Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 January 14, Evaluating Open Source Software William Cohen NCSU CSC 591W January 14, 2008 Based on David Wheeler, “How to Evaluate Open Source.

Similar presentations


Presentation on theme: "1 January 14, Evaluating Open Source Software William Cohen NCSU CSC 591W January 14, 2008 Based on David Wheeler, “How to Evaluate Open Source."— Presentation transcript:

1 1 January 14, 2008 1 Evaluating Open Source Software William Cohen NCSU CSC 591W January 14, 2008 Based on David Wheeler, “How to Evaluate Open Source Software / Free Software (OSS/FS) Programs,” http://www.dwheeler.com/oss_fs_eval.html http://www.dwheeler.com/oss_fs_eval.html

2 2 January 14, 2008 2 Why Evaluate Existing Software? ● Determine if suitable software already exists ● Select the best software possible ● Make use of already available work

3 3 January 14, 2008 3 Evaluation Steps ● Identify candidates ● Read reviews ● Compare leading candidate programs for basic attributes ● Analyze top candidates in more depth

4 4 January 14, 2008 4 Identify Candidates ● Search sites OSS software packages: ● Freshmeat http://freshmeat.nethttp://freshmeat.net ● SourceForge http://sourceforge.nethttp://sourceforge.net ● Savannah http://savannah.gnu.org/http://savannah.gnu.org/ ● FSF Software directory http://directory.fsf.org/http://directory.fsf.org/ ● Use Search Engine

5 5 January 14, 2008 5 Effective Searching ● Search engine possible biases: ● May not show software competing with their offering ● Paid advertising put before the links of interest ● Search terms: ● Identify key words and combinations ● Use previous found package names ● Use data formats the that software might use, e.g. jpg

6 6 January 14, 2008 6 Read Reviews ● Make use of other people's experience with the software: ● May be faster than locally installing the software ● May compare various software packages you are already considering ● Search for reviews view search engines ● Review biases: ● Publisher of review may accept ad revenue ● Software fans may bias votes or comments on software

7 7 January 14, 2008 7 Software Market Share ● Hard to determine how many copies running ● Do not have to report the number of copies running ● Downloads can hint at relative use, but not absolute: ● One download may be used many times ● Software fans may influence download numbers ● Data on network visible attributes: ● Websites indicate the software used to build them ● Web browsers identify themselves, e.g. Mozilla, Internet Explorer ● Look at what distributions include the software package

8 8 January 14, 2008 8 Compare Candidates for Basic Attributes ● Reduce list of possible software down to “likely candidates” ● Examine the OSS project web pages for basic attributes: ● Brief description of project ● Frequently Asked Questions (FAQs) ● Project documentation ● Mailing lists ● Links to related projects

9 9 January 14, 2008 9 Basic Attributes ● Functionality ● Cost ● Market Share ● Support ● Maintenance ● Reliability ● Performance ● Scaleability ● Useability ● Security ● Flexibility/Customizability ● Interoperability ● Legal/License issues

10 10 January 14, 2008 10 Functionality ● Does it work on the desired platform, e.g. Microsoft Windows, Apple OS X, or Linux? ● Have a list of functionality needed/desired ● Does the program do what you want? ● How hard would it be to add missing functionality to OSS?

11 11 January 14, 2008 11 Cost ● “Free” in Free Software refers to liberty ● May be nominal costs for getting software on media ● Setup costs: ● Initial installation ● Migrating existing data ● Training ● Hardware needed ● Staffing ● License fees ● Upgrade/maintenance

12 12 January 14, 2008 12 Market Share ● Are significant number of people using the software? ● Large user base: ● Many people find the software useful ● More likely support of software

13 13 January 14, 2008 13 Support ● Help with training, installing, and fixing issues with software ● Is the documentation readable and useful? ● Community support: ● Generally free ● FAQ, IRC, and mailing list ● Commercial Support: ● Companies ● Consultants

14 14 January 14, 2008 14 Maintenance ● It the project mature? ● Is there active work on the project? ● Is there a bug/issue tracking system? ● Are issues with the software being addressed in a timely manner?

15 15 January 14, 2008 15 Reliability ● How does the website describe the maturity of the software? ● Are people using the software in production environment? ● Are there test suites to check that the software functions? ● Are the test suites run regularly to make sure that the code base continues to function?

16 16 January 14, 2008 16 Performance and Scaleability ● Performance ● How fast (or slow) is the software? ● Benchmarks posted may not be applicable to your application ● May need to benchmark when doing detailed analysis ● Scaleability: ● Are people using the software in similar sized configuration?

17 17 January 14, 2008 17 Useability ● How easy is it to use the software? ● How much work to setup new user? ● How difficult to do common tasks? ● Interfaces: ● Library Application Programmer Interfaces (API) ● Graphical User Interface (GUI) ● Command Line Interface (CLI) ● Does the software follow interface guideline?

18 18 January 14, 2008 18 Security ● Static analysis of OSS: ● Coverity http://www.coverity.com/http://www.coverity.com/ ● Klocwork http://www.klocwork.com/http://www.klocwork.com/ ● Security issues: ● MITRE's CVE http://cve.mitre.org/http://cve.mitre.org/ ● Are developers analyzing and reviewing the code

19 19 January 14, 2008 19 Flexibility/Customizability ● How easy is it to adapt the code? ● Examples: ● GCC ability to port compiler to new architectures ● Plugins for web browsers

20 20 January 14, 2008 20 Interoperability ● Able to work with standard formats ● Important for data exchange with others and use of existing data ● Examples: ● Image manipulation systems ● Word processors ● Databases

21 21 January 14, 2008 21 Legal/License issues ● Some End User License Agreements (EULAs) have clauses that may be unacceptable: ● Compliance audit ● Providing vendor with private information ● Available source code doesn't guarantee OSS, check the license, copyright owner may restrict copying ● License information (OSS license compatibility): ● OSI http://www.opensource.org/licenseshttp://www.opensource.org/licenses ● FSF http://www.fsf.org/licensing/licenses/http://www.fsf.org/licensing/licenses/ ● Restrictive vs. Non-restrictive OSS license ● Legal suites ● Software patents

22 22 January 14, 2008 22 Evaluating Attributes ● Attribute importance varies on application ● Market Share may be relatively unimportant for a research project ● Legal/License issues can be very important if plan to redistribute code

23 23 January 14, 2008 23 Analyze Top Candidates in Detail ● Have narrowed field to a smaller list of possible software packages ● Have firsthand experience with candidate software ● Examine code base for adding functionality ● Benchmark code ● Check reliability (Does it handle problems gracefully?)

24 24 January 14, 2008 24 Additional Information ● http://www.dwheeler.com/oss_fs_eval.html http://www.dwheeler.com/oss_fs_eval.html ● http://www.karinvandenberg.nl/Thesis.pdf http://www.karinvandenberg.nl/Thesis.pdf ● http://www.openbrr.org http://www.openbrr.org


Download ppt "1 January 14, Evaluating Open Source Software William Cohen NCSU CSC 591W January 14, 2008 Based on David Wheeler, “How to Evaluate Open Source."

Similar presentations


Ads by Google