Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Suggested Methodological Framework for Evaluating and Selecting an Open Source LMS Dr Philip Uys Manager, Educational Design and Educational Technology,

Similar presentations


Presentation on theme: "A Suggested Methodological Framework for Evaluating and Selecting an Open Source LMS Dr Philip Uys Manager, Educational Design and Educational Technology,"— Presentation transcript:

1 A Suggested Methodological Framework for Evaluating and Selecting an Open Source LMS Dr Philip Uys Manager, Educational Design and Educational Technology, Centre for Enhancing Learning and Teaching Matt Morton-Allen Teaching, Learning and Community Source Liaison Officer

2 Introduction Set out in Feb 2006 to enhance the virtual learning environment Became the Online Learning Environment (OLE) Programme Originally focused on individual tools but morphed to framework emphasis Started with 12 possible solutions Mix of open source, commercial and in-house options Ended with two open source Selected Sakai

3 Fast Track Approach Initially used “fast track” approach Attempted to avoid lengthy investigation of requirements Focus on reusing previously supplied high level business requirements Success hinged on ability to easily identify low risk solution

4 Fast Track Approach (cont.) Reality was that too little information meant too many options Too many options meant too high a risk High risk combined badly with lack of process transparency Also did little to bring together cross-silo issues between requirements

5 A Different Approach Once “fast track” abandoned needed alternative Extensive experience in the group not sufficient to address OSS complexities Short environment scan showed two possible frameworks: Business Readiness Rating Open Source Maturity Model

6 Business Readiness Rating http://www.openbrr.org/ http://www.openbrr.org/ Geared at helping evaluate OSS Identifies 12 criteria each with their own tests Suggests only portion of these be applied Assigns a weight to each test within a criteria giving end score Has online records of submissions made by others

7 Business Readiness Rating (cont.) When looked at closely BRR has some flaws The measures of the tests within criteria are high level and generic Leads to need to revise for local use Or be faced with possibility of having undiminished pool of options Either means you need to move beyond BRR for a final decision In retrospect could have been useful early on as a filter

8 Open Source Maturity Model http://www.navicasoft.com/pages/osmm.htm http://www.navicasoft.com/pages/osmm.htm Another model that could be considered – but limited The OSMM assesses the maturity level of all key product elements: Software Support Documentation Training Product integration Professional Services

9 A Different Approach When neither BRR or OSMM seemed to fit began to consider afresh Agreed on the need for: Flexible – willingness to adapt throughout Aligned – consistent with strategy Comprehensive – extensive and in-depth investigation Transparent – rigorous debate Devised the FACT framework for our own needs

10 The FACT Framework 1.Identify requirements 2.Weigh the requirements 3.Identify possible solutions 4.Identify “killer” requirements 5.Apply “killer” requirements 6.Determine short list 7.Identify overarching concerns 8.Apply overarching concerns

11 1. Identify Requirements Utilised collaborative process to create extensive (> 40) requirements list Sources included strategy documents, feature lists from commercial and OSS products, team member experience Split into high medium and low priority Identified levels of compliance with each requirement or “criteria”

12 2. Weigh the Requirements Next we gave a weighting to each requirement Again followed a highly collaborative process Required several iterations to get consensus Split 1000 points over 40 requirements Revised several weightings when unable to differentiate possible solutions Always done in collaborative and transparent way

13 3. Identify Possible Solutions Compiled list of possible solutions Derived from a number of sources including team expertise, industry reports, peer institutions etc Resulted in list of 12 options It was hoped this might be trimmed

14 4. Identify Killer Criteria Realised evaluating over 12 products against 40 requirements would take a long time Decided some requirements were “show stoppers” and thus “killed” the option Collaboratively decided which of the requirements had a criteria level that was unacceptable

15 5. Apply “Killer” Criteria Applied killer criteria to each of possible solutions Once a killer had been reached further analysis was stopped Not all “killers” consider equal - some options needed more than one to be removed Reduced the list of 13 options down to 5: Blackboard, Angel, In-house, Moodle & Sakai

16 5. Apply “Killer” Criteria (cont.) Removed options for a number of reasons: Mergers Insufficient local support Small user base Interestingly cost did not rule out any options at this point

17 6. Determine Short List From the list of 5 we then removed: Blackboard – concerns over lack of competition, little leverage to control costs In-house – advantage of reinventing wheel questionable, OSS seemed to have same benefits without starting from scratch, lack of agility Angel – user base too small, too high risk, detriments of commercial without benefits of size Leaving us with Moodle and Sakai

18 7. Develop OACs After many months of effort the quantitative analysis gave near identical scores: Sakai – 2428, Moodle – 2402 If quantitative comparisons had come up empty what about qualitative ones? Developed “overarching concerns” or OACs: Completely qualitative Focus on general ideology not current features Designed to ensure alignment between culture of solution and the University

19 7. Develop OACs (cont.) Ended up with 10 OACs covering range of issues: Was the community decision making centralised or decentralised? Was the product enterprise oriented? Was the product stronger in secondary or tertiary sectors? Was the community more technically or more pedagogically focused?

20 8. Apply the OACs Once compiled the OACs were applied to the short list of Moodle and Sakai Four major stakeholder groups asked to decide on Sakai or Moodle for each OAC End result looked like …

21 8. Apply the OACs (cont.)

22 3 of the 4 team members agreed but consensus could not be reached after intense debate A final Steering Committee vote selected Sakai unanimously

23 Observations The deeper the analysis the more possible solutions you can remove Shallow analysis using models such as BRR can be useful in early stages Quantitative comparisons are less meaningful when you can change any aspect of the software – there’s lots of grey areas

24 Observations (cont.) The introduction of qualitative measures is unavoidable and should be accepted throughout Qualitative comparison can only be accepted in an environment of transparent rigour Qualitative measure can only follow quantitative comparison – it lacks conviction in isolation

25 Observations (cont.) Removing cost from the equation helped compare OSS and commercial Assuming cost is near equal over a period of time removes bias and misconception (i.e. no “free lunch”) Evaluations require consideration of local needs and politics – highly strategic decisions cannot be based on off shelf comparisons

26 Observations (cont.) Requiring consensus was time consuming but gave strength to the results: Forced rigorous debate Ensured transparency throughout You cannot rush decisions this large – taking the time allowed a considered decision

27 Observations (cont.) The framework wouldn’t have worked outside the context of the project management methodology A framework needs to be contextualised within the organisational culture and strategies

28 Thank You! Dr Philip Uys Manager, Educational Design and Educational Technology, Centre for Enhancing Learning and Teaching http://www.csu.edu.au/division/celt/exec_staff/philip.uys Matt Morton-Allen Teaching, Learning and Community Source Liaison Officer For more information http://www.csu.edu.au/division/landt/interact/


Download ppt "A Suggested Methodological Framework for Evaluating and Selecting an Open Source LMS Dr Philip Uys Manager, Educational Design and Educational Technology,"

Similar presentations


Ads by Google