Presentation is loading. Please wait.

Presentation is loading. Please wait.

DIANA CANO, ETS TOM MURDOCK, MOODLE ROOMS HEIDI LARSON, EDC Moving On-Line Assessments Forward using an Open Source Technology Platform.

Similar presentations


Presentation on theme: "DIANA CANO, ETS TOM MURDOCK, MOODLE ROOMS HEIDI LARSON, EDC Moving On-Line Assessments Forward using an Open Source Technology Platform."— Presentation transcript:

1 DIANA CANO, ETS TOM MURDOCK, MOODLE ROOMS HEIDI LARSON, EDC Moving On-Line Assessments Forward using an Open Source Technology Platform

2 Findings from a study conducted by Education Development Center & Grunwald Associates LLC Sponsored by ETS

3 Purpose of Study To find out: –Experiences with & attitudes toward What: –Internet-based testing and the potential for an open source delivery platform To help: –Further the conversation on how technology can enhance and support education goals.

4 Study Participants Interviews with state assessment and/or ed tech leaders from 27 states 38 interviews with national education opinion leaders representing both public and private organizations

5 “Open source delivery platform” means, for this talk: A technology platform for the delivery of online assessments, accessible by signing a free licensing agreement, with public collaboration on software development and improvement. Vendor support may be available for purchase. It does not include: –Open content or Curriculum –Teacher-developed tests –Computers or other devices –Freeware 5

6 C States (11) 9 Assessment 7 Technology B States (11) 10 Assessment 10 Technology A States (5) 5 Assessment Leaders 2 Technology A States (5) 5 Assessment Leaders 2 Technology Defining Experience Distinctions: A, B, C States Paper & Pencil Accountability Testing Statewide Online Accountability Testing Piloting and Experimenting with Online Accountability Testing Open Source Platform Accountability Testing

7 Overall, a positive reaction Both State and Opinion Leaders are interested in learning more about an internet-based, open source assessment delivery platform. –Generally more favorable responses from: States with greater experience with online assessment (A & B States) Technology and Opinion leaders However, moving to an open source platform would be a large step forward: –Experience and knowledge of open source is limited. (Overall, although C states in particular) –Many states have yet to adopt any significant online testing, for various reasons. 7

8 Perceived benefits of an open source delivery platform Flexibility to customize to state requirements Adaptability of the platform (mostly Opinion and State Technology Directors) Community: Ability to share, build upon and transfer innovations across states 8 “It’s an effective use of resources in the sense that if there is an open source large scale assessment platform, then states could pool their resources together similarly as districts have to improve software around Moodle and some of the other open source technologies...” [Opinion Leader]

9 Benefits, and potential benefits, of an Internet-based platform Nearly all of the key benefits mentioned were related to IBT: 9 Faster score reportingFor constructed response items in particular Logistics reliefNo more tractor-trailer trucks unloading boxes of testing materials Fewer errors, greater integrationDemographic data can be pulled in from a district database Potential for interactive and multimedia items To gain a clearer picture of what students know and can do, and where they are getting off track Accommodating students with special needs Adaptability of platform would enable more accessible assessment for all students Students like the computer testsNo report of students wanting to return to paper and pencil.

10 Other “potential benefits” of open source platform not embraced Cost savings: Respondents liked the potential of cost savings, but anticipated a greater burden on staff, infrastructure, and other resources. Vendor Independence: Often vendors were seen as a partner or at least as necessary and reliable support. 10

11 Primary concerns about an open source delivery platform include: 11 SecurityRespondents not comfortable with perception of open source and Internet safety, at least not for high-stakes tests. InfrastructureAvailability of computers and servers, Internet reliability, bandwidth capacity ExpertiseLack of enough expertise at the state, district, and school level to support development and delivery Going soloGenerally, collaborative relationships are seen as desirable, if not necessary, by some for such an initiative to be successful.

12 Potentials of Collaboration? Participants Unsure Many respondents, in particular those familiar with open source programs and with consortiums, could see opportunities for collaboration: –Sharing the development and maintenance burden –Sharing quick-turnaround improvements from one state to the next Others were wary: –No experience, or negative experience, with multi-state collaboration –Concerned with having to compromise standards Leadership and/or structured governance –Considered key by many respondents 12

13 A States Perceptions of Feasibility Perceive concrete benefits of internet based testing Some respondents perceive open source solutions as a strong contribution to Common Core standards. B States Perceptions of Feasibility Many are interested in potential benefits of an open source platform for high stakes testing, but would want to learn more or see results in other states before adopting an open source platform in their own state. Some saw ways that open source could be used in developing and sharing test items, particularly in relation to the development of Common Core Standards across states. C States Perceptions of Feasibility Using open source would require major infrastructure improvement; many C state respondents do not see their state in a position to make this move. Feasibility of an Open Source, I-Based, Delivery Platform: A, B, C States

14 Desired features in an open source platform Able to handle more complex item types to take full advantage of the technology User-friendly, intuitive interface with easy end user support tools Item development standards to allow greater sharing across formative tests, states and vendors Enhanced data aggregation and analysis tools Flexible reporting options 14

15 Strategies for Success From the public report: State officials experienced with internet-based assessments offered the following strategies to ensure successful implementations: Start small: Stagger implementation of online assessments 1.Start simple: Begin with multiple choice test items before venturing on to more complex items 2.Stretch the testing window to allow all students access to the test 3.Collaborate: The states that reported the most success with their online initiatives also reported strong working relationships among both assessment and technology experts. 4.Safeguard: Include redundancies, checks, and safeguards to prevent data loss or corruption

16 To read more about this study: Grunwald_Open_Source_Public_Report_v3.pdf Grunwald_Open_Source_Public_Report_v3.pdf Or contact: Heidi Larson and Bob Spielvogel of Education Development Or Peter Grunwald of Grunwald Associates 16

17 MORE ANALYSIS OF STATE RESPONSES: OPEN SOURCE, INTERNET-BASED ASSESSMENT PLATFORM STUDY December 2009

18 States currently using CBT share their implementation experience CBT-experienced respondents (A States) CBT-limited experience respondents (B States) CBT-inexperienced respondents (C States) Experience with Online Testing All states have experience with statewide online testing for accountability assessment and all are expanding. As a result of their online status, A state respondents noted experience resolving issues in their ‘roll-out’ for online testing; a deeper experience and knowledge than B and C states. Some state respondents mentioned performing comparability studies on online and paper and pencil tests. All A states have experienced satisfaction with their move to online testing. Experience with Online Testing Many B state respondents noted online pilot programs, some of which are statewide and some of which are at the district level—outside the central office of assessment or technology. State respondents indicated they have started to resolve obstacles in moving toward online, and in some cases made significant progress; nonetheless, B state online status is not as successful or widespread as A states. B state respondents noted a few strong instances of major innovations that could promote testing platforms beyond paper and pencil. Many B state respondents had enough experience with online testing to conclude that online testing may create the possibility for a greater variety of test item types, and can significantly reduce the time it takes for students and teachers to receive test results. B state respondents, more than A and C states, noted multiple uses for online testing other than high stakes. Experience with Online Testing Many are in the process of informing practitioners and policy makers of the possibility of implementing and benefiting from online testing. All states in early stages of development: most have zero online testing. Some note an increase in participation of pilot programs.

19 A, B, C State Comparisons: Summary A State RespondentsB State RespondentsC State Respondents Potential Benefits of Open Source Perhaps because of advanced experience in online testing, showed greater interest in specific areas where an open source platform could be beneficial. Potential Benefits of Open Source Similar to C states, B state respondents expressed a great deal of interest in potential cost savings, but also skepticism about whether or not an open source platform would actually reduce costs. A few B state respondents suggested that open source platform testing may create opportunities for integrating personal technologies and/or integrating high stakes testing with other types of assessment. Potential Benefits of Open Source Perhaps because of a lack of experience with online testing, were less specific in areas where an open source platform could be beneficial. Areas that were specific were typically in regard to internet based testing in general. Most C state respondents were enthused by the idea of cost benefits, but some were not convinced open source would save money. Decision Making A state respondents were able to discuss in greater detail the status of their schools and districts with regard to online testing or other progress in accountability testing. Some state respondents noted that decisions to move online (or advancing testing technology in general) were motivated or mandated by legislation. A few state respondents indicated internal capacity for developing new testing components. Many state respondents noted stronger collaboration between Ed Tech and Assessment leaders than B and C state respondents as a result of new testing. Some noted the transition as challenging. Decision Making About half of B state respondents felt that their state decision-makers would be more likely to consider adopting an open source platform if their vendor or multiple vendors were involved in the development of this technology. Respondents tended to be specific in that certain additional information would be needed to convince them that open source platform is one they would decide to consider or implement. Decision Making C state respondents indicated an unclear structure in decision making for advancing testing platforms; additionally, it appeared with many C state respondents that the issue of online or open source testing or its implementation has not been discussed to the extent of most A and B states. Many C state respondents were unable to discuss in greater detail the status of their schools and districts with regard to online testing or other progress in accountability testing. As with A state respondents, some state respondents believed that progress depended, at least in part, on legislation, usually for securing funds.

20 1 How open-source LMS became adopted by classroom teachers Easier room logistics Easier time logistics Rich reporting on student activity Rich reporting on instructor activity Collaborative events “Things you can’t do in a brick-and-mortar classroom”

21 1 How open-source LMS became adopted by institutions Use of a common platform Functionality built across the platform Interoperability between upstream and downstream systems Collecting and acting upon longitudinal data Understanding transitions The transparency of open-source systems Adoption Levels Open-source and closed-source hybrids

22 2 Use of a common platform Higher Ed Market Economics Legacy System Attrition Growth in for- profit online schools Open-source Adoption & Choice Pluggable Architecture with vendors Market Seismic Shift

23 3 Functionality built across the platform Provide a platform that delivers: Flexible deployment Comprehensive features Configurable Implementations Industry-leading reliability Security through transparency High-quality, community-based development

24 4 Interoperability between upstream and downstream systems Open standards IMS LTI SCORM Pluggable architecture More than 620 non-core modules Open API Web-services

25 7 Secondary to Post Secondary Post Secondary to Workforce 2- 4 Year Higher Ed Academic Lifecycle Collecting and Acting upon Longitudinal Data

26 A growing number of states have policies to ensure High School Students graduate College and are workforce ready Achieve.org Survey/Research 2007 Understanding transitions by tracking standards Many States Are Aligning College- and Work-Ready High School Standards

27 9 The transparency of open source systems Core code is available to anyone No vendor lock Schools do not lose IP Multiple vendors can write to open standards, open code

28 5 Adoption Levels K.C. Green, Campus Computing, 2009

29 8 Open-source and closed-source system hybrids Increase Capabilities Focus internal resources on value-added, strategic activities Transition to SLA-driven 24x7x365 end-to-end support models Leverage cloud modularity to enable rapid introduction of new SaaS applications Focus internal resources on value-added, strategic activities Transition to SLA-driven 24x7x365 end-to-end support models Leverage cloud modularity to enable rapid introduction of new SaaS applications Mitigate Risks Eliminate technology and vendor lock-in Ensure solution scalability from pilot projects through enterprise deployments Enable cost effective disaster recovery and business continuity solutions SaaS vendors can provide a wall that protects proprietary elements of a solution Eliminate technology and vendor lock-in Ensure solution scalability from pilot projects through enterprise deployments Enable cost effective disaster recovery and business continuity solutions SaaS vendors can provide a wall that protects proprietary elements of a solution

30 10 Watching for the adoption of open source for assessment Open Source is now mainstream. “Adoption of open-source software (OSS) is becoming pervasive, with 85% of companies surveyed currently using OSS in their enterprises and the remaining 15% expecting to in the next 12 months.” Gartner, Inc., “User Survey Analysis: Open-Source Software, Worldwide, 2008”

31 ® Using Open Source Software for Innovative Assessments June 2010

32 ® What are “innovative” assessments? Exercise types that allow us to expand measurement beyond constructs possible with largely multiple-choice testing –These include open-ended and performance testing –They also include using technology to expand constructs To test things we wished we could test on paper To test skills and knowledge that do not exist absent the technology itself Use of these items in high-stakes assessments 32

33 ® Delivering by computer has operational and measurement advantages Item level computer adaptive test Reduced testing time required for examinees Greatly increased the number of items that must be authored, pretested, scaled, etc. Unique aspects of the automated item selection algorithm can be exploited Item types are becoming more complex and technology rich Test development and software development are becoming more closely tied Opportunity for IT to directly impact assessment 33

34 ® Challenge #1: Cost Some of these sorts of items can be expensive and time-consuming to develop –Particularly a problem for exercises for which we lack a set of “operational norms and practices” (e.g. simulations) Many of these exercises require human scoring, which adds cost –Unlike development or analysis costs, these go up on a per-student basis –Also add schedule time –There are electronic scoring options, although these carry limitations as well Computer-based testing can add hidden costs (machines, development of larger item pools) even if tests use traditional items –School computer labs need to keep pace and offer access for test takers 34 GNU Free Documentation License from Wikimedia Commons

35 ® Challenge #2: Test development know-how For some item types, little “operational knowledge” or templates for development exist in the industry –We understand how to produce large numbers of multiple-choice items with known performance characteristics –This knowledge also exists for some types of constructed-response items, although the cost of a mistake in development is far higher Need improved cognitive models since we cannot rely on the brute force of multiple-choice tests 35 The Thinker, by Auguste Rodin the California Palace of the Legion of Honor by Karora; public domain

36 ® 36 Some types of items need computers and possibly longer windows We need to ensure that interfaces and activities are not so complex that it takes weeks for someone to learn to take the test Advanced item types may yield different group patterns, which may be an equity issue depending on construct definition We must make sure we remember it is an assessment Challenge #3: Technology

37 ® Working with open source software for delivering assessments ETS’ Cognitively-Based Assessments of, for, and as Learning (CBAL) research initiative will use Moodle this year –handling the tasks and activities associated with that research project –integrating a proprietary testing engine for the assessment delivery OECD’s Programme for the International Assessment of Adult Competencies (PIAAC) uses Testing Assisté par Ordinateur (TAO) –interviewing adults aged years in their homes – 5,000 in each participating country –assessing their literacy and numeracy skills and their ability to solve problems in technology-rich environments 37 Confidential and Proprietary. Copyright © 2010 Educational Testing Service. All rights reserved.

38 FOR MORE INFORMATION ON THIS OPEN SOURCE STUDY: 38 HEIDI LARSON BOB SPIELVOGEL PETER GRUNWALD TOM MURDOCK DIANA CANO


Download ppt "DIANA CANO, ETS TOM MURDOCK, MOODLE ROOMS HEIDI LARSON, EDC Moving On-Line Assessments Forward using an Open Source Technology Platform."

Similar presentations


Ads by Google