CRESST ONR/NETC Meetings, 17-18 July 2003, v1 18 July 2003 The Workforce Learning Community Bill Bewley and Roxanne Sylvester UCLA / CRESST The ONR Workplace.

Slides:



Advertisements
Similar presentations
Allyn & Bacon 2003 Social Work Research Methods: Qualitative and Quantitative Approaches Topic 7: Basics of Measurement Examine Measurement.
Advertisements

Chapter 5 Development and Evolution of User Interface
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Overview of Performance Funding Model for Ohio’s Community Colleges
1/27 CRESST/UCLA A Sample of CRESST Research Ronald Dietel Arizona Educational Research Organization Phoenix, AZ - October 19-20, 2006 UCLA Graduate School.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Virtual University - Human Computer Interaction 1 © Imran Hussain | UMT Imran Hussain University of Management and Technology (UMT) Lecture 16 HCI PROCESS.
USEFUL LESSONS FROM ASSESSMENT IN AN ADULT LEARNING ENVIRONMENT: INSTRUCTIONAL STRATEGIES IN DISTANCE LEARNING Harry O’Neil University of Southern California.
Proposed plan for the summative evaluation of a Technology Enhanced Learning Project Dean Petters, Statement of the research question Participants Method.
DIGITAL MEDIA INSTITUTE hypermedia laboratory Finnish Virtual University TAMPERE UNIVERSITY OF TECHNOLOGY A Multidisciplinary evaluation framework and.
Formative and Summative Evaluations
1 A Comparison of Traditional, Videoconference-based, and Web-based Learning Environments A Dissertation Proposal by Ming Mu Kuo.
U.C. Berkeley Calendar Network Usability Evaluation Nadine Fiebrich & Myra Liu IS214 May 4, 2004.
Project Prioritization Process ASPCC Update December 14, 2005.
Weaving Pathways: Interculturalism and Language
An Instructional Systems Design By: Maria Pena INTRODUCTION “The ADDIE model is a framework that lists generic processes that instructional designers.
AWARE PROJECT – AGEING WORKFORCE TOWARDS AN ACTIVE RETIREMENT Alberto Ferreras-Remesal Institute of Biomechanics of Valencia IFA 2012 – Prague – May 31th.
New York State Education Department Understanding The Process: Science Assessments and the New York State Learning Standards.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
Students in the Gap: Understanding Who They Are & How to Validly Assess Them.
Development and Implementation of a National Multisectoral Output Monitoring System (SHAPMoS) for HIV Responses in Swaziland:  Challenges and lessons learned.
Designing Learning Objects Designing Learning Objects Designing Learning Objects 1.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Applying Principles of Universal Design to Assessment Item Modification Peter A. Beddow III Vanderbilt University Nashville, TN June 2008.
Planning and Writing Your Documents Chapter 6. Start of the Project Start the project by knowing the software you will write about, but you should try.
SUMMER CAMP COUNSELOR E-LEARNING COURSE Chester Manuel (Advisor Dr. Miguel Lara) Master of Science in Instructional Science and Technology Capstone Fall.
Quality Assurance of Malaysian Higher Education COPIA – Code of Practice for Institutional Audit COPPA – Code of Practice for Programme Accreditation.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
D1.HGE.CL7.01 D1.HGA.CL6.08 Slide 1. Introduction Design, prepare and present reports  Classroom schedule  Trainer contact details  Assessments  Resources:
Planning for Successful Simulation Simulation Planning Guide - A Guided Discussion.
Comp 15 - Usability & Human Factors Unit 8a - Approaches to Design This material was developed by Columbia University, funded by the Department of Health.
CountrySTAT Regional Basic Administrator Training for ECO Member States Friday, October 23, 2015 EVENT Foundations of CountrySTAT E-learning.
“Learn It! Live It!” Ensuring the Workforce Readiness Skills and Behaviors of Today’s and Tomorrow’s Workers Quality Enhancement Plan Faculty Training.
Quality Evaluation methodologies for e-Learning systems (in the frame of the EC Project UNITE) Tatiana Rikure Researcher, Riga Technical University (RTU),
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
ONR/NSF Technology Assessment of Web-Based Learning, v3 © Regents of the University of California 6 February 2003 ONR/NSF Technology Assessment of Web-Based.
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Circuit Rider Training Program (CRTP) Circuit Rider Professional Association Annual General Meeting and Conference August 30, 2012.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
Preparing for the QCTO AgriSETA Road Show QCTO Project Manager Herman van Deventer
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Strategies for Managing the Online Workload CADE 2003 St. John’s Newfoundland June, 2003.
CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian.
CRESST ONR/NETC Meetings, July July, 2003 ONR Advanced Distributed Learning Greg ChungAllen Munro Bill BewleyQuentin Pizzini Girlie DelacruzUSC/BTL.
An Unfinished Canvas Arts Education in California Research conducted by the Center for Education Policy SRI International Funded by The William and Flora.
TGDC Meeting, July 2011 Voluntary Voting System Guidelines Roadmap Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Working With the Instructional Development Team Presented by Heidi King 20 November 2002.
Expert Group Meeting on the Revision of the Handbook on the Management of Population and Housing Censuses New York, 14 – 17 December 2015 Overview of the.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V4, 1/18/07 Research.
CAA Options: Collection of Evidence CTE Connection December 8, 2006 Rod Duckworth, Director of Career and Technical Education OSPI.
SESSION THREE: GRANT GUIDELINES WEBINAR/MEETING October 7,
1 Conceptual Framework Research Questions Data Collection.
Perkins End of Year Evaluation Roanoke-Chowan Community College.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Reviewing Syllabi to Document Teaching Culture and Inform Decisions Claudia J. Stanny Director, Center for University Teaching, Learning, & Assessment.
Role of Account Management at ERCOT Nodal Training – Update Pamela Dautel April 10, 2006 Market Education.
Stages of Research and Development
WP8: Demonstrators (UniCam – Regione Marche)
Curriculum Model Policy (7.18)
AGC October 11, 2016 Sheila jones
Measuring Teachers’ Fidelity of Implementation
An Unfinished Canvas Arts Education in California
Critically Evaluating an Assessment Task
PCS CONTENT ON CMS.
February 2007 Note: Source:.
Interface Design and Usability
NMDWS Internship Portal
Presentation transcript:

CRESST ONR/NETC Meetings, July 2003, v1 18 July 2003 The Workforce Learning Community Bill Bewley and Roxanne Sylvester UCLA / CRESST The ONR Workplace Education Project  2003 Regents of the University of California

CRESST ONR/NETC Meetings, July 2003, v1 1 Funded by an ONR grant to UCLA CRESST –“Research-Based Distance Learning to Support Navy Workplace Education” The Workplace Education Project

CRESST ONR/NETC Meetings, July 2003, v1 2 The goal –Create a framework for application of guidelines to courseware evaluation by real users The process –Expert review –Select guidelines, recast as questions –Test, revise, test—an iterative process –Base technical quality judgment on instrument and rater reliability –Report and deliver Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 3 Web-based instruction context –Accuracy of course content, instructional design, operations, format, technical standards Other guidelines –FSU “Evaluating the Course”: 27 categories, 119 questions –ASTD “Courseware Certification”: 19 categories, 78 questions –USMC “Interactive Multimedia Instruction Style Guide”: 12 categories, 50 pages Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 4 The challenges –Balancing technical quality with usability –Creating sufficient user support to ensure rater reliability while not overburdening raters with information Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 5 KMT Guidelines: Value Added –Based on rigorous, up-to-date research –Examination of technical quality of instrument; it may be possible to evaluate courseware well with fewer questions –Examination of rater reliability –Examination of scoring training issues related to calibration of rater scores Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 6 How technical quality is established –Representative sample population, various course types; variability of raters and courses –Qualitative data on rater interactions with framework and course components –Quantitative data on instrument reliability and rater reliability –Development of rater scoring training materials Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 7 Initial work on KMT guidelines –Reviewed guidelines in relation to formative evaluation of course components Originally 58 guidelines, reduced to 52 –Selected guidelines supporting formative evaluation of course components From 52 to 14 guidelines –Turned guidelines into questions From 14 guidelines to 45 questions –Applied guidelines questions to three courses, reviewed results, revised guidelines selections and questions Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 8 Usability Study 1: Questions Courseware Evaluation Questions Usability Study –Videotaped think-aloud –Review of Systems Thinking lesson –Use of paper-based Guidelines questions –Graduate students and temporary workers –August-December 2003 –Revision of questions; addition of definitions & examples to enhance rater agreement Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 9 Usability Study 2: Prototype E-Tool Prototype E-Tool Usability Study –Think-aloud screen and audio capture –Examine e-tool functionality –Examine interplay between e-tool and courseware –Begin to examine rater reliability –Graduate students, instructional designers, SMEs –January to April 2004 –Determine all features required –Use results to program final e-tool Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 10 Usability Study 3: Final E-Tool Final E-Tool Usability Study –Think-aloud screen and audio capture –Refine e-tool –Examine rater reliability –Develop rater training procedures –Graduate students, instructional designers, SMEs –June to September 2004 –Deliver October 2004 Guidelines Framework

CRESST ONR/NETC Meetings, July 2003, v1 11 Continue formative evaluation –Evaluate additional WLC courses –Continue research on validity of framework and rater reliability –Disseminate framework for review –Develop worked examples –Develop rater scoring training materials –Conduct rater and trainer training 04/05 Tasks

CRESST ONR/NETC Meetings, July 2003, v1 12 Bill Bewley – Roxanne Sylvester – For More Information