Download presentation
Presentation is loading. Please wait.
1
1 Test and Evaluation Kristen Barrera 711 HPW/RHA Kristen.barrera.4@us.af.mil
2
MLS GOALS Identify the “best” methods for providing “distance” education in order to prepare international military students for a resident training experience in the United States Develop several informational content from the IMSPDB course into multiple formats to support individual learning approaches Assess the interoperability between U.S. and foreign partners learning management systems and their ability to meet Joint Security Cooperation Education and Training regulation (JSCET) training, information sharing, collaboration and coalition interoperability requirements (US Government Team only) 2
3
MLS OBJECTIVES Develop a capability that ensures IMSPDB students have a positive and successful experience in the US along with their US-counterparts Conduct research and collaborate with international partners to identify the “best” methods for providing “distance” education for international military students Evaluate the learning effectiveness of using multiple learning approaches (web-based e-Learning, mobile applications, e- publications, etc.) to support Security Cooperation Education and Training Program (SCEPT) requirements. Ensure that international partners have access to courses via DoD e- learning systems in accordance with Security Cooperation requirements (US Only Government Team) 3
4
MLS DELIVERABLES A final report that evaluates the results of the MLS Concept Evaluation and makes recommendations on the “best approach” for meeting IMSPDB Indoctrination Training Requirements A Strategy Document, in consonance with the Security Cooperation Education and Training Program (SCEPT) Community, which focuses on the out-years (i.e., Phase III and Phase IV) A legally compliant and integrated tracking with partner Learning Management Systems (LMS) and JKO Mobile courses/content required to support US and its partners that comply with Security Cooperation requirements (US Only) 4
5
5 MLS TESTING & EVALUATION OBJECTIVES Conduct an evaluation of a multichannel learning approach (e.g., web-based, mobile applications, etc.) to evaluate the effectiveness of providing multiple channels. Make recommendations on how the lessons learned may be used to support DoD and other USG International Military Student Predeparture Training Requirements.
6
ADAPT MoLE T&E PROCESS Adapt MoLE T&E process Adapt MoLE PoC Data Source Matrix Adapt MoLE PoC Data Analysis and Collection Plan 6
7
EXAMPLE COI / MOE / MOP STRUCTURE Critical Operational Issue: Measure Of Effectiveness: Measure Of Performance: Evaluation Criteria: Data Required: Other: Data Collection Methods: Data Analysis Methods: Final Data Product: 7
8
PRE-CONCEPT EVALUATION ACTIVITIES Acknowledge acceptance of the MLS Concept Evaluation Informed Consent Demographic questionnaire from MoLE (examples) Age – less than 20, 20-29, 30-39, 40-49, and 50+ Gender How proficient are you in English? USE OF LIKERT SCALE Have you been given a device for the purpose of this trial? How comfortable are you with using the mobile device that’s running the MLS app? USE OF LIKERT SCALE What is your professional expertise? (i.e., Medical, E-learning, Other) Have you viewed the Culture portion of the IMSPDB within the last two years? 8
9
EVALUATION CRITERIA Utility: What is the effectiveness or practicality of using mobile technologies to provide training? Usefulness: What is the benefit or availability of using mobile devices in providing training? Self-efficacy: Do persons who have used the mobile device provided training believe they are capable to perform a desired outcome? Accessibility: What is the degree to which a mobile training application is available in austere environments to as many participants as possible? 9
10
EXAMPLE: MoLE FINAL QUESTIONS How useful are mobile devices for training? How useful are mobile devices for refresher training? How useful are mobile videos to enhance understanding? How useful are digital reference materials on mobile devices? How useful are collaborative contact pages to access real time information? How easy was the Global MedAid application to navigate? Please write up to five single words which best describe your overall experience of using the mobile device as a tool for learning What did you like the most about the mobile device as a tool for learning? What did you like the least about the mobile device as a tool for learning? Insert any additional Comments. 10
11
TEST LIMITATIONS AND UNCONTROLLED VARIABLES Network connectivity. Although MLS is developed to help mitigate the long-standing challenge of delivering low-bandwidth on-demand training by using mobile devices where there is a limited internet connectivity and limited infrastructure, network connectivity will not be a controlled variable. Mobile Devices. There is a strong possibility that not all participants will have their own mobile devices; therefore, the testing period may take up-to the last day since some may have to borrow mobile devices. To accommodate for this issues, each person will be assigned a unique research id. Responses to Questions. In accordance with the Human Research Protection Program, each participant cannot be forced to answer any question they do not desire. Therefore, there could be questions that participants do not wish to respond. 11
12
DATA ANALYSIS Assigned Research Identification (RID) Data collected through warehouse Page visits Time spent on each page Percentage of modules accessed Completion of modules Demographics associated with different metrics Frequency of access by: Device type Connectivity type 12
13
MISCELLANEOUS QUESTIONS Individual Country desired outcomes from the study Individual Country research objectives Individual Country research questions Country specific demographics (what do they want/need) List of common goals across the effort What technology they are developing? What language is the system available? Measurement of items (based on what do you want to know) What technology types are the users familiar with Ebooks, Audio, Video? 13
14
MISCELLANEOUS QUESTIONS What types of technologies do users prefer/use? How tech savvy are the participants? What type of learners are they? What category do they fall in (Digital native, etc)? How do the users classify themselves (this could be different than the pre-defined groupings)? Standard demographics: Age, gender, level of education, profession/occupation, country, self-classification, country Device Type (tablet, laptop, desktop, smartphone, other mobile device) Types of connectivity used (WiFi, cellular, etc) 14
15
MISCELLANEOUS QUESTIONS Access to tool (device type, internet, WiFi) Usability survey Utility of system What are the commonalities of the different systems? How do you track completion or flow through the system? 15
16
How should content be delivered? Population aspects: - College Educated (20-30 yrs.) - Military/MoD Background - English Language When the target audience is provided with learning channel option, what preference will they select for distance learning? 16
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.