Andrianna Jobin K-6 Software Evaluation Evaluating software for young learners.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

CURRICULAR MAPPING: ALIGNING ALL INTEGRATED COMPONENTS TO NJCCCS Fred Carrigg Special Assistant to the Commissioner for Urban Literacy.
Digital Game-Based Learning Why and How it Works.
The Computer as a Tutor. With the invention of the microcomputer (now also commonly referred to as PCs or personal computers), the PC has become the tool.
Growing Success Overview
Through the eyes of a child
Quality First Teaching In Any Subject From Good to Outstanding
KHS Study Groups.  What is it?  How do I plan to meet the needs of my students?  How do I group my students to reap the most benefits?
21 st Century Assessment Peg Henson and Laura Snow SD Department of Education
AN INSTRUCTIONAL-DESIGN THEORY GUIDE for producing effective self-learning multimedia programs for training adult learners in the Hang Seng Bank by Jenny.
Chapter 3 Teaching with Instructional Software M. D. Roblyer Integrating Educational Technology into Teaching, 4/E Copyright © 2006 by Pearson Education,
7/14/20151 Effective Teaching and Evaluation The Pathwise System By David M. Agnew Associate Professor Agricultural Education.
Teaching Roles for Instructional Software Nashae Lumpkin.
Science Inquiry Minds-on Hands-on.
How to build effective WORD WALLS and PERFORMANCE TASKS
Technology and Motivation
Meaningful Social Studies & Meaningful Learning
Foundations for Differentiation Part 2
Interstate New Teacher Assessment and Support Consortium (INTASC)
Preventing behavior problems Rules – did you cover everything? Consequences – did you ever talk to the student? Include the parents? Communicating rules.
Top 10 Instructional Strategies
Chapter 7 Lauren Glover. To effectively integrate technology in teaching, teachers must utilize: Content knowledge Pedagogical knowledge Technological.
Feedback and Next Step Marking
UDL Presentation: Sharing Ideas and Building Resources By: Vickie P. Murphy.
Differentiated Instruction: from DRAB to Garden City Public Schools.
MERLOT’s Peer Review Report Composed from reports by at least two Peer Reviewers. Description Section Provides the pedagogical context (i.e. learning goals,
Chapter 11 Helping Students Construct Usable Knowledge.
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
Understanding Primary Music Session 4: Lesson planning and AfL Overcoming barriers to learning Developing composition skills.
Click to edit Master subtitle style USABILITY and USER INTERFACE DESIGN Application.
Instructional Software. Definition: computer software used for the primary purpose of teaching and self-instruction. Categories include: Drill and practice.
Also referred to as: Self-directed learning Autonomous learning
PRESENTER’S GUIDE The purpose of the following presentation is to help you to share Khan Academy with the parents in your classroom. Anecdotally, we know.
Crysten Caviness Curriculum Management Specialist Birdville ISD.
1 Some Issues on Technology Use in the classroom Pertemuan Matakuliah: G0454/ Class Management and Education Media Tahun: 2006.
KIDSPIRATION KIDSPIRATION Is this software.
EDN:204– Learning Process 30th August, 2010 B.Ed II(S) Sci Topics: Cognitive views of Learning.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Teaching Games for Understanding (TGfU) As a Curriculum Model
February 28.  Unit plans feedback (that I have completed)  Expectations for reflections  Pre-Internship Expectations  Questions you always wanted.
Integrating Educational Technology into Teaching
Teaching Roles for Instructional Software Eric Sharp EDMS 6474.
Responding to the Needs of All Learners Katina Alexander Foundation of Education ED 500 Dr. Gloria Crawford.
+ Educational Technology Instructional Software SANKARSINGH,C TECH1001.
Welcome to Island Ecology for Educators!. “If we are going to save the environment, then we must save an endangered indicator species: the child in nature.”
© Crown copyright 2006 Renewing the Frameworks Enriching and enhancing teaching and learning.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Teaching Children About Food Safety Food Safety Professional Development for Early Childhood Educators.
Marking to improve student outcomes. Marking and feedback – are they the same?  Marking is the annotating of a piece of written work, using words, symbols.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Tier III Preparing for First Meeting. Making the Decision  When making the decision to move to Tier III, all those involve with the implementation of.
Learning Environments
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Teaching Roles for Instructional Software
teacher-centered supervision
Software Name (Function Type)
Teaching with Instructional Software
Teaching Styles Learning Objectives:
Website Evaluation Checklist: National Library of Virtual Manipulatives Matthew Gudenius.
Mastery at Hillyfield.
Website Evaluation Checklist: FOSS Web
Website Evaluation Checklist: PhET
The Role of a Teacher.
Motive; Motivation An inner drive, impulse, etc. that causes one to act; incentive (Webster, 1996)
Guidelines for Selecting Computer Software
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Presentation transcript:

Andrianna Jobin K-6 Software Evaluation Evaluating software for young learners

Agenda 1. Overview of selection process 2. Evaluation criteria 3. Justification of criteria 4. Critical analysis 5. References

Evaluation Cycle Needs Analysis Site Survey Formative Evaluation Summative Evaluation Evaluation Cycle We are here Develop Evaluation Criteria Critical Evaluation

Current Evaluation Process Critical Analysis Detailed Evaluation Criteria & Ratings Detailed Evaluation Criteria & Ratings Site Survey & Basic Criteria Site Survey & Basic Criteria Phase 1 Phase 2 Phase 3

Basic Criteria AGE CLASS ROOM EASE OF USE CURRICULAR VALUE Could it be used by K-6 aged students? Could it be used by students on their own after the first time? Does it teach or reinforce one of the standards or something in the curriculum? Could it be used in a classroom setting?

Site Survey metry _ht_a/iongoal/index.htm ndexpages/elementgam es.html processing.com/gamego o/gooey.html According our basic criteria, we selected this one

Categories for Detailed Evaluation  Structure  Goals  Pedagogical approaches & Learning styles  Feedback & Interaction  Motivational elements  Ease of Use  Personalization  Relevance  Curriculum and Content matter  Visual Design & Technical Consideration (??)  Interface Design (do we have/need this??)

Rating system RATING DESCRIPTIONRATING 5 Very strong in this area      4 Good in this area      3 Not especially good or weak in this area      2 Weak in this area      1 Very weak or totally lacking in this area      The rating scale is a classic 5 level rating scale, including a middle/ neutral rating, so there is no forced positive or negative decision.

Overall ratings per category CATEGORYRATING I Structure      II Goals      III Pedagogy & Behaviorism      IV Feedback & Interaction      V Motivational elements      VI Ease of Use      VII Personalization      VIII Relevance      IX Curriculum and Content matter      X Visual Design     

Detailed ratings in each category I. STRUCTURERATING 1 1. Does the program have correct answers?      2 2. Does the program have easy to follow solutions?      3 3. Are the rules easy for the learner to follow?      4 4. In addition, does the program use a limited number of rules.      5 5. Is the program organized so the learner can anticipate what is going to happen?      6 6. Is the program organized prescriptively?      7 7. Does the learner need to come up with alternative solutions?      8 8. Does the program have clear boundaries?      9 9. Is the problem clearly stated?      Are the objectives clear to the learner?     

Detailed ratings in each category II. GOALSRATING 1 Are the goals for the program well defined?      2 Have the goals of the program been attained by the user?      3 Does the learner understand the goals the program wants?      4 Does the program give the learner support if he or she needs help?      5 Is the learner acquiring new skills while using the program?      6 Is the learner being challenged to think critically about the goals?      7 How does the learner demonstrate understanding of goals?      8 What resources are needed to help the learner obtain the goals?     

Detailed ratings in each category III. PEDAGOGY & BEHAVIORISMRATING 1 Is the program directing the learner's learning process?      2 What desired behaviors is the learner expected to do?      3 Does the feedback reinforce the behavior the program intends for the learner to do?      4 Can the desired behaviors by the learner be observed?      5 What interventions does the program have in place?      6 Does the program offer suggestions on how the learner can improve?      7 In what ways the program offer the learner time to practice the desired result?      8 Does the program objective fit the desired behaviors of the learner?      9 Is there any instruction that leads the learner to the desired outcome?      10 Is the feedback throughout the program or at the end?     

Detailed ratings in each category IV. FEEDBACK & INTERACTIONRATING 1 Do users know easily if they made a mistake? Is the signal for error (wrong answer) clear to users?      2 Are the correct answers reinforced by positive feedback?      3 Based upon observation, do the children appear to enjoy the positive feedback? Does it build confidence and feeling of success?      4 Does the feedback reinforce content?      5 Does feedback employ meaningful graphic and sound capabilities?      6 Is the correct response provided? Does this program efficiently explain why the users’ answer was incorrect?      7 Does the program recommend remediation to users? Does the feedback adjust according to the child’s input?      8 Do students have a chance to correct errors?      9 Is the program forgiving of input errors such as format, capitalization, etc.?      10 When students successfully complete a challenging activity, is it followed by a “fun” activity?     

Detailed ratings in each category V. MOTIVATIONAL ELEMENTSRATING 1 Is this program enjoyable to use?      2 Based upon observation, are the graphics appealing to the children?      3 Is the theme of the program meaningful and attractive to users?      4 Do the children return to this program time after time?      5 Can users select their own level?      6 Does it encourage users to obtain correct answer?      7 Is it responsive to a user's actions?      8 Do the program elements match users’ direct experiences?      9 Does the program provide opportunities to explore and arouse curiosity?      10 Does the duration of time for each activity match with student attention spans?     

Detailed ratings in each category VI. EASE OF USERATING 1 Based upon observation, can children use the program independently after the initial use?      2 Are skills needed to operate the program in range of the child’s ability level?      3 Are key menus easy to find? Is getting to the first menu quick and easy?      4 Is reading ability a prerequisite for using the program?      5 Is it easy to print?      6 Is it easy to enter or exit out of any activity at any point?      7 Are written materials helpful for doing activities?      8 Are users given enough opportunities to review instructions on the screen, if necessary?      9 Are icons or menu bars large and easy to select with a moving cursor?      10 Do learners feel at home with the program interface? Does it have an intuitive metaphor for the learner to know how to use the interface?     

Detailed ratings in each category VII. PERSONALIZATIONRATING 1 Responsiveness to user preferences Does the program adjust the difficulty of tasks or information according to the children’s responses, giving more/less complicated tasks as appropriate?      2 Can the interface of the program be customized by user preferences?      3 Learner control Do the children feel like they have control and interesting choices?      4 Does the program allow for the children to make choices and does it adjust subsequent choices accordingly?      5 Does the program allow students an active role in developing personal knowledge? Does it help students to explore ideas and develop own personal knowledge?      6 To what extent are learners guided in creating any content of their own?      7 User tracking: Does the program track and record student progress?      8 While using the program, can the children see which activities they have already completed and which ones are still to be done?      9 Does the program provide periodic indication of how well the child is meeting the goals?      10 When exiting the program, does it automatically save student progress? When returning to program, can children carry on where they left off? Are they shown an overview of what has been completed and not completed?     

Detailed ratings in each category VIII. RELEVANCERATING Authenticity 1 Does the program provide authentic situations and rich contexts in which to explore the subject matter?      2 Based on observation, do children think the program feels “real” and interesting? How well can students relate to it?      Practicality 3 Can children use what they learn in real life contexts?      4 Does the program show students how the learning is useful?      5 Does the program encourage children to imagine themselves in a context where they can use the information they are learning?      6 Does the program involve real life situations or problems which children this age would encounter?      7 Does the program help students apply their learning to their own lives?     

Detailed ratings in each category IX. CURRICULUM AND CONTENTRATING Instructions 1 Are clear instructions available?      2 How easily can instructions be bypassed?      New terms, concepts, and vocabulary 3 Are new terms defined in words understandable to a young learner?      Challenge 4 Does the content follow an age appropriate progression of skills?      5 Does the level of difficulty increase with progress gradually building on prior knowledge?      6 Is the pacing of the activities age-appropriate and challenging?      7 Based on observation, does the pacing of the activities maximize children’s attention spans?     

Detailed ratings in each category X. VISUAL DESIGNRATING 1      2      3           8      9      10     

Critical Evaluation Phase  What age group or grade level is this most appropriate for?  What is the intended purpose of the software?  Where is the software intended to be used?  Which learning theories can be found in this program?  What do you like about the software?  What makes this software Way Cool?  What don't you like about the software?  Is this software appropriate for classroom use? How can it best be used in the classroom?

Critical Evaluation Phase What age group or grade level is this most appropriate for?  Ages 5-9  K-4

Critical Evaluation Phase What is the intended purpose of the software?  To help children practice basic skills such as letter recognition, computer motor-skills, typing, and basic concepts such as fact vs. fiction.

Critical Evaluation Phase Where is the software intended to be used? It seems intended for independent use at home, but could be used as a reward or supplement in the classroom.

Critical Evaluation Phase Which learning theories can be found in this program?  Behaviorism features strongly in this site with all of its instant audio-visual feedback and progress ratings. However, some games do suggest extension activities which would fit a constructivist model.

Critical Evaluation Phase What do you like about the software? What makes this software Way Cool?  It is very attractive and takes metaphors such as skateboarding jumps which kids are familiar with to make succeeding on learning tasks seem equally cool. In addition, it is cheerful and colorful with cute sounds.

Critical Evaluation Phase What don't you like about the software?  It may be overly stimulating to some children. It may be distracting to other children in the classroom who are doing other tasks.

Critical Evaluation Phase Is this software appropriate for classroom use? How can it best be used in the classroom?  It is appropriate for an adjunct role in the classroom, as a reward for quick workers. It would be an excellent site for teachers to familiarize their students with for extra-curricular use.

Marketing Diagram Very good software for supplemental use! Excellent Visuals Age appropriate skills practice Engaging Audio Use of behaviorist principles