Assessment Strategies for Inquiry Units Delwyn L. Harnisch University of Nebraska, Lincoln Teacher’s College National Center for Information Technology.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Agenda For Today! School Improvement Harris Poll Data PDSA PLC
LITERACY IN THE MIDDLE YEARS OF SCHOOLING INITIATIVE
WHO Antenatal Course Preparing the new WHO eProfessors.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Low-Cost Private Schools Knowledge Framework Research methodology template.
Problem Based Lessons. Training Objectives 1. Develop a clear understanding of problem-based learning and clarify vocabulary issues, such as problem vs.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Implementing Undergraduate-Faculty Teaching Partnerships in Your Classroom Anna L. Ball Neil A. Knobloch University of Illinois, Urbana-Champaign.
Research Methods for Business Students
Grade 12 Subject Specific Ministry Training Sessions
Student Assessment Inventory for School Districts Inventory Planning Training.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Effective dissemination and evaluation
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
LECTURER OF THE 2010 FIRST-YEAR STUDENT: How can the lecturer help? February 2010.
Education Bachelor’s Degree in Elementary Education Began the Master’s of Special Education program in January of 2011 Professional After graduation Sorensen.
The Four-phase Lesson Plan
Meeting SB 290 District Evaluation Requirements
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
MLC Learning Model Reveal the Big Picture Immersion What do I need to Know and how will I find out? Create it Share Reflection Celebrate Brainstorm.
Interstate New Teacher Assessment and Support Consortium (INTASC)
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Instructional leadership: The role of promoting teaching and learning EMASA Conference 2011 Presentation Mathakga Botha Wits school of Education.
Designing and implementing of the NQF Tempus Project N° TEMPUS-2008-SE-SMHES ( )
Focus on Learning: Student Outcomes Assessment and the Learning College.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
1 Linked Learning Summer Institute 2015 Planning Integrated Units.
Thomas College Name Major Expected date of graduation address
Timberlane Regional School District
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
=_A-ZVCjfWf8 Nets for students 2007.
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
ationmenu/nets/forteachers/2008s tandards/nets_for_teachers_2008.h tm Click on the above circles to see each standard.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Plenary Session 7: Technologies and Principles of Learning in Support of Teaching Delwyn L. Harnisch University of Nebraska, Lincoln.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
Curriculum Report Card Implementation Presentations
Ch. 3 StudyCast SarahBeth Walker. NETS-T Standard 1  Teachers use their knowledge of subject matter, teaching and learning, and technology to facilitate.
Chapter 1 –organizing principle
School Improvement Partnership Programme: Summary of interim findings March 2014.
10 Principles of a Successful Classroom. Students are presented with meaningful, higher-order, activities that create the context for learning and build.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Science Department Draft of Goals, Objectives and Concerns 2010.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Google Earth INTEGRATING GLOBAL THINKING. Why Use Virtual Tours? Flexible Tool: History, Science, Math, English, etc. An Interactive Way to Explore Supports.
February 28.  Unit plans feedback (that I have completed)  Expectations for reflections  Pre-Internship Expectations  Questions you always wanted.
Understanding the Common Core State Standards and Literacy Standards.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Delwyn L. Harnisch, University of Nebraska – Lincoln Leslie Lukin, Lincoln Public Schools.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Introduction to Supporting Science. What Does Science Involve? Identifying a question to investigate Forming hypotheses Collecting data Interpreting data.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Module II Creating Capacity for Learning and Equity in Schools: The Mode of Instructional Leadership Dr. Mary A. Hooper Creating Capacity for Learning.
Stages of Research and Development
MUHC Innovation Model.
Advancing Race Equity and Inclusion Annie E. Casey Foundation
LEARNAPALOZZA: SERVICE-LEARNING AT CPCC
IENE – INTERCULTURAL EDUCATION OF NURSES AND MEDICAL STAFF IN EUROPE
Building Capacity for Quality Improvement A National Approach
Presentation transcript:

Assessment Strategies for Inquiry Units Delwyn L. Harnisch University of Nebraska, Lincoln Teacher’s College National Center for Information Technology in Education (NCITE)NCITE

Initial partners: UNL Teachers College, NET, and CSE Planning participants: UNL, UNO, UNK, NDE, Aim Institute Where, why, and how does educational technology apply to learning Basic research, applied research, technology transfer National Center for Information Technology in Education... bringing together researchers and practitioners to appropriately and effectively apply technology to learning.

NCITE Major Themes R&D: The role of technology in teaching and learning –Supporting learners with technology –Supporting teaching with technology –Technology supported assessment/evaluation R&D: Interrelationship of technology, education, and society –Social and cultural issues and technologies Moving research into practice: people and tools –Creation/development of educational technology leaders –Develop, adapt, and evaluate educational technology tools

Initial Goals Create initial organization, governance, and staffing “Put NCITE on the map” –Take steps to ensure respected, sustainable role in educational technology R&D and transition to practice –Establish educational, commercial, and government collaborations and contacts

Build Upon Existing Expertise Centers and Institutes –Center for Innovative Instruction –BUROS Institute for Mental Measurement –Nebraska Educational Telecommunications –Office of Internet Studies (UNO) Current Research –Affinity Learning –Tools Promoting Deeper Online Learning –Cognitive Diagnostic Assessment –Measuring Technology Competencies

Build New Opportunities Seed Grants –$75K - $100K per year set aside –Grants will most likely be $15K to $30K –Selected based on probability of outside funding and relevance to NCITE Potential areas – Funded Proposal Writing/Review Teams –Research into the role of human teachers/coaches in computer- based learning At the current state of technology, having humans “in the loop” is vital –Programmatic, education, and technical look at Blackboard and WebCT –Assessment and effectiveness of visualization tools –Intelligent Agents for “leveling the peer playing field” –Literacy/Writing Assessment –Measurement of educator technical competency

Create Collaborations Extend partnerships –Faculty/student affiliates –Nebraska Department of Education, K-12 –National linkages (e.g., Higher Education, ADL Initiative, NCSA) Nebraska Symposium on Information Technology in Education (2002 and beyond)

NCSA and NCITE Current initiatives –Joint proposals and projects –Faculty affiliation –Seminars with discipline leaders –Research linkages and support –Research resources and infrastructure –Publication and dissemination

Evaluation of Inquiry Units Requires Reasoning Strategies Problem-Based Approach Real-Life Connections

Reasoning in areas of academic work Where Students pose questions or identify issues and through research, search for answers Where students interpret Texts and data Where students Search for ways to reach goals and Overcome obstacles Where students use imagination and discipline to design novel and expressive products

What is Evaluation? evaluation addresses a range of questions about quality and effectiveness --both implementation and outcome– uses a range of methods --qualitative and quantitative--to gather and analyze data. evaluation explicitly uses a range of criteria upon which to judge program success, e.g., student interest and motivation, parent satisfaction, teacher enthusiasm, standards of the profession,and student learning as measured in multiple ways. Handbook for Mixed Method Evaluations

Types of Evaluations: Planning Assesses understanding of project goals, objectives, timelines and strategies Key questions –Who are the people involved? –Who are the students? –What are the activities that will involve the students? –What is the cost? –What are the measurable outcomes? –How will the data be collected? –How long is the program?

Types of Evaluations: Formative Assesses ongoing project activities –Implementation evaluation: –Progress Evaluation

Formative: Implementation Evaluation Assesses whether project is being conducted as planned. Key questions: –Were appropriate participants selected? –Do activities and strategies match those in the plan? –Were appropriate staff members hired and trained and working according to the plan? –Were activities conducted according to plan? –Was management plan followed?

Formative: Progress Evaluation Assesses the progress made by the participants in meeting goals Key questions: –Are participants moving toward the anticipated project goals? –Which activities and strategies are aiding the participants toward the goal? Mid-year GK12 Evaluation Form

Summative Evaluation Assesses project successful in meeting goals Key questions –Was the project successful? –Did the project meet overall goals? –Did participants benefit from the project? –What components were most effective? –Were the results worth the project’s cost? –Is the project replicable and transportable?

Stages in Conducting an Evaluation Develop evaluation questions Match questions with appropriate data collection methods Collect data Analyze data Report information to interested audiences

Developing Evaluation Questions Clarify goals of the evaluation Identify and involve key stakeholders Describe activity or intervention to be evaluated Formulate potential relevant evaluation questions Determine available resources Prioritize and eliminate questions if necessary

Match Questions with Data Collection Methods Select appropriate methodological approach –Quantitative –Qualitative –Mixed Method Determine data sources Select appropriate data collection method to collect data and build data base.

Collecting Data Obtain necessary clearances Consider needs and sensitivities of respondents Train data collectors Avoid causing disruption to data collection site

Analyze Data Inspect raw data and clean if necessary Conduct initial evaluation based on evaluation plan Conduct additional analyses based on initial results Integrate and synthesize findings

Reporting Information Report information to targeted audiences Deliver reports in a timely manner Customize reports and presentations

Sample Assessment Strategies for Inquiry Units Adapted from Teaching, Learning, & Assessment Together by Arthur K. Ellis

Assessment Possibilities – Jigsaw Groups of students given different parts of an event or problem Groups research their part Groups come together and “teach” others about their part Entire group draws conclusions Promotes collaborative learning

Question Authoring Students write down questions on content or concepts they don’t know or areas of interest at the beginning of a lesson Students compare questions in dyads Teacher posts questions Teacher suggests reading or activities that help student answer question

Problem Solving: Talk about it Students engage in self-talk or talk with a friend while solving a problem Goals –Self-feedback mechanism –Testing ideas in public –Make thought process deliberate Students share strategies with the class Teacher comments on strategies

Creative Work: Learning illustrated Ask students to visualize the concept –Concept map –Picture –Flow chart –Diagram –Map Promotes visualization

Examples Science ReviewMath Concept Map

Analytic Work: Key Idea Ask what is the “Key” idea at the end of a lesson or unit Ask students to explain why this is a “key idea” Can be done in groups Goal –Students see learning as composed of ideas –Teacher sees what students have valued

Analytic Work: I Can Teach Students write, diagram, or map concepts that they could teach to other students Shows how well students know the idea Helps students who have not grasped concepts the first time Allows teacher to evaluate their teaching methods

Real Life Connections: Authentic Applications Have students see how an idea is applied in real life Students visualize this idea in posters, projects, or diagram Students reflect on the concept and how it is applied in their visual presentation Example evaluation form

Example: Roller coasters

Real Life Connections: Getting a Job Role play activity Students imagine they hold a particular occupation Students see how the work they are doing in school is related to the occupation Could be used as a problem-based exercise

Summary Activities: I Learned Statements Statements of personal learning at end of lesson or unit Can be done in writing or orally Teachers see what was really important to students Example evaluation form

Summary Activities: Clear and Unclear Windows Students sort out what is clear or unclear to them on paper Students who have clear understanding can help those who don’t Example evaluation form

Need for Careful Planning Need to have evaluation plan guided by a theory that takes into account –the type of information needed to make decisions –the suitability of different kinds of methods for learning about different kinds of phenomena –the technical limitations and biases of various methods –the importance of purpose of evaluation in mixed- method evaluation –that different purposes invoke different designed mixes of methods

Conclusion Mixed method framework for evaluation invites the "uncertainty and openness needed for evaluators to use their findings for action and change" The ability to examine multiple questions about a program offers "exciting and potentially meaningful opportunities for connectedness and solidarity" for evaluators, program managers, and program participants The "practical possibilities" of mixing inquiry methodologies contribute to, and reflect, the pluralistic nature of modern society

NCITE