What Makes a Good Teaching Activity? Best Practices in Teaching— Lessons Learned from Experience and Research on Learning David Mogk Montana State University.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Understanding by Design Stage 3
When Students Can’t Read…
School Based Assessment and Reporting Unit Curriculum Directorate
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
 will be able to write a learning outcome from the student perspective  will understand the difference between writing about an activity and learning.
Curriculum & Instruction Webinar October 18, 2013.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
It defines acceptable evidence of student’s attainment of desired results. It determines authentic performance tasks that the student is expected to do.
Educators Evaluating Quality Instructional Products (EQuIP) Using the Tri-State Quality Rubric for Mathematics.
What Makes a Good Activity? Lessons from research and experience Cathy Manduca Science Education Resource Center Carleton College.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Matt Moxham EDUC 290. The Idaho Core Teacher Standards are ten standards set by the State of Idaho that teachers are expected to uphold. This is because.
Kyrene Professional Growth Plan
Meeting SB 290 District Evaluation Requirements
ASSESSMENT Formative, Summative, and Performance-Based
Embedded Assessment M.Ed. In Curriculum & Instruction with a Specialization in Language & Literacy.
Intel Teach Elements Collaboration in the Digital Classroom Charity I. Mulig First Webinar Session October 18, :00 – 9:30 pm.
Project-Based Learning ITECH 711 Summer 2007 Trena Noval, Instructor.
Leveraging Educator Evaluation to Support Improvement Planning Reading Public Schools Craig Martin
MA course on language teaching and testing February 2015.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Asynchronous Discussions and Assessment in Online Learning Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous Discussions and Assessment in.
Assessing Tomorrow’s Leaders Today in an Integrated Reading and Writing Course NADE 2015 – Greenville, SC Kina Lara and Tina Willhoite San Jacinto College.
Illinois MSP Program Goals  To increase the content expertise of mathematics and science teachers; 4 To increase teaching skills through access to the.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
Developing materials for InTeGrate: A Big Picture View David Steer, The University of Akron Ellen Iverson, Science Education Resource Center.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
What Makes a Good Activity? Lessons from research and experience Cindy Shellito (U of Northern CO) and Cathy Manduca (SERC)
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Translating Scientific Research into Effective Classroom Use Karin Kirk Science Education Resource Center Carleton College What makes a good classroom.
Media Literacy and Curriculum Development Renee Hobbs National Media Education Conference Baltimore, Maryland June 29, 2003.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Understand the purpose and benefits of guiding instructional design through the review of student work. Practice a protocol for.
Leading Beyond the Institution: Graduates as Learners, Leaders, and Scholarly Practitioners Drs. Ron Zambo, Debby Zambo, Ray R. Buss.
Assessment of Service Learning Projects Not necessarily the “answer” but some questions and suggestions David Mogk Dept. of Earth Sciences Montana State.
Using edTPA Data for Program Design and Curriculum Mapping Mary Ariail, Georgia State University Kristy Brown, Shorter University Judith Emerson, Georgia.
Teaching with Data Cathy Manduca Iowa State University, 2005.
The Learning Cycle as a Model for Science Teaching Reading Assignment Chapter 5 in Teaching Science to Every Child: Using Culture as a Starting Point.
SEC.FAIL Information Security Defense Project Based Learning Rubric my.sec.fail/1SWqE6F.
What makes a good teaching activity? (T-P-S) In your experience, what are the components of your most successful teaching activities….????
Teaching for Student Success Cathryn A Manduca SAGE 2YC 7/18/2013.
February 28.  Unit plans feedback (that I have completed)  Expectations for reflections  Pre-Internship Expectations  Questions you always wanted.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Teaching with CHRONOS Data and Tools A Framework for Design Cathy Manduca Science Education Resource Center Carleton College June 13, 2006.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
School practice Dragica Trivic. FINDINGS AND RECOMMENDATIONS FROM TEMPUS MASTS CONFERENCE in Novi Sad Practice should be seen as an integral part of the.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Assessment In Learning Marie Wilson EDU 650 Teaching, Learning, and Leading in the 21 st Century Instructor: Heather Caldwell May 21 st, 2016.
1 IT/Cybersecurity - ICRDCE Conference Day Aligning Program, Course, and Class Objectives / Outcomes.
Sharing Teaching Activities On Line Cathy Manduca Science Education Resource Center Carleton College.
Inquiry-based learning and the discipline-based inquiry
Building Independent Learners
WHAT IS LIFE LONG LEARNING IMPORTANCE OF LIFE LONG LEARNING
From Learning Science (How People Learn, NRC,1999)
Pedagogy and Assessment
Cathy Manduca and the Mini-lesson Testing Team
Presentation transcript:

What Makes a Good Teaching Activity? Best Practices in Teaching— Lessons Learned from Experience and Research on Learning David Mogk Montana State University

Wisdom from Experience   Giving students ownership of problem/data   Engaging students in experiments   Involve every student   Know their learning style, interests, experience…   One learning goal per activity   Student-centered   What should every student know, be able to do? (v. content-centered)   Faculty Role   Mentor, advisor, co-learner   Inquiry, discovery, exploration   “I don’t know, but let’s find out!”

Wisdom from Learning Science (How People Learn, NRC,1999)   Learning is additive, it builds on current understanding   Be aware of pre- and misconceptions   Understanding is actively constructed   This requires an engaged learner   Different people construct/learn most easily in different ways   Learning to learn (metacognition) is an important aspect of becoming an expert   Metacognition is context specific   Cognitive and affective domains are both important in learning   Learning can’t occur if the affective domain is not engaged (motivation to learn, anxiety, values….)

Wisdom from Curriculum Design   Goals, assessments, activities (Wiggins and McTighe, 1998)   Teaching for Understanding   “Backward Design”: Start with learning goals/outcomes, align performance assessments with goals; activities and materials all support learning goals   Scaffolding   Clear direction;   Clarifies purpose;   Keeps students on task;   Assesses progress;   Demonstrates expectations, minimizes uncertainty   Successively remove support   Guided Discovery   Provide overall structure, context, resources, examples…   …But not too much!

Principles of Design 1) Students must be engaged to learn How does the activity engage them? 2) Students must construct new knowledge incrementally as a results of experience. What experiences will they have in this activity? 3) Students must refine and connect their knowledge to be able to use it further How will the activity promote reflection on and application of the new knowledge? Edelson, 2001, Learning for Use A Framework for the Design of Technology-Supported Inquiry Activities: Journal of Research in Science Teaching, vol 38, no 3, p

Is it Good?   Will the activity lead to the desired learning?   Will I be able to tell?   Does the pedagogy promote learning?   Are the materials I provide for students complete and helpful?   Could someone else implement this from the information I provide?

Does the Pedagogy Promote Learning?   Does the activity motivate and engage students?   Does it build on what they know and address their initial beliefs?   Is it appropriate for the variety of students expected in the class?   Are students engaged in independent thinking and problem solving?   Are there opportunities for students to integrate and improve their understanding incrementally?   Is there an appropriate balance of guidance vs exploration?   Does it include opportunities for reflection, discussion, and synthesis?   Does it provide opportunities for students to assess their learning and confirm they are on the right track?

Learning Goals  Content/Concept Mastery   Skill Mastery?   Technical: use of instrument, software…   Life-long: communication, quantitative, graphic, information access and vetting, interpersonal skills   Affective Aspects   Engagement with science; Values science; Personal growth—self-confidence, Motivation? Ownership? Responsibility  Motivation  Need (for personal/societal application  Curiosity (this is so cool I have to learn more)…

Geoscience Expertise  What defines a “master” geoscientist?  How can we help students (novices) become masters?  How can we bridge the gap?  Thinking about your own learning  Self-monitoring  Self-regulation  Critical thinking  What am I doing? Why? Is this purposeful? Consistent with other knowledge I have?

ASSESSMENT—What is it?  Collection of evidence to answer a question or solve a problem  Diane Ebert-May  I use “evaluation” to assign “value”—good/bad; final grade, ….  Assessments must be well-aligned with project goals  How will you know if you’ve achieved your goals?  Resources (Assessing Student Learning)  Starting Point Module on Assessment   Cutting Edge Observing and Assessing Student Learning

How Will the Assessment be Used (and by whom, and for what purpose)?   Formative—   ”road checks”, are we on course? (feedback for instructors)?   Summative—   measures of success (for project leaders, administrators, benefactors)   Longitudinal—   long term impacts (faculty, institutions, community)

What Type of Evidence “Counts”?   Approaches (multiple, independent lines of evidence—like your research!)   Self-reporting; (journals, interviews…)   External Observers (surveys, evaluation forms)   Automated (e.g. web statistics)   Quantitative; rubrics; how many…?   Narrative/anecdotal   Pre- Post- activity to measure Δ; baseline needed   Comparative   demonstrating changes pre- and post-project   Outcomes-based   demonstration of quality of products/results

Hard-Earned Advice   Make sure that the expectations of all parties are aligned!   Your standards are clearly articulated   Students know what to expect (uncertainty is the great killer of these types of projects)   What will the product look like, how it will be evaluated? Provide examples!   No Surprises!   Assessments   Must be built into the plan from the beginning, not an afterthought   Can be embedded, and can be done continuously through the life cycle of the project;   Intervene early if things go awry!

Collaborative/Cooperative Learning  Increasingly students must work in diverse groups  Equitable participation is different than equal participation  Play to students’ strengths  Consider using profiles of student learning preferences  VARK  Using Cooperative Learning to Teach Mineralogy (and other courses too!), Srogi and Baloche  Starting Point Module on Cooperative Learning  Preparing Students for Collaborative Case Studies

Charge for the Weekend  Continue to work on your own activity development  Post topics for group input via threaded discussions  Where do you need help, advice, resources, …  Work in your small groups  Schedule conference times  Use the web workspace  Post your completed activity by 9:00 AM Central