Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Centre for the Enhancement of Learning and Teaching Supporting & Enhancing Online Teaching & Learning by Catherine Ogilvie Centre for the Enhancement of.
Analyzing Student Work
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
PROBLEM-BASED LEARNING & CAPACITY BUILDING
Engaging Students in Online Discussion
Speakers: Denise Chilton, Sandra Smele, Christine Wong May 1, 2013
Register Laulima Workshop for Instructors Solutions to help you engage your students through Laulima.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
AAFCS Annual Conference Research-to- Practice Session June 27, 2013 EMBRACE AND MANAGE ONLINE TEACHING METHODS.
Experiential Learning Cycle
Facilitating Online Discussions Jason D. Baker. Topics Discussion Value Discussion Tools Discussion Tips.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Consistency of Assessment
Internet Supported Distance Learning Brian Mulligan IT Sligo, September 2003.
Meaningful Learning in an Information Age
Benefits of Blended e-Learning in Education
McWeadon Education, USA
LECTURER OF THE 2010 FIRST-YEAR STUDENT: How can the lecturer help? February 2010.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Blackboard Strategies: Using Blackboard Pedagogically.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
1 DR. BADRUL HUDA KHAN MCWEADON EDUCATION, USA DR. BADRUL HUDA KHAN
What should teachers do in order to maximize learning outcomes for their students?
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Interstate New Teacher Assessment and Support Consortium (INTASC)
A Framework for Inquiry-Based Instruction through
Student Engagement in Online Courses FALL PLANNING 2015.
The Integration and Effectiveness of Information and Communication Technologies in Canadian Postsecondary Education Dr. Carl Cuneo, Director, EvNet, Network.
Project Based Learning (PBL) Two Approaches Teacher Centered – Direct instruction driven Learner Centered - PBL.
INSTRUCTOR & FACULTY ORIENTATION Blackboard 9.1. What is Online Learning? The term online learning is used interchangeably with e-learning or electronic.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
Models of Online Learning – Identifying Components Week 3 Introduction to Web-Based Mentoring and Distance Education.
Transitioning Content, Students and Instructors. Illustrate advantages of blended learning Identify major business impacts Examine alternative media available.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Universally Designed Syllabi Kirsten Behling, MA Suffolk University.
Before the Team Project Cultivate a Community of Collaborators Deb LaBelle.
Jenni Parker, Dani Boase-Jelinek Jan Herrington School of Education Murdoch University Western Australia.
2010 Focus on Faculty No Free Lunch: Fostering and Facilitating Active Student Participation in Online Courses.
* Research suggests that technology used in classrooms can be especially advantageous to at-risk, EL, and special ed students. (Means, Blando, Olson,
The Principles of Learning and Teaching P-12 Training Program
Teaching in a Web-Based Distance Learning Environment: An Evaluation Summary Based on Four Courses Charles Graham, Joni M. Craner, Byung-ro Lim, & Kursat.
Course and Syllabus Development Presented by Claire Major Assistant Professor, Higher Education Administration.
Online Learning: From Research to Application Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com
Ten Minnie-Myths of Using Technology in Higher Education Dr. Curtis J. Bonk Alias: Mickey Mouse President, CourseShare.com Associate Professor, Indiana.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
Gouri Banerjee, Ph. D. Dept. Math & IT, Emmanuel College Boston, Massachusetts. 1 Gouri Banerjee Blended Learning Environments, 2010.
MJM22 Digital Practice and Pedagogy Week 9 Collaboration Tools.
Designing Innovative E-Learning Tools and Environments Based on Sound Research: What to do? Dr. Curtis J. Bonk Associate Professor, Indiana University.
Instructional Strategies Teacher Knowledge, Understanding, and Abilities The online teacher knows and understands the techniques and applications of online.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Problem-Solving Approach of Allied Health Learning Community.
FORESTUR How to work… …with this training platform? …with this methodology?
Facilitate Group Learning
THE PERCEPTIONS OF ENGLISH LANGUAGE TEACHING STUDENTS ON ELT WEBSITES Assist. Prof. Dr. Hasan Bedir/ Cukurova University Inst Emsal Ates Ozdemir/Mersin.
Identifying Assessments
 “I have to teach the same information skills each year because students do not learn them.”  “I don’t have time to give tests so I do not assess student.
Developing a Metric for Evaluating Discussion Boards Dr. Robin Kay University of Ontario Institute of Technology 2 November 2004.
By Bundhun Amit Varma HMOA  Define Online Discussion  Recognise models of online discussions ◦ Synchronous ◦ Asynchronous  Distinguish three.
E-learning for the Academy: challenges and opportunities Margaret Haughey University of Alberta.
Jennifer Gilligan, Open Learning Research Associate, IT Sligo, Ireland Using Moodle as a MOOC platform in the classroom Moot Ireland UK.
Teaching Children About Food Safety Food Safety Professional Development for Early Childhood Educators.
Applying Laurillard’s Conversational Framework to Blended Learning Blogging and Collaborative Activity Design R Papworth, R Walker & W Britcliffe E-Learning.
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Assessment Online. Student Assessment Design learner-centered assessment that include self-reflection Design grading rubrics to assess discussions, assignments,
Pedagogical aspects in assuring quality in virtual education environments University of Gothenburg, Sweden.
MAKING THE SHIFT: FROM CLASSROOM TO ONLINE COURSE DESIGN: SESSION 4 Patricia McGee, PhD and Veronica Diaz, PhD.
ASSESSMENT OF STUDENT LEARNING
In The Name Of the Most High
Presentation transcript:

Evidence-Based Best Practices for Interactive and Motivational Online Learning Environments Dr. Curtis J. Bonk Associate Professor, Indiana University President, CourseShare.com

Are you ready???

Brains Before and After E- learning Before After And when use synchronous and asynchronous tools

Tons of Recent Research Not much of it...is any good...

Problems and Solutions (Bonk, Wisher, & Lee, in review) 1. Tasks Overwhelm 2. Confused on Web 3. Too Nice Due to Limited Share History 4. Lack Justification 5. Hard not to preach 6. Too much data 7. Communities not easy to form  Train and be clear  Structure time/dates due  Develop roles and controversies  Train to back up claims  Students take lead role  Use Pals  Embed Informal/Social

Benefits and Implications (Bonk, Wisher, & Lee, in review) 1. Shy open up online 2. Minimal off task 3. Delayed collab more rich than real time 4. Students can generate lots of info 5. Minimal disruptions 6. Extensive E-Advice 7. Excited to Publish  Use async conferencing  Create social tasks  Use Async for debates; Sync for help, office hours  Structure generation and force reflection/comment  Foster debates/critique  Find Experts or Prac.  Ask Permission

Basic Distance Learning Finding? Research since 1928 shows that DL students perform as well as their counterparts in a traditional classroom setting. Per: Russell, 1999, The No Significant Difference Phenomenon (5th Edition), NCSU, based on 355 research reports.

Online Learning Research Problems (National Center for Education Statistics, 1999; Phipps & Merisotos, 1999; Wisher et al., 1999). Anecdotal evidence; minimal theory. Questionable validity of tests. Lack of control group. Hard to compare given different assessment tools and domains. Fails to explain why the drop-out rates of distance learners are higher. Does not relate learning styles to different technologies or focus on interaction of multiple technologies.

Online Learning Research Problems (Bonk & Wisher, 2001) For different purposes or domains: in our study, 13% concern training, 87% education Flaws in research designs - Only 36% have objective learning measures - Only 45% have comparison groups When effective, it is difficult to know why - Course design? - Instructional methods? - Technology?

Evaluating Web-Based Instruction: Methods and Findings (41 studies) (Olson & Wisher, in review)

Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, in review) “…there is little consensus as to what variables should be examined and what measures of of learning are most appropriate, making comparisons between studies difficult and inconclusive.” e.g., demographics (age, gender), previous experience, course design, instructor effectiveness, technical issues, levels of participation and collaboration, recommendation of course, desire to take add’l online courses.

Evaluating Web-Based Instruction: Methods and Findings (Olson & Wisher, in review) Variables Studied: 1.Type of Course: Graduate (18%) vs. undergraduate courses (81%) 2.Level of Web Use: All-online (64%) vs. blended/mixed courses (34%) 3.Content area (e.g., math/engineering (27%), science/medicine (24%), distance ed (15%), social science/educ (12%), business (10%), etc.) 4.Attrition data (34%) 5.Comparison Group (59%)

Different Goals… Making connections Appreciating different perspectives Students as teachers Greater depth of discussion Fostering critical thinking online Interactivity online

Electronic Conferencing: Quantitative Analyses Usage patterns, # of messages, cases, responses Length of case, thread, response Average number of responses Timing of cases, commenting, responses, etc. Types of interactions (1:1; 1: many) Data mining (logins, peak usage, location, session length, paths taken, messages/day/week), Time-Series Analyses (trends)

Electronic Conferencing: Qualitative Analyses General: Observation Logs, Reflective interviews, Retrospective Analyses, Focus Groups Specific: Semantic Trace Analyses, Talk/Dialogue Categories (Content talk, questioning, peer feedback, social acknowledgments, off task) Emergent: Forms of Learning Assistance, Levels of Questioning, Degree of Perspective Taking, Case Quality, Participant Categories

Overall frequency of interactions across chat categories (6,601 chats).

Research on Instructors Online If teacher-centered, less explore, engage, interact (Peck, and Laycock, 1992) Informal, exploratory conversation fosters risktaking & knowledge sharing (Weedman, 1999) Four Key Acts of Instructors: pedagogical, managerial, technical, social (Ashton, Roberts, & Teles, 1999) Instructors Tend to Rely on Simple Tools (Peffers & Bloom, 1999) Job Varies--Plan, Interaction, Admin, Tchg (McIsaac, Blocher, Mahes, & Vrasidas, 1999)

Study of Four Classes (Bonk, Kirkley, Hara, & Dennen, 2001) Technical—Train, early tasks, be flexible, orientation task Managerial—Initial meeting, FAQs, detailed syllabus, calendar, post administrivia, assign pals, gradebooks, updates Pedagogical—Peer feedback, debates, PBL, cases, structured controversy, field reflections, portfolios, teams, inquiry, portfolios Social—Café, humor, interactivity, profiles, foreign guests, digital pics, conversations, guests

Network Conferencing Interactivity (Rafaeli & Sudweeks, 1997) 1. > 50 percent of messages were reactive. 2. Only around 10 percent were truly interactive. 3. Most messages factual stmts or opinions 4. Many also contained questions or requests. 5. Frequent participators more reactive than low. 6. Interactive messages more opinions & humor. 7. More self-disclosure, involvement, & belonging. 8. Attracted to fun, open, frank, helpful, supportive environments.

Week 4 Scattered Interaction (no starter):

Collaborative Behaviors (Curtis & Lawson, 1997) Most common were: (1) Planning, (2) Contributing, and (3) Seeking Input. Other common events were: (4) Initiating activities, (5) Providing feedback, (6) Sharing knowledge Few students challenge others or attempt to explain or elaborate Recommend: using debates and modeling appropriate ways to challenge others

Online Collaboration Behaviors by Categories (US and Finland) Behavior Categories Conferences (%) FinlandU.S.Average Planning 0.0 Contributing Seeking Input Reflection/ Monitoring Social Interaction Total 100.0

Dimensions of Learning Process (Henri, 1992) 1. Participation (rate, timing, duration of messages) 2. Interactivity (explicit interaction, implicit interaction, & independent comment) 3. Social Events (stmts unrelated to content) 4. Cognitive Events (e.g., clarifications, inferencing, judgment, and strategies) 5. Metacognitive Events (e.g., both metacognitive knowledge—person, and task, and strategy and well as metacognitive skill—evaluation, planning, regulation, and self-awareness)

Some Findings (see Hara, Bonk, & Angeli, 2000) Social (in 26.7% of units coded) social cues decreased as semester progressed messages gradually became less formal became more embedded within statement Cognitive (in 81.7% of units) More inferences & judgments than elem clarifications and in-depth clarifications Metacognitive (in 56% of units) More reflections on exper & self-awareness Some planning, eval, & regulation & self q’ing

Surface vs. Deep Posts (Henri, 1992) Surface Processing making judgments without justification, stating that one shares ideas or opinions already stated, repeating what has been said asking irrelevant questions i.e., fragmented, narrow, and somewhat trite. In-depth Processing linked facts and ideas, offered new elements of information, discussed advantages and disadvantages of a situation, made judgments that were supported by examples and/or justification. i.e., more integrated, weighty, and refreshing.

Critical Thinking (Newman, Johnson, Webb & Cochrane, 1997) Used Garrison’s five-stage critical thinking model Critical thinking in both CMC and FTF envir. Depth of critical thinking higher in CMC envir. More likely to bring in outside information Link ideas and offer interpretations, Generate important ideas and solutions. FTF settings were better for generating new ideas and creatively exploring problems.

Unjustified Statements (US) 24. Author: Katherine Date: Apr. 27 3:12 AM 1998 I agree with you that technology is definitely taking a large part in the classroom and will more so in the future… 25. Author: Jason Date: Apr. 28 1:47 PM 1998 I feel technology will never over take the role of the teacher...I feel however, this is just help us teachers Author: Daniel Date: Apr. 30 0:11 AM 1998 I believe that the role of the teacher is being changed by computers, but the computer will never totally replace the teacher... I believe that the computers will eventually make teaching easier for us and that most of the children's work will be done on computers. But I believe that there…

Indicators for the Quality of Students’ Dialogue (Angeli, Valanides, & Bonk, in review) ID Indicators Examples 1 Social acknowledgement/ Sharing/Feedback · Hello, good to hear from you · I agree, good point, great idea · \ 2 Unsupported statements (advice) · I think you should try this…. · This is what I would do… · 3 Questioning for clarification and extend dialogue · Could you give us more info? · …explain what you mean by…? \\ 4 Critical thinking, Reasoned thinking- judgment · I disagree with X, because in class we discussed…. · I see the following disadvantages to this approach….

Social Construction of Knowledge (Gunawardena, Lowe, & Anderson, 1997) Five Stage Model 1. Share ideas, 2. Discovery of Idea Inconsistencies, 3. Negotiate Meaning/Areas Agree, 4. Test and Modify, 5. Phrase Agreements In global debate, very task driven. Dialogue remained at Phase I: sharing info

Social Constructivism and Learning Communities Online (SCALCO) Scale. (Bonk & Wisher, 2000) ___ 1. The topics discussed online had real world relevance. ___ 2. The online environment encouraged me to question ideas and perspectives. ___ 3. I received useful feedback and mentoring from others. ___ 4. There was a sense of membership in the learning here. ___ 5. Instructors provided useful advice and feedback online. ___ 6. I had some personal control over course activities and discussion.

Evaluation…

Kirkpatrick’s 4 Levels Reaction Learning Behavior Results

My Evaluation Plan…

1.Measures of Student Success (Focus groups, interviews, observations, surveys, exams, records) Positive Feedback, Recommendations Increased Comprehension, Achievement High Retention in Program Completion Rates or Course Attrition Jobs Obtained, Internships Enrollment Trends for Next Semester

1. Student Basic Quantitative Grades, Achievement Number of Posts Participation Computer Log Activity—peak usage, messages/day, time of task or in system Attitude Surveys

1. Student High-End Success Message complexity, depth, interactivity, q’ing Collaboration skills Problem finding/solving and critical thinking Challenging and debating others Case-based reasoning, critical thinking measures Portfolios, performances, PBL activities

2. Instructor Success High student evals; more signing up High student completion rates Utilize Web to share teaching Course recognized in tenure decisions Varies online feedback and assistance techniques

3. Training Outside Support Training (FacultyTraining.net) Courses & Certificates (JIU, e-education) Reports, Newsletters, & Pubs Aggregators of Info (CourseShare, Merlot) Global Forums (FacultyOnline.com; GEN) Resources, Guides/Tips, Link Collections, Online Journals, Library Resources

3. Training Inside Support… Instructional Consulting Mentoring (strategic planning $) Small Pots of Funding Facilities Summer and Year Round Workshops Office of Distributed Learning Colloquiums, Tech Showcases, Guest Speakers Newsletters, guides, active learning grants, annual reports, faculty development, brown bags

RIDIC 5 -ULO 3 US Model of Technology Use 4. Tasks (RIDIC): Relevance Individualization Depth of Discussion Interactivity Collaboration-Control-Choice- Constructivistic-Community

RIDIC 5 -ULO 3 US Model of Technology Use 5. Tech Tools (ULOUS): Utility/Usable Learner-Centeredness Opportunities with Outsiders Online Ultra Friendly Supportive

6. Course Success Few technological glitches/bugs Adequate online support Increasing enrollment trends Course quality (interactivity rating) Monies paid Accepted by other programs

7. Online Program or Course Budget (i.e., how pay, how large is course, tech fees charged, # of courses, tuition rate, etc.) Indirect Costs: learner disk space, phone, accreditation, integration with existing technology, library resources, on site orientation & tech training, faculty training, office space Direct Costs: courseware, instructor, help desk, books, seat time, bandwidth and data communications, server, server back-up, course developers, postage

8. Institutional Success E-Enrollments from new students, alumni, existing students Additional grants Press, publication, partners, attention Orientations, training, support materials Faculty attitudes Acceptable policies (ADA compliant)

But how to determine the pedagogical quality of courses and course materials you develop?

Just a Lot of Bonk Variety: tasks, topics, participants, accomplishments, etc. Interaction extends beyond class Learners are also teachers Multiple ways to succeed Personalization and choice Clarity and easy to navigate course

Wisher’s Wish List Effect size of.5 or higher in comparison to traditional classroom instruction. Web Based Instruction CBI Kulik [8] CBI Liao [18] Average Effect Size Number of Studies

Quality on the Line: Benchmarks for Success in Internet-Based Distance Ed (Blackboard & NEA, 2000) Teaching/Learning Process Student interaction with faculty is facilitated through a variety of ways. Feedback to student assignments and questions is provided in a timely manner. Each module requires students to engage themselves in analysis, synthesis, and evaluation as part of their course assignments. Course materials promote collaboration among students.

Quality on the Line: Benchmarks for Success in Internet-Based Distance Ed (Blackboard & NEA, 2000) Other Benchmark Categories: Institutional Support: incentive, rewards, plans Course Development: processes, guidelines, teams, structures, standards, learning styles Course Structure: expectations, resources Student Support: training, assistance, info Faculty Support: mentoring, tech support Evaluation and Assessment: review process, multiple methods, specific standards

The Sharp Edge of the Cube: Pedagogically Driven Instructional Design for Online Education Syllabus Magazine, Dec, 2001, Nishikant Sonwalkar five functional learning styles— apprenticeship, incidental, inductive, deductive, discovery.

New Methodology for Evaluation: The Pedagogical Rating of Online Courses Syllabus Magazine, Jan, 2002, Nishikant Sonwalkar The Pedagogical Effectiveness Index: (1) Learning Styles: (see previous page) (2) Media Elements: text, graphics, audio, video, animation, simulation (3) Interaction Elements: feedback, revision, e- mail, discussion, bulletin /article.asp?id=5914 For more info,

New Methodology for Evaluation: The Pedagogical Rating of Online Courses Syllabus Magazine, Jan, 2002, Nishikant Sonwalkar Summative evaluation instrument for rating online courses: (1) Content Factors: quality, media, authentic (2) Learning Factors: interactivity, testing & feedback, collaboration, ped styles (3) Delivery Support Factors: accessible, reporting, user management, content (4) Usability Factors: clarity, chunk size, layout (5) Technological Factors: bandwidth, database connectivity, server capacity,browser

Best of Online Pedagogical Strategies…

Changing Role of the Teacher The Online Teacher, TAFE, Guy Kemshal-Bell (April, 2001) From oracle to guide and resource provider From providers of answers to expert questioners From solitary teacher to member of team From total control of teaching environment to sharing as a fellow student From provider of content to designer of learning experiences.

Little or no feedback given Always authoritative Kept narrow focus of what was relevant Created tangential discussions Only used “ultimate” deadlines Provided regular qual/quant feedback Participated as peer Allowed perspective sharing Tied discussion to grades, other assessments. Used incremental deadlines Poor InstructorsGood Instructors Dennen’s Research on Nine Online Courses (sociology, history, communications, writing, library science, technology, counseling)

What do we need??? FRAMEWORKS!!!

1.Reflect on Extent of Integration: The Web Integration Continuum (Bonk et al., 2001) Level 1: Course Marketing/Syllabi via the Web Level 2: Web Resource for Student Exploration Level 3: Publish Student-Gen Web Resources Level 4: Course Resources on the Web Level 5: Repurpose Web Resources for Others ================================ Level 6: Web Component is Substantive & Graded Level 7: Graded Activities Extend Beyond Class Level 8: Entire Web Course for Resident Students Level 9: Entire Web Course for Offsite Students Level 10: Course within Programmatic Initiative

2. Reflect on Interactions: Matrix of Web Interactions (Cummings, Bonk, & Jacobs, 2002) Instructor to Student: syllabus, notes, feedback to Instructor: Course resources, syllabi, notes to Practitioner: Tutorials, articles, listservs Student to Student: Intros, sample work, debates to Instructor: Voting, tests, papers, evals. to Practitioner: Web links, resumes Practitioner to Student: Internships, jobs, fieldtrips to Instructor: Opinion surveys, fdbk, listservs to Practitioner: Forums, listservs

3.

But there is a Problem…

How Bad Is It? “Some frustrated Blackboard users who say the company is too slow in responding to technical problems with its course- management software have formed an independent users’ group to help one another and to press the company to improve.” (Jeffrey Young, Nov. 2, 2001, Chronicle of Higher Ed)

Must Online Learning be Boring? What Motivates Adult Learners to Participate?

Motivational Terms? See Johnmarshall Reeve (1996). Motivating Others: Nurturing inner motivational resources. Boston: Allyn & Bacon. (UW-Milwaukee) 1. Tone/Climate: Psych Safety, Comfort, Belonging 2. Feedback: Responsive, Supports, Encouragement 3. Engagement: Effort, Involvement, Excitement 4. Meaningfulness: Interesting, Relevant, Authentic 5. Choice: Flexibility, Opportunities, Autonomy 6. Variety: Novelty, Intrigue, Unknowns 7. Curiosity: Fun, Fantasy, Control 8. Tension: Challenge, Dissonance, Controversy 9. Interactive: Collaborative, Team-Based, Community 10. Goal Driven: Product-Based, Success, Ownership

1. Tone/Climate: Ice Breakers Eight Nouns Activity: 1. Introduce self using 8 nouns 2. Explain why choose each noun 3. Comment on 1-2 peer postings Coffee House Expectations 1. Have everyone post 2-3 course expectations 2. Instructor summarizes and comments on how they might be met (or make public commitments of how they will fit into busy schedules!)

2. Feedback Requiring Peer Feedback Alternatives: 1. Require minimum # of peer comments and give guidance (e.g., they should do…) 2. Peer Feedback Through Templates—give templates to complete peer evaluations. 3. Have e-papers contest(s)

3. Engagement: Electronic Voting and Polling 1. Ask students to vote on issue before class (anonymously or send directly to the instructor) 2. Instructor pulls our minority pt of view 3. Discuss with majority pt of view 4. Repoll students after class (Note: Delphi or Timed Disclosure Technique: anomymous input till a due date and then post results and reconsider until consensus Rick Kulp, IBM, 1999)

4. Meaningfulness: Job or Field Reflections 1. Field Definition Activity: Have student interview ( via e- mail, if necessary) someone working in the field of study and share their results As a class, pool interview results and develop a group description of what it means to be a professional in the field

5. Choice: Discussion: Starter-Wrapper 1. Starter reads ahead and starts discussion and others participate and wrapper summarizes what was discussed. 2. Start-wrapper with roles--same as #1 but include roles for debate (optimist, pessimist, devil's advocate). Alternative: Facilitator-Starter-Wrapper Instead of starting discussion, student acts as moderator or questioner to push student thinking and give feedback

6. Variety: Just-In-Time-Teaching Gregor Novak, IUPUI Physics Professor (teaches teamwork, collaboration, and effective communication): 1. Lectures are built around student answers to short quizzes that have an electronic due date just hours before class. 2. Instructor reads and summarizes responses before class and weaves them into discussion and changes the lecture as appropriate.

7. Curiosity: Electronic Seance Students read books from famous dead people Convene when dark (sync or asynchronous). Present present day problem for them to solve Participate from within those characters (e.g., read direct quotes from books or articles) Invite expert guests from other campuses Keep chat open for set time period Debrief

8. Tension: Role Play A. Role Play Personalities List possible roles or personalities (e.g., coach, optimist, devil’s advocate, etc.) Sign up for different role every week (or 5-6 key roles) Perform within roles—refer to different personalities B. Assume Persona of Scholar Enroll famous people in your course Students assume voice of that person for one or more sessions Enter debate topic, respond to debate topic, or respond to rdg reflections

9. Interactive: Critical/Constructive Friends, Pals, Web Buddies 1. Assign a critical friend (perhaps based on commonalities). 2. Post weekly updates of projects, send reminders of due dates, help where needed. 3. Provide criticism to peer (i.e., what is strong and weak, what’s missing, what hits the mark) as well as suggestions for strengthening. 4. Reflect on experience.

10. Goal Driven: Gallery Tours Assign Topic or Project (e.g., Team or Class White Paper, Bus Plan, Study Guide, Glossary, Journal, Model Exam Answers) Students Post to Web Experts Review and Rate Try to Combine Projects

How will you design courses that motivate thinking? (Sheinberg, April 2000, Learning Circuits)

Some Final Advice… Or Maybe Some Questions???