Presentation on theme: "Helen Beetham Consultant in Pedagogy JISC e-learning programme"— Presentation transcript:
1 Helen Beetham Consultant in Pedagogy JISC e-learning programme Pedagogic EvaluationHelen Beetham Consultant in PedagogyJISC e-learning programmeColloquia
2 Activities for this session Discuss what is meant by ‘pedagogic evaluation’Identify project aims and rephrase as evaluation questionsIdentify stakeholders in the evaluationConsider appropriate means of data collection and analysisEstablish contact with ‘peer’ projects and begin sharing ideas/expertise
3 Lab testing Does it work? Functionality test Compatibility test Destruction test…
4 Usability testing Can other people make it work? Are the menus clearly designed?Is there a logical page structure?...
5 Pedagogic evaluation Does anyone care if it works? i.e. is it useful in learning and teaching contexts?How is it useful?Who needs software anyway?…
6 Relating the three phases of evaluation Software may progress from lab testing through usability testing to contextual evaluation…… or (e.g. in RAP) through many iterative cycles, with users involved at each stageIncreasing authenticity of contextEvaluation moves from simple, lab-based to complex, authentic contexts of useDifferent questions are asked, different kinds of data are collected, and different issues ariseComplex, authentic contexts rarely provide yes/no answers to development questionsCausative factors may be difficult to untangleFindings may be highly context-related (so several different contexts are better for evaluation than one)
7 Three approaches to learning and teaching There are basically three ways of understanding how people learnAssociativeConstructiveindividual/cognitivistsocial constructivistSituativeLead to different pedagogic strategies and approachesAny of these approaches may be appropriatedepending on the priority outcomes and the needs of learners
8 Associative approach In learning In teaching In assessment Routines of organised activityProgression through component concepts or skillsClear goals and feedbackIndividualised pathways matched to prior performanceIn teachingAnalysis into component unitsProgressive sequences of component-to-composite skills or conceptsClear instructional approach for each unitHighly focused objectivesIn assessmentAccurate reproduction of knowledge or skillComponent performanceClear criteria: rapid reliable feedbackExamplesGuided instructionDrill and practiceTraditional ISD (e.g. Gagne)Socratic dialogue
9 Constructive approach (cognitivist) In learningActive construction and integration of conceptsIll-structured problemsOpportunities for reflectionOwnership of the taskIn teachingInteractive environments and appropriate challengesEncourage experimentation and the discovery of principlesCoach and model skillsInclude meta-cognitive outcomesIn assessmentConceptual understanding (applied knowledge and skills)Extended performanceProcesses as well as outcomesCrediting varieties of excellenceDeveloping self-evaluation and autonomy in learningExamplesCognitive scaffolding (e.g. Piaget)Experiential learning (based on Kolb’s learning cycle)Experimental learningConstructivist learning environmentsProblem-based learningResearch-based or exploratory learning
10 Constructive approach (social) In learningConceptual development through collaborative activityIll-structured problemsOpportunities for discussion and reflectionShared ownership of the taskIn teachingCollaborative environments and appropriate challengesEncourage experimentation, discussion and collaborationCoach and model skills, including social skillsLearning outcomes may be collectively negotiatedIn assessmentConceptual understanding (applied knowledge and skills)Extended performanceProcess and participation as well as outcomesCrediting varieties of excellenceDeveloping peer-evaluation and shared responsibilityExamplesReciprocal teachingConversational model (Laurillard/Pask)(Computer-supported) collaborative learningDialogue/argumentation
11 Situative approach In learning In teaching In assessment Participation in social practices of enquiry and learningAcquiring skills in contexts of useDeveloping identity as a learnerDeveloping learning and professional relationshipsIn teachingCreating safe environments for participationSupporting development of identitiesFacilitating learning dialogues and relationshipsElaborating authentic opportunities for learningIn assessmentCrediting participationExtended performance, including variety of contextsAuthenticity of practice (values, beliefs, competencies)Involving peersExamplesApprenticeshipCognitive apprenticeshipSituated learning(Legitimate peripheral) participation(Continuing) professional developmentWork-based learning
12 3 ways of learning about (educational) software Does it work?Routines of organised activityAnalysis into component unitsComponent performanceHighly focused objectivesCan other people make it work?Ill-structured problemsOpportunities for reflectionOwnership of the taskExtended performanceProcesses as well as outcomesIs it useful in authentic (educational) contexts?Creating supportive environments for useSupporting development of users’ skillsFacilitating dialogues and relationshipsElaborating authentic opportunitiesExtended performance, including variety of contextsAuthenticity of practice
13 Evaluation is learning! Evaluation for development, not accountabilitySharing lessons (including failures)Sharing concepts and approachesMoving software into more authentic contexts of usein order to find out how it is useful, and how it should be supported and embedded for effective useUsing the outcomes of evaluation to inform development and take-upLearning across peer projectsLearning across different strands of the e-learning programme (and beyond)Learning about your own softwarehow to have effective dialogues with usersrange and/or specificity of applicationusability implications?
14 Principles of evaluation Ask the right questionsInvolve the right peopleCollect useful and reliable dataAnalyse and draw conclusions appropriately
15 1. Asking the right questions How does the use of this e-tool support effective learning, teaching and assessment (LTA)?What LTA activities?Be specific and pragmaticUnderstand how e-tool fits with existing LTA practiceBut expect it to alter practice – sometimes unpredictablyWhich users?Range of user needs, roles and preferencesConsider stakeholders who are not direct usersWhat counts as ‘effective’?Enhanced outcomes for learners? Enhanced experience of learning?Enhanced experience for teachers/support staff? Greater organisational efficiency?Consider what claims you made in your bid, your big visionEffective in what LTA contexts?Does e-tool support a particular pedagogic approach?Does it require a particular organisational context?Consider pragmatics of interoperability, sustainability and re-useAre you aiming for specificity of breadth of application?
16 Example: Interactive Logbook project identify how the IL supports learners wrt ‘access, communication, planning & recording’How is access to learning resources improved?How is communication for learning improved?Does the IL provide a useful tool for planning and recording learning in the pilot programme? In what ways?Does the IL support planning and recording outside of the pilot programme and does it provide a durable basis for future planning and recording? In what ways?How does it compare with other systems offering similar benefits?identify how the IL supports teachers or programme developers and organisations respectively, and understand how best ‘to implement and embed the Interactive Logbook… as distributed e-learning becomes more mainstream’.What skills do learners and tutors need to make effective use of IL? What features of the programme support integration and use of IL (assessment strategies, support available, mode of access)?What technical systems and support are needed for IL to be integrated effectively? What features of the organisation support effective use of IL by learners and teachers?
17 Over to you (1) Evaluation should be interesting, so: What do you really want to find out about your software?What is the most important lesson your project could pass on to others?Don’t set out to prove what we already know!Look back at your project aims – what claims are you making for impact on LTA?Translate these aims/claims into questions. Is there evidence of this impact? How does it happen?Good claims are achievable but also challengingGood questions are tractable but also interestingDo your original aims fit with what interests you now?Prioritise the issues that seem important now, with the benefit of insights from the development processBut use this as an opportunity to revisit and reviewWhat are other projects investigating?Do you have any questions in common in your peer group? Or questions that complement each other?What could you usefully share of the evaluation process?
18 2. Involving the right people Who are your users?What activities will they carry out with the system?What functions of the system are important to them?What roles do they play in those activities?What are the important differences between them?Significant for sampling (e.g. dyslexics for V-MAP)walk-throughs, use cases (user testing and evaluation design)real groups of learners and teachers (pedagogic evaluation)Who are your other stakeholders?Non-users whose work or learning may be impacted by use of the system in context e.g. administratorsOther ‘interested’ parties e.g. institutional managers, project funders, researchers/developers, potential users…
19 Over to you (2) Who are your important stakeholder groups? distinguish your user group in ways that are significant for your evaluation questions (e.g. learners with or without an existing e-portfolio)consider non-users as stakeholders and as potential sources of informationShare your outcomes with your peer group.What types of user do you need to include? (e.g. model users you have developed for walk-throughs)How will you identify real user groups for evaluation?How will you ensure all your significant types are included?(NB including different types of learner is much easier than finding a ‘statistically representative’ sample: for this the proportions of different kinds of learner must be the same as in the target population)
20 3. Collecting useful data Data collected should be directly relevant to the questions!Data should show triangulation:using a variety of methods (e.g. focus group and questionnaire)from a range of stakeholders (e.g. learners, teaching staff)over a span of time (e.g. ‘before’ and ‘after’)Quantitative data = How much? How often? How many?Also providing yes/no answers to simple questionsGeneralising from instances to rulesConverting opinions into data for analysis (Likert scales)Qualitative data = explanatory, narrativeWhat happened? What was it like for you? Why?Identifying themes and providing local evidencePreserving the voices of participantsHow authentic the context? If authentic, how will you support embedding and use of the software?Feasibility and costs of data collection?Skills and costs of analysis?
21 Over to you (3) Use the matrix to plan what data you will collect Data should be designed to answer specific questions (left column) and should be collected from specific stakeholder groups (top row)Add details if possible, e.g. when (time) and how (method) this data could be collectedYou need not fill all the boxes but try to have something in each row and columnYou can merge boxes!
22 Final discussion: analysing and drawing conclusions Basic choices for data analysis:Quantitative analysis – statistical data that may be presented as pie charts, graphs etc; likert scalesQualitative analysis within a given analytical framework –comparison, correlation, explanationQualitative analysis without a given analytical framework – case histories, narratives, themesOutcomes need to be useful to different audiencesYour project, and other development projectsImplementers and users of your softwareHow will we draw conclusions across the different projects?Peer review groupsLinks to other projects in the e-learning programmePedagogy strand – refer to previous workshopDeL regional pilotsELF reference models (also standards community)Sharing scenarios, roles and walk-throughs (model users)?Mapping activities to a common framework (model uses)?
23 Learner Differences (‘model users’) Refer to hand-outBut note that in many situations these differences will not be educationally significantAnd in your context of use, there may be other important differences to consider (see your ‘stakeholder’ activity)Could we develop a databank of ‘scenarios’ and roles’?See Peter Rees-Jones’ work on scenarios for e-portfolios
24 Outline of a learning activity Identities: preferences, needs, motivations Competences: skills, knowledge, abilitiesRoles: approaches and modes of participatinglearner(s)specific interaction of learner(s) with other(s), using specific tools and resources, oriented towards specific outcomeslearning environmentlearning activitylearning outcome(s)Tools, resources, artefacts Affordances of the physical and virtual environment for learningNew knowledge, skills and abilities on the part of the learner(s)Artefacts of the activity processother(s)Other people involved and the specific role they play in the interaction e.g. support, mediate, challenge, guide
25 Developmental processes Environment (tools, resources) can be adapted to meet the needs of learners, or provides a range of options for differentiationlearner(s)doreflectadaptdifferentiateLearning outcomes are captured for reflection, planning and reviewlearning environmentlearning activitylearning outcome(s)sharerespondOutcomes can also be shared for formal assessment, informal feedback and peer reviewother(s)
26 Next steps Appoint evaluators (if not already in place) Finalise evaluation planBased on phase 2 bidUsing a pro-forma (optional)Identify opportunities to liaise with other projects, e.g. to shareEvaluation questions and approachesActual data (comparative analysis?)Process of analysis and drawing conclusionsI will be in touch to discuss these (or contact me at any time)