Helen Jefferis, Soraya Kouadri & Elaine Thomas

Slides:



Advertisements
Similar presentations
Supplementing lectures with additional online materials Matthew Juniper, CUED June 2007.
Advertisements

Technology Seminar Self and Peer Assessment. Archway of teaching and learning capabilities.
ZUZANA STRAKOVÁ IAA FF PU Pre-service Trainees´ Conception of Themselves Based on the EPOSTL Criteria: a Case Study.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
442421: Dissertation Interim Report Ian Perry Room: C49 Extension: 7287
Module, Course and Unit Evaluations Module, course or unit evaluations give you the opportunity to make your voice heard by giving feedback about your.
Support for Learning with Technologies Mel Philipson.
Patrik Hultberg Kalamazoo College
Click on a question to see the answer. HOW EFFECTIVE IS THE COMBINATION OF YOUR MAIN PRODUCT AND ANCILLARY TEXTS? My main product and the ancillary product.
SATS 2016 What’s new?.  Your children will be in the first year group to take tests based on the new National Curriculum (2014).  Schools only began.
Lesson objective To understand how to draw and write up a questionnaire Success criteria: Build – discuss what makes a good question in a questionnaire.
ASSESSMENT WITHOUT LEVELS Age Appropriate Learning.
Articulating from FE to HE: Assessing & Improving Academic Confidence Enhancement Themes conference, Thursday 9 June 2016 John McIntyre Conference Centre,
Welcome to EYFS and KS1 Mathematics Evening Wednesday 26 th November 2014.
Some Suggestions for Improvement
Information for Parents Statutory Assessment Arrangements
Maths Mastery Parents’ Session
Rachel Glazener, Assistant Professor Natural and Behavioral Sciences
Maths Curriculum Evening 2016.
JavaScript/ App Lab Programming:
Coursebook Evaluation II
AP CSP: Cleaning Data & Creating Summary Tables
Personal Learning Planning Learning Logs and Pupil Achievement Folders
Leading Enhancement in Assessment and Feedback in Medical Sciences
Information for Parents Statutory Assessment Arrangements
Writing your reflection in Stage 1 & 2 Indonesian (continuers)
Staff and student experience of flipped teaching
Measuring Success Toolkit
Understanding the GCSE Marking Criteria
Assessment without levels
Introduction to Programmng in Python
Thinking Hats There are 6 Thinking Hats and they are used to help us focus and guide our thinking. INFORMATION HAT The white hat is used for information.
Supporting Completion and Credit Achievement for AgITO Trainees
TAKING CORNELL STYLE NOTES
Training Trainers and Educators Unit 3 - Teaching a Practical Skill
Building Engagement in Course Evaluation
Completing the tasks for A452 with….
Teaching Listening Based on Active Learning.
Staff Learning: Student DATA analysis
Understanding the student journey – from pre-arrival to graduation
Arranging your experiential placements
Ensuring you have the best start to your training.
Socratic Seminar By participating in
Programming for Visually Impaired Learners
Problem Solving Activities
Changes in teacher-student relationships during residential field courses Anne Plessis, Plymouth University, School of Biological and Marine Sciences
Imagine Success Engaging Entering Students Innovations 2009
Mapping Life-long Learning: A Deep Dive into a Graduate Attribute
Book Review Over the next few weeks you will be studying a novel of your choice in detail.
Standards Review Small Group Sessions.
Training Trainers and Educators Unit 3 - Teaching a Practical Skill
Learning Activity Teaching Programming with Visual Basic.Net to Year 10 (My Microsoft in the classroom story by Phil Feain) Introduction Year 10 is where.
WELCOME !! HI THERE.
Personal profiles in VLE forums: do students use them?
A Comparison of Gradebook Options for assessing Student Competencies
Creative assessment and feedback
Investigation of student engagement with programming in TU100 The impact of using a graphical programming environment? Helen Jefferis, Soraya Kouadri.
In-Service Teacher Training
Training Trainers and Educators Unit 3 - Teaching a Practical Skill
Before we begin MTT Case Study
SAMBa Writing Sprint for ITT9 Assignment
Jill Ann Broermann Spring 2004
A Moodle-based Peer Assessment Tool
Arranging your experiential placements
Utilising Canvas to create a collaborative learning environment…..
Utilising Module Evaluation data to explore outcomes from the Teaching Excellence and Student Outcomes Framework (TEF) Subject Level Pilot. Natalie Holland.
The Impact of Peer Learning on Assessment Literacy and Feedback Orientation
Experiences supporting Open University apprenticeship students: A case study Focus is on the OU’s Digital and Technology Solutions Professional Apprenticeship.
Faye Nicholson, P7 Class Teacher, Kingsland Primary School
Christine Gardner, Allan Jones, David Chapman, Helen Jefferis
Presentation transcript:

Visualising the Code An investigation of student engagement with programming in TU100 Helen Jefferis, Soraya Kouadri & Elaine Thomas Computing & Communication Department

Aim of the project To investigate the impact of using a graphical programming environment on student engagement with programming. Background - There is a strong need for the teaching of introductory programming at level 1 in the Computing and IT degree programme. The majority of new OU students will not have experienced the new National Curriculum. Previous teaching of programming at level 1 (M150) involved a text-based programming, JavaScript. Over half of students  avoided answering the question on programming in the EMA.   TU100 ‘My digital life’ uses a graphical programming environment Sense based on Scratch. What is visual programming? Why is it important? Literature says…. The project will seek to address the fundamental question as to whether the visual programming environment actually engages novice programmers or not in ‘TU100’.

Methodology Identification of the Sense programming questions in each TMA and in the EMA. Identification  and collection of data related to the numbers or students who completed these questions and their overall performance. Analysis of textual comments in a selection of SEaM surveys  of TU100 relating to students’ experience of programming. Three stages to the project.

Comparison of OES Scores Red line shows number of students (right hand axis) – less on B presentations Green is the OES Mean, Yellow the Sense Mean, Blue the ‘rest’ mean. In 13J and 14B there were less marks allocated for Sense but the Average scores for the programming and non-programming elements were about the same. (The marks have been normalised so that these figures are showing the percentage of the available marks for each element) In more recent presentations, students seem to have gained more marks, on average, in the Sense / programming elements of the EMA. NB the J presentation and the following B presentation have very similar questions – which tends to suggest that perhaps Sense was ‘easier’ on 14J / 15B?

Are students passing without passing Sense? <7.5% (460 out of 6,159 students) failed Sense and passed OES This chart shows the students that scored less than 40% The red column is the students that also failed OES Out of 6,159 students in total only 460 (181+161+88+30) passed OES – this is less that 7.5% of students. So the simple answer is no! As can be seen by the line indicating the overall % of students

Small percentage passed module without passing Sense 7.5% Putting it more clearly – a tiny proportion passed the module without passing Sense 92.5%

Analysis of SEaM comments 325 students made comments on one or more of the 3 questions: What aspects of teaching materials, learning activities or assessment did you find particularly helpful to your learning? We would welcome any further suggestions or comments to consider for future editions of the module. Do you have any other comments to add about your study experience on this module? NB Explain what SEaM is So these were general comments not just about programming

Sense & Programming Comments 79 students made some comment about Sense and in addition a further 18 students made a comment about programming. Of these 73% were positive comments (60 students commenting about Sense; 11 about programming) 27% were negative (19 students commenting about Sense; 7 about programming.)

Positive comments examples SENSE was very good too as it enabled me to focus on logic and programming structure, rather than language The Sense board, tasks and associated manual were excellent and this was certainly the high point of the course for me Furthermore, it introduced me to programming in a way that was very easy to understand and to follow with lots of activities and programming examples and which led me to a new hobby. “I particularly liked the Sense Programming learning activities. This is something that I had never done before and found it very engaging and easy to understand. I found my competence levels rising throughout the course, especially for Sense.” “One aspect that was really good was the Sense programming guide and the activities for it through the blocks. It provided a brilliant base for learning programming and advancing skills. It was easy to follow and enjoyable to do. It also stuck in your head for other languages because of the idea of blocks being used for sections of programs”

Negative comments examples I felt the assignment programming tasks were very basic The Programming guide is awful, I have worked thought it more than once and I still cant grasp most of it. I have had to look elsewhere to help develop these skills The Sense programming environment is obviously designed for children, and I cant understand why the OU couldn’t have based the course on a real-world app programming language that is instantly applicable in the real-world... Android for example or C++ two people also commented (and counted as negative comments in this case) that they didn’t want to do it all year eg “.. A 15-point course on Sense would have been nice, followed by another 15- or 30-point course on some proper object-oriented programming.” “Python is very simple and introducing some Python after making students comfortable with Sense would definitely work.”

Summary & future work Summary Future Work More students have engaged with the programming element than on previous modules. There is a strong correlation between the scores that students achieved in the programming and non-programming elements of the EMA. There is little or no difference in the performance of students in the programming elements and the non-programming elements.  Work has informed thinking around new level 1 module Provides a reference point for future studies of level 1 programming Future Work Similar study for TM111 after first presentation Evaluate student performance in text-based programming on TM112