Presentation is loading. Please wait.

Presentation is loading. Please wait.

Teaching for Fluency with Information Technology: The Role of Feedback in Instructional Design and Student Assessment Explorations in Instructional Technology.

Similar presentations


Presentation on theme: "Teaching for Fluency with Information Technology: The Role of Feedback in Instructional Design and Student Assessment Explorations in Instructional Technology."— Presentation transcript:

1 Teaching for Fluency with Information Technology: The Role of Feedback in Instructional Design and Student Assessment Explorations in Instructional Technology Mark Urban-Lurain Don Weinshank October 27, 2000

2 Overview u Context: Teaching non-CS majors u Instructional Design u Uses of technology u Results u Implications

3 Fluency with Information Technology u What does it mean to be a literate college graduate in an information age? u Information technology is ubiquitous u Computer Literacy associated with training u Being Fluent with Information Technology (FIT) u Committee on Information Technology Literacy, 1999 u CSE 101 u MSU introductory course for non-CS majors

4 Instructional Principles 1. Concepts and principles promote transfer within domain Necessary for solving new problems 2. Assessment drives instruction (Yelon) Write the final exam first 3. Move focus from what is taught to what is learned Student-centered 4. Formative evaluation improves student performance Study – test – restudy – retest 5. Performance assessments evaluate mastery of concepts High inter-rater reliability critical 6. Mastery-model learning ensures objectives met What students can do, not what they can say

5 Uses of Technology in Instruction u Delivery of content u Television u CBI / CBT u Web-based u Communication u u Discussion groups u Real-time chat u Feedback and monitoring u Formative evaluation u Iterative course development and improvement

6 ` Incoming Students Instruction Assessment Outcomes Design Inputs Instructional Goals Instructional Design Design Phase Implementation Phase Course Design & Implementation

7 Discriminant Analysis u Multivariate statistical classification procedure u Dependent variable: final course grade u Independent variables u Incoming student data u Classroom / instructional data u Assessment performance data u Each student classified in group with highest probability u Evaluate classification accuracy u Interpret discriminant functions u Independent variable correlations with functions u Similar to interpreting loadings in Factor Analysis

8 Footnote Modify style Web format Private folder New SS Charts Functions Path Find application TOC URL Public folder Update SS Boolean search Skill to Schema Map Link Web search Find rename file Computer specs.

9 SIRS: Course, TA, ATA u End of semester student survey about course u Three Factors u Fairness u Student preparation and participation u Course resources u SIRS for Lead TA u One Factor u SIRS for Assistant TA u One Factor

10 Fairness Factor u 35.3% of variance on this factor accounted for by: u Final grade in course u TA SIRS u Number of BT attempts u ATA SIRS u Cumulative GPA u ACT Mathematics u Computer specifications u Incoming computer communication experience

11 Participation Factor u 19.8% of variance on this factor accounted for by: u TA SIRS u Attendance u ATA SIRS u ACT Social Science u Number of BT attempts u Create chart u Incoming knowledge of computer terms u ACT Mathematics u Find - rename file u Path u TOC u NO course grade

12 Course Resources Factor u 11.3 % of variance on this factor accounted for by: u TA SIRS u Attendance u ATA SIRS u Extension task: backgrounds u Web pages in Web folder u Number of BT attempts u NO course grade

13 Lead TA SIRS u 27.8 % of variance on Lead TA SIRS accounted for by: u Fairness factor u Preparation and participation factor u TA Experience u Course resources factor u ATA SIRS u Attendance u Private folder u Extension task: excel function u Number of BT attempts u NO course grade

14 Assistant TA SIRS u 13.4 % of variance on ATA SIRS accounted for by: u Fairness factor u Preparation and participation factor u Student factor u TA SIRS u Course resources factor u Attendance u Path u TA Experience u NO course grade

15 Technology in Instructional Design and Student Assessment u Data-rich instructional system u Detailed information about each student u CQI for all aspects of instructional system u Performance-based assessments u Labor intensive u Inter-rater reliability u Analyzing student conceptual frameworks u Intervention strategies u Early identification u Targeted for schematic structures

16 Implications u Instructional design process can be used in any discipline u Accreditation Board for Engineering and Technology u CQI u Distance Education u Demonstrates instruction at needed scale u On-line assessments u How to provide active, constructivist learning on line?

17 Questions? CSE 101 Web site

18

19

20 Instructional design detail slides

21 Design Inputs Literature CS0 Learning Assessment Client department needs Design team experience Design Inputs

22

23 Instructional Goals FITness Problem solving Transfer Retention No programming Instructional Goals

24 Deductive Instruction Concept Skill 1Skill 2 Skill 3

25 Schema 1 Schema 2 Schema 3 Inductive Instruction Skill 1Skill 3 Concept Skill 2

26

27 Instructional Design 1950 students / semester Multiple tracks Common first half Diverge for focal problems All lab-based classes 65 sections No lectures Problem-based, collaborative learning Performance-based assessments Instructional Design

28

29 Incoming Students Undergraduates in non-technical majors GPA ACT scores Class standing Major Gender Ethnicity Computing experience Incoming Students

30

31 Instruction Classroom staff Lead Teaching Assistant Assistant Teaching Assistant Lesson plans Problem-based learning Series of exercisesSeries of exercises Homework Instructional resources Web, Textbook Instruction

32

33 Assessment Performance-based Modified mastery model Bridge Tasks Determine grade through 3.0 Formative Summative Final project May increase 3.0 to 3.5 or 4.0 Assessment

34 Bridge Task Competencies in CSE 101 u 1.0 ; Web; Distributed network file systems; Help 1.0 u 1.5Bibliographic databases; Creating Web pages 1.5 u 2.0Advanced Word-processing 2.0 u 2.5Spreadsheets (functions, charts); Hardware; Software 2.5 u 3.0Track A 3.0Track A u Advanced Web site creation; Java Applets; Object embedding u 3.0 Track C 3.0 Track C u Advanced spreadsheets; Importing; Data analysis; Add-on tools u 3.0 Track D 3.0 Track D u Advanced spreadsheets; Fiscal analysis; Add-on tools

35

36 Bridge Task Detail Drilldown 1

37 Bridge Task (BT) Database Each Bridge Task (BT) has dimensions (M) that define the skills and concepts being evaluated. Within each dimension are some number of instances (n) of text describing tasks for that dimension. A bridge task consists of one randomly selected instance from each dimension for that bridge task Dim 1 Instance i Instance i+1 Instance i+2 Instance i+n Dim 2 Instance i Instance i+1 Instance i+2 Instance i+n Dim M Instance i Instance i+1 Instance i+2 Instance i+n Creating Assessments

38 Bridge Task (BT) Database Dim 1 Instance i Instance i+1 Instance i+2 Instance i+n Dim 2 Instance i Instance i+1 Instance i+2 Instance i+n Dim M Instance i Instance i+1 Instance i+2 Instance i+n Dim 1 Instance 1 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim 2 Instance i+2 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim M Instance i+n Criteria i Criteria i+1 Criteria i+2 Criteria i+n Student Evaluation PASS or FAIL Evaluation Criteria

39

40 Bridge Task Detail Drilldown 2

41 Web Server Student Enters: Pilot ID PID PW Student Records Database Submits to Create Query Bridge Task (BT) Database Request New BT Dim 1 Instance i Instance i+1 Instance i+2 Instance i+n Dim 2 Instance i Instance i+1 Instance i+2 Instance i+n Dim M Instance i Instance i+1 Instance i+2 Instance i+n Randomly select one instance from each of M dimensions for desired BT Assemble Text Dim 1 (i+1) Dim 2 (i+n) Dim M (i) Web Server Individual Student BT Web Page Returns Delivering Assessments

42

43 Student Records Database Grader Queuing Create Query Bridge Task (BT) Database Request Criteria Dim 1 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim 2 Criteria i Criteria i+1 Criteria i+2 Criteria i+n Dim M Criteria i Criteria i+1 Criteria i+2 Criteria i+n Provide criteria for instances used to construct students BT Individual Student BT Checklist Returns Grader evaluates each criteria PASS or FAIL Student Bridge Task Dim 1 Criteria Dim 2 Criteria Dim M Criteria Student Records Database Record PASS / FAIL for each criteria Evaluating Assessments

44

45

46 Outcomes Outcomes Student final grades SIRS TA feedback Oversight Committee: Associate Deans Client department feedback

47

48

49


Download ppt "Teaching for Fluency with Information Technology: The Role of Feedback in Instructional Design and Student Assessment Explorations in Instructional Technology."

Similar presentations


Ads by Google