Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Comparing Generic Student Learning Outcomes among Fresh Graduates in.

Slides:



Advertisements
Similar presentations
Global Learning Outcomes at Pensacola State College (GLOs)
Advertisements

Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Assessment of the Cooperative Education Program in Relationship to the Program Outcomes of the Undergraduate Nursing and UHM Cooperative Education Programs.
A2 Unit 4A Geography fieldwork investigation Candidates taking Unit 4A have, in section A, the opportunity to extend an area of the subject content into.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
KSC Mathematics Creating Rubrics for Assessment of General Education Mathematics Dick Jardine SUNY General Education Assessment Conference Syracuse, NY.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
Computer Science Department Program Improvement Plan December 3, 2004.
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
Home Economics Teachers’ Readiness for Teaching STEM
Monitoring & Evaluation HKMC Accreditation Site Visit 2008 Chinese University of Hong Kong HKMC Accreditation Site Visit 2008 Chinese University of Hong.
GS/PPAL Section N Research Methods and Information Systems A QUANTITATIVE RESEARCH PROJECT - (1)DATA COLLECTION (2)DATA DESCRIPTION (3)DATA ANALYSIS.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
FLCC knows a lot about assessment – J will send examples
Insights from PISA & TIMSS on our planning of services & policies on curriculum development in Science Education Sci Edu Section, CDI 6 Oct 2014.
Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share.
NAPS Educator Evaluation Spring 2014 Update. Agenda Evaluation Cycle Review Goal Expectations and Rubric Review SUMMATIVE Evaluation Requirements FORMATIVE.
Jeremy Hall Nicholas Jones Wouter Poortinga An Exploration of Assessment Practices at Cardiff University’s Schools of Engineering, Psychology and the Centre.
Assessment of Higher Education Learning Outcomes (AHELO): Update Deborah Roseveare Head, Skills beyond School Division Directorate for Education OECD 31.
Impact assessment framework
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
FACULTY RETREAT MAY 22, H ISTORY 2006 Middle States Self-Study Reviewer’s Report Recommendations: The institution is advised that General Education.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
St. Thomas University ALUMNI SURVEY Executive Summary Undergraduate and Graduate alumni in university database from 2001 to 1991 were mailed a copy of.
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
Jason Leman Education Researcher Sheffield Hallam University.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Employer Surveys General & Employee Supervisor May -June 2003.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Student Outcomes Assesment for CTE Programs. Student Outcomes Assessment and Program Review  In 2004, the Program Review process was redesigned based.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Self-assessment Accuracy: the influence of gender and year in medical school self assessment Elhadi H. Aburawi, Sami Shaban, Margaret El Zubeir, Khalifa.
Department of Secondary Education Program Assessment Report What We Assessed: Student Learning Outcomes (SLOs) and CA State Teaching Performance.
BPS - 3rd Ed. Chapter 131 Confidence Intervals: The Basics.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Career Pathways for English Language Graduates Liz Whitaker York St John University 16 th July 2008.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Student Learning Outcomes (Pharmacy) Susan S. S. Ho School of Pharmacy Faculty of Medicine The Chinese University of Hong Kong 9 September 2007.
VALIDITY, RELIABILITY & PRACTICALITY Prof. Rosynella Cardozo Prof. Jonathan Magdalena.
Institutional Effectiveness at CPCC DENISE H WELLS.
An Institutional Writing Assessment Project Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010.
Program Assessment – an overview Karen E. Dennis O: sasoue.rutgers.edu.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
Awareness of the National Standards for Culturally and Linguistically Appropriate Services (CLAS) at an Academic Health Center Dr. Genny Carrillo Department.
Using correspondence analysis: a tool for researching nursing informatics competencies Peter Kokol, Helena Blažun Faculty of Health Sciences, University.
+ A Case Study of Teaching Job Interviews in Introductory Public Speaking Chris Cruz-Boone California State University, Bakersfield College to Workplace:
Capstone: Identifying the impact of advisor review on the quality of student scholarly writing Colleen Burnham MBA, Caroline Alper MD, Melissa A. Fischer.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
Cohort Study Evaluation Irina Ibraghimova
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
EVALUATING EPP-CREATED ASSESSMENTS
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Parts of an Academic Paper
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
AXIS critical Appraisal of cross sectional Studies
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Reliability and Validity of Measurement
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Student Evaluations of Teaching (SETs)
Presenter: Kate Bell, MA PIP Reviewer
8.3 Estimating a Population Mean
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Presentation transcript:

Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Focusing on Faculty of Medicine’s Graduates

This report includes relevant sections of a research report submitted to the Senate Committee on Teaching & Learning for all of CUHK by Hazlett, Clarke Cheng Chun Yiu, Jack Cheung Wai Hung, Gordon Kwong Kai Sun, Sunny Plus further data from a 2 nd similar survey completed only by Medicine, i.e., medical alumni from its new curriculum as the surveys were done within one yr of graduation, the medical alumni were interns Hazlett, Clarke Cheng Chun Yiu, Jack Cheung Wai Hung, Gordon Kwong Kai Sun, Sunny Plus further data from a 2 nd similar survey completed only by Medicine, i.e., medical alumni from its new curriculum as the surveys were done within one yr of graduation, the medical alumni were interns

CUHK’s Objectives Determine if recent (within 1 yr of graduation) CUHK alumni concluded that their major undergraduate programme had developed their generic skills so that they could effectively meet job requirements in their place of employment Determine if these opinions of the alumni were consistent with those of students enrolled all CUHK programmes during 1 st & last yrs of the program (as assessed by the Student Engagement Questionnaire) Hong Kong employers (as assessed by the Hong Kong Education and Manpower Bureau) Faculty of Medicine’s Added Objective Determine if alumni from old and new curriculum were different in reporting how well their programme had developed these generic skills

Methods: Summary Summary Surveyed all alumni graduating in 2004/5 (N= 2646) and then repeated the survey on only medical graduates in 2005/06 (N=142) Focus of survey instrument: During my employment since graduation, the academic programme that I took in CUHK has enabled me to... e.g.,.. speak more effectively in Cantonese when communicating with clients and colleagues.” Instrument and survey method pilot tested for content & face validity Instrument’s construct validity established (construct validity = 0.91) Generalizability of results estimated in initial survey (found no evidence that non respondents had any different opinions) Methodology details and relevant analyses available

Results

Response Rates Overall Response Rate (CUHK) in 1 st Survey:1356 (51%) Fac. of Medicine Response Rate in 1 st Survey: 78 (54%) Fac. of Medicine Response Rate in 2 nd Survey: 73 (51%)

Results of validation studies were consistent with (and somewhat better than) the findings when the survey instrument was first pilot tested. Instrument met (exceeded) acceptable standards for construct validity - an oblique factor structure with nine correlated factors was readily interpretable - 9 factors corresponded to the a prior defined constructs, i.e., those which SEQ & EMB surveys were assumed to have in common - solution met criteria for a simple factor structure - nine factors represented over 53% reduction from the number of items used within the instrument but still accounted for > 84% of the total variance (approximating a construct validity coefficient of.91) Construct Validity Estimate

Comparisons among those who only responded after 1 to 3 follow up reminders and with those needing no reminders were used to estimate if the collected data were possibly representative - i.e., to establish if it was clearly inadvisable to generalize the findings to those who had not responded to the survey at anytime - no differences in opinions detected among those who responded to 1 st contact & those who were contacted a 2 nd or 3 rd time - altho’ no evidence to indicate that one can’t generalize the results to non respondents, it remains possible that those who did not answer were systematically different Generalizability

Faculty of Medicine’s Within Program Comparisons With reference to CUHK’s average for all respondents (i.e., includes those who did and did not require the skill in their job)

CUHK & Nursing Alumni Compared Level their Programmes Developed Generic Competencies

CUHK & Pharmacy Alumni Compared Level their Programmes Developed Generic Competencies

CUHK & Medical Alumni Compared Level their Programme Developed Generic Competencies

* * statistically significant at 0.05 (2-tailed) between old & new curriculum cohorts CUHK & Medical Alumni Compared Level their Programmes Developed Generic Competencies

Summary Expected differences among CUHK Faculties & programmes were often found, adding evidence to the validity level of the measured outcomes These results imply that caution is needed when attempting to make any direct comparisons among Faculties or among major programmes To evaluate the relative differences, note that on a scale from 1 to 5, the absolute differences among Faculties & progammes were not large and thus few differences are educationally meaningful

Comparative Findings for Faculties & Programmes Variations between programmes were more pronounced than among Faculties caution should be exercised as smaller programmes with fewer respondents can more easily exhibit wider variations than large programmes with larger number of respondents

Conclusions Recent alumni from 3 programs in Faculty of Medicine are quite similar to how other CUHK alumni report that their respective programmes’ have developed the graduates’ generic competencies Medical programme’s alumni who graduated under the new curriculum reported one statistically significant improved competency (use of computers) & that the other 8 generic competencies were directionally better than parallel measures reported by alumni from the old curriculum

Summary Although reported results are based on a survey instrument which met acceptable standards for construct validity, there were only reasonable consistencies between CUHK’s SEQ and this Fresh Graduates Survey’s measures for the following constructs a. Language proficiency b. Computer literacy c. Problem Solving Ability Thus, comparative alumni ratings are likely more accurately informative for these 3 above noted competencies

most constructs measured in this survey are generic to all programmes; thus, most should be included in the student learning outcomes of any major programme or Faculty these generic skills are those that both employers and the Educational Manpower Bureau regard as relevant skills for any university graduate entering the Hong Kong workforce these skills are those which CUHK has operationally defined as appropriate for its students to acquire while they pursue a chosen discipline-based or professional-based education the investigators hope that this feedback from new alumni will prove to be useful for the Faculty in its continuing efforts to enhance its respective curricula, instructional designs, learning activities and student assessments in order to achieve desired student learning outcomes Notwithstanding the Afore Noted Cautions