FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.

Slides:



Advertisements
Similar presentations
MICS 3 DATA ANALYSIS AND REPORT WRITING. Purpose Provide an overview of the MICS3 process in analyzing data Provide an overview of the preparation of.
Advertisements

Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Whiteboard Zoom Out OKED TLE Pilot Facilitator Training.
Chapter 13 Survey Designs
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Formative and Summative Evaluations
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010.
Needs Analysis Instructor: Dr. Mavis Shang
Effect of Staff Attitudes on Quality in Clinical Microbiology Services Ms. Julie Sims Laboratory Technical specialist Strengthening of Medical Laboratories.
Chapter 13 Survey Designs
Introduction of Internet Survey Methodology to the Emirate of Abu Dhabi Andrew Ward, Maitha Al Junaibi and Dragica Sarich.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Customer Survey Van Bennekom Book. Introduction Surveying has become a commonplace tool on the business landscape due to the drive of the quality management.
Survey Designs EDUC 640- Dr. William M. Bauer
Assessing Organizational Communication Quality
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Evaluation of Teacher Preparation Programs: Purposes, Methods, and Policy Options Robert E. Floden, Michigan State University & Jeanne Burns, Louisiana.
Confidential © InsideTrack, Adult Learners: Researched-based Strategies for Recruitment and Retention.
Questionnaires and Interviews
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
European Conference on Quality in Official Statistics (Q2010) 4-6 May 2010, Helsinki, Finland Brancato G., Carbini R., Murgia M., Simeoni G. Istat, Italian.
Making Sense of the Social World 4th Edition
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Chapter 12: Survey Designs
1 L643: Evaluation of Information Systems Week 5: February 4, 2008.
Evaluating a Research Report
Cultivating Security: Cultivating Security: Estate Planning for New Mexico Farm and Ranch Families Amy Lamb New Mexico State University Undergraduate Research.
Research Methodology Lecture No :14 (Sampling Design)
A Preliminary Investigation of Student Perceptions of Online Education Angela M. Clark University of South Alabama Presented at ISECON 2003 San Diego,
Thoughts on the Role of Surveys and Qualitative Methods in Evaluating Health IT National Resource Center for HIT 2005 AHRQ Annual Conference for Patient.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Agriculture, Human and Natural Resources Information Technology, Virginia Tech United States Department of Agriculture Peer Panel Meeting June 23-24, 2005.
USDA CSREES SERD Program Directors’ Conference Wednesday, March 30, 2005 New Orleans, LA Bill Richardson Ella Smith.
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
Perceptive Agile Measurement: New Instruments for Quantitative Studies in the Pursuit of the Social-Psychological Effect of Agile Practices Department.
Primary Years Programme FLIBS PYP: PYP Chair: Sandy Wesson PYP Chair Elect: Rachel Goodnow PYP Secretary: Gayle Baisch Grants Committee: Cynthia Doyle.
Strategies for Managing the Online Workload CADE 2003 St. John’s Newfoundland June, 2003.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
4th Quality Conference (4QC) Impact Assessment Study Rui Sousa, PhD Catholic University of Portugal Lisbon, 16 July 2007.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Just before we get started… Who are we? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys, Asunción,
Participation in FAEIS. Participants over 2 years TypeNumber* NLG *number indicates the schools with valid.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Descriptive Research & Questionnaire Design. Descriptive Research Survey versus Observation  Survey Primary data collection method based on communication.
2008 FAEIS Triennial Evaluation The purpose of the FAEIS Triennial Evaluation is to conduct a comprehensive assessment of FAEIS once every three years.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Accreditation (AdvancED) Process School Improvement Activities February 2016 Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator of.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
From Question to Action: Creating In-House Surveys as a part of Data Driven informed Decision Making David Consiglio EDUCAUSE Connect april 22, 2015.
Information for marketing management
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Annual Longitudinal Assessment
Course Evaluation Ad-Hoc Committee Recommendations
Update from ECO: Possible Approaches to Measuring Outcomes
Links for Academic Learning: Planning An Alignment Study
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension EducationCornell University Virginia Tech Food and Agricultural Education Information System (FAEIS) Project User Opinion Survey 2005

FAEIS Project User Opinion Survey 2005 To understand from the FAEIS users, their experiences with the system and its usefulness To gather input from FAEIS users in higher education on improvements and changes that are needed in the future Purpose

FAEIS Project User Opinion Survey Solicit input about the user friendliness of FAEIS 2.How well is FAEIS meeting its goal of supporting higher education in food, agriculture, and natural resources sciences 3.How well is FAEIS meeting its goals to support the national network of universities 4.Determine the comprehensive view of current and future use of FAEIS as reported by feedback from expert in the field Objectives

FAEIS Project User Opinion Survey Determine the effectiveness of FAEIS and future direction in terms of context, input, process, and product 5.1 CONTEXT Includes the elements of the university system in which FAEIS is operating including alignment with needs of the national network of universities, reliability of FAEIS, and interactions that university personnel have with FAEIS Objectives

FAEIS Project User Opinion Survey INPUT Asks the user to consider the human, fiscal, and other resources required for the FAEIS to operate within the university system such as commitment of staff time and variables related to data entry. Objectives

FAEIS Project User Opinion Survey PROCESS The user is asked to evaluate process variables (i.e. timing of data entry and help desk activity) required for development, maintenance, and continuous improvement. Objectives

FAEIS Project User Opinion Survey PRODUCT Characteristics of project outcomes and uses of the system such as quality of report generators, reliability of the system components, and need for further refinements are evaluated. Objectives

FAEIS Project User Opinion Survey 2005 Methodology Survey Research using Case Study Methodology Administer Survey Instrument to Population of FAEIS User Group Focus Group using Expert Panel

FAEIS Project User Opinion Survey 2005  Structure of evaluation distributed to FAEIS panel of experts Fall 04 for review and reaction  Review and refinement of structure based on input  Decision to delay evaluation until Spring 05.  The survey instrument was developed by SRI in collaboration with the FAEIS central project staff and USDA Higher Education administration responsible for the project. Methodology

FAEIS Project User Opinion Survey 2005 Population and Sample 1862 Land-Grant Land-Grant Land-Grant26 Non-Land-Grant419 Other (US, USDA, Priv, Org)57

FAEIS Project User Opinion Survey Outlining overall survey objectives 2.Development of survey dimensions 3.Development of items within dimensions 4.Internal testing-administration of survey to individuals on the FAEIS administrative team 5.Revision of instrument based upon internal testing Instrument Development

FAEIS Project User Opinion Survey Pilot test - via web N=35 individuals 10 pre-selected due to qualifications 25 randomly selected Personalized invitation s sent with unique URL links on 5/26/05 Reminder sent to non-respondents on 5/31/05 Total: 17/35 completions Response rate: 48.6% Instrument Development

FAEIS Project User Opinion Survey Revision of questionnaire based upon pilot results Panel of experts used to determine instrument validity Variance in responses Not all questions applicable to all respondent types Additional response categories added, few minor changes made Conclusion: revised instrument was valid and reliable and able to capture the desired information Instrument Development

FAEIS Project User Opinion Survey Launching of tested instrument Personalized invitation with unique URL link sent on June 6, 2005 Data Collection

FAEIS Project User Opinion Survey 2005 Number of Responses by Institution Type June 17 Institution TypeTotalCompletionsRefusalsResponse Rate 1862 Land-Grant % 1890 Land-Grant % 1994 Land-Grant % Non-Land-Grant % Other (US, USDA, Priv, Org) % TOTAL1, % Response Rate

FAEIS Project User Opinion Survey 2005 Send up to five reminder s to non- respondents Identify institution types with low response rates and follow-up with reminder phone calls Follow Up of Non Respondents

FAEIS Project User Opinion Survey 2005 Control for Non Repondent Error Comparison of non respondent to respondents data Determine reason for lack of participation in survey

FAEIS Project User Opinion Survey 2005 Indepth Analysis and Probing Use FAEIS Expert Panel and Focus Group Data Transcribed and Reported Internally Anonymity of Response

FAEIS Project User Opinion Survey 2005 Analyses to portray variables and comparison on dimensions of particular interest Conclusions Recommendations Executive Summary Final Report

FAEIS Project User Opinion Survey 2005 Who are the respondents: 352 respondents out of 1416 by June % use the FAEIS system and enter data 4.4% do not enter data, but use the system 13.2% do not use the system at all (of these significantly more NLG (20.2%) and Other (62.5%) are non-users) Preliminary Findings

FAEIS Project User Opinion Survey 2005 Q:Can you obtain student placement data? (significantly more NLG (30%) cannot get student placement data) Q:Are we collecting data from the institutions which you consider your peers? Q:Do a sufficient number of institutions report data to allow for meaningful and helpful analysis? Preliminary Findings: Context

FAEIS Project User Opinion Survey 2005 Q:How would you rate your ability to obtain the following data in the format required by FAEIS: Preliminary Findings: Input

FAEIS Project User Opinion Survey 2005 Q:Are the instructions in the FAEIS system clear? Preliminary Findings: Process

FAEIS Project User Opinion Survey 2005 Q:Is the support offered by the Help Desk adequate? Preliminary Findings: Process (continued)

FAEIS Project User Opinion Survey 2005 Q: How useful is the information collected in FAEIS: Preliminary Findings: Product

FAEIS Project User Opinion Survey 2005 Q:To what extend does FAEIS add value to other information sources that are available? Preliminary Findings: Product (continued)

FAEIS Project User Opinion Survey % do not use Custom Report Builder Preliminary Findings: Product (continued)

FAEIS Project User Opinion Survey 2005 Q:Who are users of FAEIS data at your institution? Preliminary Findings: Product (continued)

FAEIS Project User Opinion Survey 2005 Q:In general, to what extent is it useful for your institution to participate in the FAEIS database? Preliminary Findings: Product (continued)

FAEIS Project User Opinion Survey 2005 Preliminary Thoughts on Conclusions and Recommendations Need further analyses to explain the high rate of “don’t know”. Is this response isolated to a particular population? How can we correct this lack of information? Preliminary data show a positive view of FAEIS on most variables Need to determine why it is more difficult for non land grants to access data Custom report builder is in early stages of release and current data may not reflect full potential

FAEIS Project User Opinion Survey 2005 Preliminary Thoughts on Conclusions and Recommendations FAEIS seems to be satisfactory in terms of comprehensive elements involving context, input, process, and product Respondents were not forthcoming with significant new and innovative future development recommendations FAEIS seems to be meeting its goals in terms of user friendly and meeting needs of the university community.

FAEIS Project User Opinion Survey 2005 Final Report Format Are there recommendations for the format, style or types of analyses? Are there recommendations on developing the web report or the written report? Any suggestions on dissemination of findings?

FAEIS Project User Opinion Survey 2005 Your thoughts – opinions? Questions

FAEIS Project User Opinion Survey 2005 Thank you.