Collecting Longitudinal Evaluation Data in a College Setting: Strategies for Managing Mountains of Data Jennifer Ann Morrow, Ph.D. 1 Erin Mehalic Burr,

Slides:



Advertisements
Similar presentations
Introduction to the NCDR Research Management System
Advertisements

Training Presentation E-Learning Test Request. Objective Provide Test Center staff members with information about the e-learning test request process.
LeadManager™- Internet Marketing Lead Management Solution May, 2009.
MP3 proposal. Template  Title  Your group name and group members  Application overview  Main functions  Detail description  Timeline and task assignment.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
MoversSuite Getting the Most Out Of Your Investment.
Reference Guide Module 4: Reports October 2014 Reference Guide Module 4: Reports October 2014.
HOW MANY HAVE USED QUALTRICS? WHO WHAT WHEN WHERE WHY.
Compliance on Demand. Introduction ComplianceKeeper is a web-based Licensing and Learning Management System (LLMS), that allows users to manage all Company,
Introduction to the Child & Adolescent Needs and Strengths Assessment (CANS) Our Community. Our Kids. Dr. Gary Buff, Ed.D. President and COO.
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
New School Process Please read the following slides carefully, as they provide helpful information about the new school account creation process.
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
A Product of Event Management, Register, Schedule & Report Entire Back to Back Event Management Solution Copyright © ANGLER.
Copyright Shanna Smith & Tom Bohman (2003). This work is the intellectual property of the authors. Permission is granted for this material to be shared.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
8/9/2015 1:47 AM SurveyCentralOverview.ppt CSC ©Copyright 2012 Online Survey Application: CSC Survey Central System Overview November 26, 2012 Supported.
By: By: Austin Graft, Veldez Joshua, and Kelsey Miller. AI’s: AI’s: Vaibhav Garg and Wyatt Clark.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
Implementing an Online Statewide Survey to Address Alcohol and Other Drug Use Among University Students in Arizona Presenters: Peggy Glider, Ph.D. Jon.
Developing an Effective Call Center within the Financial Aid Office Texas Association of Student Financial Aid Administrators Fall 2010 Conference.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Evaluating NSF Programs
Using Collaborative and Expressive Writing Activities to Educate First-Year Students about Alcohol and Drugs Jennifer Ann Morrow, Ph.D. University of Tennessee.
1 CADE Finance and HR Reports Administrative Staff Leadership Conference Presenter: Mary Jo Kuffner, Assistant Director Administration.
Welcome to the Learning Community 2015 Roll out webinar Hosted by the Family Institute for Education, Practice & Research The webinar will begin shortly.
Overview of Features and Reports Version 2.0 Send inquiries to:
DE&T (QuickVic) Reporting Software Overview Term
© The Johns Hopkins University and The Johns Hopkins Health System Corporation, 2011 Using the Online HSOPS & RC Apps for CSTS Armstrong Institute for.
Tarrence McGovern Team Leader South Carolina Department of Education Office of Exceptional Children Heather Grohn SC Gateways Project Evaluator University.
Tracking Online Student Engagement Presented by Kimberly Webster eLearning Instructional Coach Ottawa-Carleton District School Board.
T. Rowe Price, Invest With Confidence and the Bighorn Sheep logo is a registered trademark of T. Rowe Price Group, Inc. Please dial from.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
Introduction to Course MMIS 656 Web Design Technologies.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08.
How to Prepare for an Ohio Technical Assistance Visit.
Statistics for Business and Economics Chapter 1 Statistics, Data, & Statistical Thinking.
DCIPS Implementation Project Plan Update Army G2 Intelligence Personnel Management Office (IPMO) April 6, 2009.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
LibQUAL+ ® Survey Administration American Library Association Midwinter Meeting Denver, CO January 26, 2009 Presented by: MaShana Davis Technical Communications.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
Certificate IV in Project Management Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
Draft TIP for E-rate. What is E-rate? The E-rate provides discounts to assist schools and libraries in the United States to obtain affordable telecommunications.
1 Accommodate Pro 4.0 Creating Effective Intervention Plans with TM Data Impact Software, L.L.C Jeff Crockett Ted Behn.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
ASEE Profiles and Salary Surveys: An Overview
® LibQUAL+ ® Implementation Procedures The Third Lodz [Poland] Library Conference Technical University of Lodz June, 2008 Presented by: Bruce Thompson.
LibQUAL+ ® Survey Administration LibQUAL+® Exchange Northumbria Florence, Italy August 17, 2009 Presented by: Martha Kyrillidou Senior Director, Statistics.
Identifying Assessments
Office of the Chief Information Officer Introduction to Qualtrics for Online Survey & m-Learning Office of the CIO.
The Center for IDEA Early Childhood Data Systems DaSy Data System Framework: A Tool for Building High-Quality State Part C and Section 619 Data Systems.
Your school name Career and College Counseling Featuring GUIDEDPATH: Helping You Get to College Your school picture here.
Measuring University Outreach and Engagement: CSLCE End of Semester Survey Nicole C. Springer Center for Service-Learning and Civic Engagement Michigan.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 2.
Preparing for Data Analysis Some tips and tricks for getting your data organized so that you can do the “fun stuff”!
Microsoft Customer 2 Partner Connector Quick Reference Guide
Component D: Activity D.3: Surveys Department EU Twinning Project.
Easy Chair Online Conference Submission, Tracking and Distribution Process: Getting Started AMS WMC and AMS Annual Conferences Click on play to begin show.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. REDCap:
After-Session Actions
What’s Your Behavioral Style?
What’s Your Social Style?
My10yearPlan.com An Overview
2018 NM Community Survey Data Entry Training
DISCstyles Steady? Dominant? Interactive? Compliant?
Total Reward Model for “ Office Manager “ { N Spot Beauty Centre}
Amanda Holliday, MS, RD, LDN
Compliance Assist: Faculty Roster Database
Presentation transcript:

Collecting Longitudinal Evaluation Data in a College Setting: Strategies for Managing Mountains of Data Jennifer Ann Morrow, Ph.D. 1 Erin Mehalic Burr, M.S. 1 Marcia Cianfrani, B.S. 2 Susanne Kaesbauer 2 Margot E. Ackermann, Ph.D. 3 University of Tennessee 1 Old Dominion University 2 Homeward 3

Overview of Presentation Description of Project Writing Research Team Evaluation Methodology Data Management What worked What did not work Suggestions for Evaluators

Description of Project Writing Goal: to reduce high-risk drinking and stress in first- year college students. 231 students were randomly assigned to one of three online interventions. Online Interventions Expressive Writing Behavioral Monitoring Expressive Writing and Behavioral Monitoring Participants received payment and other incentives (e.g., gift certificates).

Research Team Jennifer Ann Morrow (project PI). Robin Lewis (project Co PI). Margot Ackermann (Evaluation team leader). Erin Burr (Project manager and assisted with evaluation). Undergraduate assistants: Marcia Cianfrani, Susanne Kaesbauer, Nicholas Paulson.

Evaluation Methodology Comprehensive formative and summative evaluation. Online data collection method: Participants ed link to intervention/survey each week. Timing of data collection: Pretest survey, midpoint surveys (weeks 3 and 6), posttest survey. Types of data collected: Qualitative: online journals, open-ended questions. Quantitative: numerous standardized measures.

Online Data Collection We utilized Inquisite ® software to collect all data. Features of software: A variety of question types available. Survey templates are available or you can customize. Allows you to upload lists. Can create invitations and reminders. Enables you to track participation using authorization key. You can download data into a variety of formats (e.g., SPSS, Excel, RTF files). Performs frequency analyses and graphs.

Evaluation Tools Used Project specific account: Multiple researchers had access and were able to send, read, and reply to participants’ s. Allowed us to keep all correspondence with participants in one location. Project notebook: Copies of all project materials (e.g., surveys, invitations). Contained instructions on how to manage surveys and download data. Included descriptions of problems encountered and how they were addressed.

Evaluation Tools Used Weekly status reports: Project manager created weekly reports that were sent to all team members each Sunday. Contained: summary of work completed previous week, key tasks for each member to complete that week, any issues that needed to be addressed. Reports were discussed in weekly meetings. Project data codebook: Contained complete list of variable names/labels, value labels, and syntax for creating composites/reverse scoring. Listed name and description of every dataset for project.

Evaluation Tools Used Data analysis plans: Created detailed analysis plans (using SPSS syntax) for each dataset. Included: data cleaning (e.g., composite creation, addressing assumptions), qualitative coding, descriptive and inferential statistics. Master participant list: Contained complete list of all participants. Included: contact information, participant id, list of weeks participated, list of payments and incentives received.

What Worked Online data collection: Enabled us to collects lots of data in a short period of time and with little effort. Automated system for contacting and tracking participants. We could download the data multiple times each week and in multiple formats. Project specific Enabled us to split the work of responding to participants among all researchers. Each researcher had access to every that was received/sent.

What Worked Project notebook: All project materials were located in one larger document. Enabled us to train new research assistants easily. Allowed us to keep track of problems that we encountered. Weekly status reports: Could track the number of person hours each week. Each team member could see what they and others were responsible for completing.

What Worked Other tools/activities that were useful: Project data codebook. Data analysis plans. Master participant list. Analysis teams. Cross-training of all researchers. Weekly meetings and specific team meetings. Student research assistants (inexpensive labor).

What Did Not Work Technology issues: We had various technical clitches with the survey software. MAC versus PC use of research assistants. We had not involved the technology staff at the school in our project until we had issues. Inflexibility of data collection schedule: We created a rigid schedule and when problems occurred it was difficult and stressful to modify our plans.

What Did Not Work Too many “cooks”: We had multiple people work on the same dataset. Not always did they keep accurate track of what modifications were made. Too much data, not enough person hours: We underestimated how much time it would take to manage all of the data we collected.

Suggestions for Evaluators Use standardized tools (e.g., project notebook, data analysis plans) to manage your data. Use online tools to recruit and manage college student participants: Project , project website, social networking sites. Involve students/interns: Offer course credit, internship hours instead of salary.

Contact Information If you would like more information regarding this project please contact: Jennifer Ann Morrow, Ph.D. Asst. Professor in Assessment and Evaluation Dept. of Educational Psychology and Counseling University of Tennessee Knoxville, TN Office Phone: (865)