Assessing Undergraduate Sustainability Knowledge Campus Wide: Adam Zwickle - OSU Tomas Koontz - OSU Andrew Bodine - OSU Mark Stewart – UMD Nicole Horvath.

Slides:



Advertisements
Similar presentations
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Advertisements

Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
Social Norms and Conservation Behavior Results from the 2012 Campus Energy and Sustainability Behavior undergraduate survey Adam Zwickle, Ohio State University.
Assessing Undergraduate Sustainability Knowledge Campus Wide:
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
DEVELOPING A QUESTIONNAIRE FOR USE IN OUTCOMES ASSESSMENT
Redesign of PSYC 1101 into a 50% Online (Hybrid) Course Sue Spaulding, UNC Charlotte Pearson Education March 9, 2012 Boston Office.
Developing an Assessment of Sustainability Knowledge (ASK) for undergraduate students Adam Zwickle Tomas Koontz Andrew Bodine Kristina Slagle Environmental.
Assessing Sustainability Literacy An Instrument to Measure Knowledge of Environmental, Economic, and Social Sustainability Adam Zwickle Tomas Koontz Kristina.
Survey Design & Item Construction Lindsay Couzens, M.S. UNLV’s 2009 Academic Assessment Workshop May 14 th & 15 th.
SE 450 Software Processes & Product Metrics Survey Use & Design.
Chapter 13 Survey Designs
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Survey Research Measuring Environmental Consciousness and it’s Relationship to Green Behaviors and Sustainable Lifestyles.
© 2005 The McGraw-Hill Companies, Inc., All Rights Reserved. Chapter 8 Using Survey Research.
Strategic Management Process Lecture 2 COMT 492/592.
Chapter 13 Survey Designs
Introduction of Internet Survey Methodology to the Emirate of Abu Dhabi Andrew Ward, Maitha Al Junaibi and Dragica Sarich.
LibQUAL+ Process Overview Introduction to LibQUAL+ Workshop University of Westminster, London 21st January 2008 Selena Killick Association of Research.
Community Needs Assessments Thomas P. Holland, Ph.D., Professor UGA Institute for Nonprofit Organizations.
Customer Survey Van Bennekom Book. Introduction Surveying has become a commonplace tool on the business landscape due to the drive of the quality management.
Survey Designs EDUC 640- Dr. William M. Bauer
Implementing an Online Statewide Survey to Address Alcohol and Other Drug Use Among University Students in Arizona Presenters: Peggy Glider, Ph.D. Jon.
Temple University Russell Conwell Learning Center Office of Senior Vice Provost for Undergraduate Studies GETTING INVOLVED IN RESEARCH AT TEMPLE UNIVERSITY.
Information Competency: an overview Prepared by: Erlinda Estrada Judie Smith Mission College Library Santa Clara, CA.
What it is and what it is used for?.  It is a type of writing by an author who is trying to get something. As a result, it is an extremely persuasive.
Learning Objective Chapter 10 Questionnaire Design CHAPTER ten Questionnaire Design Copyright © 2000 by John Wiley & Sons, Inc.
Statistical Issues in Data Collection and Study Design For Community Programs and Research October 11, 2001 Elizabeth Garrett Division of Biostatistics.
CHAPTER eleven Questionnaire Design Copyright © 2002
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Survey Methods By Shivakumaraswamy, K N
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
CCSSE Houston Community College System Presented by Margaret Drain June 19, 2007.
Questionnaires and Interviews
Developing an Assessment of Sustainability Knowledge (ASK) for undergraduate students Adam Zwickle Tomas Koontz Andrew Bodine Kristina Slagle Environmental.
Aurali Dade, PhD Division of Research, University of Nevada, Las Vegas Barriers and Opportunities for Communicating Sustainability on Campus.
Community College Survey of Student Engagement CCSSE 2014.
Developing and Implementing Syllabus and Course Modules Jerash University Development of Interdisciplinary Program on Climate Change and Sustainability.
WEAVING SUSTAINABILITY THROUGH THE UNDERGRADUATE CURRICULUM Chad King, Assistant Professor of Environmental Science John Marazita, Professor of Psychology.
Presentation of Results NSSE 2003 Florida Gulf Coast University Office of Planning and Institutional Performance.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Library 150 Information Literacy & Research Skills E. Chisato Uyeki Fall 2006: Week 1 September 22, 2006.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Lesli Scott Ashley Bowers Sue Ellen Hansen Robin Tepper Jacob Survey Research Center, University of Michigan Third International Conference on Establishment.
Data Collection Methods
Data Librarians Represent! Integrating Data Services into the Social Science Research Process Lynda Kellam Data Services & Government Information Librarian.
Collecting Information via the Web Stephen Porter Director of Institutional Research Michael Roy Director of Academic Computing Services.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
Introduction to Building Your Portfolio. What is PLA?  PLA = Prior Learning Assessment  “Process of earning college credit from learning acquired through.
Chapter 12 Survey Research.
Head to Irrationallabs.org for more information on applying behavioral economics.
Quality Assessment July 31, 2006 Informing Practice.
Mr. Swarthout 8 th grade Honors English I Please pick up your student’s SpringBoard Book and spiral or composition notebook from the white basket on the.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
® LibQUAL+ ® Implementation Procedures The Third Lodz [Poland] Library Conference Technical University of Lodz June, 2008 Presented by: Bruce Thompson.
Surveys.
Asking Questions Dr. Guerette. Appropriate Topics Counting Crime Counting Crime Asking respondents about their victimization or offenders about their.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Effectively Serving Adult Learners to Reach Postsecondary Attainment Goals Go Higher Ohio – Summit on Postsecondary Education Attainment Columbus, Ohio.
Using Surveys to Design and Evaluate Watershed Education and Outreach Day 5 Methodologies for Implementing Mailed Surveys Alternatives to Mailed Surveys.
Questionnaire Design.
Is It Worth the Cost? The Use of a Survey Invitation Letter to Increase Response to an Survey. Brian Robertson, PhD, VP Research John Charles, MS,
Giving Diverse & Underserved Students a Leg up on Student Success
MEASUREMENT AND QUESTIONNAIRE CONSTRUCTION:
Presentation transcript:

Assessing Undergraduate Sustainability Knowledge Campus Wide: Adam Zwickle - OSU Tomas Koontz - OSU Andrew Bodine - OSU Mark Stewart – UMD Nicole Horvath - UMD Environmental and Social Sustainability Lab From design to implementation and analysis

Overview  How we developed our Assessment of Sustainability Knowledge (ASK)  Why an ASK is important & how it can help  An aside on Knowledge and Literacy  Conducting an assessment  Thinking long term… 2

Developing an assessment  Built upon the “triple bottom line”, the “three legged stool”, the “3 p’s”  Environmental (planet)  Economic (prosperity)  Social (people) 3

Developing an assessment  Replicated questions used in the past  Coyle, “Environmental Literacy in America.”  Solicited topics and questions from experts  Held expert focus groups  Pilot tested among professors, graduate, and undergraduate students  Narrowed down to 30 questions 4

Developing an assessment  Distributed those 30 to OSU students  Used IRT to throw out 14  Added UMD’s 16  Distributed those to OSU and UMD students  Used IRT to throw out 2  Current ASK has 28 items:  ess.osu.edu 5

Conceptualizing sustainability knowledge Social Economic Environmental Sustainability 6 Which of the following is the most commonly used definition of sustainable development? Meeting the needs of the present without compromising the ability of future generations to meet their own needs What is the most common cause of pollution of streams and rivers? Surface water running off yards, city streets, paved, lots, and farm fields Many economists argue that electricity prices in the U.S. are too low because… They do not reflect the costs of pollution from generating the electricity

How is this helpful? 7

Need for measuring knowledge  University goals-  More along the lines of:  “Become carbon neutral by 2050”  Less common:  “Create sustainably minded citizens of tomorrow”  Can help track improvement over time 8

Need for measuring knowledge  Serves as an evaluation for specific academic efforts:  Interdisciplinary programs  Sustainability majors/minors  STARS Credit:  ER 6 (STARS 2.0) – 3 points available 9

“Knowledge” vs. “Literacy”  Knowledge  Can be objectively measured  Can be used to evaluate academic programs  Literacy  = Knowledge  Knowledge + values, attitudes, and behaviors  Can be used to evaluate outreach efforts, sustainability campaigns 10

Measuring Behaviors, Values, and Attitudes  Make sure you are asking the right questions  Collaborate with sustainability departments to target specific behaviors (e.g., leaving lights on)  Include questions on behavioral barriers  Other behaviors that may get at the same concept 11

Measuring Behaviors, Values, and Attitudes  Collaborate with academic departments to develop a good survey design  No need to reinvent the wheel  Each survey could be a Master’s thesis 12

Conducting an Assessment 13

Conducting an Assessment 14  Find your partners!  IRB approval  Required for publication  Exempt status  Registrar approval*  Student’s s, majors, and demographics  Survey software*  We use Qualtrics, but there are others *If a large scale assessment is planned

Maximizing Response Rates  Maximizing response rates is important to reduce uncertainty about how well your completed sample matches the population of interest.  Dillman (2008) and others have long studied how to maximize response rates for surveys that were telephone, mailed, or in-person.  There is growing research on electronic survey response rates. 15

Research Methods  We compare response rates from five different survey implementations:  1: 2012 spring OSU (n=10,000)  2: 2013 spring OSU sample A (n=10,000)  3: 2013 spring OSU sample B (n=10,000)  4: 2013 spring OSU School of ENR (n=538)  5: 2013 spring UMD (n=10,000)  We tried different treatments and tracked survey responses with survey software (Survey Monkey and Qualtrics) 16

Key Variables Affecting Response Rates among College Students  Timing  When to send the invitation and reminders  Incentives  text  Who it is from  Questionnaire format  Long list of questions vs. more page clicks 17

Timing Matters  Time of Semester:  Last week of semester and into finals week  Middle of semester  Time of Day:  6:00 am  6:00 pm  Reminders are critical  big spike in completed surveys after each reminder, with a fast decay 18

Timing Matters  Overall, you want students sitting at their computers… but wanting to be distracted 19

Incentives  Some disagreement in research on best way to provide incentives  ahead of time (Dillman 2008)  randomly select winner of 1 big prize  give more/all respondents smaller prizes  May impact the validity of the data 20

Survey Responses 21 UMD n=1,556 OSU n=2,621 Day

text  How the invitation s are phrased affects response rates.  Besides making the invitation personal, clear, and as short as possible, prior research has found that who the invitation comes from matters.  An appeal from a trusted authority increases response rates. 22

text: Appeal from Authority  We compared an appeal from authority versus an appeal from a peer (student).  Two surveys, A and B, had an appeal from a higher authority (University VP) for first 3 contacts.  For the 4 th contact kept the higher authority for A, but switched B to have to an appeal from a grad student 23

text: Appeal from Authority  Results: = additional 538 = additional 348 over the next 65 hours 24

Questionnaire Format  Trade-off:  Long list of questions to scroll down  Shorter lists with a page click to get to the next page  We analyzed respondent drop-outs spots  Highest spots were just after clicking to the next page  We recommend finding a balance… 25

Non-Respondent Bias  Even after efforts to maximize response rates, are the non-respondents different from the respondents?  We conducted a non-respondent short survey (5 questions + some demographics).  Results indicate that non-respondents are slightly but significantly less knowledgeable about sustainability topics, but no difference in GPA or pro-environmental behaviors. 26

Planning ahead…  For longitudinal studies:  Write a multiple year IRB  Let registrar know this is a yearly survey  Finding partners  Sustainability office  Academic departments with survey expertise (interdisciplinary social science, communications, environmental studies, sociology, political science, psychology) 27

Acknowledgements Funded by:  Office of Energy Services and Sustainability  OSU’s School of Environment & Natural Resources 

Thank You!  Environment and Social Sustainability Lab  ess.osu.edu ess.osu.edu  Contains:  This presentation  The 28 question ASK  Forthcoming article    Questions? 29