Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building a Culture of Assessment

Similar presentations


Presentation on theme: "Building a Culture of Assessment"— Presentation transcript:

1 Building a Culture of Assessment
Monday Afternoon (3:30) AOA Annual Conference 2017

2 Your Facilitators Debra Hammond Barnaby Peake Kingson Leung
Executive Director of the University Student Union Cal State Northridge Barnaby Peake Director of the Bronco Student Center Associated Students, Inc. – Cal Poly Pomona Kingson Leung Coordinator of Assessment and Special Initiatives USU, Cal State Northridge Barny to welcome and intro first, then Kingson with intro himself and lead in to the Pollev

3 Who’s in the room? Poll everywhere questions
6L3tt6gIY mjU Barny: just make sure the Pollev app is on your laptop so we don’t need to click back and forth. It takes a minute to pollev.com/app

4

5

6 The Current Culture of Assessment
WASC requiring evidence of student learning Intentional outcomes-based learning philosophy CSU focus on student success Students are more savvy “shoppers” Associations are responding – ACUI, NASPA, AOA kl The question that led to the creation of the sub-committee on assessment was “how can we be better at assessment?” and the desire to know what other Auxiliaries are doing to get the data they need to demonstrate their impact on student success. How can we do it better?

7 AS/Union/Rec Sub-committee on Assessment
September 2015 AS/Union/Recreation meeting at Cal State Fullerton Creation of a sub-committee on assessment practices Desire to share expertise and resources, help one another, collaborate more Opportunities for benchmarking across the CSU Development of a survey to understand the current assessment practices of Auxiliaries Barny to give overview and background

8 Methodology 37 question survey CampusLabs link sent via email
Sent to all Executive Directors in the AS/Union/Rec membership of AOA One response per Auxiliary requested Barny to share results of the survey tool, slides 8-13

9 16 Auxiliaries Responded
Dominguez Hills East Bay Fullerton Long Beach LA Northridge Chico Pomona Sacramento San Diego San Francisco San Jose San Marcos Barny

10 Current Assessment Practices
From the 2016 AOA survey BP to go through this section

11 Assessment Methods 93%- Surveys 71%- Focus groups
57%- Document analysis, Observations, Interviews

12 Utilization of National Surveys
36%- CAS Standards 29%- NASPA survey for Student Unions and Rec Centers (CampusLabs) 14%- EBI / Skyfactor 2 questions from survey (captured into 1) 86% 43%

13 Auxiliary’s Use of Other Campus Data Sets
29% 86% 43%

14 Auxiliary’s Use of Data Collected

15 Where Auxiliaries are Struggling
43% KEY FINDINGS: Don’t have the staff who can dedicate time to assessment Lack of training to design, conduct, and analyze survey data 50% 71% 79%

16 Addressing the lack of staff
“We Need Help”

17 79% 50% Auxiliary struggles:
Reported struggling the most with limited staff resources Reported struggling with ability to analyze data 50% Barny From the 2016 AOA survey

18 Assessment Takes Time and Expertise
Staff time = cost to organization Survey development Promotion and administration of instrument Data analysis Reporting Implementation of findings Reviewing and revising assessment tool for future use Barny to present Can we come up with actual numbers and cost analysis?

19 Staffing for Assessment Work
47% have a staff member assigned in part or entirely to assessment 2 Auxiliaries have a full-time dedicated staff member for assessment 5 Auxiliaries have assessment responsibilities included in multiple job descriptions Assessment work coordinated by each department through the Director Some collaboration between Auxiliary departments or Student Affairs for annual surveys Barny From the 2016 AOA survey

20 “Status of Assessment Staff
in Higher Education” Association for the Assessment of Learning in Higher Education (AALHE) 2016 Conference Not sure if these slides work here. The rest of their presentation wasn’t that relevant to AOA as they are talking mostly about IR at the University level. What do you think? KL to present these slides

21 KL

22 KL

23 KL

24 If you don’t have dedicated staff, what can you do?
1) Use existing national surveys and benchmarking tools Tested and vetted surveys Opportunities for benchmarking and cross-campus comparison Technical support Additional campus-specific questions KL to present

25 NASPA Survey Cost: 29% of Auxiliaries have participated in this survey
$1,500 - less than 5,000 enrollment $2, ,000 or more enrollment $2,250 non-member school Provides benchmarking information on: Recruitment/retention impacts Connection to the campus Levels of involvement Operations, services, program impacts/satisfaction Skills gained Learning outcomes 29% of Auxiliaries have participated in this survey Offers College Union & Recreation & wellness Surveys Administered by Campus Labs Can be done at any time before the end of May Can add campus specific questions RESOURCE:

26 NASPA Results (selected)
Impact: KL Conducted in

27 NASPA Results (selected)
Learning Outcomes: KL Conducted in

28 NASPA Results (selected)
Persistence to Graduation Conducted in KL

29 EBI / Skyfactor Considerations
14% of Auxiliaries have participated Assessment tools for Student Union, Student Activities, and more Survey tool vetted by ACUI – man hours to develop Cost: $2,500-3,000 per survey Some customization of questions possible, but limited BP Skyfactor invest on average man hours to develop a national benchmarked assessment. The assessments, where appropriate, are developed in partnership with a professional associations. Whether developed in partnership with a professional association or not the assessments are vetted by campus based professionals that are known to have experience and expertise in their field. Once the initial version of the assessment developed the tool is piloted and statistically tested for factor reliability. The results are then reviewed by the Skyfactor Analytics and Research Team. Those aggregate results are once again shared and reviewed with the campus based professionals and professional association if appropriate. The assessments are then annually tested for statistically reliability and face validity. So when you think of limited staff resources part of the reason assessment may not be taking place is just the sheer time to develop a quality assessment tool. Then with an in house assessment the tool is typically not validated. In most cases there is not the understanding there is a problem with the assessment design until the assessment has been deployed and data is collected. I recently had a conversation with an Ivy League institution about utilizing our housing assessment and in that conversation the person shared with me she was looking at this self-designed assessment tool for their campus master plan and she was not sure if the data was even usable due to some key questions left off and the wording of some of the questions. With the predesigned assessment tool campus can use resources to work on actin plans and focus action in ways that truly improve the student experience. While the assessments are standardized they can be customized by the campus by hiding areas that my not fit a particular campus as well as they can add up to 20 institutional specific questions to ask campus specific topics.  RESOURCE:

30 EBI / Skyfactor Considerations
Validity National benchmark assessment hours to develop Involvement of professional associations Tested for reliability Help, Training, Support, Ease of Use Consultation with Client Services On-line reports – multiple views Can download onto Excel Readily Available 1 day after assessment closes Training webinars, blog, on-line resources Custom statistical analysis report Identify greatest impacts on students learning, satisfaction & overall program effectiveness These seem more like our talking points to explain the survey. Can we cover these points while on slide 29?

31 EBI / Skyfactor Considerations
Learning Outcomes: Self-knowledge Practical competencies Principled dissent Cognitive complexity Personal competencies Healthy Behaviors Enhance relationship Appreciation for diversity Sense of belonging Interpersonal Competence Collaboration Leadership College Union / Student Center Event Services Student Activities Student Organizations Recreation Impacts: Customer service Satisfaction Effectiveness Enrich Campus Life BP

32 If you don’t have dedicated staff, what can you do?
2) Consider Graduate Assistants Grad programs often require field work and assistantships Assessment work is part of the curriculum Win-win for GA and the Auxiliary Programs have some requirements and expectations of host campus and supervisory staff BP

33 List of Graduate Programs in California
Alliant International University EdD in Educational Leadership and Management  EDD Azusa Pacific University College Counseling and Student Development  MS California Lutheran University College Counseling and Guidance Programs  MS California Polytechnic State University-San Luis Obispo Counseling & Guidance; Higher Education Counseling/Student Affairs  MA California State University-Fullerton Higher Education  MS California State University-Long Beach Student Development in Higher Education  MS California State University-Northridge College Counseling / Student Services  MS California State University-Sacramento Higher Education Leadership  MA Drexel University – Sacramento Center for Graduate Studies Master of Science in Higher Education  MS San Diego State University Educational Leadership  EDD Masters of Arts in Education with a Concentration in Educational Leadership and a Specialization in Student Affairs  MA Stanford University Policy, Organization and Leadership Studies, Higher Education Concentration  MA University of California-Los Angeles Student Affairs  MED UCLA Educational Leadership Program  EDD University of Redlands Doctorate in Leadership for Educational Justice  EDD Higher Education   MA School and College Counseling  MA  University of San Diego Higher Education Leadership/Student Affairs  MA Leadership Studies, specialization in Higher Education Leadership  PHD  University of San Francisco Higher Education and Student Affairs (HESA)  MA University of Southern California Postsecondary Administration and Student Affairs (PASA)  MED University of the Pacific MA Education with a Specialization in Student Affairs (Students also qualify for specialization in Educational & Organizational Leadership)  MA BP List of Graduate Programs 17 campuses (23 programs)

34 “We Need Training” Addressing staff training
KL to present this section

35 71% 50% Auxiliary Struggles:
Reported a lack of staff training to conduct assessments Reported struggling with ability to analyze data 50% KL From the 2016 AOA survey

36 If you don’t have assessment expertise, what can you do?
1) Reach out to your partners for help Faculty Other departments- Institutional Research Other campuses (AOA colleagues) KL to present Share your work with SD State

37 If you don’t have assessment expertise, what can you do?
2) Training sessions available National associations like ACUI, NASPA, NIRSA, ACPA Education Week has a list of webinars InsideHigherEd.com has a list of webinars and articles CampusLabs Skyfactor Sightlines (for facilities) KL to present Share your work with SD State

38 If you don’t have dedicated staff, what can you do?
3) Use existing assessment services and web applications Campus Labs- shared surveys through the community on-line Survey Monkey- offers survey templates Qualtrics Orgsync CAS Standards KL to present

39 Assessment Types Being Used
Experienced campuses listed below chart Dominguez Hills Fullerton Northridge Pomona Dominguez Hills Fullerton Long Beach Los Angeles Northridge Pomona Sacramento San Jose Dominguez Hills Northridge Pomona Dominguez Hills Fullerton Los Angeles Northridge Sacramento Northridge

40 Assessment Types Being Used (Cont.)
Experienced campuses listed below chart Dominguez Hills Fullerton Northridge Sacramento Dominguez Hills Fullerton Northridge San Marcos None None San Marcos

41 Side by side to see where we are over all in what we are comfortable in and not

42 Paying for Assessment Tools
Qualtrics Campus Labs Orgsync Survey Monkey Self-Created Auxiliary Campus Other Shared Cost Some services provide consultation to assist with survey development Templates to use Able to view other campuses surveys Pricing varies greatly for services Skyfactor invest on average man hours to develop a national benchmarked assessment. The assessments, where appropriate, are developed in partnership with a professional associations. Whether developed in partnership with a professional association or not the assessments are vetted by campus based professionals that are known to have experience and expertise in their field. Once the initial version of the assessment developed the tool is piloted and statistically tested for factor reliability. The results are then reviewed by the Skyfactor Analytics and Research Team. Those aggregate results are once again shared and reviewed with the campus based professionals and professional association if appropriate. The assessments are then annually tested for statistically reliability and face validity. So when you think of limited staff resources part of the reason assessment may not be taking place is just the sheer time to develop a quality assessment tool. Then with an in house assessment the tool is typically not validated. In most cases there is not the understanding there is a problem with the assessment design until the assessment has been deployed and data is collected. I recently had a conversation with an Ivy League institution about utilizing our housing assessment and in that conversation the person shared with me she was looking at this self-designed assessment tool for their campus master plan and she was not sure if the data was even usable due to some key questions left off and the wording of some of the questions. With the predesigned assessment tool campus can use resources to work on actin plans and focus action in ways that truly improve the student experience. While the assessments are standardized they can be customized by the campus by hiding areas that my not fit a particular campus as well as they can add up to 20 institutional specific questions to ask campus specific topics.  From the 2016 AOA survey

43 Survey Monkey Free Version Select ($26/mo.) - $18/mo. Edu
Limit 10 questions per survey Limit 3 collectors per survey Page skip logic Progress bar Limit 1000 responses per month Unlimited questions Add Logo, colors, themes, custom URL, thank you page Download results Maybe we save this info at the end and if there are questions about it we can share it. Since we don’t have pricing on other survey tools, I don’t want us to sound like we’re pitching SurveyMonkey over the others. What do you think?

44 Survey Monkey Gold $300 ($25/mo.) - $204 Edu Platinum $1,020 ($85/mo.)
Unlimited responses and collectors Add additional users to build/share/analyze/share File upload capability Question piping Randomization options Filter & Cross Tab data Export to SPSS Text Analysis HIPPA compliant features Phone support Remove survey monkey footer Advance logic

45 Closing Thoughts Reviewing the findings and next steps

46 Graduation Initiative 2025

47

48 Developing your culture of assessment
Get started! It all begins with a question, what questions do you have? What questions are being asked of your Auxiliary? Identify your resources Report assessment results Tell your story Data informed decisions

49 36% Auxiliary Struggles:
Reported struggling the most because data is not shared among departments From the 2016 AOA survey

50 Most of us are NOT sharing results with students
Sharing Our Data Staff meetings Board meetings Annual reports Auxiliary staff Campus partners However… Most of us are NOT sharing results with students Celebrate Results

51 “With whom are results actively shared?”
43% 79% 100%

52 “How are results shared?”

53 What’s next from the committee…
Recommendations from the survey of AOA members Webinars and conference sessions Sharing best practices, survey instruments, data Creating resources for AOA members Make them easily accessible on the website Resource guide for assessment practices List of Auxiliaries and their current assessment work and level of expertise Share about committee work. Next steps? Conference sessions, Resources guide, Gather and share information (online) AOA site update?

54

55

56 Questions, thoughts, comments?


Download ppt "Building a Culture of Assessment"

Similar presentations


Ads by Google