MHSA Data Collection. 2010- Community-driven input for evaluation and reporting requirements on PEI activities 2011- Narrowing of reporting requirements.

Slides:



Advertisements
Similar presentations
National Representative 2011 – 2012 Diversity & Multiculturalism.
Advertisements

Early Childhood Outcomes… Who, What, Where and How
April 4, 2012 Indiana Joint National Public Health Week Conference.
Mn/DOT & ACEC/MN Collaboration 1. Mn/DOT Commissioner and Executive Staff ACEC Transportation Executive Leadership Team ACEC - Mn/DOT Collaboration Team.
Massachusetts Quality Rating and Improvement System (QRIS) December 2009.
Kara Taguchi, Psy.D. Mental Health Clinical Program Head MHSA Implementation and Outcomes Division RAND/SRI Data Systems Workgroup.
PEG ANAWALT, M.S. EXECUTIVE DIRECTOR, CCCRC - CHESAPEAKE COLLEGE LESLEY FALLON, B.A., COORDINATOR, CCCRC - CHESAPEAKE COLLEGE H. SUSIE CODDINGTON, PH.D.,
Planning, Using, and Adapting County Data Systems CalMHSA PEI TTACB Work Group March 5, 2014 Facilitated by RAND and SRI Planning, Using, and Adapting.
Strategic Workforce Development Planning Project Presented by: Jane Henty Date: 13/03/2012.
Let’s Take a Look Back Reviewing your PD Plan. Guskey’s Three Phases of Evaluation Need to image from Summer Institute 1 The flow chart??
1 Service Providers Capacity Assessment Framework Presentation to the Service Delivery Advisory Group August 28, 2008.
Use of a Registry for Chronic Disease Management in a Small County Behavioral Health Setting Karen Stockton, Ph.D., M.S.W., B.S.N. Health Services / Behavioral.
Building a Healthier Prince George’s County Rushern L. Baker, III County Executive PRINCE GEORGE’S COUNTY HEALTH DEPARTMENT UPDATES FROM THE PGCHEZ Pamela.
Data Mining in Industry: Putting Theory into Practice Bhavani Raskutti.
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
APS District Technology Plan Team Project July 17, 2012 Jill Roberts, Sam Skeels, Deb Stevenson.
FY 10/11 Annual Update DMH Information Notice: Highlighted Changes Department of Mental Health February 2010.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Council collected information. Council datasets Small area population projections. Community satisfaction surveys. Household and community surveys.
ADDICTIONS AND MENTAL HEALTH DIVISION Adult New Investment Quarterly Reports Wendy Chavez, MPA April 23, 2014 Developed By: Wendy Chavez, MPA Adult New.
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Academic Work Environment Survey 2004 Barbara Silver, ADVANCE Program Director Presented at the ADVANCE National Conference, G-Tech, Atlanta, Georgia April.
Evaluation. Practical Evaluation Michael Quinn Patton.
Designing and Implementing An Effective Schoolwide Program
Data Entry PEI Outcome Measures Application for Mental Health Integration Program (MHIP) John J. Flynn, Ph.D. Revision Date: 9/20/12.
Data Entry PEI Outcomes Measures Application
Assessing the Heritage Planning Process: the Views of Citizens Assessing the Heritage Planning Process: the Views of Citizens Dr. Michael MacMillan Department.
Consumer Satisfaction Ensuring Compliance With the SAMH Community Project October 2014 Tell Us What You Think.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Cooking with CHEF Mark Pearson and Manuel Rendon Ingredients : What is CHEF?  CompreHensive collaborativE Framework  Open source Course Management System.
ADDICTIONS AND MENTAL HEALTH DIVISION Supported Employment Reporting and Data Sharing Wendy Chavez, MPA September 16, 2014 Developed By: Wendy Chavez,
Today’s website:
Program Evaluation: Guidelines for Effective Data Collection Sarah Schelle Mike Lloyd Indiana Department of Corrections Thomas L. Sexton, Ph. D., ABPP.
1 OAC Principles MHSA Prevention and Early Intervention.
The Role of the CPCDMS in QM Activities Elizabeth Love, MPH Harris County Public Health and Environmental Services Department HIV Services Section.
Building a Brighter Future for Our Kids and Families Multnomah County Department of School and Community Partnerships.
Managing Data Collection: Where we were, where we are, and where we want to be! Gwen Brockman California State University, Dominguez Hills.
National Obesity Observatory The Standard Evaluation Framework National Obesity Observatory The Standard Evaluation Framework Kath Roberts
CIC Webinar Community Outcomes Project February 16, 2012 Dawn Helmrich, Director of Data and Outcomes United Way of Greater Milwaukee.
Data Quality: Treasure in/Treasure Out Victoria Essenmacher, SPEC Associates Melanie Hwalek, SPEC Associates Portions of this presentation were created.
Minnesota Continuous Improvement Process: Program Evaluation Report Writing Post School Outcome and Parent Survey Minnesota Department of Education Conference.
System Establishing Your Management Reporting System.
Michigan Partnering with Parents to Help Measure Outcomes for Young Children and Families Chandra Jones Vanessa Winborne MICC Parent Michigan Part C Coordinator.
Care Network of the Treasure Coast.  The mission of the Care Network of the Treasure Coast (CNTC) is to serve as the advisory body for the Ryan White.
Council for Exceptional Children/Division of Early Childhood Conference October 2010 Kim Carlson, Asst. Director/619 Coordinator Ohio Department of Education.
The Griffith PRO- Teaching Project A Process for Peer Review and Observation of Teaching.
Program Evaluation NCLB. Training Objectives No Child Left Behind Program Series: Program Evaluation To provide consistency across the State regarding.
Ms. Carla Cary/DSN 09/17/03 Army Family Well-Being Advisory Council Deployment Cycle Support Program “Army One Source”
Martha Thurlow Laurene Christensen Courtney Foster April 22, :15-2:15 MONITORING ACCOMMODATIONS FOR INSTRUCTION AND ASSESSMENT.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
FRYSC Advisory Councils Partners in Progress
Reporting Updated 05/2014. Handbook References Chapter 3: Administrative Guidance – Demographic Report – Match Report – Annual Report – Deaf and Hard.
Creating Equity Dashboards to Monitor Racial, Ethnic and Linguistic Disparities in Health Care: Lessons from the Disparities Leadership Program DiversityRx.
The National Energy Audit (NEAT) (Comparison of Popular Computer Audit Tools) Michael Gettings Oak Ridge National Laboratory.
Michigan Partnering with Parents to Help Measure Outcomes for Young Children and Families Vanessa WinborneJulie Lagos Michigan Part C CoordinatorParent,
Click to edit Master title style Click to edit Master subtitle style 1/31/20161 If you modify this powerpoint, update the version information below. This.
Cañada Noel-Levitz Results Spring 2010 Semester. What is the Noel-Levitz Survey? National survey of students conducted by hundreds of colleges every year.
ATP Meeting September 18, Overview Key components of the 2016 Plan Public Participation Plan Discussion.
Texas Regional Collaboratives for Excellence in Science Teaching Guidelines for Program Evaluation Presented by Carol L. Fletcher, Ph.D. Project Manager.
National 4-H Common Measures Suzanne Le Menestrel, Ph.D. National Institute of Food and Agriculture, USDA Jill Walahoski, Ph.D. University of Nebraska-Lincoln.
SMP Sub-Grantee Management. What We Will Cover Today Why Sub-grant? Contracts & Work Plans Communication Reporting Monitoring FAQ’s and Lessons Learned.
THE USE OF TWO TOOLS TO EVALUATE HEALTHY START COALITIONS Susan M. Wolfe, Ph.D.
Molly Brassil, Assistant Director, Policy California Primary Care Association Community Clinics and Health Centers & Mental Health Services.
PRG 420 Week 2 Learning Team Quality Control Sheet To purchase this material click below link Week-2-Learning-Team-Quality-Control-Sheet.
How to Assess the Effectiveness of Your Language Access Program
Competency 007: E.
Presented by: Sisifo Taatiti
Presentation transcript:

MHSA Data Collection

2010- Community-driven input for evaluation and reporting requirements on PEI activities Narrowing of reporting requirements and created first version of documentation Spring Created a model that was consistent across all MHSA activities Ease administrative burden Consistency of reporting requirements

Initially Word Documents Current System: Excel Spreadsheets Used in all MHSA Components Data is copied into spreadsheets per activity and then compiled into a master for demographics Future System: Standardized Satisfaction Surveys Master sheet for like measures (YOQ, Eyberg, etc.) Expectation of Validated/Cultural Driven Tools

Process of data collection: Completed by Provider (county or contractor) Electronically sent to County Liaison Merged into Master Spreadsheet Uses of data: County Evaluation Team Community Planning Process EQRO and other audits

Based on 3 Quadrant Model What/How much do we do? How well do we do it? Is anyone better off? Did we make a difference?

Quadrant 1- Demographics and Contract Requirements Unduplicated served Age Groups Ethnicity Primary Language Gender Veteran/Active Military Contract Requirements Quadrant 2 Satisfaction Survey Completion of Services Quadrant 3 Measurable outcome (at least one)

Is anyone better off? Did we make a difference? Customized to each activity Fidelity expected with EBP Provider holds the hard/individual data Provider provides the evaluation of the data Reports change scores/data Limitations Lack of funding for evaluation Lack of research and evaluation experience by providers