FORUM GUIDE TO SUPPORTING DATA ACCESS FOR RESEARCHERS A STATE EDUCATION AGENCY PERSPECTIVE Kathy Gosa, Kansas State Department of Education.

Slides:



Advertisements
Similar presentations
Planning Collaborative Spaces in Libraries
Advertisements

The Role of the IRB An Institutional Review Board (IRB) is a review committee established to help protect the rights and welfare of human research subjects.
Reliability Center Data Request Task Force Report WECC Board Meeting April 2009.
WV High Quality Standards for Schools
Office of the General Counsel (OGC) Strategy Map FY 11 August 2011 University Strategic Goals 1. Ensuring student success OGC Strategic Directions OGC.
Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
STRATEGIC PLAN Community Unit School District 300 7/29/
Research Integrity and Assurance Protections and Support Human Subjects Animal Care and Use Biosafety Conflicts of Interest Export Controls Responsible.
Federal Guidance on Statistical Use of Administrative Data Shelly Wilkie Martinez, Statistical and Science Policy, OIRA U. S. Office of Management and.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
The Department of Communications and Engagement Jimmy Lee Peterkin, Jr., MBA District Business and Community Partnership Coordinator
NCES Forum Tech Committee July 2010 Presented by: Kathy Gosa Kansas State Department of Education.
Family Resource Center Association January 2015 Quarterly Meeting.
STANDARDS FOR SCHOOL LEADERS DR. Robert Buchanan Southeast Missouri State University.
IS Audit Function Knowledge
Accounting for Every Student : A Taxonomy for Standard Student Exit Codes (2006) Forum Guide to Education Indicators (2005) Forum Unified Education Technology.
ACFID CODE OF CONDUCT Changes to the Code Effective Jan 2015.
Ricki Sabia, JD edCount, LLC Senior Associate and NCSC Technical Assistance and Parent Training Specialist Universal Design for Learning: Working to Create.
A Healthy Place to Live, Learn, Work and Play:
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Purpose of the Standards
Comprehensive Guidance and Counseling
UNLV Data Governance Executive Sponsors Meeting Office of Institutional Analysis and Planning August 29, 2006.
Research Methods for the Social Sciences: Ethics Ryan J. Martin, Ph.D. Thomas N. Cummings Research Fellow March 9, 2010.
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
April 2, 2013 Longitudinal Data system Governance: Status Report Alan Phillips Deputy Director, Fiscal Affairs, Budgeting and IT Illinois Board of Higher.
National Incident Management System. Homeland Security Presidential Directive – 5 Directed the development of the National Incident Management System.
Information Security Compliance System Owner Training Richard Gadsden Information Security Office Office of the CIO – Information Services Sharon Knowles.
Justice Information Network Strategic Plan Development Justice Information Network Board March 18, 2008 Mo West, JIN Program Manager.
Supporting Data Access for Researchers: An LEA Perspective Christina Tyedeman Hawaii Department of Education Sheri Ballman Princeton City School District.
Meeting SB 290 District Evaluation Requirements
CUI Statistical: Collaborative Efforts of Federal Statistical Agencies Eve Powell-Griner National Center for Health Statistics.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Maintain Ethical Conduct
Guidance for AONB Partnership Members Welsh Member Training January 26/
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
How Hospitals Protect Your Health Information. Your Health Information Privacy Rights You can ask to see or get a copy of your medical record and other.
Chapter 9 Developing an Effective Knowledge Service
Implementing and Auditing Ethics Programs
2012 SLDS P-20W Best Practice Conference 1 R ESEARCHER A CCESS TO THE SLDS Monday, October 29, 2012 Kathy Gosa, Kansas Department of Education Bethann.
March 17, Open Source Release of NASA Software GSA/GWU Open Source in Government Conference NASA Open Source Legal Team.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
1 Information Sharing Environment (ISE) Privacy Guidelines Jane Horvath Chief Privacy and Civil Liberties Officer.
FOURTH EUROPEAN QUALITY ASSURANCE FORUM "CREATIVITY AND DIVERSITY: CHALLENGES FOR QUALITY ASSURANCE BEYOND 2010", COPENHAGEN, NOVEMBER IV FORUM-
Name Position Organisation Date. What is data integration? Dataset A Dataset B Integrated dataset Education data + EMPLOYMENT data = understanding education.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
How To: A Process for Successful Partnerships. Partnership Definition A partnership IS: A written agreement between the parties. Mutual interest in, mutual.
Data Governance 101. Agenda  Purpose  Presentation (Elijah J. Bell) Data Governance Data Policy Security Privacy Contracts  FERPA—The Law  Q & A.
Monitoring Region 3 Discretionary Roundtable May 20 – 23, 2008 Mary Evans And Conyers Garrett EMPLOYMENT AND TRAINING ADMINISTRATION UNITED STATES DEPARTMENT.
Surveillance Evaluation Assist Prof Dr. Kwankate Kanistanon, DVM, MS, PhD.
2013 MIS Conference 1 R ESEARCHER A CCESS TO THE SLDS Facilitating Researcher Access to Statewide Longitudinal Data Systems (SLDS) Kathy Gosa, Kansas Department.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Cooperating Agency Status Presented by Horst Greczmiel Associate Director, NEPA Oversight Council on Environmental Quality Washington, DC September 14,
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Data Ethics Levette Williams Associate Superintendent Technology Services Brad Bryant, State Superintendent of Schools “We will lead the nation in improving.
What is National Standards? The National Standards Community Foundation accreditation program helps demonstrate that a community foundation is well-run.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
Protection of Personal Information Act An Analysis on the impact.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Integrated Regional Water Management Grant Program Implemented Jointly by Department of Water Resources and State Water Resources Control Board.
Stages of Research and Development
Data Accessibility, Confidentiality and Copyright United Nations Statistics Division Demographic Statistics Section.
BETTER AND PROPER ACCESS TO PACIFIC MICRODATA
On data accessibility and confidentiality……..
Student Data & Privacy.
Data Release Policies & Procedures
Presentation transcript:

FORUM GUIDE TO SUPPORTING DATA ACCESS FOR RESEARCHERS A STATE EDUCATION AGENCY PERSPECTIVE Kathy Gosa, Kansas State Department of Education

Introduction  Data use should drive data collection  The “research community” is an important user of education data  Developing mutually beneficial relationships between education agencies and the research community makes sense

Data Partnerships  Data are an integral component of our education system  Most SEAs view responding to requests for data as a major responsibility to their stakeholders and a wise investment in education  Partnerships with researchers can lead to numerous tangible benefits to education agencies, such as:  Encouraging research projects that reflect an education agency’s information needs and priorities  Supporting data-driven decisionmaking by educators and policymakers  Providing an SEA with access to additional research, statistical, and program expertise  Building the research skills of SEA staff who will work with members of the research community while reviewing and servicing data requests

Foundations for Data Sharing  Data governance  Data sharing infrastructure  Priorities and goals of the agency  Recognize the numerous dimensions of a research proposal  Which data are being requested and for what purpose?  Whether the requested data are appropriate to address the research question(s)?  Which data are actually available from the agency?  Which data can be shared?  Which data need to be masked, de-identified, or otherwise altered to protect individual privacy?

Challenges to Sharing Data  Supplying data to the research community comes at a cost to the SEA  Responding to the growing volume of data requests can become a full-time job for one or more staff members  There are also practical concerns about sharing education data Confidentiality and security issues Data ownership conflicts Concern regarding the potential for misusing or misinterpreting data  Resource allocation  Staff time required to establish an infrastructure, implement core data sharing practices, and manage and monitor requests can create a significant resource burden for an SEA.  Data limitations  Data collected by SEAs are intended for specific purposes and may not necessarily meet the precise needs of research projects.

Effectively Managing Data Requests  Chapter 2. Core Practices 1. Help Researchers Understand Agency Data and the Data Request Process 2. Create Effective Data Request Forms for Researchers 3. Review Data Requests Strategically 4. Manage the Data Request Process Efficiently 5. Release Data Appropriately 6. Monitor Data Use

Core Practice 1: Help Researchers Understand Agency Data and the Data Request Process  Develop training materials about an agency’s use of data terms, definitions, coding instructions, and metadata  Disseminate accurate information about relevant data sharing procedures and request forms  Provide a detailed description of all expectations to ensure that researchers are aware of their responsibilities for complying with policies for protecting, managing, and using data  Establish training topics, such as  SEA research priorities  Data governance and privacy policies  Data sources  Metadata  Data management expectations  Ethical and legal responsibilities  Communications responsibilities

 Creating standardized forms will streamline the request and evaluation process Preliminary Research/Data Request Form Full Research/Data Request Form Data sharing agreements Agreement modification forms Personal access agreement Certification of data destruction  Standardized forms should Help researchers accurately identify the data they are requesting Concisely, yet comprehensively, describe the proposed research plan Accurately capture all information needed by SEA staff to evaluate a request Core Practice 2: Create Effective Data Request Forms for Researchers

 Staff review  Data steward review  Review boards (including IRBs)  Legal counsel review  Working with outside agencies  By linking education data with data from outside agencies, researchers can answer questions about education that go far beyond the classroom  Memoranda of understanding (MOU)  When beginning the review, consider  Does the SEA have the requested data?  Can the SEA legally provide the data?  Has the researcher completed all training?  If this researcher has previously been granted access to data, did he or she adhere to all agency requirements? Were data managed and used appropriately?  Has the destruction of previously accessed datasets been certified?  If the researcher is affiliated with a research organization, such as a university, does the researcher have approval for the project from the organization’s IRB?  Will any fees be required? If so, have they been paid?  Does the SEA have the available resources to assemble the data? Core Practice 3: Review Data Requests Strategically

 Establish expectations for researchers  Eligibility to request data  Timelines for data use  Fee structures  Expectations for protecting confidentiality and security Core Practice 4: Manage the Data Request Process Efficiently

 Track data requests and use  From the point at which a request is received  Through its review, rejection, or approval  The delivery and receipt of data  The publication of research findings (e.g., articles and reports)  The certification of data destruction  Communicate with researchers  Timely communication with the researcher regarding the status of the request is appropriate until the request has been either refused or approved and fulfilled  SEAs may also find it advisable to provide researchers with predicted timelines for the data sharing process Core Practice 4: Manage the Data Request Process Efficiently

 Technical and statistical tools  Suppression, Masking, De-identification, and Anonymization  Media type  is considered secure only when data are appropriately encrypted and otherwise protected prior to attachment and delivery  Physical media, such as discs and tapes, require transport by entities that can guarantee safe and secure delivery to authenticated recipients  Physical access  Safe, highly monitored locations such as research data centers, secure facilities in business or universities, or similar locations Core Practice 5: Release Data Appropriately

 Confirm adherence to agreements  Data use should be limited to the purposes stated in the Data Sharing Agreement and should not be used for other research without explicit approval  Review research outcomes  At the conclusion of a research project, an agency may wish to review the findings and proposed publications prior to public release in order to prevent the unintended disclosure of personally identifiable information Core Practice 6: Monitor Data Use

 Confirm project completion and data destruction  Researchers should be informed of appropriate data destruction procedures during data use training  Destruction should be consistent with all procedures described in the Data Sharing Agreement  Use research findings  Research results maybe adapted or adopted by an agency for policy development, program review and improvement, or the resolution of technical and operational issues  Build partnerships with researchers  Ongoing collaboration can benefit both the SEA and the researcher, and lead to better research Core Practice 6: Monitor Data Use