Using QDAS in the production of policy evidence by non-researchers: strengths, pitfalls and implications for consumers of research Dr Chih Hoong Sin Head.

Slides:



Advertisements
Similar presentations
C4EO Support for Regional Developments Gill Taylor Regional Associate 1.
Advertisements

External Quality Assessments Frequently Occurring Findings Observed by The IIA QA Teams.
HELPING THE NATION SPEND WISELY Performance audit and evaluation: common ground with Internal Audit ? The UK National Audit Office experience Jeremy Lonsdale.
Presentation by Pippa Lord, Senior Research Officer National Foundation for Educational Research Listening to Learners Conference University of East London.
Learning at Our primary role has been to help schools, teachers and lifelong learners engage with NLS collections through online resources, workshops and.
ADDRESSING THE NEEDS OF TEACHERS IN DISADVANTAGED ENVIRONMENTS THROUGH STRATEGIES TO ENHANCE SELF-EFFICACY L A WOOD & TILLA OLIVIER.
An introduction March background A new relationship between Government and its agents and FE Providers based on mutual respect, trust and reduced.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Strand 2 Supporting Providers to Engage with Employers Commissioned and funded byOrganised by.
HR Manager – HR Business Partners Role Description
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Teaching / managing practitioner-researchers in using N6: evaluations Dr Chih Hoong Sin.
Regional Trajectories to the Knowledge Economy: A Dynamic Model IKINET-EURODITE Joint Conference Warsaw, May 2006.
9am Arrival and breakfast 9.20am Welcome and outline of our Traineeship Martin Foulkes, Starting Point Education Services Manager 9.40am TSSP: Aims and.
A Discussion of Validity in Qualitative Research Anne Sofie Fink Data Archivist The Danish Data Archives.
Redefining the Role of Rural Education Partneriat at the Blue Lagoon 13th -14th september 2002 Thuridur Jóhannsdóttir researcher Sigurjón Mýrdal, associate.
The Australian/New Zealand Standard on Risk Management
Effective CVs and Interviews Dr. Lorna Dargan Careers Adviser for FMS.
ISO 9001: Supporting Statistical Systems Prices Division, Office for National Statistics Derek Bird May2014.
Standards and Guidelines for Quality Assurance in the European
Factors influencing open source software adoption
Research, evidence and engaging learning Profiling the influence of school librarianship Penny Moore
Analysis of Local Integrated Workforce Strategies Paul Gutherson Research Consultant CfBT Education Trust.
Welcome Assessment Centres David Phillips Senior Assessment Partner DfT Resourcing Group.
FIRST Dissemination. Key tasks ⇒ To identify key EQF levels for qualifications in FSS ⇒ To select 3 work processes to be analysed to capture the relevant.
1 Beyond the Library: i-Skills for University Administration © Netskills, Quality Internet Training, Newcastle University Partly.
Evaluation Office 1 Evaluating Capacity Development David Todd Senior Evaluation Officer GEF Evaluation Office.
Kanaalweg HG Utrecht Tel. 030 – Website: FORUM Institute for Multicultural Development PAOO.
Needs Assessment: Young People’s Drug and Alcohol Services in Edinburgh City EADP Children, Young People and Families Network Event 7 th March 2012 Joanne.
A Model for EAP Training Development Zhiyun Zhang IDE 632 — Instructional Design & Development II Instructor : Dr. Gerald S. Edmonds.
Quality Directions Australia Improving clinical risk management systems: Root Cause Analysis.
Scoping exercise of work- based assessment programmes for health professionals in Scotland. Mrs J McDowell, Mr W Wright, Dr K McHardy, Dr G Leese.
2 ND EDITION ROD JONES Copyright © Pearson Australia (a division of Pearson Australia Group Pty Ltd) 2010 PowerPoint presentation to accompany.
PRJ566 Project Planning and Management
By: Ehsan Khodarahmi L7  Developing appropriate messages and themes, given the target audience, a set of brand (or corporate values), a set of communication.
The Scholarship of Engagement for Politics Barrie Axford Oxford Brookes University.
A toolkit for embedding methods teaching within a Sociology fieldtrip Carole Sutton & Alison Anderson.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
The views expressed in this presentation do not necessarily reflect those of the Federal Reserve Bank of New York or the Federal Reserve System Association.
John Burke and Clive Alderson Embedding Business & Community Engagement.
School of SOCIAL AND COMMUNITY MEDICINE University of BRISTOL Shall we meet for coffee? Experiments in ways of bridging the researcher commissioner gap:
Regional Co-operation Council Workshop on enhancing women entrepreneurs in SEE Milena Corradini Sarajevo, 1 October 2009.
Measuring the Impact of Labour Market Information Michel Turcotte Tannis Goddard Bryan Hiebert, PhD Sareena Hopkins October20, 2011 Cape Town.
Developing a Sustainable Procurement Policy and Strategy EAUC – EAF Programme.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
PERFORMANCE APPRAISAL 1. Performance Appraisal Performance Appraisal (PA) refers to all those procedures that are used to evaluate the personality, performance.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
ELearning Socrates Minerva Concertation Meeting Helsinki 3 July 2006 « Dissemination and Exploitation of Results » Janette Sinclair European Commission.
Intermediate1/2 Administration Communication. An Admin Assistant needs to communicate with lots of people everyday. Communication can happen in many different.
Advances in Human Resource Development and Management Course code: MGT 712 Lecture 9.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Session 2: Training Theme: How to identify and prioritize needs for training Discussant: Blagica Novkovska Conference of European Statisticians Workshop.
PHE Knowledge and Library Services – building an interactive evidence base Anne Brice Head of Knowledge & Library Services Public Health England.
Job Analysis (Session Four) Jayendra Rimal. What is Job Analysis & its Uses The procedure for determining the critical knowledge, abilities, skill and.
© PeopleAdvantage 2013 All Rights Reserved We will Show You How to Easily Conduct Effective Performance Appraisals LCSA Conference 2013.
Work Package 2 Eva Heckl 1/10/2014 Feasibility study on an internet-based e-platform for women entrepreneurs.
TYNE AND WEAR FIRE AND RESCUE SERVICE ‘Creating The Safest Community’ Evaluation in the Fire and Rescue Service Vicki Parnaby.
Information Technology Infrastructure Library Reaching the Adult Learner: Teaching Information Technology Infrastructure Library (ITIL) to Practicing Technology.
How does coaching add value in organisations?
The What Works Centre for Crime Reduction: An evaluation
Employability Skills Interviews.
Handout 3: Identify development needs
“CareerGuide for Schools”
Structured Trade and Commodity Finance Advisory services
Needs Contribution Statement
Job Analysis CHAPTER FOUR Screen graphics created by:
Public engagement strategy
A Focus on Outcomes and Impact
Handout 3: Identify development needs
Presentation transcript:

Using QDAS in the production of policy evidence by non-researchers: strengths, pitfalls and implications for consumers of research Dr Chih Hoong Sin Head of Information and Research Disability Rights Commission

Presentation from the perspective of research commissioner and of research provider Three key developments in UK: –evidence-based policy and practice –utilitarian view of research –effective dissemination

Implications: –‘marketised’ research relationships –increasing heterogeneity of ‘providers’ and ‘clients’ –different skills sets required –‘quality guarantee’ in doubt or not primary concern? –different ‘normative worlds’ in collision

Example of consultancies: –cross pollinators reduce ‘silos’, enhance transferability –match makers more effective partnership working –translators and processors information usable and relevant –multiple dissemination routes, formative techniques wider audience, timely

Company X: –SME research and consultancy company –Works solely with public sector clients (i.e. national, regional, local government, public bodies) –Six employees use QDAS

Prior experience: –4 had general undergrad social research training –1 did qualitative postgrad research –1 no background in qualitative research at all –None had used any QDAS before

Training (not mutually exclusive): –1 had formal external training by specialist –4 had ‘on the job’ training –3 had formal internal training by colleague - implication? –1 asked colleague –1 read a manual

Type of research QDAS used on: –All were large-scale mixed-method national policy evaluations –Mostly semi-structured interviews, one structured focus group –Volume of data - from around 30 to more than 100 documents –All individuals used QDAS on actual projects immediately after training

Perceived adequacy of training: –All felt training was adequate, irrespective of: background in qualitative research/data experience in using QDAS mode of training timing of training

Functions used: –All used QDAS for preparing and uploading documents; code; perform matrix node searches –Fewer used it to design coding structure; define codes; generate reports; create memos –2 used Merge function

Confidence and weakness: –All confident in functions with regular use –Less confident in functions with sporadic use or never used –Awareness of more ‘sophisticated functions’ that they had never used but no indications of knowledge of what these functions actually are

Project management: –All trained in specific project teams –Division of labour - data management, data analysis –‘Need to know’ and consistency

Data analysis: –‘Core’ analysis team –Structured coding design –Descriptive or topic codes –Largely descriptive analysis, lack of theorising

Discussion: –Need to engage. Pragmatic rather than idealistic response. Can’t ignore or shun as ‘wrong’ or ‘unorthodox’ –QDAS can offer some tools to help mitigate against the worst of ‘bad practise’, depending on: type of research type of team management type of outputs and hence analysis required

Discussion: –Allows things that can be systematised to be systematised –Easy checking –Not overwhelm individuals, e.g. ‘need to know everything’

What to look for: –Good guidance exist, but tend to target people with some understanding of research –What to look for and what to ask for when it’s not there. Inability to articulate causes frustrations on both sides, fuel continued misunderstanding –QDAS not the only way, but can help. Some risks (e.g. ‘wow’ factor).

What to look for: –Samples of documents –Numbers of documents, all ‘analysed’ –Codes –Use of codes –…and, dare we hope, a theoretical ‘model’?

Thank you for your attention and enjoy the rest of the conference! Dr Chih Hoong Sin