Www.capstan.be ESRA- 6 th Annual Conference Reykjavik, July 17, 2015 Spoken language versus written language: A challenge for the linguistic validation.

Slides:



Advertisements
Similar presentations
Skills in note-making, summarising and report writing Quality and Outcomes Framework Assessor Training.
Advertisements

IMPLEMENTING YOUR SURVEY. By the end of this lesson you will be able to: Conduct a survey questionnaire. Recruit and train enumerators and encoders to.
European Inventory on Validation of Non-formal and Informal Learning 2010 Jo Hawley, Project Manager Brussels, 12 December 2011.
DEPT, FERRARI AND MENDELOVITS: HOW TO ANALYZE AND EXPLOIT FIELD TEST RESULTS WITH A VIEW TO MAXIMIZING CROSS-LANGUAGE COMPARABILITY OF MAIN SURVEY DATA.
Conclusion and Guidelines for Coursework Essay. Conclusion Aims of course unit: Translation/interpreting as a professional activity Translation/interpreting.
Barbara M. Altman Emmanuelle Cambois Jean-Marie Robine Extended Questions Sets: Purpose, Characteristics and Topic Areas Fifth Washington group meeting.
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
Division of Special Education
Literacy Assessment and Monitoring Programme (LAMP) UNESCO Institute for Statistics.
RESEARCH METHODS Lecture 24
CHARLOTTESVILLE POLICE DEPARTMENT GENERAL ORDER LIMITED ENGLISH PROFICIENCY POLICY: It is the policy of the Charlottesville Police Department to take reasonable.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
On Scoring Guides everything you were afraid to ask PART TWO.
Interviews.
HIBBs is a program of the Global Health Informatics Partnership Introduction to Form Design Regional East African Centre for Health Informatics (REACH-INFORMATICS)
Overview of DMAIC A Systematic Framework for Problem Solving
Proposal Writing.
Tennessee Department of Education Compliance Training February 2012 Department of Exceptional Children.
> taking best practice to the world International Experience with Performance Based Maintenance Contracts.
MYP: Language B ARABIC and SPANISH
Providing accommodations for students with special needs Accommodations vs. modifications Consider all areas of accommodation (methods, materials, technologies)
National Adaptation Forms International Study Centre Data Management Seminar Hamburg, July 2008.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
ENGLISH PRIMARY BENCHMARK COMPONENTS AND WEIGHTINGS SPEAKING – carrying 20% of the global mark (prepared by the Benchmark board and administered.
ICCS status report Wolfram Schulz International Study Centre Data Management Seminar Hamburg, July 2007.
Medical Audit.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Report Writing.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Renaissance Academy World Language Program Assessment.
National adaptations to main survey instruments and layout verification National Research Coordinators Meeting Windsor, June 2008.
Parent Input Directions. Parent Input Meeting The District Monitoring Plan team must include a parent (from such sources as CAC, PTI, FRC, FEC).
Collecting primary data: use of questionnaires Lecture 20 th.
I Power Higher Computing Software Development The Software Development Process.
Informal transactional letter
Exploitation Seminar , Budapest Presentation of European Family Set.
Record Keeping and Using Data to Determine Report Card Markings.
HYMES (1964) He developed the concept that culture, language and social context are clearly interrelated and strongly rejected the idea of viewing language.
Machinery and Equipment and PPPs Richard Dibley 3 rd Technical Advisory Group Meeting.
Monitoring and Evaluation
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Alternate Assessment Attainment Task Overview Part 2.3 Click here to download the Administration Guide (Required for completion of this training)
The Individual Education Plan (IEP) Toronto District School Board January 20, 2015.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
Winter 2011SEG Chapter 11 Chapter 1 (Part 1) Review from previous courses Subject 1: The Software Development Process.
TITLE OF AUDIT Author Date of presentation. Background  Why did you do the audit? eg. high risk / high cost / frequent procedure? Concern that best practice.
Assessment Design and its relationship to NARS and ILOs Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic of Egypt.
Language Society and Culture. Social Dialects  Varieties of language used by groups defined according to :  - Class  - Education  - Occupation  -
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
MYP Language Acquisition Objectives Phase 1 and
Publishing Educational Research Articles Dr. David Kaufman Faculty of Education Simon Fraser University Presented at Universitas Terbuka March 4, 2011.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
Technology Help or Hinderance? DOMINIQUE JOHNSON EDU671: FUNDAMENTALS OF EDUCATIONAL RESEARCH INSTRUCTOR: FREDERICK ANSOFF 2 JUNE 2014.
Quality Assurance processes
Best Practices in Performing DSA Legacy Reviews
WHO The World Health Survey Translation
Proposal for piloting the VIRTA publication information service at the European level Janne Pölönen, Hanna-Mari Puuska and Gunnar Sivertsen.
IB Assessments CRITERION!!!.
SYSTEM ANALYSIS AND DESIGN
Verification Guidelines for Children with Disabilities
Grading AND Assessment IN THE MYP AT GWA
Writing Survey Questions
A guide to Paper 1: EDEXCEL certificate English language
How to identify prestigious journals and conferences
Modernisation of Statistics Production Stockholm November 2009
Validating Growth Models
Assistive Technology Implementation
Presentation transcript:

ESRA- 6 th Annual Conference Reykjavik, July 17, 2015 Spoken language versus written language: A challenge for the linguistic validation of data collection instruments for international surveys

Linguistic validation in 3MC Set of processes that aims to ensure that the same questions are being asked, or the same constructs are being measured, via translated data collection instruments. Comprises a number of quality assurance (LQA) and quality control (LQC) steps implemented both upstream and downstream.

Quality assurance Translatability Assessment Early resolution of potential equivalence issues Source Version Optimization Imposing robust translation model e.g. TRAPD Comprehensive translation and adaptation guidelines Assistance to national/local teams of translators Quality control Documentation of all steps in translation process Verification of locally produced translations Managing errata, updates, trends Controlled review following Field Trial or Piloting Reporting on compliance

Spoken language adds a new twist… When CAPI/CATI systems are used, the interviewer follows a script and reads out the questions to the respondent.  Questions and challenges for linguistic validation of materials which will not be seen in written form but only heard by respondents. cApStAn: 15 years experience in LQA/LQC, starting with PISA – Active in CSDI, WAPOR, ESRA, 3MC, ITC… but practitioner’s viewpoint. Empirically, we’ll strive to define conceptual and procedural issues, present real examples of difficult situations, propose solutions, and suggest open challenges

What accommodations in the source itself?  It seems appropriate and useful to relax the conventions of standard written English and use e.g. contractions  “don’t”, “can’t”, “you’re”, etc.  Gives translators an indication of the desired register (to the extent that it applies to their language)  But let’s not go overboard Whatcha gonna do?

What accommodations in the source itself?  Idiomatic expressions may be more frequently used in spoken versus written English.  This is fine, but it is good practice to provide a translation/adaptation guideline.

What accommodations in the target languages?  In general, should the criterion of “equivalence to source” cover also equivalence of register (spoken versus written)?  Yes, with care – this will mean different things in different languages.  Example 1: less subjunctive in Italian  Example 2: negation in French

What accommodations in the target languages?  In general, can or should (some of) the criteria of “linguistic correctness” in the target language be relaxed in the case of texts which will never be seen in written form?  In general, no – not much more than the accommodations for spoken language register.  Examples: punctuation, spellcheck  What about formatting (bold, underline, italics)?

What accommodations in the target languages?  Example 3: gender marks Solution 1 Solution 2 Solution 3

Special cases Diglossia: ‘everyday’ variety versus ‘high’ variety  Example 1: Arabic  Example 2: Swiss-German EU-MIDIS-II Target group: Immigrants from North Africa and their descendants, in BE, ES, FR, IT  Team translation: -Tunisian linguist - Algerian linguist - Moroccan linguist  ‘Passe-partout’ Maghrebi Arabic version SHARE SOURCE: What was his relationship to you? GERMAN: Was war seine Beziehung zu Ihnen? SWISS-GERMAN: Was ist seine Beziehung zu Ihnen gewesen?

Special cases  Leading from the previous point  How to check “linguistic correctness” in the target language when the standards for the latter are not clearly defined?  Example 1: PIAAC, Serbo-Croatian-Bosniac version for Austria…  … with some words in German

Special cases  One last example: PIAAC, Spanish version for the USA  Varieties of Spanish spoken in the USA: - Mexican - Caribbean - Central American - Colonial Most widespread, and becoming the ‘standardized dialect’ of Spanish in the continental United States But if more time/money: - interviewer ‘help’ text fields and/or - dynamic text fields

(Tentative) Conclusions  From Prof. Lyberg’s keynote speech: “A good 3MC design is a mixture of standardization and flexibility”  Accommodations for colloquial register: Yes. But “text will not be seen in written form” is not a valid excuse to generally relax linguistic standards.  A‘good’ translation is one that is fit for purpose. The purpose here: help the interviewer carry out a successful interview and collect valid, comparable data  What helps? What hinders?  Spoken language likely to have even more variants than written  Interviewer ‘Help’ texts, dynamic text

Questions? THANK YOU