Presentation is loading. Please wait.

Presentation is loading. Please wait.

Testing the Test: Validation, Litigation & Risk Management Kathleen K. Lundquist, Ph.D. John C. Scott, Ph.D.

Similar presentations

Presentation on theme: "Testing the Test: Validation, Litigation & Risk Management Kathleen K. Lundquist, Ph.D. John C. Scott, Ph.D."— Presentation transcript:

1 Testing the Test: Validation, Litigation & Risk Management Kathleen K. Lundquist, Ph.D. John C. Scott, Ph.D.

2 1 About APTMetrics Global Talent Management Solutions Provider Comprised of: Ph.D. industrial/organizational psychologists Human resource consultants Information technology specialists What Sets APTMetrics Apart: Professional integrity Evidence-based approach Technical expertise Customer service Diversity Supplier Certified as a women-owned business by WBENC Certified as a women-owned small business by the US SBA Global Strategies for Talent Management. 1

3 2 Our Areas of Expertise Leader Assessment Employee Selection Litigation Support Diversity Strategy & Measurement Job Analysis Competency Modeling Performance Management Staffing for Mergers & Acquisitions Organizational Surveys Global Strategies for Talent Management. 2

4 3 Our Web-Based Solutions Platform APTMetrics ® SelectionMetrics ® Employee Selection System LeadIN Leadership Assessment Suite LeadIN sm Leadership Assessment Suite JobMetrics ® Job Analysis System 360Metrics ® 360-Degree Feedback System SurveyMetrics ® Organizational Survey System 3

5 4 APTMetrics’ U.S. Offices 4

6 5 Expert Witness Testimony Court appointed expert in settlement Ford Abercrombie & Fitch Morgan Stanley Sodexo The Coca-Cola Company Invited testimony on testing before EEOC Negotiations with OFCCP Proactive HR Process Audits Development and validation of new selection systems Our Background with Testing and Litigation 5

7 6 Are you concerned about a legal challenge to your tests or interviews? Yes No I don’t know Polling Question #1 6

8 7 The Federal Uniform Guidelines on Employee Selection Procedures (1978) define a test as: Definition “Any measure…used as a basis for any employment decision.” 7

9 8 What is a Test?  Work Simulation Exercises  Cognitive Ability Tests  Job Knowledge Tests  Personality and Interest Inventories  Honesty/Integrity Tests  Assessment Centers  Physical Abilities Tests  Application Form Data  Psychological Assessment  Education/Experience Requirements  Reference Checks  Performance Evaluations  Interviews 8

10 9 Is your company currently using written tests? No, not at all No, but considering using in the future Yes, for a small number of jobs Yes, for a wide range of jobs I have no idea! Polling Question #2 9

11 10 Validity… and the Law 10

12 11 Why are Tests Challenged? Adverse impact Validity None Inadequate Dated Reliance on Studies Outside the Company Less Adverse Alternatives Inconsistent Administration 11

13 12 Disproportionately fewer protected group applicants pass the test than majority group applicants Usually determined by: 4/5ths or 80% rule Standard deviation test The presence of adverse impact may be a given for many types of tests Adverse Impact Definition HOWEVER… tests with adverse impact can be successfully defended 12

14 13 4/5ths Rule Illustration 3 out of 5 African- Americans pass the test 60% pass rate 4 out of 5 Whites pass the test 80% pass rate 60% / 80% = 75% Adverse Impact 13

15 14 Trends in Testing Case Law More specific evidence of adverse impact required Beyond 80% rule Practical and Statistical Significance Sample size issues Disparate treatment testing cases Inconsistent administration Selective use of test Jury trial and punitive damages Greater emphasis on “validity” of passing scores Rationale Set at acceptable performance, not applicant flow Plaintiffs’ burden to identify less adverse alternatives Must be knowable Must demonstrate would be less adverse Must demonstrate substantially the same validity 14

16 15 Job analyses are conducted Procedures have face validity They document a search for less adverse alternatives Employers Tend to be Successful When… 15

17 16 Valid procedures are administered inconsistently Cut scores are set too high Plaintiffs Tend to be Successful When… 16

18 17 249 Industrial and Organizational Psychologists answered this and related questions in a recent APT survey (July 2008) Given a competently conducted criterion-related or content validation study, how old would the validation study need to be to necessitate a new validation study? Average = 5 years old (Blue Collar, Supervisory, Managerial, Executive Tests) Technical Tests judged to have shorter shelf life = 3 to 3.5 years old Administrative/Customer Service/Sales = 4 to 4.5 years old What is the “Shelf Life” of a Validation Study? 17

19 18 Assuming a job’s tasks or work behaviors do not change, or change very little, what is the shelf life of a job analysis? That is, after how much time would one need to update the job analysis, even when there are little or no changes to the major tasks or work behaviors of the job? Average = 5 to 6 years old What is the shelf life of the cut score? That is, how long would you recommend using a cut score before conducting additional research to determine if the cut score needs to change? Average = 3 to 3.5 years old Other Survey Results 18

20 19 Conditions that shorten the shelf life of a validation study: Change in the nature of the job duties and or KSAs Conditions that result in the emergence of adverse impact or legal challenges Changes in applicant population Changes in administration mode Organizational changes (e.g., merger, downsizing) Other Survey Results 19

21 20 These are professionally derived “rules of thumb” Use these findings and your professional judgment Seek additional professional judgment Seek legal input And the Survey Says… 20

22 21 What must you do to validate a test? 21

23 22 Test Validity Looks Like… People who score high on the test are also high performers on the job People who score low on the test are also low performers on the job X Y Job Performance Test Scores 22

24 23 Selection procedures provide samples of behavior which allow us to make inferences about: What basic abilities a person possesses What the person knows What the person can do What a person is willing to do How a person will behave in the future The Validity of Selection Procedure Inferences 23

25 24 Validity refers to the degree to which test scores are job related The process of validation involves accumulating evidence to provide a sound scientific basis for the proposed use of the test Validation Defined 24

26 25 Evidence Based Upon Test Content (Content Validity) Demonstration that the content of a test is representative of important aspects of performance on the job Evidence Based on Relations to Other Variables (Criterion Validity) Statistical demonstration of a relationship between scores on a test and job performance of a sample of employees Evidence Based on Internal Structure (Construct Validity) Demonstration that test measures a construct (something believed to be an underlying human trait or characteristic, such as conscientiousness) and the construct is important for successful job performance Sources of Validity Evidence 25

27 26 Content Validity Study 1. Job Analysis 1. Job Analysis 2. Test Development 3. SME Validation 4. Set Passing Scores Profile job by identifying essential functions (WABs), knowledge, skills, abilities, and performance standards Develop representative samples of performance domains - job sample tests, job skill tests, or job knowledge tests Use subject matter expert judgment to document relationships between test content and performance domains Use incumbents and subject matter experts to establish passing scores 26

28 27 A comprehensive job analysis Competence in test construction Test content related to job’s content Test content representative of job’s content Examination of less adverse alternatives A passing score that selects those who can better perform the job Key Issues in Content Validity 27

29 28 Criterion-Related Validity Study Establish Administrative Use Job Analysis Develop or Acquire Tests Try Out/Pilot Collect Test Data (Applicants or Employees) Relate Test Scores & Performance Measures Collect Performance Data Try Out/Pilot Develop Performance Measures 28

30 29 Adequacy of job performance criteria The psychometric quality of test and criterion measure Degree of correlation necessary to establish validity Examination of less adverse alternatives Appropriateness of the passing score Key Issues in Criterion-Related Validity 29

31 30 No adverse impact Transporting validity from another job or location Generalizing validity from other studies of similar jobs When Can You NOT Validate? 30

32 31 Validation is the joint responsibility of test developer and test user When the use of a test differs from that supported by the test developer, the test user bears special responsibility for validation Responsibility for Validation 31

33 32 Job Analysis will identify what to assess It is not necessary to measure every important KSAO! It is necessary that every KSAO measured be important! What Can You Test For? 32

34 33 Job Analysis is the Foundation Job Job Job Job JobJobAnalysisAnalysisJobJobAnalysisAnalysis Work Activities Performed Scope and Effect of Work Technical Skills Required Competencies Required Education Requirements Experience Needed Minimum & Preferred Qualifications Test Specifications Uniform Standards for Promotions Structured Interviews Performance Standards 33

35 34 Does your company use formal job analyses as the basis for selection procedures? No Yes Polling Question #3 34

36 35 Example of Test Specification Matrix 35

37 36 Types of Tests to Consider ENTRY LEVEL Cognitive Ability Testing Non-cognitive Measures Interview PROMOTIONAL Knowledge Testing Performance Assessment Interview 36

38 37 Face Validity 37

39 38 “High Fidelity” measurement tools (work samples, video, assessment centers) are more acceptable to candidates Once test specifications have been developed, decide: Custom design or identify commercially available test? What is appropriate testing medium? Higher Fidelity = More Costly Whether custom or commercially available, must validate for your jobs Selecting a Test 38

40 39 Combine Assessments Cognitive Job Performance Non- Cognitive Structured Interview 39

41 40 Cut-off scores should: Be consistent with normal expectations of proficiency within the workforce Permit the selection of qualified applicants Allow an organization to meet affirmative action goals Have a documented rationale Criteria For Establishing Appropriate Cut-off Scores 40

42 41 What Makes Tests Fair and Defensible? Validity Based on job analysis Standardized Consistent Validated Implementation Training Ongoing monitoring Appeals process Communication 41

43 42 Ongoing Monitoring Adverse Impact Test Content Administration Issues 42

44 43 1.“Aren’t tests harder to defend than other selection procedures?” 2.“The vendor says: ‘The test has no adverse impact.’ I have no problems, right?” 3.“The vendor says: ‘The test is validated – Trust us!’ I have no problems, right?” 4.“Candidates can take this new test at home over the Internet…so we can reduce overhead. I have no problems, right?” 5.“Once the test is validated, we’re done, right?” Top Five Questions About Testing 43

45 44 For a copy of the results of APT’s recent Test Validity Shelf Life Survey email your request to: Test Validity Survey 44

46 45 APTMetrics, Inc. One Thorndal Circle Second Floor Darien, CT 06820 203.655.7779 Contact Information Global Strategies for Talent Management. 45

Download ppt "Testing the Test: Validation, Litigation & Risk Management Kathleen K. Lundquist, Ph.D. John C. Scott, Ph.D."

Similar presentations

Ads by Google