©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Slides:



Advertisements
Similar presentations
Managing Human Resources, 12e, by Bohlander/Snell/Sherman © 2001 South-Western/Thomson Learning 5-1 Managing Human Resources Managing Human Resources Bohlander.
Advertisements

Combining Test Data MANA 4328 Dr. Jeanne Michalski
Developing a Hiring System OK, Enough Assessing: Who Do We Hire??!!
Chapter 10 Decision Making © 2013 by Nelson Education.
Simple Regression Equation Multiple Regression y = a + bx Test Score Slope y-intercept Predicted Score  y = a + b x + b x + b x ….. Predicted Score 
PowerPoint Slides developed by Ms. Elizabeth Freeman
Chapter 5: Personnel Decisions
Staffing Chapters
Concept of Reliability and Validity. Learning Objectives  Discuss the fundamentals of measurement  Understand the relationship between Reliability and.
Chapter Eleven Creativity, Innovation, and Leadership
Effect of Selection Ratio on Predictor Utility Reject Accept Predictor Score Criterion Performance r =.40 Selection Cutoff Score sr =.50sr =.10sr =.95.
Chapter Five Selection © 2007 Pearson Education Canada 5-1 Dessler, Cole, Goodman, and Sutherland In-Class Edition Management of Human Resources Second.
1 Chapter One The Nature and Importance of Leadership © 2010 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted.
1 1 Slide © 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
©2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Part 5 Staffing Activities: Employment
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
DEFINING JOB PERFORMANCE AND ITS RELATIONSHIP TO ASSESSMENTS.
Chapter 4 – Strategic Job Analysis and Competency Modeling
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
© 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Chapter 12 Inferential Statistics Gay, Mills, and Airasian
C Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, or posted to a publicly accessible website, in whole or in part.
Chapter 5 Selection Objectives and goals Selection Procedures
© 2011 Cengage Learning. All rights reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible Web site, in whole or in part.7–1.
Determining Sample Size
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Foundations of Recruitment and Selection I: Reliability and Validity
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
1 1 Slide © 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
1 Ch 3: Forecasting: Techniques and Routes. 2 Study objectives After studying this chapter the reader should be able to: Evaluate the suitability of several.
Human Resource Management
CHAPTER 4 Employee Selection
Managing Human Resources, 12e, by Bohlander/Snell/Sherman © 2001 South-Western/Thomson Learning 5-1 Managing Human Resources Managing Human Resources Bohlander.
Chapter Seven Measurement and Decision-Making Issues in Selection.
© 2014 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Copyright © 2012 by Cengage Learning. All rights reserved.5- 1 Chapter 5 Information for Making Human Resource Decisions Prepared by Joseph Mosca Monmouth.
Part 5 Staffing Activities: Employment
Job Analysis: Concepts, Procedures, and Choices
Hires 5 Offers 10 Interviews 40 Invites 60 Applicants 240 Adapted from R.H. Hawk, The Recruitment Function (New York: American Management Association,
Selecting Employees to Fit the Job and the Organization 03/04/2013.
Educational Research Chapter 13 Inferential Statistics Gay, Mills, and Airasian 10 th Edition.
Warren Reeve Duchac Accounting 26e Capital Investment Analysis 26 C H A P T E R human/iStock/360/Getty Images.
Combining Test Data MANA 4328 Dr. Jeanne Michalski
© 2012 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
IMPORTANCE OF STATISTICS MR.CHITHRAVEL.V ASST.PROFESSOR ACN.
©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
© 2013 Cengage Learning. All rights reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Topic #5: Selection Theory
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
© 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part, except for use as permitted in a license.
Needles Powers Crosson Financial and Managerial Accounting 10e Capital Investment Analysis 24 C H A P T E R © human/iStockphoto ©2014 Cengage Learning.
Internal and External Sources of Recruitment. 8–28–2 Learning Objectives After you have read this chapter, you should be able to: –Describe why selection.
©2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
© 2013 by Nelson Education1 Foundations of Recruitment and Selection I: Reliability and Validity.
© 2013 by Nelson Education1 Decision Making. Chapter Learning Outcomes  After reading this chapter you should:  Appreciate the complexity of decision.
6 Selecting Employees and Placing Them in Jobs
Copyright ©2016 Cengage Learning. All Rights Reserved
Human Resource Selection, 8e
Human Resource Planning
CHAPTER 4 Employee Selection
MANA 4328 Dr. Jeanne Michalski
MANA 4328 Dr. George Benson Combining Test Data MANA 4328 Dr. George Benson 1.
Developing a Hiring System
Personnel decisions Study Unit 4.
Copyright ©2016 Cengage Learning. All Rights Reserved
DM’ing with Multiple Predictors
Chapter 7: Selection.
Presentation transcript:

©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. Strategies for Selection Decision Making Part 2Foundations of Measurement for Human Resource Selection CHAPTER 6

Characteristics of Selection Decisions Simple Selection DecisionsSimple Selection Decisions  Involve one position with several applicants  Which person for a specific job? Complex Selection DecisionsComplex Selection Decisions  Involve several applicants and several positions  Which persons for which jobs? Selection Information Processing Demands =Selection Information Processing Demands = Number of applicants × Amount of selection data collected 6–2 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Characteristics of Selection Decisions (cont’d) Selection Decision ErrorsSelection Decision Errors  False positives (erroneous acceptances)  Appear acceptable but, once hired, perform poorly  False negatives (erroneous rejections)  Initially appear unacceptable but would have performed successfully if hired  True positives  True negatives Improving Selection DecisionsImproving Selection Decisions  Use high-validity standardized selection procedures  Employ proven decision-making strategies 6–3 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Enhancing Employment Decisions Questions to be answered:Questions to be answered:  For a specific selection situation, what are the best methods for collecting predictor information on job applicants?  How should scores be combined or added together from multiple predictors to give applicants an overall or total score for selection decision-making purposes?  Once a total score on two or more predictors is obtained, how should this overall score be used to make selection decisions? 6–4 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6–5 TABLE 6.1Modes of Collecting Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6–6 TABLE 6.2Modes of Combining Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6–7 TABLE 6.3Methods of Collecting and Combining Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE:Judgmental—Involves collecting/combining predictor information using human judgment. Mechanical—Involves collecting/combining predictor information without using human judgment. a Indicates (usually) superior methods for collecting and combining predictor information.

Combining Selection Data Mechanical versus JudgmentalMechanical versus Judgmental  Mechanical combination increases the accuracy of prediction when weighting predictor scores.  Accuracy of statistical models increases as data on more applicants are added to available data.  Systematic and thorough judgmental decision makers can do only as well as a statistical model.  Decision makers add error when judgmentally combining both subjective data and objective data.  Statistical models make an allowance for overconfidence errors and reduce the impact of individual biases on decision outcomes. 6–8 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Improving Selection Decisions 1.Be careful about relying too heavily on resumes and other initial information collected early in the selection process. 2.Use standardized selection procedures that are reliable, valid, and suitable for the specific selection purpose. 3.When feasible, use selection procedures that minimize the role of selection decision maker judgment in collecting information. 4.The judgment of selection decision makers should not play a major role in combining data collected and used for determining applicants’ overall scores. 6–9 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategies for Combining Predictor Scores Mechanical Methods for Combining Predictor InformationMechanical Methods for Combining Predictor Information  Multiple regression  Multiple cutoffs  Multiple hurdles  Combination method 6–10 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6–11 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE: The higher an applicant’s score on a predictor, the better the applicant’s performance on the predictor. TABLE 6.4Patient Account Representative Selection Procedure and Application Score Data

Strategy One: Multiple Regression AssumptionsAssumptions  The predictors are linearly related to the criterion.  The predictors are additive and can compensate for one another. AdvantagesAdvantages  Minimizes errors in prediction and combines the predictors to yield the best estimate of criterion status.  Is a very flexible method that can be modified to handle nominal data, nonlinear relationships, and both linear and nonlinear interactions. 6–12 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy One: Multiple Regression (cont’d) DisadvantagesDisadvantages  Assumed substitutability of predictor values  Instability of weights when using small samples  Requirement for assessing all applicants on all predictors Best UseBest Use  When predictor scores are interchangeable  When sample size is large 6–13 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy Two: Multiple Cutoffs AssumptionsAssumptions  A nonlinear relationship exists among the predictors and the criterion—a minimum amount of each predictor attribute is necessary.  Predictors are not compensatory—lack or deficiency in any one predictor attribute cannot be compensated for by having more of another. AdvantagesAdvantages  Candidates in pool are all minimally qualified.  Easy to explain to managers 6–14 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy Two: Multiple Cutoffs (cont’d) DisadvantagesDisadvantages  Requires assessing all applicants on all predictors  Identifies only the minimally-qualified candidates Best UsesBest Uses  When physical abilities are essential for performance  When a minimum level of performance on each predictor is necessary to perform job safely (e.g., aircraft takeoffs and landings) 6–15 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy Three: Multiple Hurdles AssumptionsAssumptions  Each applicant must sequentially meet the minimum cutoff or hurdle for each predictor before going to the next predictor. Failure to pass on the next predictor eliminates the applicant from consideration.  An alternative procedure is to calculate a composite multiple regression for each applicant at each successive hurdle. Whenever the probability of success for an applicant drops below some set value, the applicant is rejected. 6–16 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6–17 FIGURE 6.1Double-Stage Multiple Hurdle Strategy ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy Three: Multiple Hurdles (cont’d) AdvantagesAdvantages  All candidates are minimally qualified on predictors  Do not have to apply all predictors to all candidates DisadvantagesDisadvantages  Establishing validity for each predictor  Time required to apply predictors sequentially Best UsesBest Uses  When extensive post-hiring training is required  When an essential KSA must be present at hiring  With numerous applicants and expensive procedures 6–18 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Strategy Four: Combination Method AssumptionsAssumptions  There are minimum cutoffs for each predictor. Multiple regression creates an overall score. AdvantageAdvantage  Identifies and rank orders the acceptable candidates DisadvantageDisadvantage  All predictors must be applied to all applicants Best UseBest Use  When multiple cutoffs are acceptable, predictors can compensate, the applicant pool is small, and predictors have about equal costs 6–19 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Approaches for Making Employment Decisions 6–20 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. Employment Decisions Cutoff Scores Top-Down Selection BandingBanding

6–21 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. TABLE 6.5The Ebel Method for Setting Cutoff Scores

6–22 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. a SEM = Standard Error of Measurement. TABLE 6.6The Angoff Method for Setting Cutoff Scores

6–23 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE: The guidelines are based on Wayne F. Cascio and Herman Aguinis, “Test Development and Use: New Twists on Old Questions,” Human Resource Management 44 (2005): 227; and Wayne F. Cascio, Ralph A. Alexander, and Gerald V. Barrett, “Setting Cutoff Scores: Legal, Psychometric, and Professional Issues and Guidelines,” Personnel Psychology 41 (1989): 1–24. Cutoff scores are not required by legal or professional guidelines; thus, first decide if a cutoff score is necessary. There is not one best method of setting cutoff scores for all situations. If a cutoff score is to be used for setting a minimum score requirement, begin with a job analysis that identifies levels of proficiency on essential knowledge, skills, abilities, and other characteristics. (The Angoff method is one that can be used.) If judgmental methods are used (such as the Angoff method), include a 10 to 20 percent sample of subject matter experts (SMEs) representative of the race, gender, shift, etc. of the employee (or supervisor) group. Representative experience of SMEs on the job under study is a most critical consideration in choosing SMEs. If job incumbents are used to develop the cutoff score to be used with job applicants, consider setting the cutoff score one standard error of measurement below incumbents’ average score on the selection procedure. Set cutoff scores high enough to ensure that at least minimum standards of job performance are achieved. TABLE 6.7Selected Guidelines for Using Cutoff Scores in Selection Procedures

6–24 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. FIGURE 6.2Illustration of Fixed and Sliding Bands for Comparing Test Scores

Advantages of Banding More flexibility to pursue workforce diversity when making hiring decisionsMore flexibility to pursue workforce diversity when making hiring decisions Increased opportunity to emphasize secondary hiring factors not measured by traditional selection methodsIncreased opportunity to emphasize secondary hiring factors not measured by traditional selection methods Broadening of the selection perspective beyond a focus on maximum economic gain.Broadening of the selection perspective beyond a focus on maximum economic gain. 6–25 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Issues in Selection Decisions Adopting a ProcedureAdopting a Procedure  Should the selection procedure be sequential or nonsequential?  Should the decision be compensatory or noncompensatory?  Should the decision be based on ranking applicants, or should it be based on banding acceptable applicants? 6–26 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Issues in Selection Decisions (cont’d) Assessing Job PerformanceAssessing Job Performance  What determines success on the job?  What factors contribute to success, and how are they related? Administrative ConsiderationsAdministrative Considerations  How many applicants are expected?  What is the cost of the selection devices to be used?  How much time must be devoted to the selection decision? 6–27 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Developing a Selection Decision-Making Strategy Objective Decision-Making Strategy ModelObjective Decision-Making Strategy Model  Same decision made repeatedly  Data on past decision outcomes available  Future will resemble past  Sound validation strategies are use BootstrappingBootstrapping  Building a standardized mechanical, selection decision-making model using regression analysis to infer the weights a decision maker applied to factors in past selection decisions. 6–28 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Enhancing Selection Decisions Use reliable and valid selection proceduresUse reliable and valid selection procedures Do not make selection decisions based on intuition or gut feelingsDo not make selection decisions based on intuition or gut feelings If possible, use an objective (mechanical) decision-making strategyIf possible, use an objective (mechanical) decision-making strategy Use specific procedures, weightings, and rating standards to create overall applicant scoresUse specific procedures, weightings, and rating standards to create overall applicant scores Create cutoff scores using the modified Angoff procedureCreate cutoff scores using the modified Angoff procedure Use top-down selection to maximize job performanceUse top-down selection to maximize job performance Use court-supported forms of banding to increase diversityUse court-supported forms of banding to increase diversity Monitor post-selection successes and failureMonitor post-selection successes and failure Periodically audit selection decisions to identify areas for improvementPeriodically audit selection decisions to identify areas for improvement 6–29 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Key Terms and Concepts Simple selection decisions Complex selection decisions False positives False negatives Mechanical information collection Judgmental information collection Multiple regression Multiple cutoffs Multiple hurdle Combination method Top down selection Cutoff scores Banding Bootstrapping 6–30 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.