Presentation is loading. Please wait.

Presentation is loading. Please wait.

©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

Similar presentations


Presentation on theme: "©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part."— Presentation transcript:

1 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. Strategies for Selection Decision Making Part 2Foundations of Measurement for Human Resource Selection CHAPTER 6

2 Characteristics of Selection Decisions Simple Selection DecisionsSimple Selection Decisions  Involve one position with several applicants  Which person for a specific job? Complex Selection DecisionsComplex Selection Decisions  Involve several applicants and several positions  Which persons for which jobs? Selection Information Processing Demands =Selection Information Processing Demands = Number of applicants × Amount of selection data collected 6–2 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

3 Characteristics of Selection Decisions (cont’d) Selection Decision ErrorsSelection Decision Errors  False positives (erroneous acceptances)  Appear acceptable but, once hired, perform poorly  False negatives (erroneous rejections)  Initially appear unacceptable but would have performed successfully if hired  True positives  True negatives Improving Selection DecisionsImproving Selection Decisions  Use high-validity standardized selection procedures  Employ proven decision-making strategies 6–3 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

4 Enhancing Employment Decisions Questions to be answered:Questions to be answered:  For a specific selection situation, what are the best methods for collecting predictor information on job applicants?  How should scores be combined or added together from multiple predictors to give applicants an overall or total score for selection decision-making purposes?  Once a total score on two or more predictors is obtained, how should this overall score be used to make selection decisions? 6–4 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

5 6–5 TABLE 6.1Modes of Collecting Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

6 6–6 TABLE 6.2Modes of Combining Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

7 6–7 TABLE 6.3Methods of Collecting and Combining Predictor Information from Job Applicants ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE:Judgmental—Involves collecting/combining predictor information using human judgment. Mechanical—Involves collecting/combining predictor information without using human judgment. a Indicates (usually) superior methods for collecting and combining predictor information.

8 Combining Selection Data Mechanical versus JudgmentalMechanical versus Judgmental  Mechanical combination increases the accuracy of prediction when weighting predictor scores.  Accuracy of statistical models increases as data on more applicants are added to available data.  Systematic and thorough judgmental decision makers can do only as well as a statistical model.  Decision makers add error when judgmentally combining both subjective data and objective data.  Statistical models make an allowance for overconfidence errors and reduce the impact of individual biases on decision outcomes. 6–8 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

9 Improving Selection Decisions 1.Be careful about relying too heavily on resumes and other initial information collected early in the selection process. 2.Use standardized selection procedures that are reliable, valid, and suitable for the specific selection purpose. 3.When feasible, use selection procedures that minimize the role of selection decision maker judgment in collecting information. 4.The judgment of selection decision makers should not play a major role in combining data collected and used for determining applicants’ overall scores. 6–9 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

10 Strategies for Combining Predictor Scores Mechanical Methods for Combining Predictor InformationMechanical Methods for Combining Predictor Information  Multiple regression  Multiple cutoffs  Multiple hurdles  Combination method 6–10 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

11 6–11 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE: The higher an applicant’s score on a predictor, the better the applicant’s performance on the predictor. TABLE 6.4Patient Account Representative Selection Procedure and Application Score Data

12 Strategy One: Multiple Regression AssumptionsAssumptions  The predictors are linearly related to the criterion.  The predictors are additive and can compensate for one another. AdvantagesAdvantages  Minimizes errors in prediction and combines the predictors to yield the best estimate of criterion status.  Is a very flexible method that can be modified to handle nominal data, nonlinear relationships, and both linear and nonlinear interactions. 6–12 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

13 Strategy One: Multiple Regression (cont’d) DisadvantagesDisadvantages  Assumed substitutability of predictor values  Instability of weights when using small samples  Requirement for assessing all applicants on all predictors Best UseBest Use  When predictor scores are interchangeable  When sample size is large 6–13 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

14 Strategy Two: Multiple Cutoffs AssumptionsAssumptions  A nonlinear relationship exists among the predictors and the criterion—a minimum amount of each predictor attribute is necessary.  Predictors are not compensatory—lack or deficiency in any one predictor attribute cannot be compensated for by having more of another. AdvantagesAdvantages  Candidates in pool are all minimally qualified.  Easy to explain to managers 6–14 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

15 Strategy Two: Multiple Cutoffs (cont’d) DisadvantagesDisadvantages  Requires assessing all applicants on all predictors  Identifies only the minimally-qualified candidates Best UsesBest Uses  When physical abilities are essential for performance  When a minimum level of performance on each predictor is necessary to perform job safely (e.g., aircraft takeoffs and landings) 6–15 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

16 Strategy Three: Multiple Hurdles AssumptionsAssumptions  Each applicant must sequentially meet the minimum cutoff or hurdle for each predictor before going to the next predictor. Failure to pass on the next predictor eliminates the applicant from consideration.  An alternative procedure is to calculate a composite multiple regression for each applicant at each successive hurdle. Whenever the probability of success for an applicant drops below some set value, the applicant is rejected. 6–16 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

17 6–17 FIGURE 6.1Double-Stage Multiple Hurdle Strategy ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

18 Strategy Three: Multiple Hurdles (cont’d) AdvantagesAdvantages  All candidates are minimally qualified on predictors  Do not have to apply all predictors to all candidates DisadvantagesDisadvantages  Establishing validity for each predictor  Time required to apply predictors sequentially Best UsesBest Uses  When extensive post-hiring training is required  When an essential KSA must be present at hiring  With numerous applicants and expensive procedures 6–18 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

19 Strategy Four: Combination Method AssumptionsAssumptions  There are minimum cutoffs for each predictor. Multiple regression creates an overall score. AdvantageAdvantage  Identifies and rank orders the acceptable candidates DisadvantageDisadvantage  All predictors must be applied to all applicants Best UseBest Use  When multiple cutoffs are acceptable, predictors can compensate, the applicant pool is small, and predictors have about equal costs 6–19 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

20 Approaches for Making Employment Decisions 6–20 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. Employment Decisions Cutoff Scores Top-Down Selection BandingBanding

21 6–21 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. TABLE 6.5The Ebel Method for Setting Cutoff Scores

22 6–22 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. a SEM = Standard Error of Measurement. TABLE 6.6The Angoff Method for Setting Cutoff Scores

23 6–23 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. NOTE: The guidelines are based on Wayne F. Cascio and Herman Aguinis, “Test Development and Use: New Twists on Old Questions,” Human Resource Management 44 (2005): 227; and Wayne F. Cascio, Ralph A. Alexander, and Gerald V. Barrett, “Setting Cutoff Scores: Legal, Psychometric, and Professional Issues and Guidelines,” Personnel Psychology 41 (1989): 1–24. Cutoff scores are not required by legal or professional guidelines; thus, first decide if a cutoff score is necessary. There is not one best method of setting cutoff scores for all situations. If a cutoff score is to be used for setting a minimum score requirement, begin with a job analysis that identifies levels of proficiency on essential knowledge, skills, abilities, and other characteristics. (The Angoff method is one that can be used.) If judgmental methods are used (such as the Angoff method), include a 10 to 20 percent sample of subject matter experts (SMEs) representative of the race, gender, shift, etc. of the employee (or supervisor) group. Representative experience of SMEs on the job under study is a most critical consideration in choosing SMEs. If job incumbents are used to develop the cutoff score to be used with job applicants, consider setting the cutoff score one standard error of measurement below incumbents’ average score on the selection procedure. Set cutoff scores high enough to ensure that at least minimum standards of job performance are achieved. TABLE 6.7Selected Guidelines for Using Cutoff Scores in Selection Procedures

24 6–24 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part. FIGURE 6.2Illustration of Fixed and Sliding Bands for Comparing Test Scores

25 Advantages of Banding More flexibility to pursue workforce diversity when making hiring decisionsMore flexibility to pursue workforce diversity when making hiring decisions Increased opportunity to emphasize secondary hiring factors not measured by traditional selection methodsIncreased opportunity to emphasize secondary hiring factors not measured by traditional selection methods Broadening of the selection perspective beyond a focus on maximum economic gain.Broadening of the selection perspective beyond a focus on maximum economic gain. 6–25 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

26 Issues in Selection Decisions Adopting a ProcedureAdopting a Procedure  Should the selection procedure be sequential or nonsequential?  Should the decision be compensatory or noncompensatory?  Should the decision be based on ranking applicants, or should it be based on banding acceptable applicants? 6–26 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

27 Issues in Selection Decisions (cont’d) Assessing Job PerformanceAssessing Job Performance  What determines success on the job?  What factors contribute to success, and how are they related? Administrative ConsiderationsAdministrative Considerations  How many applicants are expected?  What is the cost of the selection devices to be used?  How much time must be devoted to the selection decision? 6–27 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

28 Developing a Selection Decision-Making Strategy Objective Decision-Making Strategy ModelObjective Decision-Making Strategy Model  Same decision made repeatedly  Data on past decision outcomes available  Future will resemble past  Sound validation strategies are use BootstrappingBootstrapping  Building a standardized mechanical, selection decision-making model using regression analysis to infer the weights a decision maker applied to factors in past selection decisions. 6–28 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

29 Enhancing Selection Decisions Use reliable and valid selection proceduresUse reliable and valid selection procedures Do not make selection decisions based on intuition or gut feelingsDo not make selection decisions based on intuition or gut feelings If possible, use an objective (mechanical) decision-making strategyIf possible, use an objective (mechanical) decision-making strategy Use specific procedures, weightings, and rating standards to create overall applicant scoresUse specific procedures, weightings, and rating standards to create overall applicant scores Create cutoff scores using the modified Angoff procedureCreate cutoff scores using the modified Angoff procedure Use top-down selection to maximize job performanceUse top-down selection to maximize job performance Use court-supported forms of banding to increase diversityUse court-supported forms of banding to increase diversity Monitor post-selection successes and failureMonitor post-selection successes and failure Periodically audit selection decisions to identify areas for improvementPeriodically audit selection decisions to identify areas for improvement 6–29 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.

30 Key Terms and Concepts Simple selection decisions Complex selection decisions False positives False negatives Mechanical information collection Judgmental information collection Multiple regression Multiple cutoffs Multiple hurdle Combination method Top down selection Cutoff scores Banding Bootstrapping 6–30 ©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Download ppt "©2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part."

Similar presentations


Ads by Google