Presentation is loading. Please wait.

Presentation is loading. Please wait.

Social Media and Selection

Similar presentations


Presentation on theme: "Social Media and Selection"— Presentation transcript:

1 Social Media and Selection
Frequency of Use --- 18% indicated that they have used social networking websites to screen applicants, while 11% planned on using such sites in the future (survey of over 400 organizations by the Society for Human Resource Management in 2011) 45% of employers used social networking sites to investigate job applicants (survey of over 2,600 hiring managers by Harris Interactive for CareerBuilder.com) Consequences? 35% of organizations in Harris survey said they did not hire candidates due to content available on social networking sites. Most common examples of negative information included Provocative attire Images of drug or alcohol use, Complaints about previous employers (TIP article of social media & selection)

2 Password Protection Act
Introduced March of 2012 Prohibits an employer from forcing prospective or current employees to provide access to their own private account as a condition of employment Prohibits employers from discriminating or retaliating against a prospective or current employee because that employee refuses to provide access to a password-protected account.

3 EEOC Statements on Pre-Employment Inquiries
“Although Title VII does not make pre-employment inquiries concerning race, color, religion or national origin per se violations of the law, the Commission’s responsibility to equal employment opportunity compels it to regard such inquiries with extreme disfavor.” “ … in the investigation of charges alleging the commission of unlawful employment practices, the Commission will pay particular attention to the use by the party against whom charges have been made of pre-employment inquiries concerning race, religion, color, or national origin, or other inquiries which tend directly or indirectly to disclose such information. The fact that such questions are asked may, unless otherwise explained, constitute evidence of discrimination, and will weigh significantly in the Commission’s decision as to whether or not Title VII has been violated”

4 Application Blanks Content of items (use of job analysis)
Number of application blanks (one for each position or job category) Legal issues Image of organization (e.g., format, recruitment issue, perceived fairness) Accuracy of data Applicants overstated length of employment and past salary on application forms (Goldstein, 1971) One serious lie on 25% of application forms and resumes (LoPresto, Mitcham, & Ripley,1986) 40% to 60% candidates overstated their qualifications on resumes (George & Marett, 2005) Most frequent falsifications are job history, job duties, educational record, position title, and previous salary (Broussard and Brannen, 1986)

5

6 Increasing Application Form Accuracy
Inform applicants in verbally and in writing, that the information they furnish will affect their employability Inform applicants that the data they provide will be thoroughly checked Require applicants to sign a statement certifying the accuracy of the information they provided on the form. Include warnings of penalties (not being hired or termination upon discovery) for deliberate falsification Include a statement that the application does not create a binding obligation of employment for any specific period of time

7 Previous research studies examining employment applications
by date of study Study Results Type of Application & Sample Wallace & Vodanovich (2002) Fortune 500 sample = 2.99 inappropriate items Customer Service sample = 5.35 inappropriate items 191 Fortune 500 Finance/Accounting applications 109 Customer service applications (e.g., retail, food service) Wallace, Tye, & Vodanovich (2000) Average of 4.2 inappropriate items with most problematic: salary, age, driver’s license 42 online state general employment applications Vodanovich & Lowe (1992) Average of 7.4 inappropriate items with most problematic: age, convictions, & salary Retail; 46 categories Jolly & Frierson (1989) 25% of 20 categories were problematic (e.g., salary) 283 random applications from American Society of Public Administration members; 20 categories Coady, (1986) Most problematic: improper use of EEO worksheets 50 state libraries; 25 categories Lowell & DeLoach, (1982) Most problematic: military service & age 50 US firms; 17 categories Burrington, Average of 7.7 inappropriate items 50 general state applications; 30 categories Miller, (1980) Average of 9.74 inappropriate items 151 of Fortune 500; 72 categories Note: Adapted from Wallace, Tye, and Vodanovich, 2000.

8 Frequency of Common Inappropriate Application Blank Questions
Item Not appropriate Worded Appropriate Not asked Past salary 98.9 1.1 Minimum salary 72.7 27.2 Age 54.5 37.5 8.0 Information about relatives 50.0 10.2 39.8 Conviction records 43.2 28.4 Health 40.9 2.3 56.8 Military service 30.7 38.6 Marital status 27.3 Emergency contact 25.0 31.8 Years of experience and previous salary are the strongest predictors of starting salary, and starting salary is the greatest predictor of current salary. --- Mickey Silberman, Jackson Lewis, LLP Industry Liason Conference (2011) From: Vodanovich & Lowe (1992); Public Personnel Management

9 [from Wallace & Vodanovich, 2004) Public Personnel Management]
Percentage of most commonly identified inadvisable application blank items by sample [from Wallace & Vodanovich, 2004) Public Personnel Management] Customer Service Fortune 500 Category Inadvisable Legitimate Desired Salary 66.1 15.0 20.8 5.2 Personal Address * 49.5 0.0 88.0 2.5 Lowest Acceptable Salary 46.8 19.2 25.0 1.2 Graduation Date 33.9 3.1 54.2 3.5 Work Schedule 33.0 36.2 1.6 0.6 Conviction (w/o disclaimer) 26.6 56.8 3.6 38.2 References 42.4 23.6 Gender (w/o EEO disclaimer) 25.7 38.0 20.3 19.5 Race (w/o EEO Disclaimer) 24.8 25.1 15.1 18.9 Driver’s License 22.9 16.2 0.8 Relatives 21.1 5.1 2.6 EEO Worksheet 16.5 8.1 14.1 25.6 Handicap (w/o EEO disclaimer) 14.7 3.2 15.6 Age (w/o EEO disclaimer) 15.8 Language Fluency 11.9 1.5 0.7 Emergency Contact 7.3 .5 Marital Status 3.7 5.7 0.3 Personal Web Page Address 16.7 National Origin (w/o EEO disclaimer) 2.8 1.8 23.1

10 Effect of Name on Resumes and
Interview Rates Resume Quality Name type Low High “White” sounding name “Black” sounding 50% less chance of being invited for an interview versus “Whites” with high qualifications

11 Research on court cases: Most common application blank challenges were based on questions about sex (28%), age (25%), and race (12%). Source: Kethley & Terpstra, 2005

12 2012 EEOC Guidance on Arrest and Conviction Records
In 2012, the EEOC issued guidelines on the use of arrest and conviction records for making selection decisions, its first update since 1990. Refusing to hire those with arrest records is not considered to be justifiable. However, if an individual's behavior underlying an arrest makes the person unfit for a given job, then a decision to not hire may be legitimate. Conviction records are generally easier to defend from a legal perspective. But, the EEOC stresses the consideration of the following three factors: the nature and severity of the offense, 2) the amount of time that has passed since the conviction (or completion of one's sentence), and 3) the nature and type of job sought. The EEOC suggests not asking about conviction records on application forms. If included, such questions ought to be restricted to convictions “for which exclusion would be job related for the position in question and consistent with business necessity.” See:

13

14 SHRM Credit Background Check Survey Results SHRM Survey Credit Checks
2004 In general, how frequently does your organization, or an agency hired by your organization check any of the following references for its job candidates? 
Credit Checks Always: 19%
Sometimes: 24%\ 42%
Rarely: 18%   Never: 39% Survey margin of error: +/- 5% Note: n = 296. Excludes respondents who responded “Don’t know.”
Source: SHRM Reference and Background Checking Survey (2004) 2010 Does your organization, or an agency hired by your organization, conduct credit background checks for any job candidates by reviewing the candidates’ consumer reports? Credit Checks All job candidates: 13%
Select job candidates: 47% No: 40% Survey margin of error: +/- 5% Note: n = 343. Excludes respondents who responded “Not sure.”
Source: SHRM Background Checking Survey (2010)

15 SHRM Survey on Use of Credit Background Checks (2010)
On which categories of job candidates does your organization conduct credit background checks?

16 SHRM Survey (cont.) When does your organization, or any agency hired by your organization, initiate credit background checks on job candidates?

17 SHRM Survey (cont.) Does your organization allow job candidates, in certain circumstances, the opportunity to explain the results (e.g., high debt, bankruptcy, etc.) of their consumer report that might have an adverse effect on an employment decision?

18 SHRM Credit Check Survey Research Summary
The use of credit background checks in employment decisions has not changed in any discernable way over the past 6 years. Most organizations do not conduct credit background checks on all job candidates. Organizations conduct credit background checks for those positions where this information is most job-relevant. Employers place lower relative importance on credit background checks than other job-related factors in making hiring decisions. Employers do not use credit background checks to screen out mass numbers of candidates in the early phases of the application process. Credit background check results are seldom used as a definitive hiring criterion.

19 Two large studies by the Federal Reserve System in 2003 and Freddie Mac in 2000 concluded that Asians and Whites have higher credit scores than do Hispanics and African Americans Meta-Analysis Criterion K N r Work problems 10 7,464 .149 Discipline 5 5,946 .131 Absenteeism 6 1,678 .211 Performance ratings 3 561 .069 K = number of studies, N = total sample size, r = sample-size weighted uncorrected average correlation Credit score: A number which provides a “snapshot” over a certain period of time (not shown to employers) Credit report: Generates information about an individual’s debt over a longer time frame than a credit score From: Statement of Michael Aamodt, Ph.D., Principal Consultant, DCI Consulting Group, Inc. EEOC Meeting of October 20, Employer Use of Credit History as a Screening Tool

20 A recent study found credit scores to be predictive of certain work-related outcomes and Big 5 personality scores. The authors found that credit scores were significant and negatively related to supervisor ratings of: Task performance Employee engagement in OCBs Credit score were also predictive of Big 5 personality scores: Greater conscientiousness Low agreeableness But, credit scores were not found to predict supervisor ratings of: workplace deviance (e.g., theft, aggressiveness) However, the authors caution the use of credit scores absent data demonstrating their job relatedness for certain jobs and the potential for adverse impact.

21 Example: Scoring resume data for sales and accounting jobs

22 Characteristics of T&E Evaluations (e. g
Characteristics of T&E Evaluations (e.g., information from application blanks, resumes) A listing or description of tasks, KSAs, or other job-relevant content areas A means by which applicants can describe, indicate, or rate the extent of their training or experience with these job content areas A basis for evaluating or scoring applicants’ self-reported training, experience, or education

23 Some Uses of Training and Experience Evaluations (e. g
Some Uses of Training and Experience Evaluations (e.g., gleamed from application blank information) As the sole basis for deciding if an individual is or is not minimally qualified As a means for rank-ordering individuals from high to low based on a T&E score As a basis for prescreening applicants prior to administering more expensive, time-consuming predictors (for example, an interview) In combination with other predictors used for making an employment decision

24 Sample T&E Evaluation

25 Decision-Making Methods for T&E Data
Holistic Judgment An informal, unstructured approach that an individual takes when reviewing an application or T&E form An individual makes a cursory review of the information and arrives at a broad, general judgment of the applicant’s suitability Because of its unstandardized nature and unknown reliability and validity, it should be avoided as an approach to T&E evaluations.

26 Decision-Making Methods for T&E Data (cont.)
Point Method A pre-established rating system for crediting applicants’ prior training, education, and experience considered relevant to the job Points are assigned based on the recentness, type, and amount of training, job experience, and education received Analysts using the point method make their ratings and then sum the credited points assigned

27 Decision-Making Methods for T&E Data (cont.)
Grouping Method Divides applicants into groups that best represent each applicant’s level of qualifications The number of groups used will depend on the particular situation High Group: suitable applicants well qualified for the job Middle Group: applicants not fitting in either the high or low group Low Group: applicants with minimum qualifications but poorly suited because of limited experience or training Unqualified Group: applicants lacking minimum qualifications

28 Decision-Making Methods for T&E Data (cont.)
Behavioral Consistency Method Applicant descriptions of achievements related to key job requirements or competencies are formally scored using scales derived from subject matter experts Principles of the Method Behaviors evaluated have been identified by SMEs as showing differences between superior and minimally acceptable workers. Applicants’ past accomplishments can be reliably rated by SMEs. Past accomplishments are considered predictive of future behaviors

29 Sample of Behavior Consistency Model
Concerns the conduct of research activities including designing a research study, collecting and analyzing data to test specific research hypotheses or answer research questions, and writing up research results in the form of a formal report. For the behavior Conducting Empirical Research that is defined above, think about your past activities and accomplishments. Then write a narrative description of your activities and accomplishments in the space below. In your description, be sure to answer the following questions: 1. What specifically did you do? When did you do it? 2. Give examples of what you did that illustrate how you accomplished the above behavior. 3. What percentage of credit do you claim for your work in this area? Description: During my senior year (2005–2006), I wrote a senior research thesis as a partial requirement for graduation with honors in psychology. I designed a research study to investigate the effects of interviewer race on interviewee performance in a structured interview. I personally designed the research study and conducted it in a metropolitan police department. White and African-American applicants for the job of patrol police officer were randomly assigned to White and African-American interviewers. After conducting an analysis of the patrol police job, a structured interview schedule was developed. The various interviewee-interviewer racial combinations were then compared in terms of their performance in the structured interview. I consider the vast majority of the work (80 percent) to be my own. My major professor accounted for about 20 percent of the work. Her work consisted of helping to obtain site approval for the research, helping to design the study, and reviewing my work products. Name and Address of an Individual Who Can Verify the Work You Described Above: Name: Dr. Amy Prewett Address: Department of Psychology Pascal Univ. State College, ID Phone:

30 Decision-Making Methods for T&E Data (cont.)
Task-Based Method Critical job tasks identified from comprehensive job analysis serve as the basis for the task-based method. Applicants indicate on a list of tasks if they have performed the tasks and, if so, how often Applicants furnish specific information such that their self-ratings can be verified KSA-Based Method Similar to the task-based method with the substitution of KSAs on the questionnaire for applicant self-ratings

31 Psychometrics of T&E Evaluations
Reliability T&E evaluations reflect high inter-rater reliability estimates (.80s) with the task-based method producing the highest reliability coefficient and the grouping method producing the lowest Validity Validity of T&E ratings varies with the type of procedure used The behavioral consistency method demonstrated the highest validity The point- and task-based methods show useful validities for applicant groups having low levels of job experience

32 Research Findings on T&E Evaluations
Consistently predict important work outcomes Vary significantly in the strength of their predictive validity Some methods of evaluating experience and training exhibit substantial correlations with success (e.g., the “behavioral consistency” method, GPA) Other methods reflect low validities (e.g., the point method) Are particularly valuable for the first three to five years on the job

33 T&E Recommendations Use T&E evaluations to set specific minimum job qualifications (KSAs), rather than using a selection standard Replace holistic methods with competency-based approaches—behavioral consistency and grouping methods T&E evaluations are subject to the Uniforms Guidelines Use T&E evaluations only as rough screening procedures for positions where previous experience and training are necessary Forms and procedures for collecting and scoring T&E evaluations should be standardized as much as possible Verify self-report data, particularly of data given by applicants who are going to be offered a job Base final hiring decisions on other selection measures when distortion of self-evaluation information is likely to be a problem

34 Reference Checks Basic Purposes:
(Exceptionally common technique; e.g., 95% usage by organizations) Basic Purposes: Verify information provided by the applicant (check for inconsistencies) Uncover unreported or additional information (over ½) Predict job performance (pass or fail decisions made (52%)

35 Types of Information Collected
Employment dates Rehire? Salary history Absenteeism, tardiness Qualifications for a certain type of job or work

36 Sources of Reference Data
Supervisor (most common and most useful) Personal reference Agencies (e.g., credit ratings) Public Records (criminal background, driving records, court records, workers compensation) Educational background (verification)

37 Reference Check Methods
In-Person (e.g., interview) Costly, time consuming Used in jobs that involve the concern for risks (e.g., security, $) Can elicit different types of information (differences between in-person and written reference information) Mail (or ) Low return rate with use of “snail” mail (e.g., 56 – 64%) (many chose to set up phone reference checks via ) Standardized questions, format Written record of responses Ensure confidentiality of responses (signed statement by applicant)

38 (More frequently used than written references)
Telephone Checks (More frequently used than written references) Allows follow-up or clarification of answers given Less resistance to giving certain types of information can be collected Relatively quick process Important data can be gleaned from various verbal cues (e.g., pauses, hesitations, voice inflections, voice level, intonations) Relatively high return rate (especially if time was set earlier) Better responsiveness, more interactive nature of the method More confidence in the identity of responder

39 Reference Check Recommendations
Use of job-related questions (e.g., KSAs from a job analysis) Use of multiple reference check forms (job specificity) Follow provisions contained in the Uniform Guidelines (e.g., regarding fairness, validity) Behaviorally-focused and objective set of questions Get written permission for applicants Training of interviewers (phone, interview) and recordkeeping Ask for additional references if one’s submitted not available Verify information that is collected!

40 Usefulness of Reference Information
Relatively low validity; relationship to performance measures (e.g., .14, .16) Relatively low interrater reliability (e.g., .40, but sometimes from different sources) Most useful if: Data collected from immediate supervisor Referee knows applicant well (chance to observe job behavior) and have similar demographic characteristics Similarity between the prior job and the one being applied for

41 Reference Checks --- Legal Issues
Negligent Hiring An injury to a third party is caused by an employee The employee is shown to be unfit for the job that he or she holds The employer knew or should have known that the employee was unfit if a background check or criminal check had been conducted The injury to the third party was a foreseeable outcome resulting from hiring the unfit employee The injury is a reasonable and probable outcome of what the employer did or did not do in hiring the individual

42 Letters of Recommendation
(Mainly used in highly skilled or professional jobs) Some generic indicators: Meaning of certain adjectives (e.g., mental ability – performance; cooperativenss/personality – not related to performance) Number of words used or length of letter (longer letter is better) Concerns: Pre-selection of referees (often only positive information included) Verbal and organizational skill of writer Unstructured content Omissions Time availability Subjective scoring (e.g., focus on irrelevant information, status of writer)

43 Biographical Data (Bio-Data) Process
Developing Biodata Items --- Task/job-based approach --- To assess safety orientation; “How often have you been a participant in safety meetings?” Assumes prior experience in the job domain KSAO approach --- [Example of assessing a job-related construct such as initiative --- “On past jobs, how often did you volunteer to work on specific projects?” More generic but not explicitly related to job duties

44 Item Screening Relevance of the item to the intended construct or job/task relevance of the item Potential overlap with other constructs Social desirability and/or faking potential Possible for bias against protected groups Privacy concerns

45 Classification of Biographical Items
Historical How old were you when you got your first job? Hypothetical What job do you think you’ll have in 10 years? External Did you ever get fired from a job? Internal What is your attitude towards friends who smoke marijuana? Objective How many hours did you study for your math tests? Subjective Would you describe yourself as shy? First-hand How punctual are you about coming to work? Second-hand How would your teachers describe your punctuality? Discrete At what age did you receive your driver’s license? Summative How many hours do you study during an average week? Verifiable What was your GPA in college? Non-Verifiable How many fresh vegetables do you eat daily? Controllable How many times did you drop classes in college? Non-controllable How many siblings do you have? Best results if items are: Historical, objective, discrete, job relevant, and external

46 1. Yes-No Response: Are you satisfied with your life? a. Yes b. No 2. Continuum, Single-Choice Response: About how many fiction books have you read in the past year? a. None b. 1 or 2 c. 3 or 4 d. 5 or 6 e. More than 6 3. Noncontinuum, Single-Choice Response: Which one of the following would you most prefer to do in your leisure time? a. Read a book b. Work crossword puzzles c. Attend a party d. Play golf, tennis, or softball e. Repair a broken appliance or make minor home repairs 4. Noncontinuum, Multiple-Choice Response: Check each of the following activities you had participated in by the time you were 18. a. Shot a rifle b. Driven a car c. Worked a full-time job d. Traveled alone more than 500 miles from home e. Repaired an electrical appliance

47 5. Continuum, Plus Escape Option:
When you were a teenager, how often did your father help you with your schoolwork? a. Very often b. Often c. Sometimes d. Seldom e. Never f. Father was not at home 6. Noncontinuum, Plus Escape Option: In what branch of the military did you serve? a. Army b. Air Force c. Navy d. Marines e. Never served in the military 7. Common Stem, Multiple Continuum: In the last 5 years, how much have you enjoyed each of the following? (Use the rating scale of 1 to 4: (1) Very Much, (2) Some, (3) Very little, (4) Not at all a. Reading books b. Watching TV c. Working at your job d. Traveling e. Outdoor recreation

48 Classification of Biographical Items (cont.)
1. Verifiable: Did you graduate from college? 2. Historical: How many jobs have you held in the past five years? 3. Actual Behavior: Have you ever repaired a broken radio? 4. Memory: How would you describe your life at home while growing up? 5. Factual: How many hours do you spend at work in a typical week? 6. Specific: While growing up, did you collect coins? 7. Response: Which of the following hobbies do you enjoy? 8. External Event: When you were a teenager, how much time did your father spend with you? Unverifiable: How much did you enjoy high school? Futuristic: What job would you like to hold five years from now? Hypothetical Behavior: If you had your choice, what job would you like to hold now? Conjecture: If you were to go through college again, what would you choose as a major? Interpretive: If you could choose your supervisor, what characteristic would you want him or her to have? General: While growing up, what activities did you enjoy most? Response Tendency: When you have a problem at work, to whom do you turn for assistance? Internal Event: Which best describes the feelings you had when you last worked with a computer?

49 Performance Groups (e.g., median split, upper vs. lower thirds)
Biodata Scoring Empirical Keying Approach (correlation between items and a criterion) Items Performance Groups (e.g., median split, upper vs. lower thirds) High Low Issues: Validity of the criterion measure Contamination/bias in criterion measure Items (and their weights) are specific to the criterion

50 Rational Scoring Approach (Assessment of how well items

51 Summary of Bio-Data Validity Studies

52 Bio-Data (cont) Reliability: .60 to .80 across several studies [higher for more verifiable items] Validity: Many validity coefficients above .30 Accuracy: Some distortions exist. Mainly on unverifiable items (e.g., interests, preferences) and more if desirability of answers is apparent (e.g., faking can occur)

53 Bio-Data [Why does it work?]
Use of life history items e.g., personal background, life experiences, interests (past behavior is best predictor of future behavior) Only relevant (empirically significant) items/constructs are selected Correlation between BIB content and criterion Wide range of information (lots of different questions and types)

54 Some Bio-Data Issues Need large sample to construct properly
Situational specificity Need large sample to construct properly Assumption of a “correct” life history Pure empirical approach (e.g., versus content approach) Legal issues (e.g., adverse impact, validity, reliability)

55 Eliminate an item from the Bio-data inventory if the item:
Exhibits little response variance Has a skewed response distribution Is correlated with protected-group characteristics such as ethnicity Has no correlation with other items thought to be measuring the same life history construct Has no correlation with the criterion (no item validity)


Download ppt "Social Media and Selection"

Similar presentations


Ads by Google