Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013.

Similar presentations


Presentation on theme: "Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013."— Presentation transcript:

1 Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013

2 Let’s do the math… 2 High unemployment Reduced Public Sector Budgets Internet Recruiting Broad Outreach Online Applications Low-effort applying Reduced Self-filtering Large Applicant Pools Workload Problem for many merit systems

3 Strategies to reduce pool  Limit exposure ▪ Limit posting sites (no Monsters) ▪ Paper bulletins only ▪ Shorten posting period ▪ Do not accept “interest cards”  Limit application window ▪ Open for one day or hour ▪ Posting cut-off at x-number of applicants (first come first serve for testing slots, self- schedule)  Increase burden to apply ▪ Paper app, hand pickup and deliver ▪ Charge $ per application ($5 is legal) ▪ Increase requirements for complete app (proof of diploma, degree, transcripts)  Random selection ▪ Lottery 3 Will reduce your pool, but… At best, not improve on the average level of ability Possibly skew toward lower end None are “merit” based Negative impact on image as public employer affecting other recruitments Increase protests, appeals How will you defend this?

4 The opportunity  Large applicant pools increase the potential utility of a valid selection procedure.  Utility: The expected organizational gain from your selection procedure ▪ Gain #1: Success rate of those hired ▪ Gain #2: Monetary value of higher performance  Validity: the ability of your selection procedure to identify top talent  Selection Ratio: job openings (n) divided by the number of job applicants (N). Lower Ratio = Higher Selectivity 4

5 Large versus small pools 5 The likelihood of recruiting exceptional candidates increases with large pools. large small Exam Score Number

6 Large versus small pools 6 Large pools enable identifying pass points.

7 Put test validity to work 7 Job Performance Exam Score, Current Employees Large pools enable high pass points. Selected mean= 36 Overall mean = 26

8 Challenge How to be efficient without sacrificing quality? 1. Analyze where large pools take the most effort 2. Utilize most efficient job-related screening tools first 8

9 Job-Related Strategies 1. Self-Selection 2. Efficient first stage screening 3. Broad-based testing 9

10 1. Focused Recruitment 2. Clear Job Descriptions and Requirements 3. Realistic Job Previews 10 Self-selection

11 Self-selection: Focused Recruitment  Occupation-related websites ▪ Dice.com, OhSoEZ.com  Social Networking websites ▪ Professional bulletin boards and discussion groups ▪ LinkedIn Groups  Regular community-oriented networking  Old fashioned paper bulletins and local media distribution  Career preparatory schools and training organizations  PTA and booster organizations  Internal email, intranet, word-of-mouth by employees 11

12 Self-selection: Clear Requirements and Expectations  Clear duty statements and minimum qualifications ▪ “Any combination of education and experience…” ??  Realistic job preview  Clear statement of rewards and challenges of the job in bulletin and other job description information.  Web-based preview/orientation ▪ Paraeducator, Special Education ▪ 3 hour, on-line workshop, must be completed as part of the application process ▪ Uses Adobe Connect 12

13 1. Automated objective written tests 2. Un-proctored Internet Testing 3. Auto-scored supplemental questionnaires 13 Efficient first-stage testing

14 Efficient first-stage testing: Objectively Scored Testing  Scantron (mark-sense answer sheets) or NCS automated scoring and score upload.  Work horse of selection procedures, and still among most efficient.  Test BEFORE screening for minimum qualifications ▪ Review minimums for only those who pass the test. 14

15 Efficient first-stage testing: Un-proctored Internet Testing (UIT)  Basic skills testing (reading, writing, math, reasoning, etc.) ▪ Offered by all major test publishing and consulting firms ▪ Register or apply on-line ▪ Computer administered and timed ▪ May be computer adaptive testing ▪ Candidate cheating and impersonation remain issues ▪ Those who pass are invited in for verification testing and subsequent exam parts (performance, orals, etc.)  Completion of online workshop post-testing ▪ Use with on-line preview/orientation with a multiple choice exam.  Online empirically-keyed personality testing, situational judgment, biodata, and occupational preference/suitability ▪ Less prone to faking and cheating 15

16 Efficient first-stage testing: Supplemental Questionnaires 1. Part of the application process ▪ Could be just a few questions focused on MQs ▪ Could be automated tally of years experience and level of education ▪ Potentially useful but blunt instrument ▪ Prone to candidate inflation due to “interpretation” 2. Based on job analysis to identify areas to assess ▪ Type and level of duties and related tasks ▪ Skills and facets of the skills ▪ Each area is broken down into individual questionnaire sections and items ▪ Should require some form of verification 16

17 Example Items: Skills (MS Office)  To what extent have you performed the following EXCEL/spreadsheet tasks? 1. Set up rows and columns of data 2. Cut and paste data from one spreadsheet into another 3. Write formulas in cells that reference more than one worksheet 4. Create pivot tables 5. Create graphs 6. Use built-in statistical functions 7. Write macros to automate routine functions  To what extent have you performed the following WORD/word processing tasks? 1. Formatted documents with section breaks 2. Formatted documents in “column” format 3. Created and formatted tables 4. Inserted pictures and graphs 5. Created locked forms with "form fields" 6. Used "track changes" feature in document editing 7. Used mail merge to insert Excel source data into templates 17

18 Supplemental questionnaires Points per response (example) 0 = I have no background in this 0 = I know what this is but I have not done it 1 = I have assisted others (or received training) in this, but not done it independently 2 = I have done this independently but not frequently 4 = I have done this as a regular part of my job 18

19 Supplemental questionnaires Supporting Examples  Describe an accomplishment that illustrates your skill level in conducting job classification work. (Please limit your response to 200 words.)  Describe an example of a task you performed that illustrates your skill level with Excel/spreadsheets. (Please limit your response to 100 words.)  Or for skills, actual testing to verify ( similar to UIT) 19

20 Supplemental questionnaires Two-Stage Pass Points Stage 1 1. Calculate MAR – Sum of Minimally Acceptable Responses for each item. 2. Set pass point based on self-rated scores. (If based on applicant flow management, pass 10-20% more than you actually need.) Stage 2 1. For those that exceed the pass point, have SMEs compare the supporting examples to the self ratings. 2. SMEs apply a weight of 0,.5, or 1.0 to the area of self rating. Where: ▪ 1.0 = example is consistent with the self ratings ▪.5 = questionable support for the self ratings ▪ 0.0 = lacking support for the self ratings 3. Apply the corrected scores to the pass point to determine who actually has passed the hurdle. 20

21 Broad-based testing  Core test battery for as many classifications as possible  Score banking and certification to different exam plans  Different weights and cut-off scores for different classifications based upon relevancy and level needed  College-Board Model (SAT, GRE, etc.) 21

22 Broad-based testing: What problem are we solving?  Many related classifications  Differ on specific job content and levels of responsibility  Many obstacles to class consolidation - logistical, political, fiscal  Each class has a unique examination  Variable content and quality, not consistently updated  Many of the same candidates  2-3 month recruitment & examination cycle time  No overall competency structure across the classes  Lack of core competencies and competency progression  Pass point paradoxes (lower pass points for higher level jobs based on selection ratios) 22

23 Broad-based testing: Purpose and Objectives  Improve testing reliability and validity  Improve coherence of clerical examination plans  Reduce redundant testing  Increase recruitment efficiency  Reduce time to place candidates on eligibility lists 23

24 Broad-based testing: Recruitment  Recruit for broad-based test  Annual calendar of assessment (e.g., quarterly)  Establishes a pre-tested pool of potential candidates for the relevant classes  For most job classifications, limited or no public posting for job specific recruitments  Invitations to apply for specific jobs are sent to pre-tested pool based on test cut-off score 24 Method 1:

25 Broad-based testing: Recruitment  Recruit for a classification or class job family  Build a pre-tested pool from each recruitment process  Candidates apply for each recruitment process  Allow non-tested candidates to participate in each process & exams  Pre-tested candidates are informed of their status based on test cut-off scores 25 Method 2:

26 Broad-based testing – Clerical & Secretarial Classifications: Competencies Assessed - Tests 1. Sequencing and ordering speed and accuracy (timed) 2. Checking and comparing speed and accuracy (timed) 3. Computational speed and accuracy (timed) 4. Following Instructions (MC) 5. English Usage & Grammar (MC) 6. Data Entry speed and accuracy (timed, OPAC) 7. Microsoft Word Skills (Performance, OPAC) 26

27 1. Clerk 2. Intermediate Clerk 3. Senior Clerk 4. School Clerk 5. Senior School Clerk 6. Data Control Clerk 7. Senior Data Control Clerk 8. Typist Clerk 9. Intermediate Typist Clerk 10. Temporary Office Worker 11. Senior Typist Clerk 12. Department Assistant, Dance 13. Department Assistant, Music 14. Department Assistant, Theater 15. Department Assistant, Visual Arts 16. Receptionist 17. Reader 18. Information Resources Specialist 19. Media Dispatching Clerk 20. Secretary 21. Division Secretary 22. Legal Secretary 23. School Administrative Secretary 24. Senior Division Secretary 25. Executive Legal Secretary 26. Executive Assistant 27

28  ADMINISTRATIVE ANALYST  ADMINISTRATIVE ASSISTANT  ADMINISTRATIVE AIDE  ASSISTANT ADMINISTRATIVE ANALYST  RESOURCE & DEVELOPMENT ANALYST  PROJECT COORDINATOR  COMPENSATION ANALYST  RESEARCH ANALYST  LEGISLATIVE ANALYST  BUDGET ANALYST  HEAD START PROGRAM RESULTS SPECIALIST  HUMAN RESOURCES AIDE  HUMAN RESOURCES ANALYST  HS PROGRAM DEVELOPMENT SPECIALIST  ASSISTANT HUMAN RESOURCES ANALYST  LABOR RELATIONS SPECIALIST 28

29  CUSTODIAN  SENIOR CUSTODIAN  MAINTENANCE WORKER  SENIOR MAINTENANCE WORKER  DELIVERY DRIVER  REPROGRAPHICS WORKER  UTILITY WORKER 29

30 The Take Back Message 30

31 Job-Related Strategies  Self-Selection (Pre-Application Stage)  Focused Recruitment  Clear Job Descriptions and Requirements  Realistic Job Previews  Efficient first stage screening (Pre-Invite Stage)  Automated objective written tests  Un-proctored Internet Testing  Auto-scored supplemental questionnaires  Broad-based testing (Efficient Testing Administration)  One Test for Multiple Classifications  Score Bank Candidate Scores  Improve reliability, validity, and efficiency 31

32 What are you doing? 32


Download ppt "Dr. Frank Olmos and Joshua Kahn Los Angeles County Office of Education PTC-SC Luncheon Presentation Monterey Park, CA, May 29, 2013."

Similar presentations


Ads by Google