Presentation is loading. Please wait.

Presentation is loading. Please wait.

PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007.

Similar presentations


Presentation on theme: "PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007."— Presentation transcript:

1 PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007

2 SCOPE Statutory Control of the use of Psychological Assessment Measures within the SANDF. The Psychometric Tests used within the SANDF. The Role and Function of Assessment Centres within the SANDF. The Application of Specialist Psychological Measures in the Recruitment and Selection of Pilots within the SANDF.

3 STATUTORY CONTROL OF THE USE OF PSYCHOLOGICAL MEASURES WITHIN THE SANDF

4 SCOPE (1) Terminology Characteristics of Assessment Measures
The Need for Control of Assessment Measures within the SANDF Control of Psychological Assessment Measures within the SANDF Psychological Assessment within the Democratic South Africa Fair and Ethical Practices in the Use of Psychological Measures within the SANDF

5 SCOPE (2) Factors Affecting Psychological Assessment Results
Professional Practices that Assessment Practitioners within the SANDF should follow Basis Statistical Concepts: Reliability, Validity and Norms Challenges faced by Psychological Assessment Practitioners within the SANDF

6 TERMINOLOGY (1) Importantant Terms
Confusing and overlapping terms are used in the field of psychological assessment. Understand the more important terms and how they are interlinked. Tools are available to make it possible for us to assess (measure) human behaviour. Various names are used to refer to these tools such as tests, measures, assessment measures, instruments, scales, procedures, and techniques.

7 TERMINOLOGY (2) Psychometrics
To ensure that psychological measurement is valid and reliable, a body of theory and research regarding the scientific measurement principles that are applied to the measurement of psychological characteristics has evolved over time. Psychometrics Refers to the systematic and scientific way in which psychological measures are developed and the technical measurement standards (e.g. validity and reliability) required of measures.

8 TERMINOLOGY (3) Psychological Assessment Testing
A process-orientated activity aimed at gathering a wide array of information by using assessment measures (tests) and information from many other sources (e.g. interviews, a person’s history, collateral sources). Evaluate and integrate all information to reach a conclusion or make a decision. Testing The use of tests, measures, etc. which involves the measurement of behaviour, is one of the key elements of the much broader evaluative process known as psychological assessment.

9 TERMINOLOGY (4) Assessment Measure
In the SANDF preference is given to the term assessment measure as it is a broader connotation than the term test, which mainly refers to an objective, standardised measure that is used to gather data for a specific purpose (e.g. to determine what a person’s intellectual capacity is).

10 CHARACTERISTICS OF ASSESSMENT MEASURES (1)
Different Procedures Assessment measures include many different procedures that can be used in psychological assessment and can be administered to individuals, groups and organisations. Domains of Functioning Specific domains of functioning (e.g. intellectual ability, personality, organisational climate) are sampled by assessment measures. Standardised Conditions Assessment measures are administered under carefully controlled (standardised) conditions.

11 CHARACTERISTICS OF ASSESSMENT MEASURES (2)
Systematic Methods Systematic methods are applied to score or evaluate assessment protocols. Guidelines Guidelines are available to understand and interpret the results of an assessment measure. Such guidelines may make provision for the comparison of an individual’s performance to that of an appropriate norm group or to a criterion (e.g. competency profile for a job).

12 CHARACTERISTICS OF ASSESSMENT MEASURES (3)
Evidence Based Assessment measures should be supported by evidence that they are valid and reliable for the intended purpose. The evidence is usually provided in the form of a technical test manual. Context Assessment measures are usually developed in a certain context (society or culture) for a specific purpose and the normative information used to interpret test performance is limited to the characteristics of the normative sample.

13 CHARACTERISTICS OF ASSESSMENT MEASURES (4)
Test Bias The appropriateness of an assessment measure for an individual, group, or organisation from another context, culture, or society cannot be assumed without an investigation into possible test bias (i.e. whether a measure is differently valid for different subgroups) and without strong consideration being given to adapting and re-norming the measure. Multidimensional Assessment process is multidimensional in nature. It entails the gathering and synthesising of information as a means of describing and understanding functioning. This can inform appropriate decision-making and intervention.

14 CHARACTERISTICS OF ASSESSMENT MEASURES (5)
Limits of Human Wisdom Recognise the limits of human wisdom when reaching opinions based on assessment information.

15 THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (1)
Sensitive Item Content In view of the potentially sensitive nature of some of the item content and the feedback, and given that assessment measures can be misused, the use of assessment measures need to be controlled so that the public can be protected. Trained Professionals Controlling the use of psychological measures by restricting them to appropriately trained professionals.

16 THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (2)
Practitioner Competency Measures are administrated by a qualified, competent assessment practitioner and that assessment results are correctly interpreted and used. Conveying the Results The outcome of the assessment is conveyed in a sensitive, empowering manner rather than in a harmful way. Psychometry Procurement The purchasing of psychological assessment measures is restricted to those who may use them and that test materials are kept securely (as it is unethical for assessment practitioners to leave tests lying around) – this will prevent unqualified people from gaining access to and using them.

17 THE NEED FOR CONTROL OF ASSESSMENT MEASURES WITHIN THE SANDF (3)
Release of Assessment Materials Test developers do not prematurely release assessment materials (e.g. before validity and reliability have been adequately established), as it is unethical for assessment practitioners to use measures for which appropriate validity and reliability data have not been established. Public Familiarity The general public does not become familiar with the test content, as this would invalidate the measure.

18 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (1)
Statutory Control in RSA In South Africa the use of psychological assessment measures is under statutory control. A law (statute) has been promulgated that restricts the use of psychological assessment measures to appropriately registered psychology professionals. Health Professions Act Act 56 of1974 defines acts “specially pertaining to the profession of a psychologist”.

19 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (2)
Diagnosis The evaluation of behaviour or mental processes or personality adjustments or adjustments of individuals or groups of persons, through the interpretation of tests for the determination of intellectual abilities, aptitude, interests, personality make-up or personality functioning, and the diagnosis of personality and emotional functions and mental functioning deficiencies according to a recognised scientific system for the classification of mental deficiencies.

20 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (3)
Method and Practice The use of any method or practice aimed at aiding persons or groups of persons in the adjustment of personality, emotional or behavioural problems or at the promotion of positive personality change, growth and development, and the identification and the evaluation of personality dynamics and personality functioning according to psychological scientific methods.

21 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (4)
Evaluation The evaluation of emotional, behavioural and cognitive processes or adjustment of personality of individuals or groups of persons by the usage and interpretation of questionnaires, tests, projections or other techniques or any apparatus, whether of South African origin or imported, for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology.

22 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (5)
Exercising of Control The exercising of control over prescribed questionnaires or tests or prescribed techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology. Development The development of and control over the development of questionnaires, tests, techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psychophysiological functioning or psychopathology.

23 CONTROL OF PSYCHOLOGICAL ASSESSMENT MEASURES WITHIN THE SANDF (6)
Domain of Psychology According to Act 56 of 1974, the use of measures to assess mental, cognitive, or behavioural processes and functioning, intellectual or cognitive ability or functioning, aptitude, interest, emotions, personality, psychophysiological functioning or psychopathology (abnormal behaviour), constitutes an act that fall in the domain of the psychology profession.

24 PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (1)
Post 1994 Since 1994 and the election of South Africa’s first democratic government, the application, control, and development of assessment measures have become contested terrain. Constitution and Labour Relations Act With the adoption of the new Constitution and the Labour Relations Act in 1996, worker unions and individuals now have the support of legislation that specifically forbids any discriminatory practices in the workplace and includes protection for applicants as they have all the rights of current employees in this regard.

25 PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (2)
Employment Equity Act To ensure that discrimination is addressed within the testing arena, the Employment Equity Act No. 55 of 1998 (section 8) refers to psychological tests and assessment specifically and states that: Psychological testing and other similar forms or assessments of an employee are prohibited unless the test or assessment being used: has been scientifically shown to be valid and reliable; can be applied fairly to all employees; is not biased against any employee or group.

26 PSYCHOLOGICAL ASSESSMENT WITHIN THE DEMOCRATIC SOUTH AFRICA (3)
Impact of Employment Equity Act The impact of this Act on the conceptualisation and professional practice of assessment in South Africa in general is far-reaching as assessment practitioners and test publishers are increasingly being called upon to demonstrate, or prove in court, that a particular assessment measure does not discriminate against certain groups of people. Despite the fact that the Employment Equity Act is not binding on Defence Act Personnel, Directorate Psychology is still obliged to ensure that its practices are fair and equitable.

27 FAIR AND ETHICAL PRACTICES IN THE USE OF PSYCHOLOGICAL MEASURES WITHIN THE SANDF
International Guidelines on Test Use (Version 2000) Fair Assessment Practices The appropriate, fair, professional, and ethical use of assessment measures and assessment results. Taking into account the needs and rights of those involved in the assessment process. Ensuring that the assessment conducted closely matches the purpose to which the assessment results will be put. Taking into account the broader social, cultural, and political context in which assessment is used and the ways in which such factors might affect assessment results, their interpretation, and the use to which they are put.

28 FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (1)
Viewing Assessment Results in Context. A test score in only one piece of information about how a person performs or behaves. Therefore, if we look at an individual in terms of a test score only, we will have a very limited understanding of that person. A test score can never be interpreted without taking note of and understanding the context in which the score was obtained. In addition to the test score, the information in which we are interested can be obtained by examining the context in which a person lives. When you think about it, you will realise that people actually function in several different contexts concurrently.

29 FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (2)
At the lowest level there is the biological context, referring to physical bodily structures and functions, which are the substrata for human behaviour and experiences. Then there is the intrapsychic context which comprises abilities, emotions, and personal dispositions. Biological and intrapsychic processes are regarded as interdependent components of the individual as a psychobiological entity. In addition, because people do not live in a vacuum, we need to consider a third and very important context which is the social context.

30 FACTORS AFFECTING PSYCHOLOGICAL ASSESSMENT RESULTS (3)
The social context refers to aspects of the environment in which we live such as our homes and communities, people with whom we interact, work experiences, as well as cultural and socio-political considerations. Methodological Considerations In addition to looking at the effects of the different contexts within which people function, we also need to examine methodological considerations such as test administration, which may also influence test performance and therefore have a bearing on the interpretation of a test score.

31 Professional Practices that Assessment Practitioners within the SANDF should follow (1)
Rights of Test-takers Informing test-takers about their rights and the use to which the assessment information will be put. Informed Consent Obtaining the consent of test-takers to assess them, to use the results for selection, placement, or training decisions and, if needs be, to report the results to relevant third parties. Treatment Treating test-takers courteously, respectfully, and in an impartial manner, regardless of culture, language, gender, age, disability, and so on.

32 Professional Practices that Assessment Practitioners within the SANDF should follow (2)
Preparation Being thoroughly prepared for the assessment session. Confidentiality Maintaining confidentiality to the extent that it is appropriate for fair assessment practices. Language Establishing what language would be appropriate and fair to use during the assessment and making use of bilingual assessment where appropriate. Training Only using measures that they have been trained to use.

33 Professional Practices that Assessment Practitioners within the SANDF should follow (3)
Administration Administering measures properly. Scoring Scoring the measures correctly and using appropriate norms or cutpoints or comparative profiles. Background Information Taking background factors into account when interpreting test performance and when forming an overall picture of the test-taker’s performance (profile).

34 Professional Practices that Assessment Practitioners within the SANDF should follow (4)
Communication Communicating the assessment results clearly to appropriate parties. Subjectivity Acknowledging the subjective nature of the assessment process by realising that the final decision that they reach, while based at times on quantitative test information, reflects their “best guess estimate”. Utilisation of Assessment Information Using assessment information in a fair, unbiased manner and ensuring that anyone else who has access to this information also does so.

35 Professional Practices that Assessment Practitioners within the SANDF should follow (5)
Research Researching the appropriateness of the measures that they use and refining, adapting, or replacing them where necessary. Storage Securely storing and controlling access to assessment materials so that the integrity of the measures cannot be threatened in any way.

36 BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (1)
Psychological assessment measures often produce data in the form of numbers. We need to be able to make sense of these numbers. Basic statistical concepts can help us here, as well as when it comes to establishing and interpreting norm scores. Statistical concepts and techniques can also help us to understand and establish basic psychometric properties of measures such as validity and reliability.

37 BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (2)
This refers to the degree to which a psychometric test consistently produces the same results by the same candidates. Validity This refers to the degree to which the psychometric test measures what it claims to measure.

38 BASIC STATISTICAL CONCEPTS: RELIABILITY, VALIDITY AND NORMS (2)
Norms refer to the records of performance by other candidates who have previously been assessed using the same test. A candidate must be measured against norms taken from the context and population group to which that candidate belongs, i.e candidates are measured against other South African candidates who have previously undergone assessment on a specific test. As a database of results is built, SANDF specific norms are developed and used. The current SANDF database consists of primarily Black candidates.

39 NORMAL DISTRIBUTION: RAW SCORES TRANSLATED INTO NORMED SCORE

40 CLASSIFICATION OF STANINE SCALE
DESCRIPTION 1 Very Poor 2-3 Poor 4-6 Average 7-8 Good 9 Very Good

41 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (1)
Influence of Multiculturalism In the latter part of the twentieth century and at the start of the twenty-first century, multiculturalism has become the norm in many countries. As a result, attempts were made to develop tests that were “culture-free”. It soon became clear that it was not possible to develop a test that is free of any cultural influences. Consequently, test developers focused more on “culture-reduced” or “culture-common” tests in which the aim was to remove as much cultural bias as possible from the test by including only behaviour that was common across cultures.

42 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (2)
For example, a number of non-verbal intelligence tests were developed (e.g. Raven Progressive Matrices) where the focus was on novel problem-solving tasks and in which language use, which is often a stumbling block in cross-cultural tests, was minimised. In an attempt to address issues of fairness and bias in test use, the need arose to develop standards for the professional practice of testing and assessment.

43 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (3)
Representivity of Assessors Legitimate concern is sometimes expressed regarding the representivity of psychologists in the SANDF. This is a challenge that the organisation is currently striving to meet and some degree of progress has already been made. Directorate Psychology conducts targeted recruitment in order to recruit Black psychologists, and regularly engages the Professional Board for Psychology and academic institutions in this regard.

44 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (4)
However, the lack of availability of Black psychologists in South Africa remains a challenge. The Professional Board for Psychology’s official registration statistics reflect that 11% (known disclosures) of South African psychologists are Black.

45 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (5)
Language Language is generally regarded as the most important single moderator of performance on assessment measures. This is because performance on assessment measures could be the product of language difficulties and not ability factors if a measure is administered in a language other than the test-taker’s home language. When a test is written in a different language, it may present a range of concepts that are not accessible in our home language.

46 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (6)
Current Dilemma regarding Psychometric Tests Historically the Human Science Research Council (HSRC) was mandated to provide cost effective psychometric tests that had been proven to be valid within the South African population. After the advent of democracy in the Republic of South Africa, the HSRC underwent transformation. The HSRC redefined its role regarding psychometric tests and surrendered the license to most of these tests to the private sector. This led to the current dilemma where there is a shortage of cost effective psychometric instruments that are approved for use in the Republic of South Africa.

47 CHALLENGES FACED BY PSYCHOLOGY ASSESSMENT PRACTITIONERS WITHIN THE SANDF (7)
The situation has reached critical proportions within the broader industry sector. Consequently, members of the Professional Board for Psychology have indicated that the HSRC will be requested to provide this essential service to the nation. Due to the scarcity of validated psychometry for the South African context, the South African National Defence Force has been obliged to develop or validate some psychometric tests for use within the organisation. This is done in consultation with the Psychometric Committee of the Professional Board for Psychology.

48 PSYCHOMETRIC TESTS USED IN THE SANDF

49 SCOPE Academic Aptitude Test (AAT). Blox Test.
Differential Aptitude Test (DAT). Raven’s Progressive Matrices (RPM). Potential Insight Battery (PIB). Psychological Risk Inventory (PRI). Vienna Test System (VTS).

50 ACADEMIC APTITUDE TEST (1)
Origin South Africa. Human Sciences Research Council. Representative sample. Different languages. Northern Sotho Zulu Southern Sotho Afrikaans Tswana English Tsonga Other Venda Xhosa

51 ACADEMIC APTITUDE TEST (1)
Aim To serve as an objective, reliable and valid aid in the guidance of candidates in respect of subject and occupational choice. Provides an indication of a candidate’s: General intellectual ability (intelligence). Verbal ability and the level achieved in the official languages. Mathematical ability. Level of spatial ability.

52 ACADEMIC APTITUDE TEST (2)
Description Consists of 9 tests with 37 items in the first and 33 items in each of the other tests. All items are of multiple choice type Correct answer which the candidate can choose from five possibilities, is indicated on a separate answer sheet.

53 ACADEMIC APTITUDE TEST (3)
AAT 1: Non-verbal reasoning Measures the ability to reason inductively. Consists of two parts, viz. Figure series. Pattern completion. Figure series: Four figures are given and the fifth figure in the series must be selected from the given possibilities. Pattern completion: Total picture must be formed of the matrix, a rule deduced and the matrix completed accordingly. The candidate is consequently expected to deduce and apply a general principle. The test should in conjunction with the verbal score, provide a good indication of general intellectual ability.

54 ACADEMIC APTITUDE TEST (4)
AAT 2: Verbal reasoning Candidates are required to grasp verbal concepts and their relationships. Inductive as well as deductive reasoning is required. Items include analogies, letter codes and logical deductions.

55 ACADEMIC APTITUDE TEST (5)
AAT 5: Number comprehension Ability to manipulate and apply fundamental principles and operations. Items include, inter alia, percentages, fractions, exponents and basic sets. Reliability Degree of accuracy and consistency. Reliability coefficients: Vary from 0.69 to 0.90 for the individual tests. Reliability in the SANDF; .74

56 ACADEMIC APTITUDE TEST: NORMS (6)
STANINE AAT 1 NVR AAT 2 VR AAT 5 1 0-8 0-3 2 9-12 9-10 4 3 13-16 11-12 5-6 17-20 13-14 7-8 5 21-23 15-17 6 24-26 18-19 11-13 7 27-28 20-22 14-16 8 29-31 23-25 17-19 9 32 26-30 20-26

57 ACADEMIC APTITUDE TEST: NORMS (7)
Implementation Military Skills Development Youth Foundation Programme Nursing Study Scheme Other Study Schemes Pilot Selection

58 BLOX TEST (1) ORIGIN South Africa. Human Sciences Research Council.
Previously known as Perceptual Battery. Determine own norms for SANDF Population.

59 BLOX TEST (2) AIM Measures visual orientation.
Ability to comprehend the nature of arrangements within visual stimulus pattern primarily with respect to candidate’s body or frame of reference. Ability to recognise spatial arrangements from different orientations without the benefit of physical shifts of the body. Recognise the same visual stimulus pattern from different angles. Ability to manipulate (rotate, twist) on or two parts of a visual stimulus pattern in the candidate’s imagination in order to recognise change appearance of the object.

60 BLOX TEST (3) Description Test format Rationale Paper and pencil test.
Consists of 6 practice items and 45 test items. Non-verbal test. Rationale Spatial ability consists of spatial relations and orientation. The ability to comprehend the nature of arrangements within a visual stimulus pattern primarily wrt the examiner’s body or frame of reference.

61 BLOX TEST (4) Item format Time required
Isometric drawings of different combinations of two, three, four, five or six cubes. Each set of cubes must be compared to similar arrangements of cubes viewed from other angles. Each page is divided into two sections with a easy black line. Above the line are five sets of cubes which are the responses and below the line are nine sets of cubes which form the stimuli. Candidate must analyse each stimulus set and choose the corresponding set seen from a different angle, from the five possible responses. Time required Time limit for the test is 30 minutes.

62 BLOX TEST (5) Reliability Reliability in the SANDF: .72

63 BLOX TEST (6) STANINE NORM 1 0-13 2 14-17 3 18-21 4 22-26 5 27-30 6
31-32 7 33-36 8 37-39 9 40-45

64 BLOX TEST (7) Implementation Apprentices. Youth Foundation Training.
Explosive Device Disposal Operator. VIP Protector. Pilot Selection.

65 DIFFERENTIAL APTITUDE TEST (DAT) (1)
ORIGIN South Africa. Human Sciences Research Council. Standardised: Blacks Coloureds Whites Indians

66 DIFFERENTIAL APTITUDE TEST (DAT) (2)
AIM Provide information on candidates who want to undergo tertiary training or gain entry to particular high-level occupations, especially with the view to the provision of counselling, and the placement in and selection for tertiary or other post-school training and specific occupations.

67 DIFFERENTIAL APTITUDE TEST (DAT) (3)
Rationale Aptitude is the potential a candidate has which will enable him/her to achieve a certain level of ability with a given amount of training and/or practice. Aptitude, together with interest, attitude, motivation and other personality characteristics, will to a large extend determine the ultimate success of a candidate. Aptitude with other information, predict possible success in a specific field of study/training programme/occupation should a candidate make a particular choice, ort should the employer wish to make a particular appointment.

68 DIFFERENTIAL APTITUDE TEST (DAT) (4)
Description of tests Vocabulary. Verbal reasoning. Non-verbal reasoning. Calculations. Reading comprehension. Comparison. Price controlling. Spatial visualisation. Mechanical insight. Memory.

69 DIFFERENTIAL APTITUDE TEST (DAT) (5)
Test 1: Vocabulary Aim: To measure Verbal Comprehension, which can be defined as knowledge of words and their meaning, as well as the application of this knowledge in spoken and written language. Rationale: The ability of a learner to recognise a word and to choose a synonymous word is regarded as a valid indication of his/her knowledge of the meaning of words and as a valid criterion for the verbal comprehension factor.

70 DIFFERENTIAL APTITUDE TEST (DAT) (6)
Test 2: Verbal Reasoning Aim: To measure an aspect of general reasoning on the basis of verbal material. Rationale: The assumption that the ability to determine relationships, to complete word analogies, to solve general problems requiring logical thought, as well as a candidate’s vocabulary, is a valid indication of an aspect of general reasoning.

71 DIFFERENTIAL APTITUDE TEST (DAT) (7)
Test 3: Non-verbal Reasoning: Figures Aim: To measure an aspect of general reasoning on the basis of non-verbal material. Rationale: Assumption that the ability to see relationships between figures and, by analogy, to identify an appropriate missing figure, as well as, following the changes that the figures of a figure series undergo, to deduce the work principle and to apply it again, is a valid indication of an aspect of non-verbal reasoning ability.

72 DIFFERENTIAL APTITUDE TEST (DAT) (8)
Test 4: Calculations Aim: To measure arithmetical ability. Rationale: Assumption that the candidate’s ability to do mechanical calculations and to solve arithmetical problems with the help of four basic arithmetic operations, namely adding, subtracting, dividing and multiplying, provides a valid indication of his/her arithmetical ability.

73 DIFFERENTIAL APTITUDE TEST (DAT) (9)
Test 5: Reading Comprehension Aim: To measure the ability to comprehend what the candidate is reading. Rationale: Assumption that the candidate’s ability to choose the right answers to questions on prose passages is a valid indication of reading comprehension.

74 DIFFERENTIAL APTITUDE TEST (DAT) (10)
Test 6: Comparison Aim: To measure visual perceptual speed as a certain aspect of clerical ability, which consists mainly of the quick and accurate perception of differences and similarities between visual configurations. Rationale: Assumption that the ability to quickly and accurately indicate from five symbol groups the one that corresponds precisely with a given symbol group, is a valid indication of visual perceptual speed.

75 DIFFERENTIAL APTITUDE TEST (DAT) (11)
Test 7: Price Controlling Aim: To measure a general speed of clerical ability, namely the ability to look up data quickly and accurately. Rationale: Assumption that the ability to look up the prices of articles in a table quickly and accurately is a valid indication of success in numerous clerical tasks.

76 DIFFERENTIAL APTITUDE TEST (DAT) (12)
Test 8 Spatial Visualisation 3-D Aim: To measure the three-dimensional spatial perceptual ability. Rationale: Assumption to- Manipulate mentally a cube whose sides are marked in a certain way and which is presented three dimensionally in such a way that the relative position of a certain cube to that of a given cube can be determined. Recognise and indicate certain sides of a flat figure that has been folded to make a three-dimensional figure. Visualise what the three-dimensional result will be if a flat figure is rolled up or folded Is a valid criterion of three-dimensional spatial perceptual ability.

77 DIFFERENTIAL APTITUDE TEST (DAT) (13)
Test 9: Mechanical Insight Aim: To measure mechanical ability (insight). Rationale: Assumption that the ability to make correct visual representation of the result of the operation of a mechanical apparatus or a physical principle depicted in a drawing, is a valid criterion for the measurement of mechanical ability.

78 DIFFERENTIAL APTITUDE TEST (DAT) (14)
Test 10: Memory Aim: To measure an aspect of the memory factor by using meaningful material. Rationale: Assumption that the ability to memorise meaningful material summarised in written paragraphs and then to correctly answer questions on the content of the paragraphs, is a valid criterion for measuring an aspect of memory.

79 DIFFERENTIAL APTITUDE TEST (DAT) (15)
Reliability Overall reliability in the SANDF:

80 DIFFERENTIAL APTITUDE TEST NORMS (DAT) (16)
T8 M T8 F T9 M T9 F T10 1 0-8 0-5 0-4 0-9 0-12 0-3 15-25 2 9-10 6-7 5-6 10-12 13-16 4-5 13-14 3 11-2 8-9 7-8 13-15 17-19 12 4 10-11 16-18 20-22 5 15-17 12-13 11-13 11-12 19-21 23-25 12-14 9 6 18-20 14-15 14-16 22-23 26-27 15-16 8 7 21-22 16-17 24-25 28 17-18 18-19 20-21 29 19-20 26-30 20-25 22-25 28-30 30 21-25

81 DIFFERENTIAL APTITUDE TEST (DAT) (17)
Implementation Army Musterings. Navy Musterings. Explosive Device Disposal Operator.

82 RAVENS PROGRESSIVE MATRICES (1)
ORIGIN United Kingdom. International application. Minimise cultural influences.. Non-verbal test.

83 RAVENS PROGRESSIVE MATRICES (2)
AIM To measure the candidate’s capacity to apprehend meaningless figures presented for his/her observation, see the relations between them, conceive the nature of the figure completing each system of relations presented, and by so doing, develop a systematic method of reasoning. Suitable for comparing candidates wrt their immediate capacities for observation and clear thinking.

84 RAVENS PROGRESSIVE MATRICES (3)
DESCRIPTION Consists of 60 problems, which is divided into 5 sets of 12 each. In each set the first problem is as closely as possible self-evident. Problems, which follow, become progressively more difficult. The order of the items provided the standard training in the method of working. Five sets provide five opportunities for grasping the method and five progressive assessments of a candidate’s capacity for intellectual activity.

85 RAVENS PROGRESSIVE MATRICES (4)
DESCRIPTION (cont) Test is developed to evaluate the full spectrum of a candidate’s intellectual development. Test can be applied to any age group. Scale is intended to cover the whole range of intellectual development from the time a child is able to grasp the idea of finding a missing piece to complete a pattern to the stage of intellectual maturity through a process of comparison and reasoning. The score for adults tend to be above average, but the scale provides sufficient discriminating value. Where more differentiation is needed, the Advanced Ravens must be used.

86 RAVENS PROGRESSIVE MATRICES (5)
IMPLEMENTATION Test is included in some selection batteries and provides a basis for evaluation of general abilities. No time limits for administration of the test. Time taken to complete the test must be indicated. Reliability: Reliability in the SANDF: .80

87 RAVENS PROGRESSIVE MATRICES NORMS (6)
STANINE RAVENS (RPM) 1 0-13 2 14-26 3 27-37 4 38-43 5 44-47 6 48-51 7 52-54 8 55-56 9 57-60

88 RAVENS PROGRESSIVE MATRICES (7)
IMPLEMENTATION Special Forces Selection. Apprentices. Explosive Device Disposal Operator. VIP Protector.

89 POTENTIAL INDEX BATTERIES (1)
Potential Index Batteries (PIB) Job Profiling Expert. Comprehensive Structured Interviewing for Potential. Situation Specific Evaluation Expert. Performance Appraisal Scoring Scale.

90 POTENTIAL INDEX BATTERIES (2)
ORIGIN South Africa. Based on ongoing research that dates back to 1964. Applied research done by reputable, independent institutions. Situation-specific norms and state-of-the-art, computerised standardisation procedure. Generic standardisation done on a population of approximately respondents.

91 POTENTIAL INDEX BATTERIES (3)
PROCESS Job profiling. Determining job-related competencies. Determining NQF level and job grade. Job description. Critical crossfield education and training outcomes. Comprehensive structured interviewing for potential. Job profiling expert (basic competencies) Performance appraisal scoring scale. Ongoing feedback on workers performance. Ongoing identification of training and development needs. Career pathing.

92 POTENTIAL INDEX BATTERIES:CSIP COMPETENCIES (4)
General knowledge Competing Driver Innovation Creativity Collaborating Analyst Feedback Reading comprehension Compromising Risk-taking Presentation Calculations Avoidance Integrity Negotiation Mental alertness Accommodating Empathy Liaison Listening skills Time management Emotional sensitivity Analytical skills Abstract reasoning Stress management Tact Judgement Adaptability/Flexibility Short-term memory People development Organisational alertness Interpersonal relations Type A/B personality Coaching Nonverbal perception Self-image Assertiveness Interpersonal Objectivity Personal development Clerical skills Spelling Social insight Written communication Vocabulary Problem-solving Diversity facilitation Potential to assemble Typing skills Coping skills Leadership Potential to classify Filing ability Expressive Excellence orientation Environmental pressure Comprehension Spacial resoning Hand-eye coordination Self-motivation Supporter Customer orientation Frustration tolerance

93 POTENTIAL INDEX BATTERIES: JP EXPERT COMPETENCIES (5)
Conceptualisation Reading comprehension Listening potential Self-actualisation Memory Demonstrative Diversity facilitation Visioning Basic calculations Samaritan Excellence orientation Effort focusing Advanced calculations Evaluating Customer orientation Transparency Observance Persevering Innovation Empowerment Assembling (Basic) Risk-taking Feedback Big picture Assembling (Advance) Conformity Presentation Goal setting Clerical Non-conformity Negotiation Motivation Comparison Empathy Liason Decisiveness Perception Emotional sensitivity Analytical thinking Strategy application Environmental exposure Tact Judgement Action planning Insight People development Organisational alertness Organising Self-acceptance Mental stress Nonverbal perception Basic linguistic proficiency Socialising Interpersonal objectivity Personal development Advanced linguistic proficiency Adaptibility Physical stress Written communication Hand-eye coordination

94 POTENTIAL INDEX BATTERIES: JP EXPERT (6)
IMPLEMENTATION Post Profiling for Specific Musterings. Explosive Device Disposal Operator. VIP Protector.

95 PSYCHOLOGICAL RISK INVENTORY (1)
ORIGIN South Africa. Developed by SANDF Psychologists.

96 PSYCHOLOGICAL RISK INVENTORY (2)
AIM To scan for self-reported symptoms of psychopathology. To determine the need for an interview. To recommend the candidate for deployment or not. Utilised for concurrent health assessment processes. To confirm the mental health status in adhering to set standards for deployment.

97 PSYCHOLOGICAL RISK INVENTORY (3)
DESRIPTION Consists of 92 multiple-choice items. Each item consists of a short statement with three possible answers. Screening is focused on identification of psychopathology. Psychological fitness The concurrent health assessment defines psychological fitness as the absence of diagnosable psychopathology.

98 PSYCHOLOGICAL RISK INVENTORY (4)
Psychopathology SANDF mental health standards are based on the Diagnostic and Statistical Manuel of Mental Disorders (Revised) (DSM-IV-R). The United Nations (UN) indicates that members should not deploy if they have a history of substance dependence, situational maladjustment, anxiety disorder or are on chronic medication.

99 PSYCHOLOGICAL RISK INVENTORY (5)
CATECORIES OF SCALES C Coping scales: Less serious pathology scales and consists of - C1-Stress indicator: The experience of pressure from the environment ranging from work pressure, work environment pressure, financial pressure, family problems and interpersonal pressure. C2-Coping indicator: Reflects the subjective experience of negative emotions indicating that the candidate is not emotionally coping well.

100 PSYCHOLOGICAL RISK INVENTORY (6)
C3-Ego strength: Provides an indication of the candidate’s stress tolerance and inner resources to deal with daily challenges. D-Pathology scales: More serious pathology scales. D1-Mood disorder: Indicates symptoms of depression. D2-Anxiety: Indicates symptoms of anxiety. D3-Psychotic features: Indicates thought process and content disorder and other symptoms related to psychosis.

101 PSYCHOLOGICAL RISK INVENTORY (7)
D4-Somatic disorder: Indicates pre-occupation with symptoms of a physical nature. P-Interpersonal scale: P1-Interpersonal conflict: Indicates symptoms of interpersonal conflict, lack of interpersonal trust and unstable relationships. R-Psychological risk scales: Indicate specific risks that must be noted w.r.t. deployment. R1-Control risk: Indicates tendencies to be unstable or impulsive and not well controlled by self and authority. R2-Suicide risk: Indicates suicidal ideation and negativity about life in general.

102 PSYCHOLOGICAL RISK INVENTORY (8)
R3-PTSD risk: Indicates that the candidate has been exposed to one or more traumatic event(s) which has not been resolved. R4-Substance abuse risk: Indicates self-reported excessive drinking over a recent period of time. R5-Aggression risk: Indicates tendencies to express aggressive behaviour due to frustration or interpersonal conflict.

103 PSYCHOLOGICAL RISK INVENTORY (9)
SCALES RELIABILITY COPING SCALES C1: STRESS 0.82 C2: COPING INDICATOR 0.86 C3: EGO STRENGTH 0.50 DISORDERS D1: MOOD DISORDER 0.83 D2: ANXIETY DISORDER 0.80 D3: PSYCHOTIC FEATURES 0.66 D4: SOMATIC DISORDER 0.67 INTERPERSONAL FUNCTIONING P1: INTERPERSONAL CONFLICT 0.77 RISK INDICATORS R1: CONTROL RISK 0.73 R2: SUICIDE RISK 0.52 R3: POST TRAUMATIC STRESS DISORDER RISK R4: SUBSTANCE ABUSE RISK 0.72 R5: AGGRESSION RISK 0.70 OVERALL RELIABILITY INDEX 0.94

104 PSYCHOLOGICAL RISK INVENTORY (10)
IMPLEMENTATION Pre-deployment mental health assessments.

105 CHALLENGES (1) Development of a SANDF Competency Assessment Test
Explore the feasibility and the requirements of instituting a competency assessment test for enlisted soldiers. Develop and sustain a competency assessment program for evaluating soldiers’ technical and tactical proficiency in the military occupational specialty and leadership skills for their rank. Include situational judgment test items.

106 CHALLENGES (2) Sample item for assessing performance dimensions (Benchmarking): Problem Solving and Decision Making Skills Motivating, Leading and Supporting Sub-ordinates Directing, Monitoring and Supervising Work Training Others Relating to and Supporting Peers Team leadership Concern for Soldier Quality of Life Cultural Tolerance Computer-based testing

107 ROLE AND FUNCTION OF ASSESSMENT CENTRES IN THE SANDF

108 SCOPE ASSESSMENT CENTRE DEFINED HISTORY AND BACKGROUND
LEGAL AND STATUTORY REQUIREMENTS METHODOLOGY VALIDITY AND RELIABILITY DIVERSE APPLICATIONS APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN MILITARY COUNCIL DECISIONS: ROLE OF SAMHS

109 ASSESSMENT CENTRE DEFINED
An Assessment Centre consists of a standardised and validated evaluation of behaviour and competencies based on multiple inputs. Multiple trained observers and techniques are used. Judgments about behaviour and competencies are made from specifically developed assessment simulations. Judgments are pooled in a meeting among assessors or by a statistical integration process. Integration discussions result in evaluations of the performance of the assessees on the competencies and/or dimensions or other variables which the assessment centre is designed to measure.

110 HISTORY AND BACKGROUND (1)
First conceptualised on a large scale by the German High Command in World War I to select officers with exceptional command or military abilities. British Army War Office Selection Board (WOSB) developed a similar process. During World War II it was used by the Office of Strategic Services (OSS) to select spies. OSS 3 ½ day assessment centre involved an intensive evaluation: Sentence completion test, health questionnaire, work conditions survey, vocabulary test, personal history evaluation, a projective questionnaire and various simulations.

111 HISTORY AND BACKGROUND (2)
In the early 1950s, the American Telephone and Telegraph Company adapted the OSS concept to the selection and identification of management personnel. By the late 1960s, a number of major corporations were using the AC for selecting managers. In the early 1970s, law enforcement agencies began experimenting with AC, with the fire service following shortly thereafter. AC’s are now used internationally.

112 LEGAL AND STATUTORY REQUIREMENTS
White Paper on Public Service Training and Education Government Gazette Employment Equity Act Health Professions Act Military Council Decision

113 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (1)
All public service institutions will be required to conduct job evaluations or re-evaluations of all posts, with the purpose of ensuring that they are expressed in terms of the essential competencies required for effective job performance. This will involve both functional or sector-specific competencies and core transversal competencies.

114 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (2)
In the case of transversal competencies, the definition of competence will encompass a broad range of skills, knowledge and attitudes, including: The ability to carry out effectively the routine task of the job. The ability to transfer skills, knowledge and attitudes to new situations within the same occupational area. The ability to reflect on one’s work, learn from one’s actions, and innovate and cope with non-routine activities. The personal effectiveness to deal effectively with co-workers, managers and customers.

115 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (3)
The introduction of a competency-based approach will assist the development of an outcomes-led model of training and education in a number of important ways. This will include forming an effective and measurable basis: For the objective evaluation of current performance, and the effective assessment of current and future needs. For the design and delivery of training programmes and courses, as well as other staff development interventions, targeted at the achievement of specific and meaningful competencies.

116 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (4)
- For the standardisation and accreditation of such programmes and courses through the NQF framework. - For the subsequent evaluation of the effectiveness of such programmes and courses. The introduction of a competency-based approach will also form the basis for improvements in the current systems of performance appraisal, recruitment and selection, and promotion.

117 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (5)
COMPETENCIES: Knowledge Skills Abilities Attributes That employees develop through formal, informal and on the job training, continuing education, details and other employee development opportunities.

118 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (6)
LINK COMPETENCIES – JOB PERFORMANCE COMPETENCIES KNOWLEDGE SKILLS ABILITIES ATTRIBUTES OBSERVABLE BEHAVIOUR JOB PERFORMANCE EXPERIENCE

119 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (7)
COMPETENCIES: SANDF PERFORMANCE APPRAISAL Visioning Conceptualisation Insight Judgement Analytical Thinking Strategic Planning Leadership Evaluating

120 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (8)
COMPETENCIES: SAMHS Problem Solving Planning and Organisation Delegation Control Sensitivity Negotiation Leadership Assertiveness Communication

121 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (8)
COMPETENCIES: SA NAVY Communication Reading Writing Oral Non-verbal Formal Research

122 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (9)
COMPETENCIES: SA NAVY (cont) Management Planning Effective Thinking Quantitative Problem Solving Qualitative Problem Solving Directing Organising Control

123 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (10)
COMPETENCIES: SA ARMY (SHL) Planning Reviewing/Evaluating Deciding Implementing/Coordinating Interpreting Controlling/Directing Motivating Supervising/Directing Investigating/Observing/Searching Informing/Discussing/Interviewing Problem Solving/Designing Assessing/Evaluating

124 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (11)
COMPETENCIES: JSCSP Cognitive Problem Solving & Analysis Planning & Organising Leadership/Coordinating Decisive/Action Orientated Affective Integrity Persuasiveness Self-confidence Personal Motivation Resilience Flexibility Interpersonal Sensitivity

125 WHITE PAPER ON PUBLIC SERVICE TRAINING AND EDUCATION (12)
COMPETENCIES: SENIOR MANAGEMENT SYSTEM (SMS) Strategic Capability & Leadership Programme & Project Management Financial Management Change Management Knowledge Management Service Delivery & Innovation Problem Solving & Analysis People Management & Empowerment Client Orientation & Customer Focus Communication Honesty & Integrity

126 GOVERNMENT GAZETTE (1) Establishment of category of fitness:
The Surgeon General or a medical officer designated by him or her for that purpose shall, from time to time, in consultation with the Chief of the Service or Corporate Division concerned, determine the standard of physical and mental fitness required in peace or war time for the efficient work performance of a member in every Service or Corporate Division in each branch, corps, or unit thereof and in each mustering, appointment, post or job classification in the SANDF, taking into account requirements laid down by the relevant Code of Remuneration or Personnel Management Code and the Chief of the SANDF.

127 GOVERNMENT GAZETTE (2) - No member shall be appointed, enrolled, mustered or employed in any post or mustering of the SANDF or be required to serve or to undergo training in such post or mustering unless the allotted fitness category of such member equals or exceeds the category designated to such post or mustering.

128 EMPLOYMENT EQUITY ACT (1)
Addresses the elimination and prohibition of unfair discrimination in employment. Prohibition of various forms of employment testing. Elimination of unfair discrimination in the policies and practices of the organisation. Scrutinise policies, practices and procedures, particularly, pre-employment, job assignments, training and development and promotional selection.

129 EMPLOYMENT EQUITY ACT (2)
Psychological testing and other similar assessments of an employee are prohibited unless the test or assessment being used: Has been scientifically shown to be valid and reliable; Can be applied fairly to all employees; and Is not biased against any employee or group. Psychological testing and other similar assessment of an employee must only be done on the basis of the inherent requirements of the job.

130 METHODOLOGY (1) OBSERVING AND RECORDING BEHAVIOUR ASSESSOR DISCUSSION
SITUATIONAL EXERCISES PSYCHOMETRIC TESTING

131 METHODOLOGY (2) OBSERVING AND RECORDING BEHAVIOUR:
Observing and recording behaviour exhibited by participants during simulation exercise. Behaviour observed and recorded during oral presentation, group discussion, interview, etc. Behaviour is recorded according to dimension and/or competency. Recorded behaviour is transferred onto rating form.

132 METHODOLOGY (3) ASSESSOR DISCUSSION:
Each participant’s performance is discussed by exercise, dimension and competency. Assessors may ask for clarification or additional recorded examples of behaviour or competency. Attempts must be made to reach consensus.

133 METHODOLOGY (4) SITUATIONAL EXERCISES:
Theoretical exercises (Case Studies). Practical or dynamic exercises. In basket. Written case analysis. Interview. Leaderless group discussion. Assigned leader group task. Fact finding. Oral presentation. Integrated exercises.

134 METHODOLOGY (5) PSYCHOMETRIC TESTING:
The evaluation of behaviour or mental processes or personality adjustments or adjustments of individuals or groups of persons, through the interpretation of tests for the determination of intellectual abilities, aptitude, interests, personality make-up or personality functioning. The development and control over the development of questionnaires, tests, techniques, apparatus or instruments for the determination of intellectual abilities, aptitude, personality make-up, personality functioning, psycho-physiological functioning or psychopathology.

135 VALIDITY AND RELIABILITY (1)
Validity: The degree to which a method results in a measure that accurately reflects the concept it is intended to measure. A synonym for validity is accuracy. Reliability: The degree to which different methods of the same concept yield the same results. A synonym for reliability is consistency. NB: Low reliability leads to low validity and vice versa.

136 VALIDITY AND RELIABILITY (2)
3 Types of reliability: Inter-rater reliability. Test-retest reliability. Internal consistency. NB: Research indicates high correlation and coeficiency of these three methods. Strong indications that AC is a valid prediction instrument for leadership and management potential.

137 VALIDITY AND RELIABILITY (3)
FAILURES OF ASSESSMENT CENTRES: Never implemented. Results misused. Non-success prediction. Lack of support from the Top. Lack of Efficiency.

138 VALIDITY AND RELIABILITY (4)
ASSESSOR TRAINING: Thorough knowledge of the organisation and job being assessed. Thorough knowledge and understanding of the assessment techniques, relevant dimensions, etc., to be observed, expected or typical behaviours, examples or samples of actual behaviour, etc. Thorough knowledge and understanding of the assessment dimensions, etc., definitions of dimensions, relationship to job performance.

139 VALIDITY AND RELIABILITY (5)
Demonstrated ability to record and classify behaviour in dimensions, including knowledge of forms used by AC. Thorough knowledge and understanding of evaluation and rating procedures, including how data are integrated. Thorough knowledge and understanding of assessment policies and practices of the organisation, including restrictions on how assessment data are to be used.

140 VALIDITY AND RELIABILITY (6)
Thorough knowledge and understanding of feedback procedures, where appropriate. Demonstrated ability to give accurate oral and written feedback. Demonstrated knowledge and ability to play objectively and consistently the role called for in the interactive exercises.

141 DIVERSE APPLICATIONS OF ASSESSMENT CENTRES
Recruitment Selection Placement Performance appraisal Training and development Organisational development Human resource planning Promotion and transfer Separation and layoffs (Exit)

142 APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (1)
A job analysis of relevant competencies must be conducted to determine the dimensions, attributes, characteristics, qualities, skills, motivation, knowledge, or tasks that are necessary for effective job performance and to identify what should be evaluated by the assessment centre. Competency observations must be classified into some meaningful and relevant categories, such as dimensions, attributes, characteristics, aptitudes, qualities, skills, abilities, knowledge, or tasks.

143 APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (2)
The techniques used in the assessment centre must be designed to provide information for evaluating the dimensions, etc. previously determined by job analysis. Multiple assessment techniques must be used. The assessment techniques must include sufficient job-related simulations to allow multiple opportunities to observe the candidate’s competencies related to each dimension, etc. being assessed. Multiple assessors must be used for each assessee.

144 APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (3)
Assessors must receive thorough training and demonstrate assessor performance guidelines. Some systematic procedure must be used by assessors to record accurately specific competency observations at the time of their occurrence; this might involve handwritten notes, competency observation scales, competency checklists, etc. Assessors must prepare some report or record of the observations made in each exercise in preparation for the integration discussion.

145 APPLICATION OF PRINCIPLES TO ASSESSMENT CENTRE DESIGN (4)
The integration of competencies must be based on a pooling of information from assessors and techniques at a meeting among the assessors or through a statistical integration process validated in accord with professionally accepted standards.

146 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (1)
MC DECISION JUNE 2001: Establishment of the Defence Institute for Assessment and Development. Transfer of the SANDF Assessment and Development Services currently vested at C Joint Training Formation (as agreed during transformation principle decisions) to the SAMHS for functional control purposes and integration into the Defence Institute for Assessment and Development. Assessment and development services and personnel structure of the Defence Institute for Assessment and Development. Phases of implementation for establishing the Defence Institute for Assessment and Development.

147 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (2)
Surgeon General is mandated and designated to be the controlling authority for all statutory psychological and other similar assessment processes to ensure compliance of all policies and practices with legal and statutory acts and regulations. HPCSA’s Professional Board of Psychology is the controlling statutory body with the authority to classify and legalise the use of psychological tests, prescribed questionnaires, apparatus and instruments for the determination of intellectual ability, aptitude, personality functioning and the like.

148 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (3)
Employers who make use of psychological tests, in terms of the EEAct, may be required to ensure not only that the tests meet the standards of the Professional Board of Psychology, but also that the tests and testers meet the requirements of the Health Professional Act 56 of 1974. In accordance with SAQA it has become imperative to apply the guidelines of the National Qualification Framework wrt outcome-based education/performance. The AC service must support the DOD Human Resource Strategy 2010.

149 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (4)
AC will aspire towards maintaining competency levels and enhancing productivity across/within the DOD, by means of identifying strengths and potential as well as gaps in competencies. AC will make an essential contribution to the human resource management function in the DOD and will assist the DOD in its objective to develop and sustain optimal skills and competencies in all its members, entrenching a learning culture, and to enhance equal opportunity.

150 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (5)
Scientifically and legally acceptable to relevant stakeholders. Acceptable to relevant stakeholders in the DOD as an effective and applicable tool in their respective environment. Encompass the entire scope of assessment/development service from entry to exit level.

151 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (6)
Rendered on the premise of an integrated human resource management system based on a competency framework. Functional control should be vested with the Surgeon General as been mandated in order to comply with statutory regulations and acts. Affordable (no duplications), sustainable, easily accessible to its users and satisfactory to the needs of its clients.

152 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (7)
Defence Assessment and Development Advisory Board, chaired by Director Psychology, must determine policy and guidelines for all kinds of Assessment and Development in the SANDF. Advisory Board will consist of all relevant stakeholders from all arms of services and divisions in the SANDF and will primarily be responsible for controlling all assessments in the SANDF in line with legal, statutory and scientific requirements.

153 MILITARY COUNCIL DECISIONS: ROLE OF SAMHS (8)
MIGRATION OF ARMY ASSESSMENT CENTRE TO SAMHS (MPI) An Assessment Centre is a specialised function and remains the responsibility of D Psychology. The function should be with MPI (DOD Assessment Centre) with a Service Agreement between MPI and the SA Army to deliver an on site service to the SA Army according to requirements. The current structure of the SA Army should be translated to a structure as proposed by D Psychology, to be aligned with the DOD Assessment Centre. The SA Army will remain responsible for providing the facilities and infrastructure.

154 THE APPLICATION OF. SPECIALIST PSYCHOLOGICAL. MEASURES IN THE
THE APPLICATION OF SPECIALIST PSYCHOLOGICAL MEASURES IN THE RECRUITMENT AND SELECTION OF PILOTS IN THE SANDF

155 SCOPE (1) Pilot Selection Process Flow.
Recruitment and Selection Process Task Organisation. Academic Requirements. Physical Requirements. Medical Requirements. Psychological Requirements.

156 SCOPE (2) Potential Index Battery Functional Analysis (Competencies).
Final Selection. Dover Systems Computerised Skills Assessment. Benchmarking. Vienna Research.

157 SELECTION PROCESS FLOW
Job Classification Recruitment Screening Selection Final Selection Selection Board Foundation Training Basic Military Training Officer Forming Military Academy

158 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (1)
PHASE 1: ADVERTISING CHRS (SAAF): Internally in DOD CHRS (D PACQ): National and local newspapers PHASE 2: PAPER SELECTION CHRS (D PACQ): Minimum academic screening CHRS (D PACQ): Provide SAAF with list of applicants who meet the minimum academic requirements CHRS (SAAF): Divide applicants into groups for assessment and selection process

159 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (2)
PHASE 3: CALL-UP DHRS (D PACQ): Notify qualified candidates DHRS (SAAF): Provide specified arrival times of applicants DHRS (SAAF): Responsible for transport, accommodation, meals and other logistic requirements of the applicants in Pretoria from the time they arrive

160 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (3)
PHASE 4: ORIENTATION DHRS (SAAF): In conjunction with SAAF’s Dir Education, Training and Development; Dir Combat System’s Group; Dir Heli Systems Group and Dir Air Transport and Maritime System’s Group responsible for orientation programme for applicants DHRS (SAAF): Responsible for briefing all applicants on assessment and selection process

161 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (4)
PHASE 4: ORIENTATION (cont) OC MPI: Responsible for briefing applicants on psychometric assessments OC IAM: Responsible for briefing applicants on aviation medical examination

162 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (5)
PHASE 5: PSYCHOMETRIC ASSESSMENTS SAMHS (D PSYCH/MPI): Responsible for conducting, administering and interpretation of results for all psychometric assessments External Service Providers: MPI execute responsibilities in cooperation with Dr Landman (16PF) and Ms Coetzee (Vienna) as agreed upon with SAAF

163 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (6)
PHASE 5: PSYCHOMETRIC ASSESSMENTS Cont) DHRS SAAF: Responsible for financial compensation and logistic support for external service providers DHRS (SAAF): Responsible for ensuring that applicants are available at assessment locations

164 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (7)
PHASE 5: PANEL INTERVIEW CHRS (D PACQ): Chair panel interview for all candidates from each group who meet the set requirements for psychometric assessments Other members of panel: CHRS (SAAF), DETD (SAAF), qualified pilot (SAAF), CHRS (D PACQ) and MPI

165 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (8)
PHASE 6: AVIATION MEDICAL EXAMINATION D MEDICINE/IAM (SAMHS): Responsible for all administration, data collecting and record keeping of aviation medical examinations D MEDICINE/IAM (SAMHS): Responsible for all arrangements wrt comprehensive aviation medical examinations DHRS (SAAF): Ensure transport for candidates to IAM premises

166 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (9)
PHASE 7: COLLATION OF MEDICAL RECORDS IAM (SAMHS): Provide DHRS (SAAF) with relevant medical reports of all candidates DHRS (SAAF): Arrange meetings with D Medicine (SAMHS) and D Psychology (SAMHS) to discuss borderline health reports

167 RECRUITMENT AND SELECTION PROCESS TASK ORGANISATION (10)
PHASE 8: CONSOLIDATED SELECTION BOARD Consolidated Selection Board: Identify most suitable applicants according to SAAF requirements Members of Consolidated Selection Board: CHRS (SAAF); CHRS (D PACQ); ETD (SAAF); Qualified Pilot (SAAF); D Med/IAM (SAMHS and D Psych (MPI) DHRS (SAAF): Submit final selected list to CHRS (D PACQ)

168 ACADEMIC REQUIREMENTS
Maths - D(HG) Science - D(HG) English - Passed Matric - University Exemption M-Score - Rank Order

169 PHYSICAL REQUIREMENTS
Length – 192cm Mass –100kg

170 BASIC MEDICAL Vision Colour blindness Hearing Balance/Coordination
Physical condition (Fat content) SAMHS Comprehensive Medical

171 MEDICAL REQUIREMENTS Normal physical condition – all limbs and all sensors functioning normal No spectacles No speech impairment No colour blindness No hearing aids, pacemakers, etc. No diabetes No epilepsy No diseases wrt lungs, heart, kidneys, etc. No stuttering No drug usage/dependency No HIV/AIDS

172 PSYCHOLOGICAL REQUIREMENTS
Goal-specific selection High risk environment with little space or time for error Determine baseline aptitude and profile Minimise risk 80% of fatal aircraft accidents are caused by human error High cost of flying training Screen out high risk candidates Imperative that right candidates are being selected to be trained as military pilots

173 POTENTIAL INDEX BATTERY FUNCTIONAL ANALYSIS
Hand-eye co-ordination Conceptualisation (Spatial orientation) Judgement Decisiveness Observance Adaptibility Calculations Memory Linguistic proficiency Analytical thinking

174 APTITUDE TESTS Intellectual ability Mathematical ability
Non-verbal reasoning Verbal reasoning Mathematical ability Spatial orientation Language Test Psycho-motor Test Dover Vienna System) Determination Test Mental Health Screening Test

175 FINAL SELECTION Flight Medical Fitness Leadership Personality
Motivation for Flying Selection Board Criteria Career Orientation Foundation Training Course Scholastic Achievement Psycho-motor abilities

176 DOVER SYSTEMS COMPUTERISED SKILLS ASSESSMENTS (VIENNA)
Flight Crew Selection Military Officer/Driver Air Traffic/Combat control Naval Operations Weapon Delivery Skills

177 SKILLS TESTING PROPER SELECTION
Provides most suitable candidates Identifies candidates with the required skills Lessens the learning time Reduces expenditure on poorly trainable candidates

178 TEST ADMINISTRATION Test Administration Evaluation Interpretation
Computer-aided implementation of tests guarantees thet all instructions as well as the presentation of stimuli are equal for all subjects and independent of the test administrator. Evaluation Registration of data and comparison of norm samples is carried out automatically by the computer, thus the possibility of miscalculation is eliminated. Interpretation As the test is a standardised performance test, interpretation objectivity is evident.

179 APPLICATION OBJECTIVE EVALUATIONS
Reaction Times Perception Skills (Both Visual and Auditory) Stress Coping Decision-making Abilities Communication Skills Short and Long-term Memory Co-ordination Skills Learning Curve / Trainability Attitudes Levels of Aggression

180 VALIDITY: STANDARDISATION
GROUP NUMBER PERSENTAGE African 1232 61.8% Indian 54 12.7% White/Coloured 577 29% Male 1622 81% Female 373 19% TOTAL 1995 100%

181 VALIDITY Culture-free and low-cost
Highly valid and reliable assessment of piloting skills, driver skills as other well military related skills Training (military) development (civil) can create ideal pool of candidates as well as identify special needs of candidates Dover system has 20 years experience in psychomotor selection in developing countries Based in South Africa with assessments done worldwide by arrangement

182 VALIDITY (cont.) System has been widely use within the African context for both military and civil uses Lesotho, Rwanda,Malawi are among the countries that have used the system in the selection of flight and other military personnel System is also employed in conjunction with premier air schools as 43 Air School and Flight Training Centre Schools select and train candidates from SADC countries as well as North African countries such as Egypt and Kenya

183 VALIDITY (cont.) System has been involved in selection of previously disadvantaged candidates for pilot training as sponsored by Aviation Training and Development Foundation (ATDF) This was done on on nation wide bases and candidates were selected for Feeder Airlines SA Express and Airlink System successful in selection program for previously disadvantaged community members and British Airways Pilot Development Program

184 BENCHMARKING (1) UNITED KINGDOM Officers and Aircrew Selection Centre.
Part 1 Selection Procedure: Aptitude Testing Verbal Reasoning: Interpretation and use of written or spoken information. Numerical Reasoning: Interpretation and use of numerical information. Capacitry: Dealing with multiple tasks involving aural and/or visual information, concentrating, noting changes, paying attention to detail. Spatioal Ability: Mental visualisation and orientation.

185 BENCHMARKING (2) Medial Examination: Determine fitness. Interview:
Work Rate: Performing tasks quickly and accurately Psychomotor: Co-ordination of eye-hand-foot. English Test: Written English Language skills. Medial Examination: Determine fitness. Interview: Appearance and bearing. Manner. Speech and Powers of Expression. Activities and Interests. Academic Level/Potential Physical Level/Potential. Awareness. Motivation. Overall Impact.

186 BENCHMARKING (3) Part 2 Selection Procedure:
Discussion Exercise. “Leaderless Exercise”. Group Planning Exercise. Individual Problem Exercise. Command Situation Exercise. Part 3 Selection Procedure: Debrief Part 4 Selection Procedure: Integration of rating scores. Feedback.

187 BENCHMARKING (4) INDIA Non-verbal test. Personality test.
Clinical Screening. Pilot Aptitude Test. Simulation Exercises. Physical Fitness Test. Group Exercises. Interview.

188 BENCHMARKING (5) ISRAEL Pilot Evaluation System (PES).
Standardised mass screening of pilot training candidates. Simulated “real” conditions. Identify “pilot training” candidates who possess the ability to focus their attention on a multitude of competing tasks while prioritising a steady stream of incoming data. Scientifically based quantitative evaluation of performance of pilot candidates. Simulated cockpit generates flight scenarios.

189 BENCHMARKING (6) PAKISTAN Selection System Psychological Dimensions
Intelligence Test. Academic Test. Psychological Test. Physical Test. Interview. Flying Aptitude Test. Psychological Dimensions Sentence completion. Word association. Thematic Apperception. Self Description.

190 BENCHMARKING (7) Group Exercises Group discussion Group planning
Half group task Command task Progressive group task Individual obstacles

191 BENCHMARKING (8) Flying Aptitude Test (FAT) Cognitive component
Instrument Comprehension Spatial Orientation) Vigilance (Cognition, Concentration, Attention) Digit recall (Mental sharpness, Memory, Speed) Attention Diagnostic Method Defence Mechanism Test Psychomotor component Fly through (sharpness, Eye-hand coordination) Target flying (Reflex action, Eye-hand-foot coordination)

192 BENCHMARKING (9) GERMANY Comprehensive Test Battery Flight-simulator
Inductive reasoning Spatial ability Attention Reactive capacity Verbal and Visual Memory Sensomotor coordination Flight-simulator

193 BENCHMARKING (10) SINGAPORE Computerised Aptitude Test Battery
Psychomotor tasks Hand-eye-foot coordination Pursuit tracking Cognitive tasks Numerical and mechanical reasoning System operation

194 CHALLENGES Targeted recruitment
Identification of candidates at earlier stages

195 PSYCHOMOTOR ABILITY: VIENNA TEST SYSTEM (VTS)
Research Overview on the use of the VTS in the SAAF Pilot Selection Test Battery

196 PRESENTATION AGENDA Background
SAAF Pilot Selection Battery – The Role of MPI Vienna Test System Practical Issues wrt SAAF Pilot selection Recruitment Representation High failure rate on VTS Practical Demonstration of the VTS Interpreting Results

197 PRESENTATION AGENDA Exploratory Research
The Three Clusters of Applicants Information Processing: Coping Strategy, Audio Deficits and Concerns Age Differences in VTS performance Other Findings of Interest

198 THE ROLE OF MPI To make recommendations to the SAAF as to the psychological status of potential SAAF pilots in terms of Basic Aviation related aptitudes Mental status of applicants

199 SAAF PILOT PROFILE Based on a scientific Job Profile Analysis (JPI)
Professional Pilot Profile Intellect, Aptitude, Language proficiency, Cognitive functioning under different STRESS situations Professional Soldier Profile Officer in the SANDF Needs to be updated Leadership, endurance (concentration ability) and realistic perceptions of flying

200 THE PROCESS Seven different Psychological tests administered over two days Aptitude: Intellectual ability Non-Verbal Verbal Mathematical Ability Spatial Orientation Ability

201 THE PROCESS (2) Language Proficiency Personality Test
Psychomotor Test (Computerised) Cognitive Functioning under Stress Time and movement anticipation Two Hand Coordination Biographical Questionnaire Clinical (psychopathology) Screening Test Structured Clinical Interview (IAM)

202 CUT-OFF STAGES Stage 1: Aptitudes and Language Proficiency
Stage 2: Psychomotor Test (Dover-Vienna Tests) Stage 3: Clinical Assessment

203 THE VIENNA TEST SYSTEM (VTS)

204 THE VIENNA TEST SYSTEM (VTS)
Consists of numerous subtests (27+) For Selection purposes: Determination Unit (DT) For Research purposes: Cognitrone Test Two-hand coordination Test Pilot Spatial Test Time-Movement Anticipation Test Need to include Multi-tasking subtest

205 DESCRIPTION OF THE TEST (DT)
Definition 1: The Determination Test measures behavior under different levels of psychological and physiological stress, since the high frequency of signals puts almost everyone into an overcharge situation (Kisser, 1986:226)

206 DESCRIPTION OF THE TEST (DT)
Definition 2: Hoyos (1969) defines stress as the incapacity of a highly motivated individual to find correct responses in a situation of extreme stimulus constellation (sic)

207 DESCRIPTION OF THE TEST (DT)
Definition 3: Stress tolerance is the capacity of a person to resist a stimulus, i.e. to activate reactions in a certain situation in order to cope with it in the best way possible (Kisser et al., 1986, p226) The Key Issue: Coping Strategies for the Information Overload created in a Stress situation whether in a natural or artificially induced situation

208 APPLICATION OF DETERMINATION TEST
Measurement of reactive stress tolerance Ability to give sustained multiple-choice reactions to rapidly changing stimuli Detect attention deficit disorders and color blindness

209 THEORETICAL BACKGROUND
The DT measures Reactive Stress Tolerance and related reaction speed ito: Discrimination of colours and acoustic signals, The memorization of the relevant characteristics of stimulus configurations and response buttons as well as assignment rules, The selection of the relevant reactions according to assignment rules Continuous, sustained rapid and varied reactions to rapidly changing stimuli

210 OBJECTIVITY Test Administration
The computer-aided implementation of tests guarantees that the instructions as well as the presentation of stimuli are equal for all subjects and independent of the test administrator Evaluation Registration of data and comparison of norm samples is carried out automatically by the computer, thus the possibility of miscalculation is eliminated Interpretation As the test is a standardized performance test, interpretation objectivity is evident (Lienert, 1961)

211 EVALUATION Reliability
Internal Consistency (Cronbach Alpha) ranges between 0.86 and 0.99 Split-Half Between 0.86 and 0.99 Validity Construct Validity Correlation between 0.3 and 0.8 with similar tests Predictive Validity Correlates with SAAF pupil pilot flying scores Norms International Local (Pilots)

212 INTERPRETATION OF TEST RESULTS
Correct reactions on time This indicate how well a subject adapts to a pre-set presentation duration of stimuli. This ability (to adapt) depends on two factors: Subjects have to pace their reaction time in such a way that they do not get off the track They have to make sure that there is enough time in between each stimulus to make the right decision T-scores above 60 and below 40, (or percentiles) above 84 and below 16) demonstrate a development of this ability above or below average, respectively Poor performance is indicated by: A low score of correct responses on time (compared to the norm sample) A proportional decrease in the number of correct responses on time when the presentation time of the stimuli is diminished

213 INTERPRETATION ….. Delayed and Omitted reactions
Usually, when the presentation time of the stimuli is decreased, a growing number of reactions are first delayed, then omitted. This results from the fact that the speed at which the stimuli are presented accounts for the most difficult condition of the test The initial increase in the number of delayed reactions versus omitted reactions is a normal function of our attention This function guarantees that a reaction is screened from external distractions (in this case the interruption of the stimulus presentation) and thus is carried out even though a new stimulus appears. A high number of omitted reaction (T-score below 40 due to reversed scale) combined with a low number of delayed reactions (T-score under 40) would therefore indicate attention deficits.

214 INTERPRETATION … Incorrect Reactions
Incorrect reactions indicate the tendency to confuse stimuli The Response Matrix can locate where such confusions accumulate In contrast to delayed and omitted reactions, incorrect reactions are not so much an indicator of the difficulty of the test. Usually, the number of incorrect responses increases only slightly when the presentation time of the stimuli is decreased Incorrect reactions occur mainly because the subjects are unable to screen appropriate responses from concurrent and irrelevant external distractions Thus, the variable “Incorrect Reactions” is closely linked to any attention deficits The number of incorrect reactions indicates the subject’s tendency to give a rapid response at the very last moment under the pressure of limited presentation time

215 Interpretation: The Three Phases
Less than a Second Overall Potential Stress Recovery

216 CORRECT REACTIONS ON TIME
Number of Stimuli 80.000 60.000 40.000 20.000 Recovery Potential Stress Overall 0.000 ON TIME INTERVAL 1 INTERVAL 2 INTERVAL 3 TOTAL GRP ACCCEPTED RESERVATION NOT ACCEPTED 73.632

217 OMITTED RESPONSES: A FLIGHT SAFETY RISK
Potential Stress Overall Recovery Number of Stimuli

218 DELAYED RESPONSES Overall Potential Stress Recovery Number of Stimuli

219 INCORRECT RESPONSES Potential Stress Overall Recovery
Number of Stimuli Potential Stress Overall Recovery

220 REACTION TIMES AND AGE


Download ppt "PSYCHOMETRIC TESTING WITHIN THE SANDF PRESENTATATION TO THE PORTFOLIO COMMITTEE ON DEFENCE 19 JUNE 2007."

Similar presentations


Ads by Google