DOCUMENTS

How Umalusi adjusted the 2010 matric results

Text of briefing by Chairperson Sizwe Mabizela, February 23 2011

Presentation by the Chairperson of Umalusi Sizwe Mabizela  on the release of the of the 2010 Matric Standardisation Decision, February 23 2011

Following the release of the 2010 matric results, there has been an unprecedented interest in the role of Umalusi in quality assuring the results of the class of 2010. In the quest to understand the improvement in the performance of the 2010 cohort of learners, there has been a call on Umalusi to disclose the standardisation decisions made in respect of each subject.

In the absence of authoritative information from Umalusi, some so-called experts have taken the gap to peddle all manner of uninformed and ill-considered claims and comments to the general public. Some of these are reckless, irresponsible and injurious to the credibility and integrity of Umalusi. To illustrate this point let me refer to an article that appeared in the Cape Times of 11 January 2011.

Top matrics robbed of As - teachers

The process of adjusting matric exam marks has robbed pupils of A's in their final history exams, say teachers at some top Cape schools. Low marks in business studies are also being questioned. But quality assurance body Umalusi has defended the adjustment process.

Wynberg Girls High senior history teacher Mario Fernandez said he had done a spot survey of the history results of seven excellent Cape Town schools, and had found that each of them had achieved "a mere handful of A symbols, far fewer than they believe, in their professional opinion, their candidates deserved".

He queried Umalusi's adjustment process and asked why Umalusi refused to release the lists of standardised subjects.

Westerford history teacher James Bissett agreed with Fernandez that there had been "artificial manipulation of the top end of the history results".

He said the number of pupils with an A in history had dramatically dropped from more than 55 percent in 2009 to 13 percent in 2010.

After speaking to a number of history teachers at different schools, Bissett said a bell curve appeared to have been applied to the raw marks, with the number of pupils at the high and low end reduced.

"It is really the bright and talented students who are being affected."

Sacs teacher Simon Perkins, with 40 years experience, said only six history pupils at the school had achieved more than 80 percent. He had expected between 20 and 30 pupils to achieve those results.

As you will learn later, History marks were not adjusted at all!

While we have consistently upheld the confidentiality of the standardisation process, we have realised that failure to share with the South African public the quality assurance processes and decisions of Umalusi will potentially cause irreparable damage to Umalusi and the qualifications that it quality assures. It is with this consideration in mind that we have taken the decision to disclose the information on the standardisation of the 2010 NSC results.

Umalusi's mandate in respect of quality assurance of assessments

Chapter 2A of GENFETQA Act 58 of 2001 (as amended in 2008) stipulates the following.

Internal assessment that forms part of final assessment

17. (1) The Council may issue directives for internal assessment to ensure the reliability of assessment outcomes;

(2) The directives contemplated in subsection (1) must include measures for verification.

(3) Assessment bodies must monitor the implementation of the Council's directives and report any irregularity without delay to the Council in writing, as well as the steps taken to deal with the irregularity.

External assessment

17A. (1) The Council must assure the quality of assessment at exit points.

(2) (a) The Council must develop policy for the accreditation of assessment bodies other than departments of education and must submit it to the Minister for approval.

(b) The Minister must make regulations in which the policy for accreditation is set out.

(c) The Council must accredit an assessment body in accordance with the regulations  contemplated in paragraph (b).

(3) The Council must perform the external moderation of assessment of all assessment bodies and education institutions.

(4) The Council may adjust raw marks during the standardisation process.

(5) The Council must, with the concurrence of the Director-General and

after consultation with the relevant assessment body or education institution, approve the publication of the results of learners if the Council is satisfied that the assessment body or education institution has-

(i) conducted the assessment free from any irregularity that may jeopardise the integrity of the assessment or its outcomes;

(ii) complied with the requirements prescribed by the Council for conducting assessments;

(iii) applied the standards prescribed by the Council with which a learner is required to comply with in order to obtain a certificate; and

(iv) complied with every other condition determined by the Council.

(6) The Council must issue certificates to learners who have achieved qualifications or part qualifications.

In line with its mandate, Umalusi follows the following processes and procedures in quality assuring the National Senior Certificate Examinations.

1. Setting of examination question papers

All assessment bodies (DBE, IEB, ERCO, DHET) set most of their examination question papers. In respect of some subjects, one assessment body may make arrangements with another assessment body for the sharing of question papers. For example, DBE candidates write IEB papers in Maritime Economics, Equine Studies, and Nautical Sciences and ERCO candidates write DBE papers in Geography.

2. Internal Moderation

Once a question paper has been set for a subject, the assessment body passes it on to its internal moderator.

3. External Moderation

After the examiner(s) and the internal moderator of the assessment body have approved the question paper, it is then sent to Umalusi's external moderator. The external moderators are subject experts with many years of experience in teaching and assessing the subject. They are also well-versed in the National Curriculum Statement (NSC) and the Subject Assessment Guidelines (SAGs) of the subject. Some are affiliated with South African universities and others are retired teachers. The external moderator assesses the question paper on, among other things, subject syllabus coverage in terms of breadth and depth, the weighting and spread of various learning outcomes, its cognitive demand using appropriate classification schemes or rubrics (such as the revised Bloom's taxonomy and the like).  The interaction between the assessment body, its examiner(s) and Umalusi's external moderator is an iterative process which is concluded only when the external moderator approves the question paper.

The main challenge experienced by Umalusi at this stage of its quality assurance exercise is that the exam question papers are usually submitted too late to it and so external examiners work under immense pressure to finalise the question papers. Several years ago Umalusi proposed that assessment bodies adhere to an 18 month cycle for the setting of exam papers. This remains one of its major challenges to this day.

4.Monitoring the "State of readiness"

This phase of the monitoring is to determine the state of readiness of the assessment bodies and Provincial Departments of Education (PDE's) to administer the examinations. This monitoring usually takes place during July, August and September of each year. Prior to 2010, this monitoring required assessment bodies and PDEs to submit completed self-evaluation instruments to Umalusi. These reports were then evaluated and assessment bodies were informed of any shortcomings. This process was supplemented by the deployment of Umalusi monitors to verify information and evidence submitted in the self-evaluation. The approach in 2010 was different. Umalusi staff members (teams of 2-3) accompanied the DBE teams on their monitoring visits of 3-4 days per PDE.  The reason for this approach was twofold: firstly to avoid duplication in the system and, more importantly, to verify the veracity of the DBE monitoring process.  Understanding the breadth and depth of the DBE processes helped to better understand and authenticate the DBE monitoring reports which were submitted to Umalusi. Amongst others, the monitoring of the state of readiness evaluates the following aspect of the examination processes:  Availability of policies and regulations on assessment processes; registration of candidates; where applicable, the appointment and training of examiners and Internal Moderators; facilities for printing and storage of question papers; security systems for examination materials; arrangements for the distribution of question papers; appointment and training of invigilators and Chief Invigilators; plan for invigilation; preparation for marking processes; appointment and training of Markers and Examination Assistants; Plan for monitoring.

5. Monitoring of the administration and conduct of examinations

The Umalusi quality assurance processes are designed to determine the degree of compliance of the assessement bodies and PDEs with the policies regulating the administration, conduct and management of examinations. The writing and marking phases of the examinations are monitored. The monitoring of writing is conducted in a sample of centres across the nine provincial departments of education. Monitoring is conducted by Umalusi monitors located within the 9 provinces. Umalusi staff also conduct unannounced "shadow" visits to determine the veracity with which the Umalusi Monitors execute this function. Aspects monitored during the writing phase include: general management of the examination with respect to provision of adequate and suitable facilities; the processes followed at the commencement, during and after the examination session; all aspects of security relating to question papers and examination material. Very much the same aspects are monitored at the marking centres.

6. Verification  of marking

Quality assurance (verification) of marking comprises two processes, namely, approval of the final marking guidelines at the marking guideline discussion meetings, and centralised verification of marking. Marking guideline discussion meetings, hosted by the Department of Basic Education (DBE), serve to ensure that all possible responses are accommodated in the final approved marking guidelines. Final marking guidelines must be approved by the Umalusi external moderators. Verification of marking verifies correct and consistent application of the marking guideline across the provinces which will, in turn, attest to the consistency and accuracy in marking. PDEs submit to Umalusi a specified sample of scripts. These are then moderated to determine compliance with the approved marking guidelines.

The quality of marking sometimes leaves a lot to be desired. This is closely linked to the teacher knowledge of the content of the subject.

7. Capturing of marks

This is done by the assessment body. Umalusi monitors the capturing of both the School Based Assessment (SBA) and the examination marks. The monitoring is to ensure that the assessment body has the necessary infrastructure and capacity as well as the required security measures in place.

8. Standardisation

An examination process invariably has many and varied sources of variability. Most of these are unplanned, unintended, and undesirable. They vary from mistakes in a question paper to subtle issues of level of difficulty and cognitive demand of a question paper and to issues of possible multiple and different interpretations of questions in a question paper.  It is universally accepted that judging the level of difficulty of a question paper is a difficult exercise. It is only after it has been written and marked that one is able to determine whether it was of appropriate level or not. 

Standardisation, commonly known as statistical moderation of raw scores, is a process used the world over to mitigate the effects on learner performance caused by factors other than the learners' subject knowledge, abilities and aptitude. There are therefore two main objectives for standardisation:

i) To ensure that a cohort of learners is not advantaged or disadvantaged by extraneous factors other than their knowledge of the subject, abilities and their aptitude.

ii) To achieve comparability and consistency from one year to the next. For example, it should not be easier or harder to obtain a distinction in one year than it is in another year.

 If a cohort of learners in one year writes a paper that is easier than the one written by the previous year's cohort, comparing the quality of passes of these two cohorts would pose a major problem in many respects. The cohort that wrote a harder paper would be at a distinct disadvantage - ranging from getting bursaries for further studies to acceptance at further and higher education institutions.

Standardisation of matric results has been used since 1918 by Umalusi's predecessors, the Joint Matriculation Board (JMB) (1918-1992) and South African Certification Council (SAFCERT) (1992-2001). Any person who holds a matric certificate had his/her subjects standardised prior to their results being confirmed.

Umalusi's standardisation is done by the Assessment Standards Committee. This is a committee of Council which comprises men and women of impeccable credentials, integrity and credibility. They are independent people who are not in the employ of Umalusi. They are appointed by Umalusi Council based on their extensive knowledge of, experience and expertise in statistical moderation, statistics, assessment, curriculum, and educational matters. Some of them are affiliated with our universities and others serve on national and international bodies that deal with education and assessments. Most of them have been doing this work for many years and have seen this process mature over time.

Prior to the standardisation meeting, the Assessment Standards Committee receives extensive and detailed qualitative reports on each subject from external moderators, chief markers and researchers. Among other things, independent subject experts and researchers carry out detailed comparative analyses of the examination question papers with those of the previous years. After all the qualitative reports have been analysed and discussed, the Committee receives booklets which contain marks of each subject from the assessment body. Each booklet contains all the essential statistics and information for each subject: the number of candidates who wrote the subject, subject percentage capture, mark distribution, performance of previous years' cohorts in the subject, historical means, the ogive curves, and pairs analysis.  The Committee does not receive the overall pass rate of the cohort based on raw marks. This is not important for the work of the Committee, given the fact that the raw marks are not necessarily unproblematic in light of all the potential sources of variability mentioned earlier. It is worth mentioning that the Committee standardises only the written component (75%) of each subject. The Continuous Assessment Mark (CASS Mark) of each subject is only standardised in relation to its written component after the written component has been standardised by Umalusi. The cohort pass rate is calculated after standardization has been completed using the marks for the written components of the subjects, CASS marks and Language compensation.

The Committee standardises the subjects individually and in a linear and non-iterative manner. That is, once a subject has been standardised, the Committee proceeds to the next subject.

How is standardisation carried out?

The Committee looks at the learner performance in subject X. A look at the ogive curves gives one a quick sense of the learner performance in the subject. If the performance is in line with that of previous cohorts and the qualitative reports do not suggest anything untoward in the question papers or in the administration, management and conduct of the examination, the Committee will accept raw examination marks.  The Committee then proceeds to the next subject, Y (say).  If the ogive curve is far to the right of all previous curves the mean ogive curve in the subject, this may suggest that the question paper might have been easy. In all likelihood the qualitative information would have suggested that as well. The Committee then looks at the pairs analysis - comparing the performance of the learners in subject Y to their performance in other subjects that they have taken (together with subject Y). If the pairs analysis corroborates all other pieces of information  that the question paper was easy, the Committee will consider a downward adjustment of marks to bring the learner performance closer to the historical average.  After the Committee has pre-standardised subject Y, it then proceeds to subject Z. If its ogive curve is far to the left of all previous curves and the historical mean ogive curve in the subject, this may suggest that the question paper might have been too difficult. Again, in all likelihood the qualitative information and pairs analysis will corroborate the fact that the question paper was difficult. In this case the Committee will consider an upward adjustment of raw marks to bring the learner performance closer to but possibly slightly below the historical norm.

Suppose that in subject S learner performance is in line with the performance of previous cohorts of learners except that this cohort has four or five times as many distinctions as has been the case in the past. This may suggest that the paper may have failed to include sufficiently challenging questions for those learners at the top end. Again, this is usually corroborated by independent sources of information. In this case the Committee may effect a downward adjustment at the top end while leaving the rest of the marks as raw to bring down the distinction rate closer to the historical norm.  An appropriate scaling is effected so as to maintain the rank order of candidates. A few interesting questions immediately arise: How does one report this adjustment?  A downward adjustment?  By how much?

Adjustments of raw examination marks are based on a set of principles that guide the standardisation process. Here I will mention just two of these:

(i) No adjustment should exceed 10% of the historical average. (The adjustments do not exceed 10% - for a 300 mark paper, adjustments should not be more than 30 marks.)

(ii) In the case of individual candidate, the adjustment effected should not double or halve the raw mark obtained by the candidate;

It should be obvious from (i) and (ii) above that one cannot have a fixed level of adjustment - appropriate scaling must be used to be consistent with the principles of standardisation. For example, if one added 12 marks to all the raw marks of candidates, those who obtained 11 marks or below would have their marks more than doubled and those who have obtained 289 or more marks will end up with more than 300 marks. In this example, one would have one scale the adjustment down to 0 at 0 and also down to 0 at 300. Accepting this as an upward adjustment, the question then is: by how much? By the way, in this example, those candidates who obtained 0 or 300 did not have their marks adjusted! Is it then fair to those candidates for one to claim that their marks were adjusted upwards?

In some instances different levels of adjustments are effected in different parts of the mark disctribution. For example, based on all evidence available, for subject K, one can scale from 0 at 0 to -12 at 163 and then scale to -3 at 190 and then scale to 0 at 240, and then accept raw marks from 240 to 300. Would one report this as a downward adjustment? By how much? Note that those candidates whose raw marks for this subject are 240 or above have not been adjusted at all.

After all subjects have been standardised, the Assessment Body then implements the standardisation decisions in respect of the written component of each subject and the associated CASS mark. At this stage Umalusi has concluded its standardisation exercise. The Assessment Body can then work out the pass rates per subject and also the cohort overall pass rate. Umalusi only gets to know the pass rates after it has concluded its standardisation exercise. No further changes or adjustments are done to the learner marks after the standardisation process has been completed.

I hope this gives you some sense of how standardisation is done. In conclusion, let me address some specific questions which have been raised in the public debate regarding matric results.

(a) Does Umalusi meet with government and look at the pass rates on raw examination marks?

No. Umalusi does not meet with any of the Assessment Bodies (including DBE) to look at pass rates on raw scores. As mentioned above, pass rates on raw examination marks are not calculated before the standardisation has been done.

(b) Does Umalusi adjust marks to achieve a particular pass rate?

No. Standardisation is used to achieve comparability and consistency from one year to the next. It is meant to ensure that a cohort of learners is not advantaged or disadvantaged by factors other than their knowledge of the subject and their aptitude. When Umalusi standardises learner performance in each subject, it does not concern itself with the overall pass rate. Pass rates are not even part of the deliberations in the Committee's work.

(c) Is Umalusi subjected to political pressure to achieve a particular pass rate?

No. Umalusi is a fiercely independent statutory body which discharges its mandate in terms of GENFETQA Act 58 of 2001 (as amended in 2008). The members of the Assessment Standards Committee are independent professionals who would never play into any political games. Their own personal integrity would be severely compromised if they pandered to political whims or pressures. Umalusi standardises results of the Independent Examination Board (IEB) and those of the Eksamen Raad vir Christelike Onderwys (ERCO) in exactly the same way it standardises DBE or DHET examination results.

9. General principles applied in the standardisation of the examination marks

(Extract from the Umalusi document ,"Requirements and Specifications for the Standardisation, Statistical Moderation and Resulting of the NSC and NC(V)" (Page 19))

These principles are applied in order to achieve the purpose of standardisation

5.4.1 In general no adjustment should exceed 10% of the Historical Average (Norm)

5.4.2 In the case of the individual candidate, the adjustment effected should not exceed 50% of the mark obtained by the candidate

5.4.3 If the distribution of the raw marks is below the Historical Average, the marks may be adjusted upwards, subject to the limitations in 5.4.1 to 5.4.3

5.4.4  If the distribution of the raw marks is above the Historical Average, the marks may be adjusted downwards, subject to the limitations in 5.4.1 to 5.4.3

5.4.5 In all the above cases 5.4.1 - 5.4.4, the result of the adjustments may not exceed the maximum mark or less than zero of a subject or subject component

5.4.6  The computer adjusted mark is calculated based on the above principles

5.4.7 Raw marks would generally be accepted for subjects with small enrolments

5.4.8 Umalusi, however retains the right to amend these principles where and when deemed to be necessary based on sound educational principles

There are three important pieces of information that inform the standardisation of each subject. These are

(i) Qualitative information provided by external moderators, chief markers and our research team that does comparative analyses of question papers of different years.

(ii) Pairs analysis - comparing learner performance in the subject under consideration to their performance in other subjects.

(iii) Historical performance in the subject as captured in the historical means of the previous cohorts, distinction rates and failure rates in the subject.

The Standardisation process is not a purely statistical process; educational considerations play a significant role in arriving at final decision.

In instances where raw examination marks have been accepted, it is because all these pieces of information confirm that the learner performance in that subject is valid, credible and in line with the historical performance in the subject. In subjects with low enrolments, we generally accept raw marks.

For the standardisation of the National Senior Certificate, we use the ogive (or cumulative frequency) curves to represent the cumulative frequency of learner marks. In constructing the ogive curve, we use the ‘less than or equal to' method. It is the ogive curve that gives one a quick impression of the cohort performance in relation to the historical learner performance in the subject. For the NATED subjects (N1 -N3) we generally use the Kolmogorov-Smirnov (KS)  Goodness-of-Fit Test and for the NC(V) we use the Moon Method, proposed by Prof Moon Moodley, a member of Umalusi Assessments Standards Committee.

Let me re-iterate the point that subjects are standardised individually in a linear and non-iterative manner.

The questions that have sustained the public debate on the 2010 NSC results are: Which subjects were adjusted upwards and which were adjusted downwards. As you will discover soon, the answers to these questions are not as simple as one might think.

10. We now report on the standardisation decisions of the 2010 NSC results

ERCO

 

SUBJECT

COMMENT

ADJUSTMENT

1

Accounting

2010 raw mean of 32.9% indicates poor performance as compared to 2009 adjusted mean of 45.48%.  Problem with the question paper.  The question paper which was written by candidates was not the one approved Umalusi - it was an earlier version.  In order not to disadvantage candidates an upward adjustment was effected bringing the mean to 38.47%  

 0 at 0 scale to +18 at 72, block 18 at 222 and scale to 0 at 300

2

Afrikaans First Additional Lang

Question papers approved -compliant with SAG's.  2010 Raw mean of 49.78% accepted.  Raw mean of 46.32% accepted in 2009.

Raw

3

Afrikaans Home Language

Question papers approved -compliant with SAG's.  2010 Raw mean of 52.31% accepted.  Raw mean of 52.85% accepted in 2009.

Raw

4

Agricultural Management Practice

Question papers approved -compliant with SAG's.  2010 Raw mean of 22.98% accepted.  Raw mean of 22.51% accepted in 2009.

Raw

5

Agricultural Sciences

Question papers approved -compliant with SAG's.  2010 Raw mean of 58.00% accepted.  Raw mean of 54.00% accepted in 2009.

Raw

6

Agricultural Technology

no candidates

*****

7

Business Studies

Question papers approved -compliant with SAG's. The 2010 raw mean of 36.41% accepted.   In 2009 the raw mean of 30.61 was adjusted to  32.61%. Good learner performance.

Raw

8

Civil Technology

Candidates wrote DBE paper therefore DBE adjustment applied. (11 candidates)

 0 at 0; scale to -15 at 30, scale to -8 at 248, scale to 0 at 300

9

Computer Application Tech

Question papers approved -compliant with SAG's.  2010 Raw mean of 48.07% accepted.  Raw mean of 41.55% accepted in 2009.

Raw

10

Consumer Studies

Candidates wrote DBE paper therefore DBE adjustment applied. (19 candidates)

Block +4  

11

Dramatic Arts

Candidates wrote DBE paper therefore DBE adjustment applied. (5 candidates)

Raw

12

Economics

Question papers approved -compliant with SAG's.  2010 Raw mean of 36.95% accepted.  Raw mean of 44.88% accepted in 2009. Difference in mean attributed to 100% increase in enrolment.

Raw

13

Electrical Technology

Question papers approved -compliant with SAG's.  2010 Raw mean of 25.70% accepted.  Raw mean of 38.83% accepted in 2009. (20 candidates) 

Raw

14

Engineering Graphics and Design

Question papers approved -compliant with SAG's.  2010 Raw mean of 23.66% accepted.  Raw mean of 24.44% accepted in 2009.

Raw

15

English First Additional Lang

The 2010 raw mean of 53.22% in line with accepted 2009 raw mean of 53.01%. However 0 distinctions in 2010, indication that paper did not differntiate sufficiently at the top end. The adjustment resulted in a mean of 54.90%'

0 at 0 and scale to +6 at 234 and scale back to 0 to 300

16

English Home Language

The 2010 raw mean is 47.02% down from the accepted 2009 raw mean of 49.19% This could be attributed to the steep increase in numbers from 84 to 176.  The scaled adjustment moves the mean to 49.74%.  

0 at 0 to scale up to +9 at 111, scale down to 0 at 300

17

French 2nd Additional Language

Candidates wrote IEB  paper therefore IEB adjustment applied. 

0 at 0 , +20 at 70 scale back to 0 at 100

18

Geography

Candidates wrote DBE paper therefore DBE adjustment applied. 

0 at 0, scale to +8 at 41, block +8 at 86 ; scale to 0 at 154, and then raw.                                          

19

History

Candidates wrote DBE paper therefore DBE adjustment applied. 

Raw

20

Hospitatility Studies

Question papers approved -compliant with SAG's.  2010 Raw mean of 53.27% accepted.  Raw mean of 45.31% accepted in 2009. Good performance 

Raw

21

Information Technology

Candidates wrote DBE paper therefore DBE adjustment applied. 

Raw

22

IsiZulu First Additional

Candidates wrote DBE paper therefore DBE adjustment applied. 

Raw

23

Life Orientation

Life Orientation poses a particular challenge as there is no examination. The committee considered the moderation reports and in the absence of any compelling reason suggesting the need for an adjustment, accepted the raw mean of 59.60%    Raw accepted mean in 2009 was 68.54%

Raw

24

Life Sciences

Question papers approved -compliant with SAG's.  2010 Raw mean of 43.94% accepted.  Adjusted mean of 38.65%  accepted in 2009. Good performance 

Raw

25

Mathematics

With Mathematics becoming compulsory,  all candidates are expected to take either Mathematics or Mathematical Literacy , hence the lower performance than in the previous SC higher grade.  The need for admission to higher education urge the choice of mathematics as subject as well.    This together with other systemic issues results in a  performance in Maths which is consistently poor.  In 2009 the adjusted mean resulted in an increase from 31.59% to  33.94%  The 2010 raw mean is 28.24%, The adjustment moves the mean to 34.59%.

0 at 0, scale to +27 at 63, scale to +12 at 228 and scale to 0 at 300

26

Mathematical Literacy

Question papers approved -compliant with SAG's.  2010 Raw mean of 54.95% accepted.  Adjusted mean of 54.60% accepted in 2009. Good performance 

Raw

27

Mathematics P3

Question papers approved -compliant with SAG's.  2010 Raw mean of 36.59 accepted - in line with accepted raw  2009 mean  of 35.60%.  

Raw

28

Mechanical Technology

Candidates wrote DBE paper therefore DBE adjustment applied. 

Raw

29

Physical Science

The  2010 raw mean of 33.58% is in line with the 2009 raw mean of 33.62%  The adjustment brings the mean to 36.50% which is slightly higher than the 2009 adjusted mean of 35.91%. 

0 at 0 to +12 at 78 and scale to +2 at 238 and 0 at 300

30

Portuguese 1st Additional Language

Candidates wrote IEB  paper therefore IEB adjustment applied. 

Raw

31

Religion Studies

Question papers approved -compliant with SAG's.  2010 Raw mean of 48.23% accepted.  Raw mean of 36.06% accepted in 2009. (26 candidates) 

Raw

32

Sesotho 1st Additional Language

Candidates wrote DBE paper therefore DBE adjustment applied.(2 candidates) 

Raw

33

Tourism

Question papers approved -compliant with SAG's.  2010 Raw mean of 51.65% accepted.  Raw mean of  2009 was 49.38%.

Raw

34

Visual Arts

Candidates wrote DBE paper therefore DBE adjustment applied. 

Raw

IEB

 

 

 

IEB NSC 2010

 

 

 

Greater than 200 Candidates

 

 

SUBJECT

COMMENT

ADJUSTMENT

Accounting

Based on the learner performance, the paper was not sufficiently challenging at the top end. The 2010 raw mean of 60.21% shows a sharp increase when compared to the accepted 2009 raw mean of 57.01% and accepted 2008 raw mean of 57.69%. The downward adjustment moves the mean to 58.98%. Modest downward adjustment at middle to upper level

0 at 90 scale to at -7 to 247; Scale back to 0 at 300

Afrikaans First Additional Lang

The paper was more cognitively challenging than the in the pastThe 2010 raw mean of 54.29% is lower than the 2009 adjusted  mean of  55.43%  and the 2008 mean of 57.10%.  The upward adjustment moves the mean to 56.24% which is in line with the 2008 and 2009 means. High enrolment subject.

0 at 0, scale to +6 at 84 , block +6 from 84 to 234, scale down to 0 at 300

Afrikaans Home Language

The 2010 raw mean of 63.90% was accepted as it is line with accepted  2009 raw  mean of 63.44% and the 2008 raw mean of 66.67%

Raw

Business Studies

Erratic performance due to subject being quite different from the old Business Economics. In 2008 the mean was adjusted upwards from 48.12% to 50.35%. In 2009 the mean was adjusted downwards from 58.59% to 55.45% . The 2010 raw mean of 52.67% was adjusted to 55.06% 

0 at 0 scale up +6 at 84, scale to +9 at 231 , scale back to 0 at 300

Computer Application Technology

The 2010 raw mean of 59.47% was accepted as it is line with accepted  2009 raw  mean of 59.00% and the 2008 raw mean of 60.39%  

Raw

Consumer Studies

The 2010 raw mean of 67.80% was accepted as it is line with accepted  2009 raw  mean of 66.33% and the 2008 adjusted mean of 65.04%  

Raw

Design

The 2010 raw mean of 65.03% was accepted. The accepted  2009 raw  mean of 62.89% and the 2008 raw mean of 62.51%  

Raw

Dramatic Arts

The 2010 raw mean of 70.65% was adjusted to 70.15%. This is line with the adjusted 2009 mean of 70.33%. The raw mean of 68.28% was accepted in 2008

0 at 200 scale -4 at 244 and scale back to 0 at 300

Economics

The 2010 raw mean of 53.84% was accepted. The adjusted  2009 raw  mean of 56.80% and the 2008 adjusted mean of 60.59%  

Raw

Engineering Graphics and Design

Although the 2010 raw mean of 64.16% is similar to the previous 2009 and 2008 means, the question paper did not differentiate at both the bottom and top end.

0 at 0 + 12 at 78 scale to 0 at 200, scale to -10  at 250, scale back to 0 at 300

English First Addition Lang

The 2010 raw mean of 65.02% was accepted as it is line with accepted  2009 raw  mean of 66.88% and the 2008 adjusted mean of 66.07%  

Raw

English Home Language

The 2010 raw mean of 65.73 accepted as it is line with the adjusted 2009 mean of 65.28 and 2008 adjusted mean of 66.11.  The improved performance due to specific interventions by the assessment body  -workshops around the country .

Raw

Geography

The 2010 raw mean of 65.17% accepted as it is line with the adjusted 2009 mean of 65.95% and 2008 accepted raw  mean of 64.91% 

Raw

History

The 2010 raw mean of 61.94% accepted as it is line with the  2009 raw mean of 61.64%

Raw

Information Technology

The 2010 raw mean of 65.84% accepted as it is line with the adjusted 2009 mean of 66.54% and 2008 adjusted mean of 64.77% 

Raw

IsiZulu First Addition Language

The 2010 raw mean of 66.89% accepted as it is line with the adjusted 2009 mean of 67.69% The raw mean of 70.35% was accepted in 2008. Good performance- subject taken mainly by Isizulu mother tongue candidates. 

Raw

Life Orientation

The 2010 raw mean of 72.60% accepted

Raw

Life Sciences

The 2010 raw mean of 66.45% accepted as it is line with the   2009 raw mean of 66.23%.  In 2009 this mean was adjusted to 63.25%.  Performance stabilising.

Raw

Mathematical Literacy

Difficult subject to detect trend. Performance affected by strong candidates electing to take or switch to Maths Lit. The 2010 raw mean of 78.64% was adjusted to 68.74%. The 2009 raw mean of 73.82% was adjusted to 66.98% . 

0 at 0, scale to -30 at 60, block of -30 from 60 to 281. Scale to 0 at 300

Mathematics

The 2010 raw mean of 59.80% was adjusted to 61.15% .In 2009 the raw mean of 63.33% was accepted. 

0 at 0 to +6 at 105 , block +6 from 105 to 144, scale back to 0 at 300

Mathematics P3

The 2010 raw mean of 65.75% accepted as it is line with the   accepted 2009 raw mean of 64.71%.  In 2008 the raw mean of 62.00% accepted.  Steady improvement in performance. 

Raw

Physical Science

Pairs analysis suggests that performance is low as compared to other subjects.  The 2010 raw mean of 57.98% was adjusted to 59.69% In 2009 the raw mean of 53.04% was adjusted to 58.56%. 

0 at 90 scale to +6 at 144, block  of +6 from 144 to 234, scale back to 0 at 300

Tourism

The 2010 raw mark of 54.50% accepted. Improved performance brings mean in line with 2008 adjusted mean of 55.94%.

Raw

Visual Arts

The 2010 raw mean of 60.75% adjusted to 60.88%

0 at 200 + 2 at 238, scale back to 0 at 300

 

 

 

 

 

 

Less than 200 Candidates

 

 

 

 

 

Dance Studies

The 2010 raw mean of 68.55% accepted. In 2009 the raw mean of 70.02% was  accepted.   In 2008 the raw mean of 73.65% accepted. 

Raw

Hospitality Studies

The 2010 raw mean of 55.32% accepted. In 2009 the raw mean of 58.63% was  accepted.   In 2008 adjusted  mean of 55.01% accepted. 

Raw

Music

The 2010 raw mean of 69.85% accepted. In 2009 the raw mean of 69.10 %was  accepted.   In 2008 the raw mean of 67.00% accepted. 

Raw

Sepedi First Additional Language

The 2010 raw mean of 62.26% accepted. In 2009 the raw mean of 65.38% was  accepted.   In 2008 the raw mean of 69.94% accepted. 

Raw

Sesotho First Additional Language

The 2010 raw mean of 61.72% accepted. In 2009 the raw mean of 63.30% was  accepted.   In 2008 the raw mean of 61.34% accepted. 

Raw

Setswana First Additional Language

The 2010 raw mean of 61.95% accepted. In 2009 the raw mean of 63.50% was  accepted.   In 2008 adjusted  mean of 65.05 %accepted. 

Raw

Siswati First Additional Language

The 2010 raw mean of 73.72% accepted. In 2009 the raw mean of 77.47% was  accepted.   In 2008 the raw mean of 78.28% accepted. 

Raw

Siswati Home Language

The 2010 raw mean of 76.00% accepted. In 2009 the raw mean of 75.30% was  accepted.   In 2008 the raw mean of 74.71% accepted. 

Raw

IsiXhosa First Additional language

The 2010 raw mean of 71.61% accepted. In 2009 the raw mean of 73.99% was  accepted.   In 2008 the raw mean of 76.17% accepted. 

Raw

IsiZulu Home Language

The 2010 raw mean of 69.39% accepted. In 2009 the raw mean of 77.22% was  accepted.   In 2008 adjusted  mean of 84.08% accepted. 

Raw

 

 

 

Non-Official Languages and Other

 

 

Arabic 2nd Additional Lang

Question paper did not differentiate adequately at the bottom end. The 2010 raw mean was adjusted from 49.68% to 53.66%

0 at 0 , +20 at 70 scale back to 0 at 300

French 2nd Additional Lang

Question paper did not differentiate adequately at the bottom end.  The 2010 raw mean was adjusted from 62.17% to 65.38%

0 at 0 , +20 at 70 scale back to 0 at 300

German Home Language

The 2010 raw mean of 62.65% accepted. In 2009 the raw mean of 60.71% was  accepted.   In 2008 the raw mean of 64.88% accepted. 

Raw

German 2nd Additional Language

Question paper did not differentiate adequately at the top  end- extremely high distinction rate. The 2010 raw mean was adjusted from 68.27% to 67.47%

0 at 200,-7 at 247, scale back to 0 at 300

Gujurati 2nd Additional Lang

The 2010 raw mean of 89.44% accepted. In 2009 the raw mean of 90.00% was  accepted.   In 2008 the raw mean of 71.56 %accepted. 

Raw

Hebrew 2nd Additional Lang

Question paper did not differentiate adequately at the top  end- extremely high distinction rate. The 2010 raw mean was adjusted from 73.97% to 72.84%

0 at 200, -7 at 247, scale to 0 at 300

Hindi 2nd Additional Lang

The 2010 raw mean of 57.16% accepted. In 2009 the raw mean of 53.22% was  accepted.   In 2008 the raw mean of 66.96% accepted. 

Raw

Italian 2nd Additional Lang

The 2010 raw mean of 57.87% accepted. In 2009 the raw mean of 58.44% was  accepted.   In 2008 the raw mean of 58.05% accepted. 

Raw

Latin 2nd Additional Lang

The 2010 raw mean of 64.35% accepted. In 2009 the raw mean of 55.82% was  accepted.   In 2008 the raw mean of 55.80% accepted. 

Raw

Modern Greek 2nd Additional Lang

The 2010 raw mean of 71.15% accepted. In 2009 the raw mean of 72.55% was  accepted. 

Raw

Portuguese 1st Addition Lang

The 2010 raw mean of 34.24% accepted. In 2009 the raw mean of 20.43% was  accepted.   In 2008 the raw mean of 26.16% accepted. 

Raw

Portuguese 2nd Addition Lang

The 2010 raw mean of 54.50% accepted. In 2009 the raw mean of 55.81% was  accepted.   In 2008 the raw mean of 54.99% accepted. 

Raw

Spanish 2nd Additional Lang

The 2010 raw mean of 64.89% accepted. In 2009 the raw mean of 64.64% was  accepted.   In 2008 the raw mean of 73.46% accepted. 

Raw

Tamil 2nd Additional Lang

The 2010 raw mean of 59.35% accepted. In 2009 the raw mean of 62.02% was  accepted.   In 2008 the raw mean of 74.06% accepted. 

Raw

Telegu 2nd Additional

The 2010 raw mean of 88.50% accepted. In 2009 the raw mean of 82.00% was  accepted.   In 2008 the raw mean of 89.00% accepted. 

Raw

Urdu Home Language

The 2010 raw mean of 15.00% accepted. In 2009 the raw mean of 10.33% was  accepted. 

Raw

Advanced Programme Mathematics

The 2010 raw mean of 59.94 was adjusted to 64.45%.  Lower 2010 mean attributed to steep increase in enrolments (817 to 1067)

0 at 0, +15 at 150, block +15 to 240, scale to 0 at 300

Equine Studies

The 2010 raw mean of 75.56% accepted ( 6 candidates enrolled)

Raw

Sport and Exercise Science

Subject examined for the first time.  Poor perfromance. 2010 raw mean of 36.67% adjusted to 52.55%

0 at 0, 15 at 30, block +15 at 200, scale to 0 at 300

DBE

DBE NSC 2010

 

 

 

 

 

SUBJECT

Justification of adjustment

ADJUSTMENT

Accounting

2010 paper cognitively more challenging than previous years. Managerial accounting, managing resources, auditing and ethics required more focus on application and problem-solving questions as result of the gradual implementation of the NCS / new curriculum content.    Scaled adjustment increased 2010 mean from 27,92% to 33% which is consistent / aligned with 32% in 2008 and 33% in 2009.                            

0 at 0,  scale to +23 at 67, scale to +6 at 234, scale to 0 at 300            

Afrikaans FAL

The 2010 raw mean of 51.16% is in line with the accepted raw mean of 50.33% in 2009.  Raw mean of 47.78% accepted in 2008.  Performance stabilising well.

Raw

Afrikaans HL

The 2010 raw mean of 57.79%  shows an increase from the 2009 mean of 56.42% and the 2008 mean of 53.77%. Performance stabilising well.

Raw

Afrikaans SAL

The 2010 raw mean of 46.79% shows an increase from the accepted raw 2009 mean of 42.66%. The question paper complied with the requirements of the SAGs and the internal moderator repports do not suggest that paer was easier.

Raw

Agricultural Management Practices

The 2010 raw mean of 49.15% is line with the adjusted 2009 mean of 51.36%. Adjustments not significant - small enrolment of 1127.

Raw

Agricultural Sciences

The paper was found to meet the requirements of the SAGs but learner performance indicates that the paper seemed to be easier that in previous years(2010 raw mean of 34.10% significantly higher than past performance).  Raw mean of 28.43% accepted in 2009. Therefore effected a scaled downwards adjustment which moved the mean to 31%. This adjustment keeps the increase in the the mean consistentat 3% per year. 

0 at 0, scale to -9 at 18, block -9 from 18 to 220, scale down to 0 at 240 and raw to 300

Agricultural Technology

The 2010 raw mean of 49.04% is line with the accepted raw 2009 mean of 52.52%  and accepted raw 2008 mean of 50.04%. Adjustments not significant - small enrolment of 534.

Raw

Business Studies

The 2010 raw mean of 36.48% is line with the accepted raw 2009 mean of 36.97%  and accepted 2008 mean of 36.22%. From the means it is evident that the examining panel has succeeded in maintaing a good standard question paper.

Raw

Civil Technology

In 2010 the paper was approved in terms of the SAG requirements, however the raw mark mean of 48.47%  indicated a far better performance than in the previous years.  The raw mark mean of 41,9% was accepted in 2008 and raw mark mean of 41,3% was accepted in 2009. The scaled downward adjustment brought the 2010 mean from 48,47% down to 44,7% (which is approx. 3% higher than the 2009 mean).                         

 0 at 0; scale to -15 at 30, scale to -8 at 248, scale to 0 at 300

Computer Application Tech

The 2010 raw mean of 45.13% is line with the accepted raw 2009 mean of 43.90%  and adjusted raw 2008 mean of 45.49%. 

Raw

Consumer Studies

In 2008 the raw mean of 41,28% accepted, in 2009 accepted a raw mean of 45,87% .  The 2010 raw mean dropped to 42,17% - this can be attributed to the question paper. A greater coverage of higher order demand questions, use of technical subject terminology and language above second language speaker level.   The upwards adjustment means an increase on the mean to 43,5% which is still 2,3% lower than in 2009.                                    

Block +4  

Dance studies

The 2010 raw mean of 54.67% is line with the accepted raw 2009 mean of 53.55% and accepted raw 2008 mean of 53.63%. Adjustments not significant - small enrolment of 430.

Raw

Design

The 2010 raw mean of 58.49% is line with the accepted raw 2009 mean of 56.29% and accepted raw 2008 mean of 55.85%. Steady increase in mean.

Raw

Dramatic Arts

Raw mean of 59.17% accepted in 2008, Raw mean of 56.64% accepted in 2009.  2010 accepted raw mean of 59.31% in line with 2008 mean.

Raw

Economics

The 2010 raw mean of 36.76% was accepted. This mean was higher than the adjusted 2009 mean of 33.07% and the adjusted 2008 mean of 34.13%.

Raw

Electrical Technology

In 2009 accepted raw mean of 38,97%, whereas in 2010 the raw mean increased to 45,13%. The scaled downward adjustment brings the mean to 43.1%.  

0 at 0, scale -6 at 46, block -6 to 226, scale to 0 at 240 and raw up to 300

Engineering Graphics and Design

The 2010 raw mean of 50.73% was accepted. This mean was higher than the adjusted 2009 mean of 47.95% and the accepted raw 2008 mean of 47.91%

Raw

English HL

In 2008 the raw mean of 53,7% accepted, in 2009 raw mean of 54.44% accepted; raw mark mean of 2010 is 53.98%.  The slight upward adjustment of +3 brought the mean to 54.32% in line with the previous mean.  Higher failure rate also an indication that candidates are taking English as home language although it's not their mother tongue,  but for the core purpose to access higher education.  

0 at 0 to +3 at 117,  scale back to 0 at 171, thereafter raw to 300

English First Additional Lang FAL

In 2008 the raw mean of 45,21% accepted, in 2009 raw mean of 46,33% accepted; raw mark mean of 2010 is 44,68%.  The lower mean attributed to the question papers. The qualitative reports indicate a leaning towards moderate and difficult questions, P 2 (literature paper)  too long with moderate to difficult questions, and limited choice questions. The upward adjustment of block +3 resulted in mean of 45,68% (which is in line with the previous 2 years) ,

Block +3  

English Second Additional Lang

Raw mean of 48.27% accepted. Only 6 candidates

Raw

Geography

In 2008 the adjusted mean of 35,8% (from 33.05) accepted, in 2009 adjusted mean of 35.07% (from 31.96%) accepted; raw mark mean of 2010 is 33% which is similar to raw mark of 2008.  A scaled adjustment  effected to be consistent with previous years.   This resulted in a mean of 34,8%, which is consistent with the previous years.  

0 at 0, scale to +8 at 41, block +8 at 86 ; scale to 0 at 154, and then raw.                                          

History

The 2010 raw mean of 38.95% is line with the accepted raw 2009 mean of 37.40%. The  accepted raw 2008 mean was 33.99%. Subject stabilising well. 

Raw

Hospitatility Studies

In 2008 the raw mean of 44.51% was accepted. The mean rose sharply in 2009 to 53.67% and was therefore adjusted downwards to 49.32%.  The 2010 raw mean of 52.69% was accepted as it is in line with the 2009 performance. In retrospect, the 2009 adjustment could be viewed as harsh.

Raw

Information Technology

The 2010 raw mean of 56.02% was accepted. This mean is higher than the accepted raw 2009 mean of 52.25% and the adjusted 2008 mean of 55.36%

Raw

IsiNdebele HL

IsiNdebele HL papers (one of four African Home Languges which were adjusted downwards by -6) were of lower cognitive demand in comparison with the other home languages. The downward adjustment of -6 resulted in the raw mean to decrease from 65,28% - 63,28%.

Block - 6

IsiNdebele FAL

 

******

IsiNdebele SAL

 

******

IsiZulu HL

IsiZulu HL papers (one of four African Home Languges which were adjusted downwards by -6) were of lower cognitive demand in comparison with the other home languages. In 2009 the raw mean of  60,02% accepted; whereas the 2010 raw mean showed a sharp increase to 65,51%. The downward block adjustment of -6 resulted in the raw mean to decrease from 65,51% to 63,51%,   

Block - 6

IsiZulu FAL

The 2010 raw mean of 74.21% is line with the accepted raw 2009 mean of 74.27%. The  accepted raw 2008 mean was 72.23%. Subject stabilising well. 

Raw

IsiZulu SAL

2010 raw mean of 74.20% accepted. 10 candidates enrolled.

Raw

IsiXhosa HL

The 2010 raw mean of 60.24% although lower than the accepted raw 2009 mean of 62.16% was accepted. No upward adjustment was considered as the post exam research suggests the 2010 paper was easier than the 2009 paper.

Raw

IsiXhosa FAL

In 2008 the raw mean was adjusted from 59,79% to 60,77%; in 2009 the raw mean of 60,9% accepted, whereas in 2010 a sharp increase in the the raw mean to 65,69%.  The high mean could be attributed to the concentration of candidates in a particular region and the fact that candidates tend to take English on home language level and their mother tongue on first additional language level.  The block adjustment of -6 resulted in a raw mean of  63.69%. 

block -6

IsiXhosa SAL

2010 raw mean of 61.33% accepted. 198 candidates enrolled.

Raw

Life Orientation

Life Orientation poses a particular challenge as there is no examination. The committee considered the moderation reports and in the absence of any compelling reason suggesting the need for an adjustment, accepted the raw mean of 63.24%.

Raw

Life Sciences

In 2008 the raw mean of 35,15% accepted, in 2009 the raw mean of 34,84% accepted, whereas in 2010 a sharp increase in the the raw mean to 41.60%. The qualitative evaluation of the papers indicacted a leaning towards easier and moderate questions which resulted that the 2010 papers were of a lower cognitive demand than in 2008 and 2009. The papers were less demanding and didn't contain sufficient application questions. The scaled downward adjustment of -9 brought the raw mean to 38% which is still higher than in previous years. 

0 at 0 ; scale to -9 at 23; block -9 from 23 to 286 and scale down to 0 to 300

Mathematics

With Mathematics becoming compulsory,  approximately  half of the candidate population (296 000) took Mathematics , hence the lower performance than in the previous SC higher grade.  The need for admission to higher education urge the choice of mathematics as subject as well.    This together with other systemic issues results in a  performance in Maths which is consistently poor.  In 2008 the adjusted mean resulted in an increase from 22,69% to  28,26%, in 2009 the adjustment brought about an increase from 20,79% to 27,3%.  The raw means of 2008 and 2009 confirmed the qualitative reports which indicated that the 2009 papers were of higher cognitive demand and more compliant with the SAG than in 2008. The raw mean of 23,66% in 2010 is similar to the raw mean of 2008.   The 2010 papers were compliant with the SAG requirements. The scaled adjustment in 2010 resulted in a mean of  28,6% which is aligned with the mean of previous years.  The pairs analysis reveals that the Mathematics question papers were significantly more difficult than the other subjects. 

 0 at 0, scale to +22 at 68, scale down to +15 at 225, scale to 0 at 300 

Mathematical Literacy

Approximately  half of the candidate population (290 000) took Mathematical Literacy. In 2009 the raw mean of 39.59% was accepted. The 2008 raw mean of 45.20% was adjusted to 41.87%. The 2010 raw mean is 43.73% -higher than the 2009 but lower than the 2008.  The 2010 raw was accepted to lock in the improvement in performance. The increase in performance could be attributed to learners being channeled away from Mathematics and into Maths Lit.  Maths Lit is a new subject and would therefore require some time to stabilise - it should not be equated with the previous Maths SG. 

Raw

Mathematics P3

The 2010 raw mean of 45.85 was accepted. The adjusted mean for 2009 was 41.06 and for 2008 was 40.77.  The higher mean was accepted due to the significant drop in the number of candidates taking Maths P3 (11800 to 9400)

Raw

Mechanical Technology

The 2010 raw mean of 42.07%  is line with the accepted raw 2009 mean of 41.25%. The  adjusted 2008 mean was 39.66%  Subject stabilising well. 

Raw

Music

The 2010 raw mean of 56.73%  is in line with the accepted raw 2009 mean of 55.91%. The  adjusted 2008 mean was 55.38%  Subject stabilising well. 

Raw

Physical Science

The 2010 raw mean of 30.26% was accepted as this is in line with the adjusted 2008 mean of 30.33%. The adjusted 2009 mean was 25.17%. The 2007 mean (constructed HG-SG norm) is 28.85%.  The improvement from 2009 to 2010 is attributed to the following interventions: The 2010 question papers were much clearer (less wordy) and therefore accessible to candidates at all levels, particularly second language learners. The DBE issued revised examination guidelines  after Umalusi research indicated that the Physical Sciences curriculum  was too broad.

Raw

Religion Studies

In 2008 the raw mean of 47.00% accepted, in 2009 the raw mean of 49,17% accepted, whereas in 2010 a sharp increase in the the raw mean of 54,31%. The sharp increase in the performance indicated that the papers were less demanding which is confirmed by the pairs analysis.  The scaled downward adjustment  brought the raw mean from 54.31% to 50% which is aligned with the previous years.

0 at 0 scale to -12 at 24; block -12 to 291 and scale down to 0 at 300

Sepedi HL

The 2010 raw mean of 61.23% accepted. The accepted raw 2009 mean of 57.90%. The  accepted raw 2008 mean was 59.09%. Subject stabilising well. 

Raw

Sepedi FAL

The 2010 raw mean of 60.64% accepted as this is in line with the accepted raw 2009 mean of 61.62% and the accepted raw 2008 mean of 62.57%.  

Raw

Sesotho HL

The qualitative analysis indicated that the papers didn't discriminate at the top end of the performance levels,  The raw mean in 2010 of 57,86% and raw mark distinction of 0,28% as compared to the 2009 mean 54,65%. The scaled adjustment of +4 moves the mean to 57.89%.  

 0 at 0, 0 at 210, scale to +4 at 236, scale down to 0 at 300

Sesotho FAL

The 2010 raw mean of 54.11% accepted as this is in line with the accepted raw 2009 mean of 56.57% and the 2008 mean of 56.20%.  

Raw

Sesotho SAL

The 2010 raw mean of 65.25% accepted. Only 165 candidates enrolled.In 2009 raw mean of 70.36% accepted and in 2008 raw mean of 72.15% accepted.

Raw

Setwana HL

The qualitative analysis indicated that the papers didn't discriminate at the top end of the performance levels,  The raw mean in 2010 of 58.92%  compares to the 2009 mean 58.27%. The scaled adjustment moves the mean to 58.98%,    

 Raw up to 210; scale to +7 at 233;  scale to 0 at 300

Setwana FAL

The 2010 raw mean of 58.10% accepted as this is in line with the accepted raw 2009 mean of 58.94% and the accepted 2008 raw mean of 57.98%  

Raw

Setwana SAL

The 2010 raw mean of 59.95% accepted. Only 27 candidates enrolled.In 2009 raw mean of 61.48% accepted and in 2008 raw mean of 65.83% accepted.

Raw

Siswati HL

In 2008 a raw mean of 57,52% was accepted whereas the sharp increase in performance in 2009 to a raw mean of 64,75% was downward adjusted to a mean of 60,75%. The Siswati HL papers (one of four African Home Languges which were adjusted downwards by -6) were of lower cognitive demand in comparison with the other home languages. The downward adjustment of -6 resulted in the raw mean to decrease from 65.07% to  63,07%.

Block - 6

Siswati FAL

The 2010 raw mean of 67.53% accepted. Only 285 candidates enrolled.In 2009 raw mean of 66.30% accepted and in 2008 raw mean of 73.05% accepted.

Raw

Tsevenda HL

The qualitative analysis indicated that the papers didn't discriminate at the top end of the performance levels,  In 2008 a raw mean of 61,56% was accepted whereas the sharp increase in performance in 2009 to a raw mean of 68,04% was downward adjusted to a mean of 64,04%.  The scaled adjustment resulted therefore in the mean moving to 61.48%.

 0 at 0, 0 at 210 scale to +7 at 233 and scale to 0 at 300

Tsevenda FAL

The 2010 raw mean of 73.93% accepted. Only 15 candidates enrolled.In 2009 raw mean of 75.19% accepted and in 2008 raw mean of 60.93% accepted.

Raw

Tourism

The 2010 raw mean of 46.56% accepted is in line with the  accepted raw 2009 mean of 45.20%. The  accepted raw 2008 mean was 41.73% Subject stabilising well. 

Raw

Visual Arts

The 2010 raw mean of 62.04% accepted. The  accepted raw 2009 mean was 58.83%. The  accepted raw 2008 mean was 56.50%.  Subject stabilising well.

Raw

Xitsonga HL

The qualitative analysis indicated that the papers didn't discriminate at the top end of the performance levels, The raw mean of 2010 is 66.92% . The 2009 accepted raw mark was 63.77%.    The scaled adjustment resulted in a mean of 64.97%.

0 at 0,  0 at 120 scale to -6  at 148 and block of -6 to 272 and scale to 0 at 300

Xitsonga FAL

The 2010 raw mean of 64.27% accepted. Only 16 candidates enrolled.In 2009 raw mean of 55.95% accepted and in 2008 raw mean of 63.80 % accepted.

Raw

 

 

 

Nautical Sciences

 

Raw

Maritime Economics

 

Raw

Issued by Umalusi, February 23 2011

Click here to sign up to receive our free daily headline email newsletter