Fairness and credibility in high stakes examinations

The assessment of the 11 “home languages” at the end of secondary school in South Africa is patently unfair. That is the finding of a recent investigation that Colleen du Plessis (UFS), Sanet Steyn (NWU) and I report on in an article that has just been published on LitNet Akademies. The Grade 12 exit examinations are high stakes assessments, since the Home Language mark contributes disproportionately to the index on the basis of which access is granted to higher education (or entry into the world of work). They are unfair, because they are not equivalent: in some languages one has a much better chance to pass than in others.

There are many causes for the unfairness, the investigation reveals. One is that the final examinations have over time drifted away from the curriculum, and in fact have become the curriculum. The other is that the good intentions of the curriculum have been systematically misinterpreted. The third is that the examinations for some languages are just so much easier than others. All contribute to undermining the credibility of one of the highest stakes examinations in South Africa. The report expresses the opinion that to achieve equivalence the construct of what gets measured must again be articulated, before a range of possible new formats for the assessment are adopted. At present, for example, there is an unwillingness to make use of multiple choice formats, though that will make marking substantially more reliable.

What is interesting to me, from a theoretical point of view – since my main interest is to see how systematic, foundational or philosophical insights relate to technically qualified design considerations – is that the design principles for a good language test work together: we cannot achieve equivalence without technical imagination and creativity; we cannot ensure fairness without technically equivalent tests. To support those, we need consistent and adequate (‘valid’) measurement. And as to the accountability for our language test designs, we need a lot of political will to make fairness happen.

Though the report is in Afrikaans, there is an extended abstract of some 1500 words in English. There is also an interview with the three authors (one was overseas) that you might be interested in. The project aims to start off with English, Afrikaans and Sesotho. Colleen’s study is the anchor study; Sanet, who has developed a Test of Advanced Language Ability (TALA) in English, will examine how this can be converted into an equivalent test in Afrikaans; and Johannes Mahlasela (also of NWU) will take the investigation further for Sesotho. It will be an uphill battle, and on many fronts!

4 thoughts on “Fairness and credibility in high stakes examinations

  1. Andrew D. Cohen

    As someone working now for 5 years on my 13th language, Mandarin, I can attest to the fact that I have had an easier time when assessed in some of my languages more than in others. Of course, one issue is how close the language I am being tested on is to my native language– whether, for example, there are cognates. This would undoubtedly influence vocabulary items where knowing the cognates in the first or dominant language (or some other well-known language) could help in selecting the correct multiple-choice alternative. Thom Upton and I found when we did research (published in 2006) on the iBT TOEFL Reading Subtest that certain reading vocab items favored those with a background in Romance languages. ETS acknowledged this but did not feel the need to stop having the vocab items favor speakers of Romance languages. Their attitude was that this was simply a good reason to make sure you, as a test taker, know one or another Romance language!

    Liked by 1 person

    Reply
  2. Albert Weideman Post author

    Wow, Andrew! Go! You’re right about language distance – recent studies of the Dutch national L2 exams that had more than 60000 test takers also ascribe the differential performance on this exam to that. And I think it’s a shame that ETS did not take your findings more seriously. In the case that I posted here, however, all secondary school learners are examined in their (or a) single home language. The challenge (that Sanet Steyn and Colleen du Plessis envisage dealing with in their PhDs) is making the equally high stakes Grade 12 exit examinations in South Africa equivalent across 11 different languages. And we all know that the moment you say ‘equivalence’, you’re in for a bumpy ride, conceptually and in test design.

    Liked by 1 person

    Reply
  3. Mbongeni Malaba

    Thank you Albert, for pointing out the disparities in the current set up. These provide food for thought, as we grapple with issues of language competence, or the lack thereof, in our students.

    Liked by 1 person

    Reply
  4. Albert Weideman Post author

    Thanks, Mbongeni! And yes, the levels of academic literacy that affect us directly in the higher education sector are one result of the inadequacies lower in the system. We’ll talk some more about that on these pages at some stage!

    Liked by 1 person

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s