James Myburgh writes on farm murder fact-checking gone bad
Africa Check vs Politicsweb
In May 2017 Africa Check published a fact sheet on farm attacks and farm murders as well as an accompanying article by Kate Wilkinson on “Why calculating a farm murder rate in SA is near impossible”. In this latter article Wilkinson dismissed claims that the murder rate of (commercial) farmers could be as high as 100 per 100 000. Her argument was that it was impossible to tell given the massive uncertainty around the size of the population affected.
She noted that the total size of the population on commercial farms, according to a 2007 census, was 818,503 (including workers and others.) “If this farming population is used, the farm murder rate for 2015/16 would be 5.6 murders per 100,000 people living and/or working on farms registered to pay value-added tax. However, the total population of people living on farms and smallholdings, which do not pay value-added tax, will be larger.”
She then cited the 2016 estimate in StatsSA’s Community Survey that there were 2.3 million households, with a population of 11 million, involved in agriculture. “If this figure is used”, she wrote, “the farm murder rate drops to 0.4 murders per 100,000 people who live on agricultural farms and smallholdings in South Africa.”
This Africa Check article and these possible estimates have, subsequently, been cited by journalists and fact-checkers around the world in rebuttal to those raising concerns (for good reasons or bad) around the continuing farm murder epidemic in South Africa.
In a critique of this and similar arguments, published on Politicsweb in May this year, I pointed out that it was quite possible to produce a reasonable estimate of the murder rate of white farmers and their family members at least (the predominant victims).
The identities of farm murder victims are recorded by various organisations, such as TAU SA, from which one can then work out their race. There had been 51 murders of “white farmers” in 2016/17, according to TAU SA’s data.
Establishing the size of the white population affected by farm attacks was more difficult. However, StatsSA’s 2016 Community Survey had quite fortuitously collected data, which I was able to access, on the number of white headed households, involved in agricultural activities on farmland, including farms and small holdings. This matches the universe of white South Africans affected by farm murders, in terms of the standard SAPS definition. The estimate from the Community Survey was that 47 218 households matched this description.
Using these two numbers I came up with the estimate of a murder rate of “white farmers” of 108 per 100 000 (3.2 times the national average.) The article made further arguments which I will not deal with here.
The article was published on the 30th May 2018. On Thursday last week, 176 days later, Africa Check finally published their response - headed “Farm murder rate calculations should be transparent – Politicsweb’s isn’t” - again by Kate Wilkinson.
The essential thrust of Africa Check’s article is that Politicsweb’s piece in no way lived up to the high moral and ethical fact-checking standards that Africa Check apparently lives and works by. The core of the complaint is that, in the face of Africa Check’s demands, Politicsweb refused to “name the person" who sent the StatsSA community survey data "or provide evidence of the correspondence.” This was apparently in breach of sacred fact-checking principles, meant that the data could not be verified, and clearly hinted at something deeply nefarious.
Here it is worth noting that Politicsweb was perfectly transparent about where the figures came from: the Community Survey of 2016. The question when it comes to verification, if Africa Check really distrusted them, is a simple one. Either the Community Survey’s estimate for white heads of households, involved in agricultural activities, on farm land, is 47 218 or it is not. If it is the latter StatsSA could easily prove this by providing the alternative, correct figure or table. Once the accuracy of the 47 218 figure is accepted one could then move on to the entirely proper and bona fide discussion of the reliability of this estimate.
Instead, Wilkinson’s article launches into some Class A hokum. We are told that, according to StatsSA's PR people, “no absolute numbers” were published only “proportions”, the “statistical agency said they had no knowledge of Myburgh requesting the numbers”, and the table “he published is also not included in any of the community survey publications.” Most significantly of all, apparently, “It’s also impossible to extract the numbers Myburgh published with Stats SA’s SuperWEB2 data tool. The applicable options to produce the table are simply unavailable.”
Given all this misdirection the ordinary reader may fail to notice that at no point does StatsSA or Africa Check actually state that the 47 218 figure is wrong. The impression created instead is that extracting this number and a simple table from the CS 2016 data set is far beyond the powers of South Africa’s official statistics body. It is apparently only if Politicsweb names names that this great mystery can be cleared up. As Wilkinson puts it: “I can’t prove that Stats SA didn’t provide Myburgh with the absolute numbers. But if he comes clean on who gave him the data, we can further press the statistical agency on why they claim the numbers can’t be extracted.”
Mixed up with these complaints is the assertion that the estimates are not reliable. Here it is worth making some observations about StatsSA’s Community Survey. This survey is just one down from the census, which is an actual headcount of the population. The overall sample size was a massive 1 370 809 dwelling units. This means that one in twelve households was contacted during the fieldwork for the survey. This sample size is ample for producing reliable estimates of a sub population of a size such as that of white households involved in agriculture on farm land, even by province.
In the survey respondents were asked whether they were involved in any kind of “agricultural activities”. This question was designed to throw a very fine and broad net and covered everything from commercial agriculture, to subsistence farming, to people who may have kept a chicken or vegetable patch in their back yard. According to the official release this found that the number of households engaged in any kind of “agriculture” was “2,3 million in 2016 compared with 2,9 million in 2011”. 143 361 of these were white headed households, down from the 150 874 counted in the 2011 census.
Respondents were then asked about the main “place of agricultural activity”. The first option listed here was “Farm land (including commercial farm land and smallholdings).” And then about the “main purpose of agricultural activity”, whether it was the main source of food or income and so on.
One of the Community Survey releases contains the following table, cited by Africa Check, with the results in the form of percentages:
This shows that for 37,9% of respondents the main place of agricultural activities of white headed households involved in “agriculture” was on farm land. Below the table is the note that the “figures above represent the proportions of all households” who responded to the question. In other words a certain percentage of those who claimed to be involved in some kind of agricultural activity did not answer subsequent questions.
Wilkinson makes a meal of the fact that StatsSA published only proportions here, and that there was a non-response rate, which she suggests was massive. She reports a spokesperson for StatsSA as saying “they had low rates of response to a number of questions and, as a result, decided only to include proportions for the households that did answer.” She adds that “Because of this, Stats SA says you mustn’t take that 37.9%, apply it to the 143,361 white agricultural households and say there are 54,334 of them are on farmland. (Note: This is what BBC Reality Check did and we’ve written to them about it.)”
It is at this apparent high point in her argument that the whole logical and factual basis of Wilkinson’s article starts falling to pieces. A percentage is the product of two absolute numbers, a numerator and a denominator. The numerator here is the number of white headed households involved in agricultural activities on farm land (the number at issue in this debate), and the denominator the number of white headed households involved in agricultural activities, minus a certain number of non-responses. If you can generate a table with percentages you can also generate it with absolute numbers.
Here then is exactly the same table but in the form of absolute numbers:
As can be seen the number of white heads of households in agricultural activities on farm land was estimated at 47 272. (The 47 218 figure we used was based on a table that also incorporated responses to the next question on the main purpose of agricultural activities.) The gap between those who said they were involved in any kind of agricultural activity and those who answered this further question was 18 646 or 13%. The response rate then was 87%.
It seems fairly sensible to assume, given how widely the net had been cast in the initial question on agricultural activities, that these non-responses would have come from those with the most tangential involvement therein. Presumably if you are a StatsSA fieldworker interviewing a (white) farmer on a farm or smallholding you are going to get answers to these questions. More conservatively, if you allocate the non-responses proportionally you get to the same 54 334 figure reached by the BBC’s Reality Check.
For argument’s sake however let us take the most unlikely option and assume that every single non-response on this question was in fact from a farmer. How would this alter Politicsweb’s estimate of the murder rate of “white farmers” in 2016/17?
It would mean that there were an estimated 65 918 (47 272 + 18 646) white heads of households involved in agriculture on “farm land” in 2016. The figure of 51 murders of “white farmers” recorded in 2016/17 by TAU SA, which we used, would mean a murder rate of 77,4 per hundred thousand. This is still over twice the national murder rate of that year. It is also 14 times the 5.6 per 100 000 figure that Africa Check suggested as one possible farm murder rate, and 193 times the widely cited 0.4 per 100 000 rate the organisation also decided to just throw out there.
To sum up then, Africa Check’s assertion that they needed to know how I came by the data I used in order to verify it is false. The suggestion that it was not possible to extract it from the CS 2016 data set is false. The claim that the response rate on the key number at issue was “low” is false. The suggestion that this would have a big effect on the estimates is false. The claim that “Myburgh’s calculations are not replicable, meaning you can’t check them and just have to trust his conclusions” is also false. The suggestion that the population estimates of the Community Survey “may not be representative” would be huge, if true.
I didn’t know that it was possible for a fact checking organisation to produce and then squeeze so many dodgy different claims into a 1 300 word article. But then again, as Africa Check notes, I do not know “how fact-checking and verification works.”
Africa Check’s conduct in response to Politicsweb’s criticism of their work raises a further issue. The protection of sources is absolutely fundamental to journalism, wherever this is practiced, and this is set out in innumerable codes of conduct for the profession. Take the Agence France Presse"Editorial Standards and Best Practices" as just one example:
Among the ten guiding principles of AFP is the following, at number three: “AFP journalists must protect the confidentiality of sources and must never knowingly put them in harm’s way.” This is emphasised again in the main body of the text under “Protection of sources”. This states: “Journalists have a duty to protect the identity of confidential sources … and should never knowingly put them at risk.” Furthermore “AFP journalists should never hand over their recordings, notes or images to a third party.”
These rules are particularly relevant here as Africa Check is in large part a project of its founding partner, the AFP Foundation, which is an initiative of AFP. Its Executive Director, Peter Cunliffe-Jones, is a veteran journalist who spent most of his career with AFP, rising to the position of chief editor for Asia and then serving as Deputy Director of the AFP Foundation. Robert Holloway, the chairman of the board, is another AFP veteran and currently director of the AFP Foundation. Another board member with AFP links is Boris Bachorz currently the AFP’s Director for Africa.
In other words simply by demanding the “name” of my “source” and/or “evidence” of any “correspondence” with them Africa Check was asking me to flout one of the most fundamental ethical principles of journalism, and commit offences that would likely have got Cunliffe-Jones, Holloway and Bachorz (deservedly) sacked and drummed out of the profession had they ever committed them while working as reporters for the AFP.
It is strange then that this organisation, headed by such journalists, thought that this was a demand that it was appropriate to press on Politicsweb. Perhaps though the new journalistic enterprise of “fact checking” had somehow superseded the older journalistic code? This certainly was the impression created by the aggressive way in which Africa Check has demanded this information, first in private and then in public (!).
In her article Kate Wilkinson boasted that “Africa Check is a signatory of the International Fact-Checking Network’s Code of Principles. This means, among other things, that we are committed to the transparency of our sources and methods of research. The principles state that ‘signatories want their readers to be able to verify findings themselves. [They] provide all sources in enough detail that readers can replicate their work.’”
The argument is that by refusing to disclose to Africa Check how exactly I came by the data I cited in my “fact check” I was flouting these principles, and failing to act with the requisite “transparency”. If you go through though to the original of the IFCN Code of Principles the relevant section reads as follows in full:
“Signatories want their readers to be able to verify findings themselves. Signatories provide all sources in enough detail that readers can replicate their work, except in cases where a source’s personal security could be compromised. In such cases, signatories provide as much detail as possible.” (my emphasis)
Africa Check’s behaviour on this score then – apart from being entirely unnecessary - was completely improper, and reckless.