Open data and the politics of scholarly practice

It’s now a good few years since academic publishers put in place differential pricing arrangements for developing economies in Africa. And although a complex transition is still in place a large number of academic publications are freely available through Creative Commons licences, as pre-prints in online repositories or through the “gold” publication route, in which all costs and profits have been recovered up-front. So why, as Laura Czerniewicz asks in a recent and widely read article, does the north-south knowledge divide still persist?

As Laura points out, there are a number of contributing factors.

Firstly, it’s by no means all about access. Universities in the south lack research funds. Bandwidth may be available but it’s often constricted, unreliable and very expensive in comparison to the facilities that northern universities largely take for granted.

Secondly, while the editorial processes that scholarly journals use are central to the machinery of peer review, assurance and ethical practice, journals can also be gatekeepers that police disciplinary boundaries and provide reciprocal endorsements for scholarly cliques.

Thirdly is a cluster of epistemological issues: the assumption that “international knowledge” trumps “local” research; that “grey literature” is of lesser value than papers written in scholarly genres.

Here, I’m interested in this third set of issues that Laura raises. In her words:

Our own perceptions of ‘science’ must be broadened to encompass the social sciences. … Research outputs need to be recognised as existing beyond the boundaries of the formal journal article. Incentives and reward systems need to be adjusted to encourage and legitimise the new, fairer practices that are made possible in a digitally networked world. And finally, the open access movement needs to broaden its focus from access to knowledge to full participation in knowledge creation and in scholarly communication.

To see how these changes could happen – how the long-established subservience of “southern knowledge” to northern paradigms in scholarly publishing can be challenged – it’s necessary to backtrack a little and take account of where digital publication is going.

We all know how online access has transformed the music and film industries. The same is happening with scholarly publishing. All major journals now provide online access and, soon, there will be no need for paper versions. This requires new business models. If journals are to continue as on-line subscriptions, then publishers must put them behind secure paywalls and charge for individual access by non-subscribers, finding ways to stop these digital copies being copied and re-distributed. Alternatively, all costs and profit margins must be recovered up front through what are known as “article processing charges”, with either authors or their universities paying to allow free-of-charge distribution after publication; what is known as “gold” open access.

Making this jungle still more complex to get through, public interest journals publish articles without subscription fees or article processing charges. And every author holds the copyright to the final version of their work – the paper as it stands just before being surrendered to a publisher – allowing them to distribute their work free of charge, online, and through a “green” open access repository, of which there are now hundreds. Today’s battleground is with those profit-hungry publishers who are “double dipping”; charging high online subscriptions as well as article processing charges for papers based on publicly funded research.

This is all of importance. But getting to a new equilibrium for the scholarly online business model will not, it itself, solve the imbalance in the north-south distribution of knowledge. Much more interesting in this regard are the possibilities in what can be called “interactive citation”.

How will reforming the citation system work?

By tradition, scholarly publications are grand constructions buttressed by the strength and standing of their citations. These foundations and load-bearing walls are a combination of previous, authoritative, publications and references to appropriate sets of data. In turn, these data sets will be in forms recognized and accepted by the tradition and rules of scholarly disciplines. Together, the standing of precedent and associated work and the quality of the appropriate data sets serves to authenticate the knowledge claims of the new work. Knock away the foundations – for example, by showing that the data sets were falsified – and the whole edifice comes crashing down.

So far most – but by no means all – online publications have stayed true to these traditions. Tracking back the citations invariably leads to a pdf file, arguable the most unimaginative digital format available. But this will change. Increasingly, online publications will make use of hyperlinks to dynamic resources. These could be the full and original datasets underpinning the work, allowing the reader to explore different possibilities. They could be links to datasets that, in their nature, change constantly. Digital links could also be to live feeds that allow the “reader” to engage directly with continuing research work as it unfolds.

Here are a few examples from my own discipline – Archaeology – that illustrate what this could mean.

Most archaeological excavations are publicly funded and generate masses of heavy, dirty stuff. Traditionally, it has only been possible to make the most slender of references to these collections, which will usually be in deep storage, unavailable to those readers who are curious. Now, online publications can include hyperlinks to all the original assemblages, including large photographic archives, inventories and statistical data.

Staying with the example of archaeology, big sites are often excavated over many years, and successively by different teams. This means that hyperlinks embedded in earlier online publications can access datasets that have changed subsequently, and continue to change. This allows the assumptions behind earlier publications to be tested. And, given that research such as this is publicly funded, and often has high levels of public interest, why not embed links in scholarly publications to live resources, such as webcams inside an excavation, or a laboratory?

Back to the issue here – how could such “interactive citations” overturn the current hierarchies of knowledge? Getting to this requires a second digression – the prevalent assumptions about the relative value of ‘international” versus “local” vehicles for publication.

Here, the most egregious example is the way in which British universities have interpreted the requirements of the UK’s periodic evaluation of research quality – the “Research Excellence Framework”. In the last version of this, subject area panels of senior academics produced a ranking system for “their” journals. 4* and 3* journals had to be “international”, and research by academics who had not published in 4* and 3* journals was not considered to be “excellent” and was invariably not included in a university’s submission to the funding council. Some universities in Britain are now moving academics who are not “international” into teaching-only positions or onto fixed term contracts.

Sadly, some universities in the south mimic this hierarchy of value. Here’s a hypothetical case study, this time in public health; a thought experiment that shows what this could lead to.

Lets say a researcher in Cape Town is passionate about a local issue, such as the high incidence of foetal alcoholism, or drug resistant TB, or homelessness and child mortality. The results of the work may have immediate and high value to local health authorities and social services; the local professional community will benefit from the rapid availability of research results in a locally published, fully refereed, academic journal. This, though, is not an “international” publication and has no place in the dominant hierarchy of academic value.

But then add this twist. An academic in a UK-based university, in Cape Town for an academic conference, hears about the local study. He or she builds the continuing development of this work into a funding proposal and, with all due acknowledgement to the Cape Town team, submits the results to an academic journal in the UK. These research outcomes – in essence the same – are now “international”, excellent and of 4* quality.

The contradiction inherent in this kind of scenario – replicated in a wide range of disciplinary areas is this. While the intention will invariably be progressive, to open up key areas of research across Africa, an outcome is to replicate the exploitative structure of nineteenth century colonialism into the knowledge economy of the twenty-first century.

Then, raw materials were exported from Africa, fashioned into high-value goods in Europe’s factories, and sold back into the colonies. Now, raw local data is exported from Africa, fashioned into high-value knowledge in Europe’s universities, and sold back to universities in Africa as high-cost journal publications.

Back to Laura’s question: how can the open access movement broaden its focus from access to knowledge to full participation in knowledge creation and in scholarly communication? Here, the key opportunity in in further extending the opportunities inherent in “interactive citation”.

When the journey from the “doing” of research through to the finished product of the final publication is fully mapped, it becomes what Bruno Latour called a system of references. Seen through a digital lens, these route maps are a series of digital assemblages. In my archaeological world, such a system of digital references would include scanned field notes and maps, photographs taken for recording purposes, spreadsheets with lists and indices of excavated finds, digital records of laboratory results, interim reports and conclusions, correspondence and news releases and – eventually – the final pre-publication draft of the journal article. In traditional citation systems at their best, very little of this information trail is publicly available; more often than many will care to admit, a good deal is lost along the way.

Used appropriately, interactive citations embedded in digital publications will open up this rich chain of references. This will be valuable for all research endeavors. But how does it help with the stubborn and persistent north-south issue?

Another example, this time from human palaeontology. Some of our earlier ancestors left their bones in Kenya’s volatile landscape, resulting in remarkable levels of preservation over hundreds of thousands of years. These traces, from footprints in lakeside mud to human sculls that can tell us about the emerging brain, are part of Kenya’s cultural heritage and belong in the national museum in Nairobi which is where, for the most part, they are.

These key traces of our common past have been excavated under permits from the Kenyan government and with public funding from North American and European agencies. The results of such fieldwork can make an academic career; this is the kind of research for which the eventual journal article is reported on the cover of Time Magazine.

One could assume that, for research of such widespread interest, the data leading to interpretation and publication would be available for further analysis, whether to ask new and interesting questions or the check whether published claims stand up to scrutiny. Not so. All too often key data sets are reserved as the intellectual property of the researcher. What is made available is metadata – data about data. Access to primary information may be withheld for many years. This means that while the original object may be housed in Africa, in its country of discovery, the gateway to the key information that makes sense of the original object is through a northern research institute, and subject to its permission.

Interactive citation can change this. If the concept of “open data” can be defined as access to the full chain of references that make up the citation, then the political economy of the knowledge landscape can be changed. In this scenario, some of the old shibboleths of academic quality management fall away. The dichotomy between “local” and “international” becomes irrelevant because, in a sense, all knowledge becomes local and all researchers become international. Similarly, the distinction between formal publications and “grey literature” become redundant; all reports are electronic files and what matters is rather the rigor of their review and verification rather than the status of a journal title.

Changes such as these are difficult, controversial and contested. This is not surprising; it’s what disruptive technologies do. We are in the midst of a major modification to the way knowledge is created and distributed; changes to a system that was invented in the seventeenth century and which formed the basis of the great scientific advances of the following three hundred years. And so it’s appropriate to end with a point of view from the Royal Society, which was founded in 1660:

The changes that are needed go to the heart of the scientific enterprise and are much more than a requirement to publish or disclose more data. Realising the benefits of open data requires effective communication through a more intelligent openness: data must be accessible and readily located; they must be intelligible to those who wish to scrutinise them; data must be assessable so that judgments can be made about their reliability and the competence of those who created them; and they must be usable by others. For data to meet these requirements it must be supported by explanatory metadata (data about data). As a first step towards this intelligent openness, data that underpin a journal article should be made concurrently available in an accessible database. We are now on the brink of an achievable aim: for all science literature to be online, for all of the data to be online and for the two to be interoperable.

**

Laura Czerniewicz: “It’s time to redraw the world’s very unequal knowledge map”. The Conversation Africa, 8 July 2015 https://theconversation.com/its-time-to-redraw-the-worlds-very-unequal-knowledge-map-44206

Royal Society: “Science as an Open Enterprise”. June 2012: https://royalsociety.org/~/media/Royal_Society_Content/policy/projects/sape/2012-06-20-SAOE.pdf

Graphic:  http://www.socrata.com/blog/whats-the-link-between-open-data-and-open-government/

 

Teaching Excellence Framework: regressive or refreshing?

As expected , Britain’s minister responsible for universities has confirmed that he will introduce a Teaching Excellence Framework for all UK universities, matching the long-standing Research Excellence Framework. This, he says, will be light-touch: “I have no intention of replicating the individual and institutional burdens of the REF. I am clear that any external review must be proportionate and light touch, not big, bossy and bureaucratic.” Universities that do well enough in the TEF will be permitted to raise their tuition fees above the current ceiling of £9000.

The small print of the TEF could be definitive. Calibrated one way, teaching excellence could recognize those universities that do most to change people’s lives and build a highly qualified workforce that is far broader than today. But if the development of the technical details goes down a different road, then current trends in inequality of access will be intensified and accelerated, sharpening and shrinking the pool of elite graduates. Here, the devil could come to own the detail.

Derfel Owen, Director of Academic Services at UCL, has suggested that the TEF should be based on existing measures as, indeed, the Minister’s policy steer implies. There will be all the more pressure for this with the linkage between the launch of the TEF and the opportunity for universities to raise their tuition fees.

In his valuable overview, Derfel groups candidate components for the TEF into three categories.

Firstly, input measures. These could be the proportion of a university’s staff with a postgraduate level teaching qualification, a measure linking an institution’s research strength with its teaching, and the strength of co-curricula opportunities for students.

Second, peer review. Here, a future version of the current National Student Survey is an inevitability. This, though, needs to be augmented by broader views, perhaps as an extension of the well tried mechanisms of external examination.

The third category is output measures. These are by far the most difficult and, I suggest, the key detail to watch for. How learning outputs are understood and measured could well determine the overall shape and character of Britain’s Higher Education system for many years to come.

Defining output measures for universities is invariably complex and contentious. In itself the exercise runs the risk of displacing imagination, flair and innovation in favour of conformity, compliance and regulation. Here is what Derfel Owen suggests: “I think a measure should be developed that looks at the entry qualifications that students arrive with and then how far along they get with their higher education. That way, universities that recruit students with no or few qualifications could be rewarded and could see the recognition increase as students achieve higher-level qualifications”.

Such a measure already exists. Alone among the major newspaper league tables, the Guardian University Guide incorporates “Value-Added Scores” in its ranking algorithm:

“Each full-time student is given a probability of achieving a 1st or 2:1, based on the qualifications that they enter with or, if they are in entry bands 20 and 50, the total percentage of good degrees expected for the student in their department. If they manage to earn a good degree then they score points which reflect how difficult it was to do so (in fact, they score the reciprocal of the probability of getting a 1st or 2:1). Thus an institution that is adept at taking in students with low entry qualifications, which are generally more difficult to convert into a 1st or 2:1, will score highly in the value-added measure if the number of students getting a 1st or 2:1 exceeds expectations”.

This approach works well enough for those parts of the education system where there are nationally set and moderated examinations at both the entry and the exit levels; for Sixth Form Colleges, for example, where students enter with GCSE scores and exit with A levels. And the results are often surprising, with good schools serving low-income areas often demonstrating learning gains that are far greater than schools recruiting from middle and high income catchments. But universities admit with a plethora of qualifications (and only about half of all British undergraduates have A levels). And, although universities don’t much like talking about it, the ways in which degrees are graded are not standardized and are not currently benchmarked beyond specific qualifications.

There is, then, nothing easy about setting up a learning output measure for universities. The key issue – and the factor that will determine the politics of the TEF – is that the qualification levels of students at admission will have been strongly modulated by their socio-economic circumstances.

Study after study has shown a strong statistical correlation between household income, school leaving qualifications, entry into higher education and the type of university attended. This, of course, is at the cohort level; there are always smart and dedicated individuals who beat the odds. But overall, in Britain today a postcode predicts educational opportunities from birth, irrespective of innate abilities.

So here are two potential TEF scenarios.

One road could follow Derfel Owen’s line of thought, with an algorithm for learning gain that takes into account both the formal educational achievements of the annual cohort of entering undergraduate students and also – and critically – the social and economic factors that have been demonstrated to influence their subsequent educational journey. This would require some sense of the equivalence of entry qualifications and, in particular, transparency in the ways in which universities calculate degree grades. The resulting measure of learning gain would then be linked to funding. This would be progressive, in that universities that were prepared to put the most effort into adding measurable value for their students would find the considerable costs of doing this matched by funding.

The second road would be based on the premise that taking prior circumstances into account is unfair, and that every entering student should be considered an equal competitor at the point of entering university. Output would then be the combination of student retention, progression and grades at completion. A good teaching university would be one that retains almost all its students, almost all of whom complete in three years, with a significant proportion gaining a “good degree”. Such universities would be rewarded by being allowed to put up their fees, allowing them to spend more on the other measures of teaching quality, such as staff development and co-curricula activities.

The problem with this second road is that, in a highly unequal society such as Britain today, it is the equivalent of regressive tax. Because all students do not have equal opportunities at the point of entry, some will be far more prepared than others. The more advantaged students that have been in their prior opportunities, the more likely they will be to stay, complete their degrees in three years and get a high standard of degree. Shifting funding towards such universities is to further disadvantage those universities that do the heavy lifting.

**

Derfel Owen, “How to build the teaching excellence framework”. Times Higher Education, 10 July 2015

https://www.timeshighereducation.co.uk/blog/how-build-teaching-excellence-framework

Matt Hiely-Rayner, 25 May 2015. “Methodology behind the Guardian University Guide 2016”. http://www.theguardian.com/education/2015/may/25/methodology-behind-the-guardian-university-guide-2016

Guns into Goods

It’s not often that a charity closes because its work is done. But earlier this year a Manchester-based group found itself beached by its own success. And given that its focus has been on reducing gun crime, that’s of more than local interest.

CARISMA – the Community Alliance for Renewal Inner South Manchester Area – had been founded primarily to tackle gun violence in Moss Side, that once had one of the highest rates of gun violence in Britain. Driven by the ever-inspiring Erinma Bell, CARISMA worked closely with both the police and community organizations in putting in place a number of focussed initiatives. One of these was “Guns to Goods”. In partnership with the design department of a local university and an industrial foundry, guns confiscated by the police were melted down to be recast as fashion items. The aim was to put in place a virtuous circle in which could-be gang members with a flair for design are directed away from the dark side.

CARISMA’s Peace Kit was launched in Manchester Cathedral in March 2013. It’s well worth listening to Erinma setting out the project’s vision in this video.

When I moved back to Cape Town, I brought with me one of the gunmetal ingots that were cast by Guns to Goods from police-confiscated weapons. There’s a power in this quiet, smooth slab of metal that was once several sawn-off shotguns.

Cape Town has a particularly high level of gun crime, part of the gang culture that is a legacy of apartheid-era segregation. Gangs depend on recruiting children who are expected to prove their loyalty through acts of violence. There is a broad consensus that policing alone will be insufficient; turning the tide can only come from a broad-based approach in which communities, police and government agencies can work together. Cape Town also has strong traditions of creativity and a vibrant visitor economy that encourages the creative industries. Last year, the city was the World Design Capital.

There’s every possibility in a Guns to Goods project for Cape Town. It would be a mark of success if here, as in Moss Side, a community-based charity were to close ten years from now because its job is done.

**

Sarah Austin, “Guns to Goods: Can old firearms forge a new future in Manchester?”. BBC, 23 March 2012: http://www.bbc.com/news/uk-england-manchester-17446832

Erinma Bell, Manchester Cathedral, 1 March 2015:   https://www.youtube.com/watch?v=Mu-QHdT9ns0&feature=youtu.be

Forced Migration: the 60 million question

Last year just under 60 million people across the world were displaced, either by violence within their own countries or as refugees – the highest number ever. The United Nations Refugee Agency reports that this includes nearly 14 million people newly displaced in 2014. Others have been stranded for long, and lengthening, periods; more than 2.5 million in the Darfur region of Sudan; 1.5 million Afghan’s still living in Pakistan.

On the other side of the equation, less than 127 000 refugees returned home last year, the lowest number in more than 30 years. Given that there is little sign of solutions to the primary conflicts driving forced migration, these numbers will continue to escalate. Rather than in far-away places, the front lines are now the beaches of Italy and Greece, and at Calais.

As many experts have pointed out, the scale of this global displacement makes most existing policies and practices redundant. Apart from the numbers of people involved, there are political and economic interconnections that will make it impossible for governments in the West and North to portray this as an issue in distant lands, amenable to lessons in democracy, well-targeted aid and a firm but fair hand.

The current situation in Syria is bound up in US and EU policies and economic interests in the Middle East and in the stand-off with Russia. The roots of the Taliban are in Western support for insurgents resisting the soviet occupation of Afghanistan.

But for politicians, immigration is a doorstep issue; most people significantly over-estimate the extent of immigration and few politicians have the courage to correct prevalent misinformation.

Consequently, the poorest countries carry a disproportionate burden of response. Lebanon has the highest proportion of refugees, with 232 per thousand residents. Jordan carries the next heaviest burden with 87 refugees per thousand. One in four refugees now finds shelter in middle or low income countries, with Turkey, Iran and Pakistan taking the largest numbers. This, in turn, risks destabilizing already fragile economies, leading to still-greater migration.

In comparison, the country with the highest proportion of refugees in Europe is Sweden, with 15 refugees per thousand residents. The United Nations’ special rapporteur for the rights of migrants has called on the European Union to host one million Syrian refugees. While this has been dismissed as an impossibility by some EU governments, the overall population of Europe is 740 million. Were the EU as a whole to offer refuge to a million more, this would increase their overall refugee ratio by less than one per thousand – a fraction of the burden being carried by poor countries such as Lebanon, Jordan, Kenya and Ethiopia.

The latest European Union response to the current Mediterranean migrant crisis demonstrates starkly the lack of fit between proffered solutions and the extent of the problem. The offer of asylum to 40 000 refugees currently in Italy and Greece will be welcomed by those who will benefit but will leave many times more without resolution. The accompanying resettlement scheme for up to 20 000 Syrian and Eritreans is voluntary; the UK has already declined to take part.

One key deficiency is that approaches are being developed without consultation with refugees. François Crépeau, the UN’s special rapporteur:

Think about trying to make policies about women without consulting women. The policies that we made before women got the right to vote were policies based on fantasies – the fantasies that we men had about women. That’s the problem today: most of our immigration policies are based on this nationalist, populist discourse which has been prevailing in Europe and elsewhere now for the past 30, 40 years.

There is a particular role for education in developing such long-term policies. The United Nations estimates that about half of those displaced are children. Crépeau makes the point that migration is generational; full economic integration may take several generations to achieve and this requires education policies that can stimulate innovation and contribute to the development of the labour market.

The eventual reconstruction of countries damaged by conflicts will require appropriate levels of expertise that must be developed in exile. As Turkey has found, as one of the countries bearing the primary burden of displacement from Syria, universities need to anticipate and plan for the future needs of young adults who have been caught up in the consequences of sustained conflicts.

António Guterres, the United Nations High Commissioner for Refugees, in releasing the latest report on global displacement:

For an age of unprecedented mass displacement, we need an unprecedented humanitarian response and a renewed global commitment to tolerance and protection for people fleeing conflict and persecution.

**

Somini Sengupta, New York Times, June 18 2015. “60 Million People Fleeing Chaotic Lands, U.N. Says”.

Patrick Kingsley and Sam Jones, 26 June 2015. “EU sidestep on migrants will do nothing to curb Mediterranean death toll”. http://www.theguardian.com/world/2015/jun/26/eu-sidestep-migrants-mediterranean-death-toll

Sam Jones, Guardian Global Development, 30 June 2015. “EU needs 25-year plan to deal with migrants, says UN envoy”.

http://www.theguardian.com/global-development/2015/jun/30/eu-needs-25-year-plan-deal-with-migrants-says-un-envoy?CMP=EMCGBLEML1625

Marikana

After Scene 1, Brigadier Calitz stopped at the dry river bed to re-organise the operation. He then proceeded in a northerly direction to a position some one hundred and fifty metres north of Koppie 3 to supervise the arrest of strikers fleeing in that direction. At the same time, the NIU under Colonel Modiba approached Koppie 3 from the north east, the TRT under Captain Kidd approached Koppie 3 from the south west and Major General Naidoo with the K9 and other units approached the Koppie from the south. This led to the position where three separate units converged on Koppie 3 without informing either Brigadier Calitz or the JOC.

There was shooting from various members of each of these units in the direction of the koppie where the strikers had gathered. This resulted in 17 strikers being killed. There were 14 bodies found at Scene 2 and three strikers who were wounded subsequently died in hospital. Ten of them were killed in what can be described as a crevice in a rocky area inside the koppie where they appear to have sought refuge during the operation.

Brigadier Calitz has been criticised by the Evidence Leaders for failing to issue any warning to the strikers at the stage when they were surrounded in koppie 3. They argue that those strikers who wished to surrender peacefully were not given an opportunity to do so before steps were taken to disperse them which might include the use of force. Sections 9(2) (a) and (b) of the Regulation of Gatherings Act 205 of 1993 apply. The Commission agrees with this submission.

It is clear from the evidence that the overall commander Major General Mpembe had absolutely no command and control of Scene 2.

These are extracts from Chapter 12 of the Commission of Enquiry report on the violent confrontation between the South African police and striking workers at Lonmin’s Marikana mine in 2012. Long awaited, the Commission’s report has been extensively criticized for exonerating prominent politicians, passing too lightly over evident failings by Lonmin’s senior management and for failing to find anyone directly culpable for the violent deaths of 44 people as the Marikana tragedy unfolded between August 9th and August 16th.

Most reactions to the report have been shaped around particular aspects of this complex web of labour relations, trade union rivalries, police competence and political allegiances. Marikana unravelled against a broad canvas of South Africa’s history and circumstances: migrant labour; miners working deep underground for unconscionable pay; widening inequality. And Marikana was the worst police shooting of its kind since the Sharpville killings of 1960, making it a symbolic event that will punctuate this country’s historical narrative. Given that the main report is 565 pages of painstaking legalese, it’s not surprising that commentaries have homed in on headline issues. Those seeking proven culpabilities have been disappointed; this has been particularly distressing for the families of those killed.

But there’s another way of reading the Marikana Commission’s report. This is from beginning to end, like a screenplay framed by legal conventions of precedent and evidence and bound by specific rules of logic. This is particularly apt here because of the forensic value of the film archive from these eight days, and because the Commission’s report has been anticipated by Rehad Desai’s extraordinary documentary Miners Shot Down, now freely available online.

Read in this way, the report has two significant outcomes.

First, and because of the peculiar logic and emotions of legal expression, it prevents Marikana being bracketed off as an unfortunate incident that is now over. This is because the unfolding pages of detail reveal a police service that is comprehensively incompetent, armed with military assault weapons that it is ready to use and is unable to carry out public order policing. Given South Africa’s very high levels of unemployment and widening inequality, Marikana could become a syndrome. Understanding this is the first move towards mitigating future emergencies, by understanding what Hannah Ardendt called the “instrumentality” of violence. As always, such understanding is a necessary condition for prevention.

Second, and to a far greater extent than has yet been acknowledged in most media reports, the Commission has laid the ground for the future prosecution of senior police officers for their direct involvement in unlawful killings. A commission of inquiry cannot double up as law court, but it can establish a prima facie basis for criminal liability and refer the file to the public prosecutor. And this the Marikana Commission has indeed done.

How does the legalese of the Commission report play out as a parallel narrative to Desai’s documentary?

Firstly, both depend on the same film archive. The forcefulness of Miners Shot Down lies in its extensive use of police and journalists’ film sequences and this same material is the bedrock of the Commission’s evidence.

Secondly, where the film compels its viewer by means of its pace, authenticity and narrative voice-over, the Commission report slows everything down, testing the detail, probing the gaps and diverting into legally-important side issues. The documentary film runs for 53 tightly-edited minutes. The Commission sat, in public, for 300 days and produced 39 719 pages of evidence.

This slow time of the Commission’s public work is augmented by a form of emotional displacement. There’s plenty of described emotion, of course, in the submissions of the advocates leading evidence for the families of those who were killed. But the only time that the Commission itself seems seriously agitated is when it comments on the attempts by the police to obfuscate, mislead and – the Commission implies – actively conceal culpability; actions that disrespect the standing of the Commission and its work. In contrast – and appropriately – the report relies on the cold, analytic language of the law when it describes and analyses the instrumentality of violence. Here’s a passage that demonstrates this. It comes from Chapter 11, which deals with the events on Thursday 16 August, up to the first set of killings at “Scene 1”, and is part of an attempt to establish the intentions of miners who had been encircled by armed police, and whether the police were justified in using lethal force in self-defence:

The evidence indicates that R5 bullets tend to disintegrate when entering the body of a victim. This is what happened at Marikana. As a result it is not possible on the ballistic evidence to connect any member who shot at Marikana with any person who died. In the case of certain shooters there is prima facie evidence that the members concerned may well have been guilty of attempted murder but it cannot be said that any shooter is guilty of murder because it cannot be shown which of the shooters actually killed anyone. In the case of those shooters who exceeded the bounds of self- or private defence, the most they can be convicted of is attempted murder.

Expressed in the parallel genre of the documentary film, the Commission is saying that this group of miners, armed with sticks, spears and – possibly – one or two guns, was shot to death by several hundred rounds of high velocity bullets intended for military combat. These bullets are designed to break up inside the victim’s body causing the maximum possible damage to tissue and organs. The evidence indicates that this police action went well beyond justifiable self-defence but, because the bullets have disintegrated, forensic evidence cannot connect a specific victim to an identified shooter, making conviction impossible. By using the cold language of the law, the Commission teases out the mechanics – the instrumentality – of this incidence of violence, making it all the more stark and horrifying.

The combination of exhaustive detail, slowed down action and the emotional displacement of legal forms of expression serves to present Marikana both as a specific set of events over a defined time period and at a particular place, and also as a form of violent confrontation that could happen again, in a different context, anywhere. For example, there are increasing occurrences of township protests over persistent inadequacies in basic service provision, some of which have had a violent edge. The Commission has shown that, at the least, Marikana was the result of leadership failures on the ground, inappropriate public order policing strategies and the deployment of armaments designed for lethal military combat. What could happen if similar tactics were to be used elsewhere, anywhere? Revealing the detailed instrumentality of the Marikana killings points to ways of preventing future atrocities.

And what of the possibilities for future prosecutions for the Marikana killings? While the evidence may be missing at Scene 1, this is not necessarily the case elsewhere, and initial media reports have passed over the Commission’s list of files that will be passed to the Public Prosecutor to decide whether charges should be laid. These files include the events at Scene 2 later in the day on August 16th, and which are set out in detail in Chapter 12 of the report.

Scene 2 is the area a small way north of the first confrontation, to which surviving miners retreated after the first round of killings with the R5 assault rifles. The Commission dissects the comprehensive confusion of the police strategy and the lack of any coordination and control by Major General Mpembe as the officer in command. Several people lost their lives in this mêlée but the most controversial claim – and a key question raised by Rehad Desai in Miners Shot Down, is whether the ten miners who were killed while hiding in a crevasse in Koppie 3 were shot by the police in self defence or were, in effect, executed. Here, the Commission must be left to speak for itself:

The Evidence Leaders criticised Major General Naidoo for having participated in a chaotic free for all which cost sixteen people their lives without exercising any command and control and without taking any steps to stop the shooting and isolate the problems. We agree with these criticisms….

Apart from the evidence of a reconstruction of the scene by Mr De Rover, the South African Police Service provided no details of what happened with regard to the deaths of most of the deceased at Scene 2. Where it does provide evidence pertaining to the deaths of some of the deceased, their versions do not, in the Commission’s view, bear scrutiny when weighed up against the objective evidence…

Major General Naidoo is criticised for his failure to exercise control at Scene 2. It is submitted by the evidence leaders that as most senior officer at Scene 2, he did nothing to stop the firing of two hundred and ninety five rounds of ammunition at the strikers in the koppie. He failed to ascertain what the problems were and in so doing, completely failed to exercise any command and control at the scene. The Commission agrees….

The Evidence Leaders point out that Major General Naidoo’s version of the shooting in self-defence is contradicted by the statements of the occupants of Papa 11, namely, Warrant Officer Mamabolo and Constables Dzivhani, Zondi, Khosa, Malesa, Mathabha and Mokoyama. The version in these statements is to the effect that Major General Naidoo and other police officers were seen emerging on the top of the boulders from the direction from where the firing came. Warrant Officer Mamabolo says that he shouted at them to cease fire but the shooting continued. None of them noticed any shooting by the strikers. There is also no corroboration for Major General Naidoo’s version from Sergeant Harmse who was very close to him.

They also submit that his description in oral evidence of the shooting is different from the versions in his statement. They attribute this change in version to the belated finding by the ballistics expert that a cartridge case linked to Major General Naidoo’s firearm was found on top of the rocks, because at the time of making his statements he was not aware of this ballistics evidence. They also criticise that he only belatedly submitted his own firearm for investigation by the ballistics experts.

The Commission is satisfied that the anomalies in his evidence as well as the fact that his version is contradicted by other evidence, warrant the circumstances surrounding the shooting to be referred to the Director of Public Prosecution for further investigation.

It’s difficult to see how these statements can be dismissed as a cover-up.  The question should rather be:  what action will the Director of Public Prosecution take?

**

Rehad Desai (2014) Miners Shot Down: https://www.youtube.com/watch?v=fTSHk2LTdtw

Marikana Commission of Enquiry report, released June 2015:

http://107.6.66.171/Report%20of%20the%20Marikana%20Commision%20of%20Inquiry.pdf