The Case for Diversity

It shouldn’t be necessary to have to make the case for diversity. There shouldn’t be the need for an Equality Challenge Unit, or for a kite mark for gender equity. There shouldn’t be an attainment gap for BME students. A postcode should not affect the probability of your attending university or shape the kind of university that will let you in. In an equitable society, a person realizes their capabilities though their own choices – by the objectives they set themselves and the efforts they make.

We are, though, not an equitable society and Higher Education is not an equitable sector. This is abundantly clear in the Equality Challenge Unit’s two volume Statistical Report for 2014 which I was obliged to carry on my back on the walk up the hill from Liverpool’s dockside to Lime Street Station at the end of last week’s ECU conference.

Baroness Onara O’Neill, Chair of the Equality and Human Rights Commission, set out the ethical and legal bases for equality with philosophical precision at the conference. Universities, despite the privatization of student fees, are still public bodies and as such have “positive duties” to address unfair discrimination, to minimize disadvantages. This, though, does not include the duty to achieve defined outcomes – quotas. Indeed, positive discrimination is unlawful in Britain, which is why a university cannot require an all-female short-list for a job, or set aside student places for which only ethnic minorities may apply.

What to do, though, when the cumulative, historical effects of unfair discrimination are so extreme that they threaten significant dysfunctionality? The week before the Equality Challenge Unit’s Liverpool conference I was in Johannesburg for a workshop hosted by the global energy company Chevron and True Blue Inclusion, a Washington-based think tank. This was about the dilemma faced by South African based organizations twenty years after the inauguration of an era dedicated to non-racialism, when there are still extreme disparities by race and gender in middle and senior management positions. Here, the law is clear. South Africa’s constitution requires positive discrimination in order to address historically-based disadvantage; the problem is that these constitutional requirements have yet to be met.

Britain’s problem, then, is both less acute and more complex than South Africa’s. Clearly, the extent of inequality in British universities is nothing like the levels of persistent inequality across South African Higher Education. But at the same time, British universities are required to exercise a “positive duty” to address imbalances, but without using measures that are “positively discriminatory”. As Baroness O’Neil put it, how far can we go to achieve proportionality?

Here, a clear priority is ensuring equity in our processes, whether for staff recruitment and promotion or for student admissions, evaluation and assessment. Evidently, key Higher Education processes are not equitable. If they were, then 79.9% of Vice-Chancellors would not be men, and the unemployment rate six months after qualifying for BME graduates would not be twice that of white graduates.

There will always be some cases of overt discrimination in any organization; these should be comparatively easy to deal with. The more insidious determinants of continuing inequality are the subconscious biases – those aspects of institutional culture that we rarely put into words, but which define the character of an institution, of what’s in the air, in gestures, language, customs. And because an institution’s culture is not a person, protected in law against positive discrimination, we are free to identify its characteristics, make them spoken rather than assumed and, though doing this, change our habits.

Chi Onuwurah, MP for Newcastle Central, set out an agenda for an attack on institutional culture at the Liverpool conference. An engineer-turned- politician, she based the case for equity across the Engineering disciplines on economic necessity; on the need to draw on the full spectrum of people’s experience and perspectives to drive forward innovation, “from sharpened stones to mobile phones”. And the same argument was made, with compelling force, in Johannesburg, where current cultural preferences in organizations result in companies depending on less than 10% of South Africa’s potential workforce for their recruitment of senior managers. No company, or country, can remain competitive if it self-restricts its talent pool to this extent.

Commitment to the case for diversity is a leadership issue, both at the top of an organization and also distributed across all areas of work. In preparation for this year’s conference, the Equality Challenge Unit commissioned a study of diversity leadership in ten British universities, including the University of Salford. As Chris Brink, Vice-Chancellor of Newcastle University, puts it in this report, equity is about excellence. And it’s hard to see how anything else could be more important.

Will we all be cyborgs?

The convergence of information technologies and bioscience is changing our lives. As bandwidth speeds of a gigabit per second become available and affordable for some, how will this affect who we are as we are able to reconstruct our own bodies and shape the bodies of our children? Will we be cyborgs, part flesh and part machine? What advantages will this bring? And what should we worry about?
The Pew Research Center recently canvassed over a thousand practitioners and experts in information technology. They were asked what they believed were the implications of a gigabit world. Will there be distinctive killer apps, disruptive innovations that will result in significant changes in the ways we live? The Pew survey identified health as one of the areas that will be most affected. Here is Hal Varian, chief economist for Google: “the big story here is continuous health monitoring… It will be much cheaper and more convenient to have that monitoring take place outside the hospital. You will be able to purchase health-monitoring systems just like you purchase home-security systems. Indeed, the home-security system will include health monitoring as a matter of course. Robotic and remote surgery will become commonplace”.
A gigabit connection provides one thousand megabits of information per second (Mbps). At the beginning of this year, the average connection speed across the world was just under 4 Mbps, across the United States 10.5 Mbps and in South Korea – the country with the highest average connection speed – 23.6Mbps. A gigabit world, then, would see a forty-fold increase in Internet speed in the best performing country. This may seem unattainable in the near future. But this technology is already with us. Some scientific communities have already had access to very fast networks for several years. Four years ago, Google ran a competition for the first community network running at 1 gigabit per second, a hundred times faster than the average speed for the US as a whole. Kansas City won and residents are now signing up for the service.
The convergence of bioscience and information technology is best represented in the history and triumphs of the Human Genome Project. Launched in 1990 and completed in 2003 with the sequencing of the chemical base pairs that make up our DNA, the results of this extensive collaboration is now transforming medicine and health care. Genome sequencing would not have been possible without high-speed computing and future developments will depend on almost instant access to massive data sets.
But in addition to new drug development and predictive diagnosis of genetic conditions there are more complicated innovations. There is widening enthusiasm for personal DNA profiles that establish genealogies; the confirmation that the skeleton found under a car park in Leicester was once Richard III is a famous example. But for others, there is a deep suspicion of what this could bring. For example, indigenous communities with hard-won rights to land and resources fear that the misuse of DNA sequencing may strip away these rights. And the extensive and continuing revelations about the misuse of information technologies by state agencies has encouraged a significant backlash against the pooling and use of personal health records. The union of the biological and digital sciences is a complicated marriage that will bring unanticipated consequences.
One area to watch for such surprises is digital implants; the surgical insertion of microprocessors that make us part flesh, part machine.
The specter of the cyborg has been with us since Mary Shelly’s 1818 novel Frankenstein. The early Internet and popularity of personal computers brought flesh, organs, digital processors and information technology together in fiction and theory. Milestones, and still great reading today, were William Gibson’s 1984 classic Neuromancer and Donna Haraway’s prescient essay, Cyborg Manifesto, published the following year.
But cyborgs were already on the street thirty years ago. The first surgical implantation had been in 1958 ( the recipient lived until 2001); today’s microprocessor-controlled pacemakers sense the physical activity of their host and respond by increasing or decreasing their rate. And since Gibson foresaw a future in which the body could be rebuilt at will – although for nefarious purposes in the dark world of the matrix – remarkable new medical technologies can transform the quality of life. Surgical cochlear implants pick up signals from a speech processor and send them to the auditory nerve. Robotic prosthetic limbs receive signals from the nervous and muscular systems and transmit these to an artificial arm or leg. In the near future, implantable artificial kidneys with microelectromechanical membranes will filter blood and excrete toxins while reabsorbing water and salt.
Widely available gigabit connections will enable intelligent, implanted devices such as these to become part of the “internet of things”, much as Gibson imagined in Neuromancer. The Pew Center study predicts personalized digital health within the next ten years. Here is Judith Donath, at Harvard’s Berkman Center for Internet and Society: “telemedicine will be an enormous change in how we think of healthcare. Some will be from home—chronically ill or elderly patients will be released from hospitals with a kit of sensors that a home nurse can use. For others, drugstores (or private clinic chains—fast meds, analogous to fast foods) will have booths that function as remote examining, treatment, and simple surgery rooms. The next big food fad, after hipster locavores, will be individualized scientific diets, based on the theory that each person’s unique genetics, locations, and activities mean that she requires a specific diet, specially formulated each day”.
But medical applications are likely to develop more rapidly than this. Chip implants that yield personal information to a scanner have been around – controversially – for a decade, promoted for monitoring prisoners and hospital patients. And a person offered a surgical implant that could save their life by responding to real-time information is unlikely to decline out of deference to Edward Snowden.
All these developments, whether for lifestyle choices, medical care or lifesaving technologies, will require a significant trade off between privacy and the sharing of personal digital information. Will this happen? Revelations about the extensive misuse of surveillance by state agencies across the West has resulted in a backlash against sharing. We are becoming aware that our digital traces are everywhere we go, and we don’t like it very much. But despite this, we surrender our personal data every day in return for the conveniences this brings.
Anyone who uses any free Google service pays with the surrender of some personal data, and usually a lot. Google knows where its users are, and what they are interested in, by collecting information on Internet searches, the contents of e-mails sent and received and from geospatial information transmitted from smart phones and tablets. The payback for the loss of privacy is easier shopping, finding places anywhere and a fast, free and capacious e-mail service. We all want safe cities, with protection from mugging to terrorism and everything in between. Today’s cities are impossible to police effectively without constant, digital surveillance. Every person in Britain is now photographed on average 300 times each day, often without knowing it. In London, more than 16 000 sensors automatically record the location of anyone carrying an Oyster card. Digital number plate recognition systems record the movement of every car across motorway systems, linking back to the identity of the registered owner. We are, to go back to William Gibson’s prescient novel, already in the Matrix, and this is a messy and complicated place to be.
And, finally, the engine of most contemporary change – consumerism. From the earliest Apple Mac to the latest iPhone, markets have directed and accelerated the advance towards a gigabit world. This Christmas’s best seller will be the wearable band, which offers a range of digital functions from paying for coffee to monitoring health patterns; market analysts predict that 43 million of these devices will be sold across the world. 28 million of these will be smartbands, that connect to tablets, iPhones and other digital devices. And once this market is saturated, as it surely will be, what next? With 1000 megabits of information available every second, what could be more natural than tucking the microchip away beneath a fold of skin, perhaps along with a tattoo or body piercing?
But not for everyone. Respondents to the Pew Center survey also saw in this future an entrenched digital divide. Rex Troumbley, from the University of Hawaii, commented that “we should not expect these bandwidth increases to be evenly distributed, and many who cannot afford access to increased bandwidth will be left with low-bandwidth options. We may see a new class divergence between those able to access immersive media, online telepathy, human consciousness uploads, and remote computing while the poor will be left with the low-bandwidth experiences we typically use today.”
And so again we are in William Gibson’s dystopian world, or right back to Mary Shelly’s horror of the “miserable monster”, and his reproach to his creator: “I ought to be thy Adam; but I am rather the fallen angel.”
Pew Research Center, September 2014, “Killer Apps in the Gigabit Age”
Available at:

First published in University Business, 29 October 2014: