Learning analytics is the study of student behaviour through patterns in their digital world: how often, when and where they log on; the digital resources that they use; the web sites they visit; the social media platforms that they use. At the present time, this is a nascent field of interest in most colleges and universities. But over the next few years, the analysis of the digital trails that students leave as they move through the digital world will become central to curriculum design, learning support, assessment and quality assurance. Two recent reports by Jisc – British education’s digital solutions provider – set the stage for these changes, and point to some early work that needs to be prioritized if these developments are in the interests of students and enable better education.
That learning analytics are set to take off is hardly surprising. After all, its common cause that many of us allow the new virtual behemoths access to our every keystroke in return for the advantages that this brings us. Amazon uses our search and purchasing patterns to suggest what books we might also like to read on the same subject. Google scans the content of every e-mail to select the advertisements that are most appropriate to our needs. Weather apps use our location to tell us whether the sun will shine on us today. Students use all these services, and many more; they are not surprised if their universities use their personal data in similar ways. Indeed, Jisc’s work shows that students are – at present – relaxed about all this.
Up until now, these rich troves of data have been used almost entirely as early warning indicators of students at risk. The Jisc project was based on a snapshot of ten British universities, two colleges and on the work of the University of London’s Computing Centre, which hosts the virtual leaning platforms of more than a hundred organizations. While there was a fair degree of variation in the priorities behind this work, immediate concerns were identifying students at risk sufficiently early to allow effective intervention. But at the same time, all could see the potential for improving overall student experience by taking this work further: “many institutions see the data as part of a continuum which can be used by people at every level of the organisation, from individual students and their tutors to educational researchers, to unit heads and to senior management”.
What could this future look like? One way of shaping this would be through an ethnography of students as they engage with the digital resources available to them, and with the digital gateways that they have to pass through on a daily basis. For example, a student’s day might start with logging into their university’s web page to check on the timetable. Getting on to campus requires swiping their student card to get access; the same card is used for buying a cup of coffee. The lecture may be available on-line for a second look (or as an alternative to hearing it live). Assignments will require access to a wide range of on-line resources, either on campus or at home (and the choice will be recorded). Once completed, the assignment has to be run through a plagiarism detection programme (which will track and test the derivation of its content) and then submitted online. Towards the end of the day, our digital student may choose to go to the Sport Centre (another digital access record) or to the Students’ Union, where purchases are debited to their account – and recorded in detail. This digital footprint will be recorded and stored for every student, every day for three years; for a medium-sized university offering programmes in a conventional semester system, this is some four million student day records each year.
Data sets of this size can now easily be analysed for patterns, and these patterns tested against propositions. Because every one of these student behaviour profiles will correlate with their prior education record, their demographic details and their postcodes, universities will be able to refine and extend existing areas of interest. For example, there is already a sharp concern with the relationship between prior achievement and graduate attainment; learner analytics will provide a higher degree of predictability. Equality concerns require the demonstration of equitable opportunity, irrespective of gender, race, ethnicity or disability; learner analytics will allow assumptions in this regard to be tested. Librarians have to make key choices about which on-line resources to prioritize and space planners need to track student preferences for facilities; learner analytics will provide rich, real-time data about student needs.
But – of course – these more substantial uses of digital data will raise some tricky ethical issues; students, staff and regulatory bodies will be much less relaxed than they are at this early stage. This is why Jisc’s second report, which surveys ethical concerns and codes of practice across the world, is particularly important. As Niall Sclater, the author of the report, puts it:
“Current legal and ethical guidelines have not caught up with innovations in the identification of patterns and new knowledge emerging from the vast datasets being accumulated by institutions. A code of ethics is essential as the law alone cannot deal with the many different scenarios that will arise. We are in a critical window: whatever gets established today will be ‘sticky’ and will affect public notions of what is acceptable for many years to come. If we fail to assert values such as privacy, transparency and free choice, society will abandon these in favour of technical innovation and commercial pressures”.
Setting standards on the basis of wide consultation is important beyond the justified concerns of individual colleges and universities. As is well known, quality standards in education are relative; a university makes claims for the quality of each of its academic programmes by benchmarking in some way against the sector as a whole. At present this is done by well-tried but comparatively crude methods: the external examination system; the National Student Survey; the new Key Information Set; various external ranking systems. The rise – and rise – of learning analytics will provide a far richer and more nuanced way of benchmarking individual courses against their sector as a whole, and it is likely that both students and quality assurance authorities will require this. This, in turn, will require national databases that are aggregates of individual universities’ records.
If higher education does not get the ethical issues sorted out up-front, it will face the same challenges that are currently bedevilling health records. Niall Sclater again:
“Dealing with the ethical challenges relating to information technology is not simply ‘a good thing to do’ but also helps organizations to develop an ethical environment where bad things are much less likely to occur. Negative publicity about the use of IT almost always results from failures to deal with ethical issues. Business-focussed staff may consider IT to be ethically neutral but they are not necessarily familiar with the many choices IT staff are required to make in the design and deployment of systems which have consequences for individuals. This would seem to be particularly pertinent in learning analytics where the nature of the algorithms and how the results are presented and acted upon could have a significant impact on a student’s academic success”.
Niall Sclater Code of practice for learning analytics. A literature review of the ethical and legal issues. Jisc 2014.
Niall Sclater Learning analytics: The current state of play in UK higher and further education. Jisc, 2014.