Education

Supporting students: the role of data analysis in improving welfare


Teenage girl reading on bed






Jisc says that data can be used to monitor student wellbeing, alerting staff to those students who could be struggling.
Photograph: Johner Images/Getty Images/Johner RF

Students are increasingly concerned about their mental health. One in six people aged 17 to 19 in England has a mental disorder, according to NHS Digital, and students are increasingly reporting mental health conditions to their institutions – a House of Commons Briefing from August 2019 reported a fivefold increase in student mental ill health since 2010.

Dr Dominique Thompson, who worked as a university GP for nearly two decades and now advises universities on mental health, believes that students are under increasing pressure from an overly competitive society and a tyranny of perfectionism, all magnified by social media. “People are accepting that we have to do more for our young people,” she says.

Universities and colleges, in particular, have been asked to do more to help students in crisis, particularly following 10 student deaths over an 18-month period at the University of Bristol (and since then there have been reports of more deaths). In December 2018, then education secretary Damian Hinds told institutions they should get in touch with students’ emergency contacts when it is clear they are at risk of a mental health crisis.

But this requires higher education providers to identify such crises. James Murray, the father of Ben, a 19-year-old student at Bristol University who killed himself in May 2018, believes universities and colleges can use data analytics to help, allowing them to support students when they most need it.

Murray found that different university departments and services had known Ben was experiencing problems – he had reported health issues, was not attending lectures and was in the process of leaving his course – but there was no way for the university to join up the dots.

Murray, who has a professional background in data analytics, designed a data “dashboard” that could pull together key information on every student’s wellbeing. At least 95 university students took their own lives in 2016-17, according to Universities UK (pdf), the organisation that represents higher education. Murray helped launch new guidance for preventing student suicides. As part of its call for “suicide-safer universities”, including greater awareness and better support for students, this guidance included a recommendation for digital data analytics systems that draw on a wide range of data, including academic, financial and disciplinary, as well as interactions such as not paying rent or engaging with other students.

“I think organisations have a duty of care,” Murray says. “It comes down to culture: to what extent do we want to create a compassionate community that’s open about data and wants to disclose as much as possible to keep the community as a whole safe, over and above an attachment to privacy matters that may be well-intentioned but is perhaps taking us down a different path?”

Young man standing near some urban stairs and looking at the sun.



The guidance published by Universities UK recommends drawing on a wide range of data, including factors such as not paying rent or engaging with other students. Photograph: simonapilolla/Getty Images/iStockphoto

How should such data analysis work? Andy McGregor, deputy chief innovation officer of education technology organisation Jisc, says the focus must be on alerting welfare and tutorial staff about those students who look like they need support, so they can investigate further. “The role of technology is to help those people to be better informed – to prioritise their time and resources to make the most difference,” he says. “It’s about helping the humans on the front line to be as effective as they can be. We know that all university staff are stretched. Analytics should be used in a way that makes their jobs easier.”

Jisc already provides a learning analytics service, used in more than 30 universities and colleges, that can analyse attendance, library and online learning systems data to raise concerns with support staff if students appear to disengage from studies. “Some of these could be flags for wellbeing issues,” says McGregor, but adds: “That’s only part of the picture.” Data on accommodation, finances and health may also be relevant, but may be held by other organisations, such as student accommodation providers and NHS organisations.

Dr Thompson thinks that analysing data will have some value in student wellbeing, particularly for those already identified as being at risk. But she has doubts about any system that only uses already-collected data, adding that it would make sense to gather opinions as well as hard facts, such as concerns from lecturers and tutors.

McGregor agrees, suggesting that asking staff to record the outcomes of any interventions – including when they were not needed – will help work out what measures and combinations are most useful in indicating problems.

Using data analytics to identify students experiencing mental health crises relies on good-quality data, like any other system. Murray says that this is another reason for looking at a range of information: “I wouldn’t rely on one piece of data to guide a decision,” he says.

An early example of such a system, University of Manchester’s StudentCRT, generates scores based on a number of factors, including non-attendance or a failure to submit work. Following two years of use in Manchester’s school of physics and astronomy, the software’s designer, Dr Andrew Markwick, has set up a company to make it available to other universities.

But will students accept such tracking, even if it aims to protect them? The National Union of Students (NUS) accepts there may be a role for technology in reducing harm. “However, many students will be concerned about an approach that ‘monitors’ them so closely, such as the nature of the data collected, who is able to access it, and how it could or will be used for other purposes. Students must be empowered to have a say in whether their data is shared and with whom,” says a spokesperson, adding that data security would be vital and such work should also consider underlying causes and include well-resourced mental health provision.

Jisc has a code of practice, developed with the NUS, to make sure learning analytics is carried out responsibly. It is also about to start looking at how “data trusts”, where personal information is held by a separate organisation set up to share data fairly and safely, could help students to trust such systems. And Murray acknowledges the risks of over-monitoring: “We have to be careful not to be creepy,” he says, adding that universities and colleges must involve students closely in deciding how such systems work.

Also, institutions could make significant progress by making new use of the data they already hold, such as when students submit work, he says. “If on the last 10 occasions, the same person has been submitting at four in the morning, that is an indicator, a piece of behaviour that we may possibly want to investigate.”

In the UK, Samaritans can be contacted on 116 123 or email jo@samaritans.org. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org. Dr Dominique Thompson is speaking at Jisc’s Digifest conference on Tuesday 12 March.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.