Feature: Jisc has reported on the approach to assessing an initiative conducted at Northumbria University
One result of the Covid-19 pandemic for universities was that concerns over the wellbeing of students climbed up their list of priorities.
They became more concerned with how the isolation of lockdown had affected mental health – with a University College London report stating that 64% reported a negative impact – and are now also looking at how new student cohorts need to adjust to post-pandemic life with more typical concerns, such as dealing with independent study, managing finances, and living away from home.
There is also a growing awareness of the potential to support student wellbeing with data, but that this requires good governance – as outlined by tertiary education technology agency Jisc.
The organisation has recently run an evaluation of a pilot project at Northumbria University to identify students at risk of low wellbeing, and found that this could be predicted with 80% accuracy.
The predictions were generated by a machine learning tool trained on data collected automatically by university systems. These included log-ins to the university virtual learning environment (VLE) and card swipes into departmental libraries.
The project, which began in 2018-19, also involved a dashboard listing the students identified at risk of low wellbeing. This was reviewed manually by a dedicated member of the university’s mental health and wellbeing team, who could then send an automated email to the student or arrange a phone call.
The Jisc evaluation focused on three elements – the machine learning model, the model of consent for data governance and the success of the dashboard.
Jim Keane, senior analytics consultant wellbeing at Jisc, believes the Northumbria project - commissioned by the Office for Students – follows a long history of attempting to support students with their wellbeing, often with varying success but is unique.
“In terms of evaluation, it’s had a narrow and appropriate focus,” he says, adding: “Before asking the question: can we develop a tool to predict the risk of a student experience low wellbeing, you need to ask who reacts to what the analytics are telling you? This is often missed,” he says.
Having spent years working in student engagement at the University of Hull, he says universities often make academic supervisors responsible for pastoral care.
“(Some supervisors) do exactly the right thing at the right time, have good interpersonal skills and knowledge of the support infrastructure – but there aren’t many of those,” he says.
Pressure on tutors
More often personal tutors are too busy to provide student support or are too keen to provide counselling services they’re unqualified to deliver. Where universities have asked faculty administration to make phone calls to students who may have depression, they’re often scared of saying the wrong thing.
Crucial to the Northumbria project was recruiting an additional staff member to the mental health and wellbeing team to look at the output from the analytics.
“Northumbria had designed into the project from the get-go who was going to respond to the tool and the actions they were going to take – so it didn’t carve into existing resources,” Keane says.
The Northumbria analytical model was built using five separate questionnaires based on the World Health Organisation Five Well-Being Index (WHO-5), a self-reported index of mental wellbeing.
The questionnaires were sent to the approximately 35,000 students attending Northumbria University. Keane’s evaluation found that 75% returned at least one questionnaire.
The questionnaire data was combined with 800 different indicators of student activity and behaviour, such as log-ins to the VLE, along with static indicators related to wellbeing, such as English as a second language.
These were run through a machine learning algorithm that used the random forest technique – which combines multiple decision trees to reach a single result – to predict the likelihood that a particular student was at risk of low wellbeing.
The algorithm has been evaluated on non-training data, Keane says, but no results have yet been released on the benefits of the algorithm. For example, whether identifying students at risk of low wellbeing translates into better mental health.
Many people often regard data use for wellbeing projects as ‘scary’ because it raises privacy and data governance concerns. That’s according to Andrew Cormack, chief regulatory officer at Jisc.
Transformation and governance
Critical to the success of this project, Keane explains, is that the university had already undergone 10 years of digital transformation, including good governance practices and centralising data access.
“The benefit of digital transformation happening first is that the institution doing good things with centralised data is part of the landscape,” Cormack says. “If there isn’t that institutional context, it’s a harder sell.”
The project also relied on secondary data that had a primary use elsewhere. For example, measuring student engagement by whether they had recently swiped their card to enter a library. The library data has a primary purpose in university security.
Secondary data use means the data isn’t being collected to track students, but – according to Cormack – it’s still important to ensure consent for secondary data use.
“If you look at newspaper headlines, secondary data use is scary,” he says. If there isn’t consent, the primary use of the data can be compromised. For example, by students holding doors open to avoid using their swipe card.
Virtue of transparency
He adds that institutions can avoid these problems by being transparent about what data is collected and how it is being used, saying: “Logging onto a VLE or using a swipe card creates an obvious place to inform the individual and community.”
It is also essential to talk to students and staff before embarking on a project. If they don’t seem comfortable with the plans, don’t expend time and effort on data use.
Finally, “don’t let perfect be the enemy of the good,” Keane says. If you don’t have a sophisticated metric for student engagement, it’s probably enough to notice when you last saw them and follow up.”