The Think Family Education App: A Lifeline or a Line Crossed?

In an era where technology is increasingly a cornerstone of our daily lives, a pioneering safeguarding app has recently come under fire. 
 
The Think Family Education app is a multi-agency safeguarding initiative run in partnership between Bristol City Council and Avon & Somerset Police. It provides pastoral care teams in schools with routinely updated pupil information relating to a student’s wellbeing and vulnerability. 

This data is available for all young people attending a school and includes: 

  • their status in the child protection/safeguarding system 

  • details of lead professionals' involvement 

  • services they are open with 

  • police information on domestic incidents 

  • reported missing 

  • entering the criminal justice system. 

The app and associated database have been criticised as a form of intrusive monitoring, that critics fear could be perpetuating racism and classism. However, the council and police argue since its launch it has 'helped protect hundreds' of children. 
 
While the app definitely offers a promising avenue for child safety, it clearly raises serious ethical concerns. 

How Does it Work? 

The platform is designed to provide a holistic view of a child's well-being. It amalgamates data from various organizations such as schools, social care, and the NHS to create a comprehensive profile for each child. 

By providing a 360-degree view of a child's life, it enables professionals to identify risks and vulnerabilities at an early stage. This in turn provides timely interventions and support. 

The app uses a specially designed dashboard that displays various metrics and indicators related to a child's well-being. These can range from academic performance and attendance records to more sensitive information like family background and social circumstances. 

This dashboard is powered by impressive data analytics technology. The app uses advanced algorithms to sift through the massive amounts of data, flagging potential issues and even suggesting possible interventions.  

With budget cuts and staff shortages across the board, this is a huge help to those with responsibility for safeguarding.  

Racial Bias 

While the app's capabilities are groundbreaking, the issue of bias is particularly concerning. If the data sources themselves have inherent biases, the app could inadvertently perpetuate systemic issues like racism and classism. 

Machine learning and AI systems are only as unbiased as the data they're trained on. When these technologies learn from data that reflects societal prejudices, they can inadvertently perpetuate existing biases. Unfortunately, most of the data that these systems are trained on do contain bias, even if it isn’t immediately evident.

Griff Ferris, senior legal and policy officer at charity Fair Trials, said:  

“Schoolchildren should not be monitored, profiled and criminalised by secretive police databases. Surveillance is not safeguarding. 

“Systems like this uphold existing discrimination against children and families from minoritised ethnic and more deprived backgrounds. This system is expanding the net of surveillance. It should be shut down.” 

Assistant Chief Constable Will White, who leads on Avon & Somerset Police’s race matters work, said: 

“I recognise the concerns being raised by community and campaign groups about the use of the Think Family Education app and more broadly around our use of data analytics to prevent crime and safeguard the vulnerable. 

“Our motivation for using this app, in partnership with other agencies, is to protect and safeguard the most vulnerable from harm, support them and provide better services. But I understand there are concerns about disproportionality and the impact this might have on people from racially or ethnically minoritised or more disadvantaged backgrounds.” 

“Neither the database nor app replace professional judgement or decision-making, and they do not assess the likelihood of an individual to commit a crime. They provide a vulnerability-based risk score based on a number of factors including whether the young person has previously been a victim of crime and whether they have previously been reported missing. 

“This score is designed to help guide and supplement the work of professionals and provide them with information about children at risk that they may not easily see. We do not use ethnicity to assess risk.” 

Data Protection Fears 

There are also fears about how the data is used, despite rigorous data protection safeguards in place. Designated safeguarding leads who spoke to Fair Trials admitted they were concerned about how parents would react, knowing schools had access to this data. 

One commented saying: 

“They [parents and carers] wouldn’t know about this ... parents will have no kind of sight of it at all ... They just don’t know of its existence.” 
 
“I think there’s a bit of a risk it getting out there that schools hold this kind of central bank of information.” 

However, both the police and council are determined to prove the apps efficacy. A spokesperson for Bristol City Council said: 

“The introduction of the Think Family Education app means that schools … have access to appropriate information in a secure and restricted way to make decisions about how they support children. There are strict controls in place about who can access this information, how they do this and the reasons why.” 

The council strongly believes that the app and database are fundamental to prevent relevant agencies from working in silos. This initiative clearly enables effective multi-agency collaboration. But, the justified fears and criticism suggest that a lot more work needs to be done. A safeguarding tool is not suitable if it directly discriminates. 

Previous
Previous

Redefining Rehabilitation: The Impact of Recent Legal Changes on Ex-Offender Employment

Next
Next

Updates to the Rehabilitation of Offenders Act (ROA): What You Need to Know