1. Private Schools
February 17, 2026updated 18 Feb 2026 12:01pm

As Keir Starmer weighs curbs on children’s access to AI, experts urge schools to embrace it cautiously

Prime Minister Keir Starmer has moved to limit children’s access to AI chatbots – but how should elite schools respond to the fast-changing technology?

By Christian Maddock

Elite educators need to embrace innovative technology cautiously, experts told Spear’s.

The ‘disgusting’ results of certain generative AI chatbots need to be effectively addressed, said Keir Starmer yesterday morning, as he expressed that he was considering limiting children’s access to such technology.

During a visit to a community hub, Starmer said: ‘We’re looking at access for under 16s. Should it be banned altogether? Should it be much stricter content control? And what goes with that, by the way, is how do we make sure children don’t operate to get around the rules?’

This comes as the Labour government announced plans to bring AI chatbots under the remit of the Online Safety Act, which protects children and adults from accessing harmful material online by placing strict duties of care on social media companies and search engines.

[See also: What is Altruist? The AI start-up shaking up wealth management in Britain]

Private schools are leading the way in tackling the AI conundrum, though there is still progress to be made, according to research from the Sutton Trust.

Private school teachers are twice as likely to have received formal AI training as their state school counterparts, with 45 per cent compared with 21 per cent having been taught how to use LLMs effectively, according to the trust’s data.

However, strategy is lacking across the board. While private schools are three times as likely to have a school-wide AI strategy in place (27 per cent versus nine per cent), the figure also shows that almost three quarters of independent schools lack an established AI policy.

Content from our partners
Lagos Private Wealth Conference 2025: Shaping Africa’s Legacy of Prosperity
From bold beginnings to global prestige: the legacy of Penfolds Bin 707
The Windsor is bringing seamless luxury to Heathrow

Dependence on AI chatbots has increased rapidly since the technology’s relatively recent emergence, with ChatGPT launched less than four years ago in November 2022.

More than a quarter of Gen Z – those born between 1996 and 2012 – use ChatGPT as their search engine of choice, according to a survey by Adobe. Furthermore, 54 per cent of respondents said they use the tool to summarise complex topics quickly and 80 per cent of Gen Z respondents said they had used ChatGPT at least once as a search engine.

The number one threat AI poses to children is the potential erosion of critical thinking skills, says educational expert Thomas Harley of the private tutoring and home-schooling firm HRB Education.

‘Chatbots can give you the exact answer you are looking for,’ he says. ‘If you ask how something works, it gives you the exact answer that you need. As a student, this means that there is no work involved to get the answer.’

Harley also notes the risk of children treating information provided by AI as well researched and factual, which is not always the case as different LLM platforms continue to refine their delivery of information.

‘Sometimes AI provides you with information that is clearly not correct. People don’t really question it at a younger age; they view technology as all encompassing and never failing,’ he says.

However, Harley believes there is space for AI chatbots within an educational setting, provided they are carefully monitored.

‘One of the keys is to ensure that the AI chatbot the child is using is safe and is not going to give ridiculous results that you might previously have seen on the X platform, for example,’ he notes.

Harley adds that universities may soon welcome applicants with high levels of AI literacy as they seek to embrace the technology.

‘I can definitely see a situation in the future where applicants to universities are saying “not only can I use AI, but more importantly I understand how the coding elements behind its language model work, and have the ability to truly work with it”,’ he says. ‘This could be a very interesting way to frame yourself as an applicant in the future. Universities, as we know, are desperate to incorporate AI into their operations.’

[See also: Family offices chase AI but governance and infrastructure lag]

Both the University of Oxford and Yale University, two of the world’s leading educational institutions, have publicly endorsed AI chatbots. Students at Oxford have free access to ChatGPT Edu, which includes enhanced data privacy functions to protect research, while Yale students can use an LLM managed by the university itself. Both institutions have been explicit that students are not permitted to use AI to write their work for them.

For applicants to elite universities today, they should not promote their use of AI, says Stephen Newall of university preparation firm Getting In.

‘The university candidates I work with would not dream of using AI for their applications because it’s a fundamental no-no,’ says Newall. ‘Anyone can spot AI from a mile off. While it may be structured in some ways and fairly well written, it’s always robotic.’

He adds: ‘If they use these tools, it raises questions of: How much has the tool benefited them? How much has it written for them? Is it really the student’s work, or is it the AI’s?’

There are consequences for the misuse of AI at university, with almost 7,000 cases of students using AI tools to cheat between 2023 and 2024, according to research by the Guardian.

With leading independent schools such as Eton College and Harrow School encouraging open debate, and historic organisations such as the Oxford Union producing some of the UK’s most prominent political leaders and thinkers, learning beyond the remit of technology remains highly valued.

Thinking for oneself is central to developing critical thinking skills, and the overuse of AI chatbots risks limiting children in this regard, says Jess Harris, head of Quintessentially Education, an education consultancy and tutoring provider.

‘Learning is also a social activity. Discussion, debate, disagreement, and collaboration in real time helps children develop empathy, confidence, and interpersonal skills,’ she says. ‘Replacing peer interaction with AI responses risks narrowing those experiences.’

‘But, when used responsibly, chatbots can also support struggling learners and those with additional learning needs,’ Harris adds. ‘They can provide supplementary tutoring support, rather than a substitute thinker, and provide practice explanations.’

Echoing this more pragmatic view of AI as an inevitable part of the future, Harley concludes: ‘Everyone’s going to go into a career that involves technology, so there’s no point in hiding it.

‘It just needs to be understood that AI needs to be used as scaffolding to build children’s learning up, rather than as a crutch.’

[See also: Building resilient portfolios in 2026: how investors can navigate AI, commodities and market risks]

Websites in our network