The UK government has partnered with Microsoft to develop a deepfake detection evaluation framework aimed at countering the growing misuse of AI-generated content. The initiative was announced by the Home Office as part of its broader efforts to address digital fraud and online abuse.
The proposed framework will bring together experts from leading technology firms and academia to assess how advanced technologies can be used to identify, analyse, and decode deepfake content more effectively.
A key objective of the project is to establish industry-wide standards for deepfake detection, helping organizations and platforms respond more consistently and reliably to emerging threats.
UK Tech Secretary Liz Kendall warned that deepfakes are increasingly being weaponised by criminals to defraud the public, exploit women and girls, and erode trust in digital media and information.
According to the Home Office, the scale of the problem is escalating rapidly. An estimated eight million deepfake videos were shared in 2025, a sharp rise from just 500,000 in 2023. In 2024, deepfakes impersonating Sir Keir Starmer and Prince William were reportedly used in cryptocurrency scams.
As part of its response, the UK government has made the creation and sharing of non-consensual sexually explicit deepfakes a criminal offence. Andrea Simon, Director of the End Violence Against Women Coalition (EVAW), welcomed the move but stressed that responsibility should not fall solely on victims, urging platforms to take stronger action.
Minister for Safeguarding and Violence Against Women and Girls, Jess Phillips, said the framework aims to expose criminal tactics, close regulatory loopholes, and hold technology companies accountable. The move comes amid global concern, with countries like Singapore also warning businesses about a rise in corporate deepfake scams impersonating senior executives.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



