UK offers cash for CSAM detection tech targeted at e2e encryption –

The UK government is preparing to spend more than half a million dollars to encourage the development of detection technologies for child sexual abuse material (CSAM), an end-to-end encrypted messaging platform to scan for illegal content. Can be bolted on. Furthering its ongoing policy around the Internet and child safety.

In a joint initiative today, the Home Office and the Department for Digital, Media, Culture and Sport (DCMS) announced the “Tech Safety Challenge Fund” – which will distribute up to £425,000 (~$584k) to five organizations (£85k). /$117k each) to develop “innovative technology such as online messaging platforms with end-to-end encryption to keep children safe in the environment”.

A challenge statement for applicants to the program said the focus is on solutions that can be deployed in an e2e encrypted environment “without compromising user privacy”.

“The problem we’re trying to fix is ​​essentially blindfolding law enforcement agencies,” a Home Office spokesperson told us, arguing that if the tech platform continues its “full end- Going forward with to-end encryption schemes, as they are at present. … we will be completely hindered in protecting our children online”.

Although the announcement did not name any specific platforms of concern, Home Secretary Priti Patel had previously attacked Facebook’s plans to expand the use of E2E encryption – warning in April that the move would be a crime against child abuse. could jeopardize the ability of law enforcement to investigate.

Facebook-owned WhatsApp also already uses e2e encryption so the platform is already a clear target for whatever ‘security’ technologies may arise from this taxpayer-funded challenge.

Apple’s iMessage and FaceTime are among other current mainstream messaging tools that use e2e encryption.

Therefore any ‘child protection technology’ developed through this government-backed challenge has potential for widespread use. (According to the Home Office, the technologies submitted for the challenge will be evaluated by “independent academic experts.” The department was unable to provide details on who would actually assess the projects.)

In the meantime, Patel continues to exert high-level pressure on the tech sector on the issue – including with the goal of garnering support from G7 counterparts.

Writing in a paywalled op-ed in Tory-friendly newspaper, The Telegraph, she is chairing a meeting today, where she says she will push the G7 to collectively pressure social media companies to “make their harmful material”. Platform”.

“The introduction of end-to-end encryption should not open the doors to even greater levels of child sexual abuse. The exaggerated allegations from some circles that this is really about governments trying to spy and spy on innocent civilians are simply untrue. It is about keeping the most vulnerable among us safe and preventing really bad crimes,” she adds.

“I am calling on our international partners to support the UK’s approach to holding technology companies to account. They should not allow harmful content to be posted on their platforms or disregard public safety when designing their products We believe there are alternative solutions, and I know our law enforcement colleagues agree with us.”

In the op-ed, the home secretary singled out Apple’s recent move to add CSAM detection tools to iOS and macOS to scan content on user’s devices before it’s uploaded to iCloud — calling the development a “first step.” welcomed in

“Apple said their child sexual abuse filtering technology has a false positive rate of 1 in a trillion, meaning the privacy of legitimate users is protected while those creating vast collections of excessive child sexual abuse material are caught. They need to see[r]That project,” she writes, urging Apple to move forward with the (currently delayed) rollout.

Last week the iPhone maker said it would delay implementing the CSAM detection system – following a backlash led by security experts and privacy advocates who expressed concerns about vulnerabilities in its approach, as well as a ‘privacy-focused one’ ‘ The company carried out the contradiction. -Device scanning of customer data. He also flagged the widespread risk of scanning infrastructure by governments and states, which could order Apple to scan for other types of content, not just CSAM.

Patel’s description of Apple’s move as just a “first step” is unlikely to do anything to assuage concerns that once such scanning infrastructure is baked into E2E encrypted systems If so, it will become a target for governments to decide what commercial platforms should be legally scanned. .

However, a Home Office spokeswoman told us that Patel’s comments on Apple’s CSAM technology were intended only to welcome its decision to act in the area of ​​child safety — rather than to endorse any specific technology or approach. (And Patel also writes: “But this is just a solution, by one company. More investment is necessary.”)

A Home Office spokeswoman would not comment on what types of technologies the government is targeting to support through the challenge fund, either, saying they are exploring a range of solutions.

She told us that the broader goal is to support ‘middleground’ solutions – denying that the government is trying to encourage technologists to come up with methods for e2e encryption through the backdoor.

In the UK in recent years GCHQ has also floated the controversial idea of ​​a so-called ‘ghost protocol’ – which would allow state intelligence or law enforcement agencies to invisibly CC’d in encrypted communications on a targeted basis by service providers . That proposal drew widespread criticism, including from the tech industry, which warned it would undermine trust and security and threaten fundamental rights.

It is unclear whether the government has such an approach – although with a CSAM focus – it now tries to encourage the development of ‘middleground’ technologies that specifically scan E2E encrypted content for illegal goods. are able to.

Related to another development, earlier this summer, guidance provided by DCMS for messaging platforms recommended that they completely “stop” the use of e2e encryption for child accounts.

When asked about this, a Home Office spokeswoman told us that the tech fund “is not very isolated” and is “trying to find a solution in the middle”.

“Working together and bringing academics and NGOs into the field so that we can find a solution that works both for what social media companies want to achieve and also makes sure we help protect children. are capable,” said: “We all need to come together and see what they can do.”

There isn’t much clarity in the Home Office’s guidance for suppliers applying for an opportunity to receive a tranche of funds.

There it is written that the proposals should “make innovative use of technology to enable more effective detection and/or prevention of sexually explicit images or videos of children”.

“There are tools within Scope that can identify, block, or report new or already known child sexual abuse material based on AI, hash-based identification or other technologies,” it further notes that the proposals need to be “specific There is a need to address the challenges presented by the e2ee environment, considering the opportunities to respond at different levels of the technology stack (including client-side and server-side).

General information about the challenge – which is open to applicants located anywhere, not only in the UK – can be found on the Safety Tech Network website.

The last date for application is 6 October.

Selected applicants will have five months between November 2021 and March 2022 to complete their projects.

When exactly any of the technology can be pushed into the commercial sector isn’t clear – but the government will be hoping that the giants will develop the stuff themselves, as Apple has been, by keeping pressure on the tech sector platform. .

The challenge is just the latest UK government initiative to bring platforms in line with its policy priorities – for example, back in 2017, it was pushing them to build tools to block terrorist content – ​​and you argue. Can give that this is a form of progress ministers are calling not only to make e2e encryption illegal as they often did in the past.

That said, talk of ‘pausing’ the use of e2e encryption – or even vague suggestions of “in between” solutions – may not be so different.

What is different is the continued focus on child protection, as the platforms are political puppets to follow. It seems to be getting results.

The broader government plan to regulate the platforms – set out in a draft online security bill published earlier this year – has yet to undergo parliamentary scrutiny. But already baked in change, the country’s data security watchdog is now implementing a children’s design code that stipulates that platforms need to prioritize children’s privacy by default, among other recommended standards.

The age-appropriate design code was added as an amendment to the UK’s data protection bill – meaning it sits under broader legislation that moved Europe’s General Data Protection Regulation (GDPR) into law, which deals with data breaches. such as brought in supersized penalties for violations. And in recent months several social media giants have announced changes to the way they handle children’s accounts and data – which ICOs have submitted to code.

So the government can be confident that it has finally found a blueprint to bring the tech giants on the heels.

Related Posts

error: Content is protected !!