Apple has defended its new system that scans users' phones for child sexual abuse material (CSAM), after a backlash from customers and privacy advocates.
The technology searches for matches of known abuse material before the image is uploaded to its iCloud storage.
Critics warned it could be a "backdoor" to spy on people, and more than 5,000 people and organisations have signed an open letter against the technology.
As a result, Apple has pledged not to "expand" the system for any reason.
Digital privacy campaigners warned last week that authoritarian governments could use the technology to bolster anti-LGBT regimes, or crack down on political dissidents in countries where protests are deemed illegal.
This one is a bit of an old chestnut.
On the one side:
No-one is opposed to combatting child abuse and measures that protect children are good.
On the other:
Once there is a backdoor, for whatever purpose, it offers governments, corporations and individuals a way to access others data. There's no point in saying just governments, just Apple itself or citing what layers of legal protection there will be - what one person can create another can duplicate.
My take:
The backdoor exists - that bell cannot be unrung. We can either regulate and control its use or see it abused under a variety of convenient fictions around 'tip offs' and anonymous sources.
So accept the reality and create legal protections around the reality and protect people's privacy by giving them to power to sue if it's improperly violated.