942 comments
bArray · 23 hours ago
Too right, it was far more problematic than they ever made out.

> The UK government's demand came through a "technical capability notice" under the Investigatory Powers Act (IPA), requiring Apple to create a backdoor that would allow British security officials to access encrypted user data globally. The order would have compromised Apple's Advanced Data Protection feature, which provides end-to-end encryption for iCloud data including Photos, Notes, Messages backups, and device backups.

One scenario would be somebody in an airport and security officials are searching your device under the Counter Terrorism Act (where you don't even have the right to legal advice, or the right to remain silent). You maybe a British person, but you could also be a foreign person moving through the airport. There's no time limit on when you may be searched, so all people who ever travelled through British territory could be searched by officials.

Let that sink in for a moment. We're talking about the largest back door I've ever heard of.

What concerns me more is that Apple is the only company audibly making a stand. I have an Android device beside me that regularly asks me to back my device up to the cloud (and make it difficult to opt out), you think Google didn't already sign up to this? You think Microsoft didn't?

Then think for a moment that most 2FA directly goes via a large tech company or to your mobile. We're just outright handing over the keys to all of our accounts. Your accounts have never been less protected. The battle is being lost for privacy and security.

Show replies

ljm · 21 hours ago
Fundamentally, I think the issue is more about technical literacy amongst the political establishment who consistently rely on the fallacy that having nothing to hide means you have nothing to fear. Especially in the UK which operates as a paternalistic state and enjoys authoritarian support across all parties.

On the authoritarianism: these laws are always worded in such a way that they can be applied or targeted vaguely, basically to work around other legislation. They will stop thinking of the children as soon as the law is put into play, and it's hardly likely that pedo rings or rape gangs will be top of the list of priorities.

On the technical literacy: the government has the mistaken belief that their back door will know the difference between the good guys (presumably them) and the bad guys, and the bad guys will be locked out. However, the only real protection is security by obscurity: it's illegal to reveal that this backdoor exists or was even requested. Any bad guy can make a reasonable assumption that a multinational tech company offering cloud services has been compromised, so this just paints another target on their backs.

I've said it before, but I guarantee that the monkey's paw has been infinitely curling with this, and it's a dream come true for any black or grey hat hacker who wants to try and compromise the government through a backdoor like this.

Show replies

AlanYx · 20 hours ago
Many people might not be aware of it, but Apple publishes a breakdown of the number of government requests for data that it receives, broken down by country.

The number of UK requests has ballooned in recent years: https://www.apple.com/legal/transparency/gb.html#:~:text=77%...

Much of this is likely related to the implementation and automation of the US-UK data access agreement pursuant to the CLOUD Act, which has streamlined this type of request by UK law enforcement and national security agencies.

Show replies

anoncow · 19 hours ago

    >Online privacy expert Caro Robson said she believed it was "unprecedented" for a company "simply to withdraw a product rather than cooperate with a government.
That is such a self serving comment. If Apple provides UK a backdoor, it weakens all users globally. With this they are following the local law and the country deserves what the rulers of the country want. These experts are a bit much. In the next paragraph they say something ominous.

    >"It would be a very, very worrying precedent if other communications operators felt they simply could withdraw products and not be held accountable by governments," she told the BBC.

Show replies

ComputerGuru · 22 hours ago
Note that this doesn’t satisfy the government’s original request, which was for worldwide backdoor access into E2E-encrypted cloud accounts.

But I have a more pertinent question: how can you “pull” E2E encryption without data loss? What happens to those that had this enabled?

Edit:

Part of my concern is that you have to keep in mind Apple's defense against backdooring E2E is the (US) doctrine that work cannot be compelled. Any solution Apple develops that enables "disable E2E for this account" makes it harder for them to claim that implementing that would be compelling work (or speech, if you prefer) if that capability already exists.

Show replies