DEV Community

Cover image for What is Apple thinking?

What is Apple thinking?

Ben Halpern on August 09, 2021

It's been a few days. You've probably read about this. Apple plans to scan US iPhones for child abuse imagery Apple intends to install...
Collapse
 
renanlazarotto profile image
Renan "Firehawk" Lazarotto

I was completely unaware of this, but I still think Apple's take on this is way more invasive than Google's. According to the link you provided, it seems that Google scans images sent/recieved through Gmail, which technically means that they're not actively scanning people's devices. Apple wants to proactively scan your gallery.

While I agree that their intentions seem to be good, how are they gonna rule out 'false-positives', like a picture taken on a hot day at the beach? And even "worse" - how can I be sure that they won't scan more stuff from my phone without my consent? Whilist I don't have anything to hide, I rather keep my stuff private.

Collapse
 
leob profile image
leob • Edited

It's not quite like that - they're not actively "scanning" and then using AI or whatever, instead they simply calculate a file hash (checksum) and comparing that to a database with hashes of known (confirmed) CP imagery ... that's probably about the most robust and least questionable way of implementing this kind of thing.

Thread Thread
 
renanlazarotto profile image
Renan "Firehawk" Lazarotto

Hey, thanks for the answer! So this means they won't get the images, just the hashes, right? Any place I can read more about how it'll work? I'm really interested on understanding more.

Thread Thread
 
leob profile image
leob

Yup it's here in their FAQ, it's very clearly explained here how it works:

apple.com/child-safety/pdf/Expande...

What they do is download a little database (list) of those hashes to your iPhone (this list gets updated regularly), then when uploading a file to iCloud they simply compute the hash/checksum and compare it to the list.

Collapse
 
ben profile image
Ben Halpern

Apple had been positioning itself so distinctly against where Google stood on all of this stuff. This was your option if you weren't in favor of all the Google stuff.

There's even a universe where Apple outlines specifically how and why this needs to work in this specific way and offers up an explicit answer to dissuade any slippery slope concerns, but it just doesn't seem that way.

Collapse
 
liquid_chickens profile image
Chris Dodds

John Gruber has some good analysis on this: daringfireball.net/2021/08/apple_c...

Collapse
 
maxart2501 profile image
Massimo Artizzu

I'm very concerned.

Of course they said a lot of things to reassure people, like:

  • it's only for CSAM images;
  • only for iCloud images;
  • only against a secure database of hashes;
  • they'll refuse to add other kinds of hashes;
  • only in the USA.

I must notice that this system is only going to catch dumb criminals, who may well opt for other, more private ways to exchange their stuff. So I'm unsure if this will do any good... But if it catches one, is all good, right?

Well, not really. The point is: we must rely on Apple's good will to keep things like that. They can remove any of the above limitations at any moment. The tool can potentially scan anything on the phone.

And I can give Apple the benefit of the doubt, so I'm sure they don't want to steer away from their intentions, but what if they are forced to do so? Are they just going to say "no"?

What if country A tells Apple: "Enable your scanning thing here and use this database of hashes instead, or else pay this huge fine and/or be booted from our market"? Easy abuse of a system born with good intentions.

Remember how China has been oppressing the Uyghurs in the Xinjiang region? They install a scanning malware in their phones.

So it's not just a slippery slope, it's a shaky step covered in grease.

Now I'll explain the worst case scenario: Apple developed this technology because they've actually been asked to. And we'll probably never know the commissioners, even if we can have our suspects.

The system might actually be active outside the US, for what we know. Or the US version could be a large scale test, prior to a world wide release.

All this leads me to think that if a company that used to shout "what's in your phone, stays in your phone" proudly, ends up dismissing all the controversy as "screeching minority", it's pretty clear that they don't value privacy because they love human rights, but because it's just a flag to wave exclusively for marketing purposes.

We don't need this.

Collapse
 
ptejada profile image
Pablo Tejada

Fuck privacy! This is great. Is for a great cause.

Collapse
 
leob profile image
leob

Are you serious, or is this irony? Many things that have been said to be "for a great cause" have turned out disastrous.

Collapse
 
ptejada profile image
Pablo Tejada • Edited

Most people that think like yourself live in a bubble. A bubble with a seed that someone else planted. I could care less about your privacy. Nothing in life is perfect. Is all about supporting the lesser evil.

Besides, if you have nothing to hide you have nothing to worry about.

Thread Thread
 
leob profile image
leob • Edited

You're assuming a lot, aren't you? First of all, "most people that think like yourself ..." - so, you pretend to know exactly what or how I think? Maybe you're clairvoyant, it can't be based on my comment (and I didn't even mention privacy).

Another nice one, "A seed that someone else planted", oh yeah, give me break LOL ... yes sure, I'm totally brainwashed, I can't think for myself, unless someone else "plants the seed". But you are such a critical and independent thinker, right? So the "if you have nothing to hide" argument is your original thought, but how come then I've heard that one so many times, sounds like a cliche and a classic fallacy of there ever was one.

And then to top it off a nice putdown like "I could care less about your privacy", oh such a nice, friendly and polite way to discuss things, yes we really need this level of discourse here on dev.to !

Thread Thread
 
ptejada profile image
Pablo Tejada

Who needs privacy? Privacy is overrated.

Thread Thread
 
rhymes profile image
rhymes • Edited

"Activist X is an enemy of the state, please check their phone". Said every corrupt head of state ever who jailed people labelled as dissidents (super recent example: the president of Belarus jailing activists and even sports athletes)

If you think privacy is useless because you are a good citizen and you never committed a crime, then you don't really know what privacy is for :)

Thread Thread
 
maxart2501 profile image
Massimo Artizzu

Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.

Edward Snowden

Collapse
 
wyattblake profile image
Wyatt Blake

Funny how apple calls it β€˜ Expanded Protections for Children’ yet they use children to assemble their phones.

Collapse
 
leob profile image
leob

I also wonder why Apple has taken this step, at this moment. Maybe some gentle urging from the direction of US authorities or government?

I can't help but notice how companies like Google, Apple and so on have all these lofty and pompous ideals and principles, only to drop those the moment they think they need to please the authorities.

 
leob profile image
leob

Ah really? yes well that is in fact quite lenient ... I agree with you on all counts, also about the media coverage. Never good to jump to conclusions too fast, and vilify a company or an individual before knowing all the facts.

Thread Thread
 
renanlazarotto profile image
Renan "Firehawk" Lazarotto

Thanks a lot for this. I'm gonna read it entirely, because just the first page got me even more interested - the idea of matching known hashes on-device is incredible and now that I learnt a bit more about it I'm not as concerned about privacy as I was yesterday.

Thread Thread
 
leob profile image
leob

Yes I agree that this seems a clean approach - if they'd do shady things like unleashing AI algorithms on images to start guessing "what it is" and then framing people based on that, well that would definitely be a big no-no. This thing with the hashes though seems about the cleanest approach you could come up with.

Collapse
 
karandpr profile image
Karan Gandhi

I liked Dr. Neal Krawetz's take on this issue.
hackerfactor.com/blog/index.php?/a...

Not sure why apple is doing this or rather why apple chose to disclose it now. It's very odd.

Collapse
 
kspeakman profile image
Kasey Speakman • Edited

Based on the headlines I was ready to be upset with Apple. But everything is clickbait nowadays. After reading more of the details I don't have as much of a problem with it.

  • On device content-based scanning is opt-in
    • only for child accounts (12 and under)
    • only parent accounts are notified
  • iCloud scanning is opt-out (by turning off iCloud Photo Library sync)
    • Scanning is for fingerprints of known child explicit images from CSAM db

I am concerned what doors this opens in the future for privacy invasion. However I think the only comprehensive way to address this concern is with laws which guard digital privacy. Otherwise policy is up to each company's leadership. And even if I believed they were doing things the "right" way for privacy now, leadership eventually changes.

Collapse
 
leob profile image
leob • Edited

I tend to agree, I've read Apple's FAQ and their approach does look focused and targeted, it's not a broad sweep big brother kind of privacy invasion thing (there's also no automatic reporting to law enforcement, which would arguably be a bridge too far).

I'd even go farther than this, I'd be fine for them to filter/flag other horrible stuff (domestic violence, animal abuse, whatever) with this hash technology if they've got reliable databases of those - but their response should be a warning to the user trying to upload that and tell them stop doing it or risk termination of their iCloud service.

And of course state all of this clearly in their user agreements.

More than happy with ways for them to stop horrible stuff being stored on their cloud (well yeah, it's their cloud alright).

 
leob profile image
leob • Edited

Hold on, can you substantiate the claim that Google's apps are spyware, or are they rather just collecting anonymous usage data for the benefit of their advertisement business? IMO you can't just equate those two ...

In my mind companies like Apple and Microsoft with their obscene profit margins are just as "evil" (if not more) than Google - their business model is just different, which allows those two to act sanctimoniously versus Apple and Google when it comes to privacy (but Apple and MS are detestable in other ways).

Yeah maybe we should blame ourselves, but OTOH maybe most of use don't care that much ... and even if everything's open source, someone's gotta pay for hosting and running the services, or would we all suddenly go take a subscription instead of enjoying freebees? If you give people a choice then most of them will take the freebees with ads rather than a subscription.

Collapse
 
leob profile image
leob • Edited

Ben, is the statement that "Apple intends to install software on American iPhones" factually correct - is Apple planning to install software on US iPhones, or will they put systems in place to scan images stored in the iCloud by American users? Big difference if you ask me (although even then it's still a form of surveillance and "big brother").

What irks me most is that they would then go on to scan for stuff like child porn and such but not other vile imagery like domestic abuse, animal abuse and I could go on. Why take measures against one form of abuse but not the other, is there some sort of agreed-upon hierarchy of evilness or whatever? It reeks of hypocrisy and it's a slippery slope, that's why companies should refrain from this.

Collapse
 
jayjeckel profile image
Jay Jeckel

Yes, they are going to install software on the phones. The FAQ Apple put out as well as the rest of the released documentation is very clear that all scanning will be done on-device. See the FAQ PDF for details.

The answer to your second paragraph. On one side, Apple have hashes for CP images that already exist and both the acts depicted and the images themselves are illegal. On the other side, while the acts of domestic and animal abuse are illegal, images of those acts generally aren't illegal to posses. The supposed purpose of Apples program is to combat possession of illegal CP images, not to stop the perpetration of illegal actions, so nothing hypocritical about that aspect of it.

I do agree it's a slipper slope that no company should go down.

Collapse
 
leob profile image
leob • Edited

Thanks for clarifying, sound reasoning ... possessing those images is indeed illegal, so I think they have a pretty strong case in saying, we just don't want this stuff in our cloud, ergo we need to block it ... because well, Apple could even be held liable for storing it on their servers, and being complicit in a crime.

And with the hashing technology they arguably have the least questionable approach that you can think of. So yeah slippery slope, still, but there is something to be said for this.

(if they'd not just block it but also report perpetrators to law enforcement then I'd say "bridge too far", but that's not the case, apparently)

Collapse
 
bonespiked profile image
bonespiked

It's possible that since you back up your photos to apple's servers - and sign EULA that they own the images (in order for them to possess them) - they have a vested interest in NOT owning images depicting the horrific subject in question.... That said, they could have done it on the images as they are uploaded to iCloud - and maybe it's just a PR thing where they don't want their brand associated with the topic at all...

Collapse
 
adampatterson profile image
Adam Patterson

I am playing devils advocate here, but I thought that Apple made a big deal about protecting your privacy and going to great lengths to not track or identify you.

This is from the app store: developer.apple.com/app-store/app-...

I suppose their own apps don't need to follow their rules.

Is it their workaround to do this on iCloud?

 
leob profile image
leob • Edited

Fair enough, that's a choice more people should be making consciously - either you pay for stuff and retain your privacy, or you accept the freebees and stop whining about it.

Collapse
 
jeehut profile image
Cihat GΓΌndΓΌz • Edited

Isn't it pretty clear? There's a huge difference between detecting child abuse imagery and detecting if someone is a terrorist: the false positive rate.

First, there's no way to detect if someone is a terrorist just by analyzing imagery. If I'm a soldier or a cosplayer, I might also hold a weapon in my hand and look like a terrorist in a photo. Who in the world has a valid reason to take or have child abuse photos on their phone? No one needs to see those photos, not even journalists who write about it. Only if someone sent an abused child such a photo to blackmail them, this would cause a false positive. But in that case, I believe it's even a good thing as it can help the child, so it's not even really a false positive.

Second, if for detecting if someone is a terrorist they had also looked at messages being sent, the chances would be high that journalists who write about this topic and some minorities like Muslims would have a much higher false positive rate because of the way media connects these two topics and the way our algorithms are trained. If they did that, Apple would help with discrimination against Muslims.

So, given that machine learning algorithms can detect such imagery nowadays with >99% accuracy, I can understand Apple doesn't see a privacy issue here as the usefulness for children are high and the risks for abuse are low.

By the way, the "proactive surveillance functionality" you mentioned that the FBI wanted didn't even include any kind of advanced "turn on only if necessary" feature. It was basically a backdoor for the FBI and the FBI would then decide, for whom they use it. So if someone in the FBI didn't like you, you had no defense. The algorithm used here though is reviewed by many people and there's no way to abuse it in that same way if you don't like someone.

Collapse
 
storytellercz profile image
Jan Dvorak

My cynical take is that this is multiple things coming into one. There is a growing pressure for monitoring from the current US administration and long stranding pressure from the spy agencies as well. Then there is Chinese market. So it is probably best to develop one monitoring app/functionality to which then they can load multiple settings based on country to satisfy the different governments and get access and money from them. To go ahead of this being discovered by users and negative press they announce it publicly and wrap it into something noble that people who don't look too deeply into it and the press won't object.

Eventually when it is discovered that they have expanded its usage for other things they can initially claim an error or something similar or obfuscate that it is just to get the "bad" people.

Collapse
 
arrrgr profile image
Arthur Grishkevich

Why do we so hold to our illusory privacy? I don't agree with every single argument here but I think Hugh Howey makes some great points hughhowey.com/the-end-of-privacy-a...

Mainly, that we all come from societies where privacy did not exist (small villages). Privacy is a modern concept that has quickly outlived itself. Right now, we really don't have much privacy anyway. There is constant oversight by government and corporates. It is what it is. Why not admit it and at least be open about the lack of privacy?

Collapse
 
webbureaucrat profile image
webbureaucrat

The Apple detractor case goes something like this:

  • Before this announcement, we trusted Apple to not create this kind of software to enable authoritarians without telling us.
  • Now that Apple has told us exactly what they built and what it's being used for, we no longer trust Apple not to modify this kind of software to enable authoritarians.

Apple is closed source! They already install software on your device for you on a regular basis, and you don't have any transparency into what it does! This doesn't suddenly enable that. If they weren't pursuing child protection, they could still pursue cooperation with authoritarianism.

Another common argument I hear is that it's unlikely to actually catch any pedophiles because pedophiles will simply not use a service that complies with authorities. This is false, and it's easy to see if you've just looked into the subject even a little. Facebook submits about 20 million cases to the FBI every year. That means every year people are caught uploading child sexual abuse imagery to Facebook, a company that has made perfectly clear many times that it will choose its own interests over privacy every day of the week.

Collapse
 
rangercoder99 profile image
RangerCoder99

I more then happy let Apple scan my images, to get rid of child abuse!

Collapse
 
foresthoffman profile image
Forest Hoffman

It sounds like Apple isn't comfortable with other parties spying on their customers. Only they get to spy on their customers.

Collapse
 
leob profile image
leob

There would clearly be a difference between Google or Apple scanning stuff that's being stored in their clouds, and them installing "spyware" ON your personal device.

Collapse
 
michaelcurrin profile image
Michael Currin
-iPhone is no proactively
+iPhone is now proactively
Enter fullscreen mode Exit fullscreen mode
Collapse
 
brokenmold profile image
Jason Glass

I'm sure Getty Images would be turned in based on a similar scan.. Context is everything in all things.

 
leob profile image
leob

I know, but some people really are stubborn ;)

Collapse
 
aghost7 profile image
Jonathan Boudreau

The "position" they've had compared to google's was nothing more than a marketing scheme. Its simple as that. If you want privacy, you will need to rely on end to end encrypted, open source platforms.

Collapse
 
artis3n profile image
Ari Kalfus

Alex Stamos (former Facebook head of security) has a really well-articulated Twitter thread about the nuances involved here: twitter.com/SwiftOnSecurity/status...

Collapse
 
andrewbrown profile image
Andrew Brown πŸ‡¨πŸ‡¦

The next 20 years is going to really interesting.

Collapse
 
zenulabidin profile image
Ali Sherief

Apple backing down on an announced new encryption system at the FBI's pressure should've raised warning signs.

Collapse
 
scottbeeker profile image
Scott Beeker

I don't see what's so shocking about this