On one hand, yes, I totally agree with Alex that the "The scale of abuse that happens to kids online and the impact on those families is unfathomable." Furthermore, sex trafficking is another horrible, enormous and related problem that seemingly getting little attention.
Per Alex tweet, I am not "verbally rolling my eyes at the invocation of child safety as a reason for these changes." Instead, I simply do not believe that this is the reason for which Apple is introducing these capabilities. There is something to be said about reputation of a business: it's hard to gain and easy to lose. Large, near-monopoly platforms, such as Apple, have lost their reputation in my eyes and now I am always questioning their motives.
I simply do not believe that the Apple will refuse, when under serious anti-trust pressures, to extend these capabilities to other areas, including political views as an example.
Paraphrasing Benjamin Franklin: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."
> Instead, I simply do not believe that this is the reason for which Apple is introducing these capabilities.
Okay, what do you believe the reason is then? I can understand being skeptical of the claim that it's for child abuse and safety, but if you don't believe that then you must have something that you believe is the more likely reason. I also want to point out that, yes, the system could be abused for a multitude of reasons but that doesn't mean the actual reason for building the system is different.
Not for "security" but rather for surveillance. The problem is the phones are too secure for governments to get the surveillance they want out of them.
There's a great live conversation between Alex, Riana Pfefferkorn, and Matthew Green here: https://www.youtube.com/watch?v=dbYZVNSOVy4. Has more details on the reasoning and a little debate on the issues from some really good people.
Even pretty non-hacker people intuit quickly that it would weird (and maybe dystopian) for your laptop to constantly scan all the files on your computer against a government database. Why this should be different for phones seems unclear to me, and none of the pro-scanning takes I’ve seen have made convincing arguments about it.
Alex and folks in the broader abuse-prevention community don’t really make a principled argument about why phones should try to do this, but not laptops. I think it sounds a bit more obviously like something an authoritarian state would do, so the laptop analogy gets avoided. This isn’t really a theoretical concern; a natural next step for Apple would be to do this on the Mac too.
Why is this a nuanced conversation with no easy answers?
There is literally no debate. Authoritarian elements want to look through your stuff because some people do bad things. I'm at a loss, to me nobody should even have to justify why the answer is no, it's a principle of natural justice.
Its pretty screwed up when the onus is on people to explain why this is a bad thing. The correct action is really just to walk away.
Yes, you should explain why something is bad, or else we are just doing things based off of how we feel… does that sound better to you?
The correct answer should be that the most well reasoned argument should be supported. And we should not expect anyone to intuit the best answer - repeating it as much as needed.
I find it disturbing that they seemingly are allowed to do things on the devices of their US customers, that in most countries are reserved to law enforcement agencies exclusively.
Also I think they should have to face serious repercussions in countries with strong data protection laws, should they decide to unilaterally change terms of service for their users in these jurisdictions, to reserve the right to search their devices for illegal images.
For many people heavily invested in their ecosystem, it’s also not that easy to just walk away. For other users, the alternative (Google) is not necessarily more trustworthy.
From what I've read, the scanning only takes place on icloud images. You can choose not to backup to icloud and the images won't be scanned. Other cloud storage providers have done something similar in the past where if the hash of the content matches copyrighted movies or music, the content is deleted.
I think the most offensive element of Apple's approach is that your device, that you paid for and own [1], is acting in a way that can only be against your own interest.
There is really no good argument to allow that. Most of the other user-hostile features are at least somewhat explained away as also benefiting the user (e.g. claims that targeted ads are preferable to irrelevant ones, telemetry helps cover the users' use cases, buying the subscription service is actually a win-win, etc.) Nobody would choose a feature like this voluntarily. This is simply preinstalled malware.
I really wonder whether Apple realized how badly they just burned down their entire privacy-centric advertising campaign, where the core selling point is that unlike others, they act in your interest.
[1] Spare me the Stallmanism, I'm talking about the device itself.
One theory put out by John Gruber was that the scanning was being done in device so that they could E2E encrypt iCloud in the future. So they satisfy the 'think of the children people', while giving you strong encryption elsewhere.
> Because it's a pretense for further surveillance.
Alternatively, it could be a way of deflating future demands for increased surveillance. CSAM is one of the few emotional strings Governments have to insist upon surveillance. If we can mark CSAM as "solved" using basic fingerprinting then it's harder to push for real surveillance.
We're currently seeing the widespread adoption of end-to-end encryption. If we don't have an answer to CSAM, then it will be used as a bludgeon to demand real back doors which can be exploited without manufacturer knowledge.
Appeasing the authoritarians never works. On the contrary, it sets a precedent that is used to make more demands.
The ability to scan for transformation-resistant image fingerprints will be used to identify whistleblowers who disclosed documents that embarrass the powers-that be (think Chelsea Manning, Reality Winner or the Panama Papers). It’s a matter of when, not if.
This isn't to appease the authoritarians, it's to diminish the argument which elected authoritarians use to get their electorate to manufacture consent for their authoritarian ideas.
What Apple has done isn't a slip on the slippery slope. There's only one greased up slide in the playground and that's the presence of closed source software which auto-updates. As long as you are willing to accept closed source software which can update itself, the slope is slippery. This very public announcement by Apple doesn't change the angle of the slide, because the risk of someone sneaking in "bad" fingerprinting wasn't any less last year.
Pretty thoughtful take IMO. It is complicated and there isn’t a clear answer.
I liked Alex’s suggestion though: when it’s a highly complex issue, the best solutions happen when you can bring together the right experts in their field together, and collaborate, and I think it’s a very fair critique of Apple. It’s hard for me to imagine a situation where showing up at the conference, sharing a preview of what they were thinking, and then hearing the feedback would not have been beneficial.
(Speaking as a reasonably happy Apple user, I wish openness for some of these things was a little bit more of their DNA. When you’re as big as Apple, people need to just recognize that the standards are different. It may not be fair. But it’s just the way the world works.)
I personally think there are pretty clear answers in terms of the technology. Apple is violating privacy. To what degree is debatable, but, I generally disagree with Alex here. He’s not making the stronger argument compared to privacy advocates here. Yes there is nuance. Yes we must operate with a high degree of empathy for victims of abuse. No one doubts or downplays that IMO. But comments right here in this thread hit the core of it. Would something like this ever be acceptable on a laptop? I get a choice because it’s my computer. I don’t want this. It’s violating my privacy and Apple has eliminated my ability to decide. They basically positioned this as the obvious and safe choice and anyone against them is a “screeching minority”, even though they knew this would be controversial. Alex is operating in good faith, but I think he’s in to deep on trying to bring nuance to an issue where his argument is simply the less principled one. It’s becoming a situation where if you care about privacy you simply can’t use a smart phone (I know there are options, but none of them are close to an iPhone).
Without addressing any other points you’re making, NCMEC wrote the email to Apple that contained the “screeching minority” phrase. It was not something Apple wrote.
Yes, but Apple coordinated with them and more or less endorsed it right? I know it’s not Apples position, but they are defacto embracing it in my opinion. They have structure this to crush dissent through layers of technical speak and policy speak. It is an important distinction but the whole package and celebration of this system smells bad and Apple is at the heart of it.
I'm fully aware of how the law technically works. I'm also aware that, generally, the police more often than not have easy access to this stuff, so the result is the same.
The principle/spirit should apply everywhere, especially when it is meaningfully difficult to opt out. Since there is literally only one competitor in this space, this should generally apply.
I do. And I have taken pains to generally keep their pictures in my control, and out of databases that I have no access to, and this measure would make this more difficult.
I acknowledge that the horrible circumstances used for the justification of this bad policy exist, and sometimes are marginally helpful, but the balancing test for me isn't particularly close.
Especially knowing, that at present - the facial recognition technology is not good at recognizing people who look like me and my kids, and presently harms us perhaps even more than it helps (yeah, we're black.)
I worry less about my kids (because I take precautions, similar to you), but I worry about kids whose parents are not as sophisticated.
And this is why I don’t think it’s as black & white. An absolutist, deontological view of the right to privacy is something I absolutely empathize with (although the fourth amendment does not enshrine this anywhere for companies; the bill of rights applies to government action.). I’m just saying there is a utilitarian argument as well for doing something like this.
And the dialogue between technologists who understand what technology can/can’t do vs those who advocate for exploited children —- is what is needed.
Odd, and perhaps telling, that you went abstract and utilitarian, seemingly as if that wasn't what I was doing.
Perhaps I should spell it out more clearly. My black child-in-the-pool's face matches some unfortunate other child's and now I'm being investigated. It should be more obvious why this concerns me as much, if not more than, the other thing.
Links to https://storage.courtlistener.com/recap/gov.uscourts.insd.77... (90 pages of harrowing public-domain court documents obtained through RECAP, the system Aaron Swartz first got investigated by the FBI for setting up; they document the misdeeds of a guy named Buster Hernandez who serially raped children over Facebook)
The thread has a point, and there's definitely nuance here, but I think people throughout history have litigated this many times over, and we came to the same conclusion over and over and over again. To argue the contrary is revisionist or simply ignorant. This conclusion is one that has pushed our species to be more charitable, more law-abiding, more free, less tyrannical, more fair, etc. Not that we've reached some kind of utopia, but let's face it: the world is much better off than it was 100, 200, or 1000 years ago.
The conclusion is that freedom (that is: negative, not positive rights) is the main force of progress (societal, technological, scientific). Of course, we have law enforcement, and judges, and lawyers, and an entire legal machinery: but it functions within well-defined boundaries. In the US, we have warrants. We have habeas corpus. We have chain of custody, and so on.
Apple is fundamentally circumventing these norms. And this is wrong, no matter what is on the other side of the equation. I'm not sure what Stamos' counter-argument is here. I don't always agree with the EFF, but I think in this case, they're right on the money.
I agree that this certainly feels like it's a step back in terms of freedom for all users. Introducing these kind of measures and expecting them to be only used to punish morally wrong behavior is a form of exceptionalism. I don't think any institution can, in the long term, consistently have such moral insight and correctness.
Historically, I think those measures almost always ended up being abused at some point. Measuring whether they've been used to do more harm than good is where the question gets a lot more complicated.
However, I'd like to better understand why you believe that freedom is the main force of progress. Although I do feel like our society generally progresses towards more freedom, it isn't always the same kind of freedom that was present in the past.
For example, freedom surrounding gun rights, freedom from taxing or freedom from government interventionism in the economy (or what is perceived as such) have all, arguably, caused a lot of harm in the US. Do you feel these are all justified simply because they are "positive rights"?
By reporting potential criminals to the authorities by scanning what is essentially private data (images, messages), I'd likely argue they're violating unlawful search and seizure and I doubt a conviction would stick[1]. It's not like my landlord can barge in my apartment and look under the couch for illegal drugs because he heard a rumor I bought some.
I mean, probable cause is already sketchy enough, what Apple doing is beyond the pale.
Right; I think its important to recognize that, obviously, Apple's products are not subject to rights against unlawful search and seizure. But: There is absolutely a reason why that is a right the government & law enforcement is forced to uphold.
In other words, especially in this age of trillion-dollar megacorps, when private companies enact policies and procedures which skirt the constitutional rights US Citizens have, its tantamount to these private companies looking at centuries of judicial, legislative, and even human rights research and development, and saying "we know better". Apple is an angry toddler; not a responsible adult.
Apple does not, generally, equivocate on changes like this. But the chorus against them, on this issue, is seriously big (I guess having Kelso from That '70s Show on your side helps, though). They need to step back on this and engage with the broader legal and ethical community. They are absolutely in the wrong here; that doesn't mean that the right course of action, for the iPhone, is to do nothing and E2EE everything. It just means that these actions are not the right ones. They locked themselves in a room, wrote up a slick algorithm, then said "sweet, this is a great solution which balances protecting our children from predators with the human right to privacy"; its Hubris. Its not that easy, and they can't do it alone.
You can argue it's unethical but it's not illegal. The police can often obtain a warrant with just a private tip from an anonymous line. I'm sure a neural network match with access logs proving image upload AND a human reviewer confirming it matches CSAM is more than enough for a search & seizure.
Given that enforcement is only possible on photos uploaded to iCloud, the fact that classification happens on device seems like a distinction without a difference.
How does it shift the Overton window if, for >99% of people, it is an entirely opaque distinction? For most people, the features provided by cloud accounts are seen as features of their physical device. Whether private data is interrogated locally or remotely is—for them—mere technical minutia.
> By reporting potential criminals to the authorities by scanning what is essentially private data (images, messages), I'd likely argue they're violating unlawful search and seizure and I doubt a conviction would stick[1].
Why? Police use informants to obtain probable cause regularly.
You are handing your image off to Apple’s servers, and they are accountable for child porn on those servers, by law.
Keep your photo on your device and the software doesn’t scan it.
So yes, there’s nuance here. It’s very unfortunate that Apple didn’t engage with the community first, but that seems to be an institutional blind spot.
Yes you own the device but it comes with Apple's software and services which you don't own. As long as you are updating the software on the device you will have to live with how it interacts with Apple's online services.
You can install your own software on Apple's hardware if that is possible and then use the hardware as you feel fit. Or you can take Apple to court and get a judgement to stop them from scanning your data.
You and I both know how incredibly difficult that is. And that it simply doesn’t have to be this way, but Apple wants the control and they are going to fight to keep it. The phone duopoly you must exist within if you want a modern device is untenable for society in my opinion. The few of us, as technocrats, are the ones who have to speak up and advocate for those who can’t go as deep on the technical issues as we can. It’s all spiraling to a user hostile, corporate controlled world.
I hope this is just trolling, but in case it’s not, just ask yourself why your landlord or your HOA doesn’t have the „freedom“ to search your flat/house for that kind of stuff.
Its a nuanced conversation, but there two fundamental points that Apple totally missed.:
1) Don't build a system for reporting what images people make, store, or send privately. That is just begging for someone to force you to abuse it for things that aren't CSAM. You shouldn't expect Apple to be able to resist pressure to abuse such a system. 1)they already caved to China by basically handing iCloud + iMessage key infra to a local company. 2) The entire reason you have encrypted systems(and make no mistake, this entire scheme only makes sense to scan encrypted content) is precisely for the day when the social and repetitional guarantees of "trust us" fail.
2) We need alternative ways to stop grooming and effectively get the major players who produce CSAM. And that's not going to come from scanning for CSAM NCMEC already knows about. Yes, resharing images harms the victim, but if you have a finite amount of resources, you go after the original image production and the people literally raping kids. And indeed, thats apparently what the FBI does and why they mostly ignore reports flagged by these systems.
I spent the weekend (well, saturday) trying to get rid of an ant problem in the kitchen. They were coming up to the work-surfaces through cracks in the grouting.
So I re-grouted the backsplash, and the joins in the worksurface etc.
Of course, they started coming up through gaps in between the floor tiles, so I patched those. Then through the window sills. And I noticed they pushed out some of the new grout to make a couple of new runs.
You will never completely stop this problem.
And monitoring every. single. thing. is not the way to tackle it.
Professionals need to target the source of the problem. Who is being abused? Why? Who are the abusers? Why? How and where are the two meeting?
This is the work to traditional policing, improved social services and society in general, not mass-surveillance.
I don't see how there's any middle ground here. Anything apple does to try to prevent sexual abuse will by definition infringe on the privacy of users. And I don't think there's anything apple can do to prevent cp that doesn't create a slippery slope to preventing other things that governments (potentially authoritarian) deem unsavory,
Frankly, this entire thread seems like a bad attempt to take a centrist position without providing any real justification for the position to have value.
They lost my trust years ago when their software detected “me” in a photo. Why is their software analyzing and detecting things like that? It’s ludicrous.
> They lost my trust years ago when their software detected “me” in a photo.
I don't understand. Their software, running on your phone, without interacting with any servers outside your phone, identified you for your convenience.
Yeah I'm confused about that statement too. From that point of view, a cron job running on a Linux system, daily, which updates the "locate" db is also a breach of trust and privacy..
But what I was getting at is that I’m seeing an analog with this latest scan, in that the scanning software is now on the server where you are electing to store your photos. So, hard drive scan on your local is fine, yet hard drive scan on your remote is outrageous?
This is the case where the scanning is being moved from iCloud to local devices -- still only for photos that are set to be uploaded to iCloud of course.. for now.
I think overall what bothers people is more anxiety about what comes NEXT. Sure, it's innocuous enough that photos will be scanned locally instead of remotely, in either case they are going to be shared to iCloud therefore scanned.
What about once they get pressure to expand this to all photos, uploaded to iCloud or not? Other kinds of materials, not just CSAM? Add other capabilities to scan other content, not just photos? etc etc etc...
Even as a long-time fan of, and defender of, Apple, that's a very thorny concern.
In the U.S., we can at least apply pressure to the politicians writing bad laws. In China, can Apple really say no to the government when they want to scan all iPhones for subversive documents?
If Apple built houses, it would secretly install CCTV cameras while the owners were away. There would ToS that allows these "home updates". If those cameras spot furniture of unknown brand, Apple would send a crew to remove it. If the cameras notice a visitor who matches a database of untrusted persons, Apple would report the owners to police. Edit: the houses would have really nice (and patented) rounded corners.
I have some genuine questions regarding Apple's new CSAM detection.
From what I have read
1) Most cloud providers including Apple already scans uploaded photos for child abuse. Photos are scanned server-side.
2) Apple wants to do this on-device instead of server-side so that,
1) photos are not looked up and scanned on the server.
2) the device can generate a cryptographic safety voucher to prevent leaking information about images that do not match the CSAM database.
3) threshold secret sharing can be used to prevent match results and encrypted data about the images from being accessed unless the account exceeds a threshold of matches.
3) Photos that are not uploaded to iCloud Photos are not affected.
4) Matching is done against *already known* database of CSAM images.
5) Data is manually reviewed before reporting to NCMEC.
Isn't this better than what Apple and other companies(Google, Microsoft, Facebook etc.) have been doing on the server-side? Server-side and on-device CSAM detection both perform scan > match > manual review > report process. As far as I can see, the difference is that on-device detection adds more restrictions to Apple and third-parties.
There are many claims focused on Apple's decision to move away from server-side to on-device CSAM detection that,
1) it is worse for privacy. (How is it worse than server-side detection? It is worse than no detection, but then there is no point of focusing on Apple's decision to move away from server-side detection.)
2) it is bad that other companies will follow this path. (Why is it bad if other companies replace server-side detection to on-device detection?)
3) it will let false positives to be triggered. (If it lets, were false positives previously(server-side) impossible to be triggered?)
I'm all in for ceasing personal data scanning, both server-side and on-device, altogether. But based on what I have read, I think on-device detection is at least better than server-side detection. On the other hand, the amount of backlash laser-focused on Apple's decision to move away from server-side to on-device CSAM detection has led me to believe that I might be missing something. Am I?
How long until this surveillance attitude leads them to add permanent monitoring into all Apple products, like Notes, iMovie, Messages, etc? That’s one thing this move points to.
One thing I don't understand which Alex also points out in this thread is why Apple does not build a tool in iMessage to allow users to report/flag such inappropriate and abusive comms to Apple where Apple can then report to the authorities? Why Apple goes to this CSAM scanning route instead?
> For the last couple of years, our team at
@stanfordio
has been hosting a series of conferences on how to balance the safety and privacy aspects of E2EE products.
I am not interested in “balancing” anything. This is not a negotiation and it is not open to compromise. The opposite of privacy is not “safety”; it is surveillance. I refuse to use products that actively surveil me behind my back. Consumers deserve products that work for consumer interests, not products that ultimately serve someone else.
Indeed. However, there is no point getting mad at Alex.
He used to be a real hacker, until he sold his soul for mammon at Facebook. Once someone like this makes the transition from hacker to wiping one's ass with the hacker manifesto, there's no turning back, and you just need to accept that their words are hollow.
Nothing against Alex. He's brilliant, and you never know if you're offered $XMM to compromise your principles, you won't make the same decision until you're actually in that position.
I'm not smart enough to ever be in his position, and I'd like to believe that if I were, I would hold true to principle, but who knows? We're all flawed.
Hmmm. Would that be the same Alex Stamos who was Chief Information Security Officer at Yahoo when they had one of the worst hacks ever, then at Facebook when they were used to subvert American democracy? He’s good at failing upwards, I’ll give you that. I don’t know why I waited so long to officially flip the bozo bit on him by adding him to my feed reader’s filter list.
I realize I’m godwin-ing myself with this, but I’m always wondering how these fanatical people view the discrepancy in holocaust victims from countries with active registration of people’s religion, versus countries without religious registration.
I’ll be downvoted you hell for this, but I’m gonna put this out here.
I’m a little concerned by the lack of concern towards the immense benefit this can have towards the terrifying problem of child pornography online.
If Apple wants to use this for the CCP, they will; and there’s already so many companies who are bowing to that regime.
False positives will be pretty obvious in court. I...I get the privacy concerns but honestly, I’m concerned with this getting almost universally negative reaction.
There’s nothing you can really say about privacy here I haven’t already read - I’m not interested in that. I know I’m going to get shit on by 99% of this community for this - but someone has to offer the opposing opinion; and I guess I’m that bitch atm.
I’m here to offer an opinion I’ve surprisedly never read in the maybe dozen comment threads on HN about this I’ve seen in the past few days.
As a victim of child sexual assault, I highly encourage anything that prevents that from getting worse.
I’m alright with this, I understand the community’s concerns; and tbh I don’t honestly think I need to read the - what I’m aware will be - scathing - responses to this comment. Happy to just see me at -50 karma tomorrow - heh.
Yes, let’s help stop child rape and pedophiles. Yes; sometimes to be serious about that we will have to take some extreme measures.
Thank God those extreme measures exist. May a hell of a lot more pedophiles go to jail from this. Looking hella forward to hearing how many arrests this was responsible for in a year’s time. <3
> I highly encourage anything that prevents that from getting worse.
This is exactly the problem. There is no end to the effective new measures we can introduce. Until everyone is living in a prison cell, there will always be more we can do, that would be demonstrably effective at reducing child abuse. There has to be a line where we say the ends don't justify the means, and for a lot of people, this exceeds that line.
Pedophiles will just simply move to another platform now that this is public knowledge. Why knowingly expose themselves to the risk of using Apple devices when Windows, Android, Linux, etc. exist?
Unfortunately, I doubt the (obviously) good parts of this will really have much impact. But the rest of us will be stuck with the inevitable false positives, increased security attack surface, and potential for govt abuse.
Of course. This is exactly why companies like Google, Microsoft and Apple publicly announce these systems. They know it's impossible to stop people who have sufficient technical knowledge. Given that it can't be stopped, their choice is to have nothing to do with it.
I don’t believe they’ve effectively expanded the CSAM portion at all. Previously they were scanning photos after being shipped to iCloud; now they’re scanning them on the device as they’re prepared to be sent.
So your wish that more criminals would be caught this way is unlikely to be fulfilled.
Per Alex tweet, I am not "verbally rolling my eyes at the invocation of child safety as a reason for these changes." Instead, I simply do not believe that this is the reason for which Apple is introducing these capabilities. There is something to be said about reputation of a business: it's hard to gain and easy to lose. Large, near-monopoly platforms, such as Apple, have lost their reputation in my eyes and now I am always questioning their motives.
I simply do not believe that the Apple will refuse, when under serious anti-trust pressures, to extend these capabilities to other areas, including political views as an example.
Paraphrasing Benjamin Franklin: "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."