The news about Apple monitoring child pornography on users iPhones has caused a great deal of consternation privacy advocates. On the surface it seems they have a point. Closer inspection, however, reveals something else.
The Apple terms and conditions, Section 5B states clearly that, “You agree that you will NOT use the Service to … upload, download, post, email, transmit, store or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy, hateful, racially or ethnically offensive, or otherwise objectionable.”
Let’s just point out that child pornography is illegal an violates at least seven of those conditions. Let’s also point out that no one at Apple is actually reviewing your content. There are almost a billion iCloud users and they just don’t have the personnel to spend that time scanning. Instead, Apple uses an AI based on a database of child sexual abuse material (CSAM) maintained by the National Center for Missing and Exploited Children (NCMEC). This is stuff that is already out on the internet and shared among pedophiles.
One more point: Apple is not scanning your phone, despite what pundits and general news outlets are saying. Apple is scanning photos that pedophiles upload to their iiCloud photo library and compares them to the database. If it exceeds Apple’s unstated limit a human will review it and then report it to NCMEC. Privacy advocates think this will open the door to abuses. The Electronic Frontier Foundation believes governments will pressure Apple to scan user devices for outlawed content, like political speech.
It is not unusual
Governments have known Apple is capable of monitoring content. They already pressure the company to share that information to their constant disappointment. There is no reason to believe Apple will change that recalcitrance anytime soon. Legally, Apple may be required to do exactly what they are doing.
Most developed countries, and every state in the US, require mandatory reporting for teachers and medical staff. They also encourages citizens to report potential child abuse. Apple’s decision (and, in fact, Google and Facebook which removes such content when found) follows along with that responsibility. Moreover, the California Consumer Protection Act (CCPA), and General Data Protection Regulations (GDPR) will fine anyone who distributes pornography without permission of the subjects. Rather than usurping privacy rights Apple is following the spirit, if not the exact letter, of existing laws.
What a concept. A technology company is enforcing the terms between the company and users, although only a subset of those conditions.
This is not only a reasonable decision regarding legalities for Apple, it also make good business sense. Boycotts of Apple products and services by people opposed to child pornography would be much more damaging financially to the company, than a boycott of child pornography advocates. There is, however, a workaround and here is where we get to the point of the headline.
Protecting your pornography from Apple
If seeing children abused, or photographing children being abused is important to you, don’t store or share it through a private cloud platform. Keep it on your hard drive or on a flash drive. Send it in the mail to your customers and friends (which is also illegal but the Post Office isn’t looking for it. That way, when the authorities do find out what you are doing, they won’t have to wait for a technology company to rat you out, you sick pile of fetid garbage.