Apple takes time to think about it: Postponement for controversial iOS features

Apple takes time to think about it: Postponement for controversial iOS features

It won't happen for now: When iOS 15 is released, neither the nude image detection in iMessage nor the photo comparison, which was originally supposed to compare local photos with a database of digital fingerprints of known abuse images before uploading them to Apple's iCloud. Apple is postponing the controversial features.

Update from September 3, 2021: The critical reporting on Apple's debatable functions for abuse detection is bearing fruit. In a current statement, the iPhone manufacturer is informing the public about a postponement of the function (source: iphone-ticker.de). Specifically it says:

"We have already announced plans for features to help protect children from criminals, who use communication tools to recruit and exploit them, and to curb the spread of child sexual abuse material. Based on feedback from customers, stakeholders, researchers and others, we decided to take more time in the coming months to gather ideas and make improvements before we release these very important child protection features. "

Original comment by Frank Ritter:

Apple is taking action against child pornography. This is basically a positive concern, but: the opposite of good is well meant. Because Apple scans data on users' devices without any suspicion: inside - also and above all of the majority who are innocent. Worse still: the systems open the door to abuse, for example through authoritarian regimes. That is fatal.

What is it about

On Thursday, Apple announced its plans to protect children in the Child Safety section of the website. Contents on iPhones, iPads, Mac computers and in the iCloud can therefore be searched later in the year for the sake of child protection, their display blocked and the authorities notified of their possession. The changes initially affect users in the USA.

In a nutshell what Apple announced:

iMessage : If images are sent or received on devices with iOS 15, iPadOS 15, WatchOS 8 or macOS Monterey that show, for example, nudity in minors, these are initially not displayed. The device assesses whether the content shows this on the basis of locally executed machine learning algorithms. The user can confirm in a warning message that this content will still be displayed or sent. If the user is under 13 years of age and part of a family group, the parents will be informed. The function is "opt-in", so it has to be activated separately by parents. Photos / iCloud : To reduce the spread of child pornography, images uploaded to iCloud are cross-checked against a database of known child pornographic images maintained by the US NGO National Center for Missing and Exploited Children (NCMEC). This comparison does not take place in the cloud, but via so-called "neural hashes" on the user's device, which can identify similarities on the basis of specific image features without the images themselves being compared. According to Apple, false reports are possible, but extremely rare. If a critical amount of corresponding images in an iCloud account is exceeded and Apple determines that the allegation is justified, the case will be examined by Apple, the user account blocked if necessary and the authorities informed. Those affected should be given the opportunity to lodge an objection. Siri / Search : Users are warned when they search for child pornographic content.

Why the new features go too far

Of course, the creation and distribution of child pornography is shameful, should be prosecuted and punished. But how far should you go? I think checking all the images users have on their devices and uploading them to the cloud is going too far for that . Yes, even if this meant that the material would spread a little less, some owners of such data could actually be caught because of the procedure.

The first important reason: I want to have control over my device , especially the data that is stored on it and that is sent or received with it. I do not want to fear that an algorithm will check the communication running through the devices to see whether the sent content is permissible, whether the pictures I have shot or sent are suspect according to some definition. I don't want to even think about sending a selfie from the nude beach, a photo of the kids splashing in the bathtub, or, God forbid, a consensual photo of primary or secondary genitals to your partner. The moment this thinking takes place, the scissors are there in your head - then you have lost the sovereignty over the technology that you have.

It is important to understand that the synchronization of the data in iCloud only takes place on the basis of hashes of known images, and only on the end device. Own pictures are only rated with iMessage when they are sent, and then only under certain circumstances and with manageable consequences, see above. I believe Apple that the features have been implemented with the best of intentions and a great deal of technical expertise to be as invasive as possible. But trust is so destroyed that Apple has built up in recent years with its focus on privacy and data protection both in the architecture of its own hardware and software and in advertising. That's a problem, because the technology industry needs Apple in this role , as a counterweight to the data giants Google and Facebook.

But there is another serious reason for rejecting Apple's precautions. Recent history has shown that the topic of child pornography, as serious and horrific as it may be, is repeatedly abused for technology or technology regulation that deprives its users of freedoms and leads them into a system of surveillance, keywords: data retention, upload filters and the like. The argument about child pornography was always only the beginning of a development, acted as a grateful door opener in the form of that crime, which everyone can agree to fight. You and I and Aunt Lise have nothing to hide and somehow it's dubious if you don't like measures against child pornography, right?

The problem with this: Apple creates an infrastructure that can be misused for censorship right down to the user's devices. Once the tools are there, state actors inevitably want to expand the content to be combated. The next thing would be political extremism, then content that is copyrighted. And finally those that are considered disreputable in some parts of the world and an expression of individual freedom in other parts. Representations of homosexuality. Pictures from the massacre in Tian'anmen Square. Memes about leading authoritarian states.

The end of this and similar developments can be a totalitarian surveillance of all communication, the censorship and suppression of unpleasant topics, the social ostracism and persecution of those who want to discuss them. No, this is not a dystopia , but a reality in China for a long time.

Who guarantees that it will stay with pictures, not even texts, videos or conversations where the phone is wrong? Who checks that the hash list really only contains child pornography? Who will protect us from these features being abused in totalitarian regimes? Only Apple? That is not enough.

More information

apple.com appleprivacyletter.com eff.org

What Apple Says

We also spoke to Apple on the subject. In a personal conversation it was made clear that the child protection functions were developed together with experts, are technically precisely documented at apple.com/child-safety and will initially only be rolled out in the USA. It is not yet clear at this point in time whether something similar will happen in other countries and in cooperation with which authorities; the legal framework must first be clarified. Apple also attaches importance to the fact that third parties will not be able to directly access the data that is encrypted in iCloud in the future. The comparison also takes place here on the device, which is compared with a downloaded version of the NCMEC database. Regarding child protection in iMessage, Apple emphasizes that use is a purely on-device "opt-in" feature that parents must explicitly activate for their family group.

Comments

Popular posts from this blog

What is VoLTE and how can you activate it on your Xiaomi

So you can check the battery status of your Xiaomi smartphone and how many cycles you have performed

How to exit the FASTBOOT mode of your Xiaomi if you have entered accidentally

Does your Xiaomi charge slowly or intermittently? So you can fix it

Problems with Android Auto and your Xiaomi? So you can fix it

If your Xiaomi disconnects only from the WiFi it may be because of that MIUI setting

How to change the font in MIUI and thus further customize your Xiaomi: so you can change the type, color and size of the letters of MIUI

What is the Safe Mode of your Xiaomi, what is it for and how can you activate it

Improve and amplify the volume of your Xiaomi and / or headphones with these simple adjustments

How to activate the second space if your Xiaomi does not have this option