Confusion at Apple: is the child abuse scanner not coming?
With iOS 15 and iPadOS 15, Apple actually wanted to introduce an automatic scan process to analyze stored images for possible child abuse. Now Apple has removed all references to this from its website. But Apple still believes in the project.
Apple removed notes on photo scanning on iOS
With the announcement that it wanted to automatically compare photos with an abuse database on iOS 15 and iPadOS 15, the group received a lot of criticism. Originally, Apple's plan was that stored images should be automatically examined for a possible depiction of child abuse . If "a certain limit value" is exceeded, a manual assessment by a person and a report to the authorities should follow.
Apple's plan was also criticized because the scan should take place directly on the customers' iPhones and iPads. According to the group, only photos that are in the iCloud should be checked. Data protection activists, civil rights activists and users then attacked Apple.
As it now turns out, a few days ago Apple removed all references to the controversial scan from its own website. Automatic photo comparison is no longer mentioned in the notes on "Security of communication in messages".
Everything about the iPhone 13 models in the video:
Photo scan: Apple announces another postponement
With iOS 15.2, Apple introduced an iMessage nude filter in the USA, but decided not to take any further measures in this area. One wanted to wait for further child protection functions, it said at the introduction.
Apple has now made a similar statement about the automatic scanning process. Spokesman Shane Bauer states that nothing has changed in Apple's position . So one continues from an introduction, albeit later than originally thought (source: The Verge).
Comments
Post a Comment