Apple has quietly removed all references to CSAM extension (Child sexual abuse material) from his page Child safetysuggesting that plans to implement measures to detect child sexual abuse images on iPhones and iPads could be scrapped altogether after criticism from much of the tech world.
In August, Apple announced a series of new features in iOS, macOS, watchOS and iMessage for the safety of the youngest public, among which the possibility of scanning photo libraries before uploading a backup to iCloud stood out, with the aim of searching for the presence of CSAM content.
Following this announcement, Apple was met with a veritable hurricane of criticism from a slew of organizations, security researchers, pro-privacy groups, politicians, and even some employees of the company itself. even. La Mela first tried to clear up any misunderstandings, investigating the matter further and making it available FAQs and documentsin order to ease the worries.
Actions which did not have the expected result, and which led Apple to continue implementing certain measures, the most useful and the least controversial, and to postpone indefinitely the establishment of the CSAM. Apple said at the time that the decision to bring the technology forward was made based on feedback from “customers, advocacy groups, researchers and others” so as to have more time to gather suggestions and make improvements before the feature is released.
A statement that was posted on Apple’s child safety page but has now disappeared and with it all references to CSAM, thus suggesting the possibility that Apple has decided to completely abandon the project.
Meanwhile, with the release of iOS version 15.2, Apple continued to include features to protect young people from receiving sexually explicit images in iMessage and the protection system for Siri and search that warns the user by offering additional information on how to receive and request help or assistance.