Apple to launch a nudity detection feature internationally for child safety; the feature, known as ‘communication safety in Messages,’ is intended to alert minors using the Apple app to receive or send nudity-containing photographs. This feature was first introduced in the United States in December last year. This new safety feature will soon be accessible on devices in the United Kingdom, Canada, Australia, and New Zealand and will scan photographs transmitted to and from youngsters.
Last August, Apple proposed a set of tools for detecting child sexual abuse material (CSAM), but the rollout was delayed due to criticism from critics. In addition to alerts for explicit images on a child’s phone, Apple considered putting up hurdles to finding CSAM and sending law enforcement alerts if CSAM is located in a user’s iCloud photos. While Apple said the safeguards would protect users’ privacy, questions were raised about how they may be used to establish a backdoor into massive content tracking and monitoring.
Must Read: Apple Purchases AI Music, a Startup that uses Artificial Intelligence to Create Music.
Since then, Apple has made several changes to its image scanning tool. The initial statement said that if explicit photos were found, parents of users under 13 would be instantly contacted. However, this is no longer addressed in the update. The communication safety feature is also turned off by default, and parents must activate it.
Instead, if it detects any nudity, Communication Safety blurs the substance of the communication. Children will be given a warning and directed to resources provided by child safety organizations. Most crucially, the feature is bidirectional. Various safeguards are activated if nudity is discovered in a photo that a youngster intends to submit. The youngster is discouraged from sending it, and they are given the alternative of calling an adult.
Must Read: Tech Giant Apple Hits $3Trillion In Market Value
Apple has gone above and beyond to ensure that the highest level of privacy is maintained. All image processing is done on the device, with Apple having almost little knowledge of the analysis. Apple apps like Siri, Spotlight, and Safari Search also get CSAM detection functionality. If a user searches for terms related to child exploitation, these applications will step in.
“These interventions inform users that their interest in this topic is harmful and troublesome, and they provide tools from partners to help them deal with the problem,” Apple explained. Users who ask Siri how to report CSAM content will be directed to resources on how to file a report.