Close Menu
My SiteMy Site
  • Home
  • Reviews
  • Computers
  • How Tos
  • Travel & Lifestyle
  • Phone ROMs
  • Contact
    • Advertise
    • About Us
Facebook X (Twitter) Instagram
My SiteMy Site
  • Home
  • Reviews
  • Computers
  • How Tos
  • Travel & Lifestyle
  • Phone ROMs
  • Contact
    • Advertise
    • About Us
My SiteMy Site
Home»News»Apple To Launch Nudity Detection Feature Internationally For Child Safety
News

Apple To Launch Nudity Detection Feature Internationally For Child Safety

Joshua EgeonuBy Joshua Egeonu3 Mins Read
Share
Facebook Twitter LinkedIn WhatsApp Pinterest Email

Apple to launch a nudity detection feature internationally for child safety; the feature, known as ‘communication safety in Messages,’ is intended to alert minors using the Apple app to receive or send nudity-containing photographs. This feature was first introduced in the United States in December last year. This new safety feature will soon be accessible on devices in the United Kingdom, Canada, Australia, and New Zealand and will scan photographs transmitted to and from youngsters.

Last August, Apple proposed a set of tools for detecting child sexual abuse material (CSAM), but the rollout was delayed due to criticism from critics. In addition to alerts for explicit images on a child’s phone, Apple considered putting up hurdles to finding CSAM and sending law enforcement alerts if CSAM is located in a user’s iCloud photos. While Apple said the safeguards would protect users’ privacy, questions were raised about how they may be used to establish a backdoor into massive content tracking and monitoring.

Must Read: Apple Purchases AI Music, a Startup that uses Artificial Intelligence to Create Music.

Since then, Apple has made several changes to its image scanning tool. The initial statement said that if explicit photos were found, parents of users under 13 would be instantly contacted. However, this is no longer addressed in the update. The communication safety feature is also turned off by default, and parents must activate it.

Apple to launch nudity

Instead, if it detects any nudity, Communication Safety blurs the substance of the communication. Children will be given a warning and directed to resources provided by child safety organizations. Most crucially, the feature is bidirectional. Various safeguards are activated if nudity is discovered in a photo that a youngster intends to submit. The youngster is discouraged from sending it, and they are given the alternative of calling an adult.

Must Read: Tech Giant Apple Hits $3Trillion In Market Value

Apple has gone above and beyond to ensure that the highest level of privacy is maintained. All image processing is done on the device, with Apple having almost little knowledge of the analysis. Apple apps like Siri, Spotlight, and Safari Search also get CSAM detection functionality. If a user searches for terms related to child exploitation, these applications will step in.

“These interventions inform users that their interest in this topic is harmful and troublesome, and they provide tools from partners to help them deal with the problem,” Apple explained. Users who ask Siri how to report CSAM content will be directed to resources on how to file a report.

Apple

Related Posts

The Apple GPT & Apple’s Chatbot Mysterious Debut

August 11, 2023

Twitter’s Rebranding to ‘X’ Hits Roadblock in Apple App Store

July 31, 2023

New iPhone to Include ‘Action’ Button: What to Expect

July 28, 2023
Add A Comment
Leave A Reply Cancel Reply

Recent Posts
  • 6 Sports Apps You Must Have On Your Phone
  • Valencia Travel Guide: All You Need to Know About This Spanish Gem
  • TikTok Faces Unexpected Ban in Senegal, Sparks Global Curiosity
  • The Apple GPT & Apple’s Chatbot Mysterious Debut
  • Triller Takes on TikTok: Filing for Public Listing
Get Exclusive Tech Insights!
My Site
Facebook X (Twitter) Instagram YouTube
© 2025 Oscarmini Co. Designed by illBytes.

Type above and press Enter to search. Press Esc to cancel.