Close Menu
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
What's Hot

The Beats Pill portable speaker drops back down to a record-low price

May 12, 2025

How to Clear Safari History on iPhone and iPad

May 12, 2025

Pi Coin Breaks 2-Month Streak, Moo Deng Soars 540%!

May 12, 2025
Facebook X (Twitter) Instagram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
Facebook X (Twitter) Instagram
KittyBNK
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
KittyBNK
Home » iOS 18.2 has a child safety feature that can blur nude content and report it to Apple
Tech News

iOS 18.2 has a child safety feature that can blur nude content and report it to Apple

October 24, 2024No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
iOS 18.2 has a child safety feature that can blur nude content and report it to Apple
Share
Facebook Twitter LinkedIn Pinterest Email

In iOS 18.2, Apple is adding a new feature that resurrects some of the intent behind its halted CSAM scanning plans — this time, without breaking end-to-end encryption or providing government backdoors. Rolling out first in Australia, the company’s expansion of its Communication Safety feature uses on-device machine learning to detect and blur nude content, adding warnings and requiring confirmation before users can proceed. If the child is under 13, they can’t continue without entering the device’s Screen Time passcode.

If the device’s onboard machine learning detects nude content, the feature automatically blurs the photo or video, displays a warning that the content may be sensitive and offers ways to get help. The choices include leaving the conversation or group thread, blocking the person and accessing online safety resources.

The feature also displays a message that reassures the child that it’s okay not to view the content or leave the chat. There’s also an option to message a parent or guardian. If the child is 13 or older, they can still confirm they want to continue after receiving those warnings — with a repeat of the reminders that it’s okay to opt out and that further help is available. According to The Guardian, it also includes an option to report the images and videos to Apple.

iOS 18.2 has a child safety feature that can blur nude content and report it to Apple

Apple

The feature analyzes photos and videos on iPhone and iPad in Messages, AirDrop, Contact Posters (in the Phone or Contacts app) and FaceTime video messages. In addition, it will scan “some third-party apps” if the child selects a photo or video to share with them.

The supported apps vary slightly on other devices. On Mac, it scans messages and some third-party apps if users choose content to share through them. On the Apple Watch, it covers Messages, Contact Posters and FaceTime video messages. Finally, on Vision Pro, the feature scans Messages, AirDrop and some third-party apps (under the same conditions mentioned above).

The feature requires iOS 18, iPadOS 18, macOS Sequoia or visionOS 2.

The Guardian reports that Apple plans to expand it globally after the Australia trial. The company likely chose the land Down Under for a specific reason: The country is set to roll out new regulations that require Big Tech to police child abuse and terror content. As part of the new rules, Australia agreed to add the clause that it was only mandated “where technically feasible,” omitting a requirement to break end-to-end encryption and compromise security. Companies will need to comply by the end of the year.

User privacy and security were at the heart of the controversy over Apple’s infamous attempt to police CSAM. In 2021, it prepared to adopt a system that would scan for images of online sexual abuse, which would then be sent to human reviewers. (It came as something of a shock after Apple’s history of standing up to the FBI over its attempts to unlock an iPhone belonging to a terrorist.) Privacy and security experts argued that the feature would open a backdoor for authoritarian regimes to spy on their citizens in situations without any exploitative material. The following year, Apple abandoned the feature, leading (indirectly) to the more balanced child-safety feature announced today.

Once it rolls out globally, you can activate the feature under Settings > Screen Time > Communication Safety, and toggle the option on. That section has been activated by default since iOS 17.

Credit: Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

The Beats Pill portable speaker drops back down to a record-low price

May 12, 2025

How to use Gemini to generate unique backgrounds in Google Meet

May 11, 2025

Doctor Who ‘The Story and the Engine’ review: Just a trim, thanks

May 10, 2025

FDA approves at-home pap smear alternative device for cervical cancer screening

May 10, 2025
Add A Comment
Leave A Reply Cancel Reply

What's New Here!

SOL Strategies Announces $500M to Buy More Solana

April 23, 2025

Designed specifically for superyachts, this gyroscopic, self-levelling golf-putting platform is a dream come true for golf-loving millionaires.

March 22, 2024

Magic Square Launchpad Boosts Early-Stage Web3 Ventures

May 20, 2024

Will Bitcoin Follow Gold in Hitting ATH as Forex Recovers

September 12, 2024

Michael Bublé Rocked a Sleek New Rolex Daytona on Stage in Mexico – Robb Report

October 13, 2023
Facebook X (Twitter) Instagram Telegram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
© 2025 kittybnk.com - All Rights Reserved!

Type above and press Enter to search. Press Esc to cancel.