Apple unveiled plans to scan US iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.
The tool designed to detect known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud.
If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.
The number of pedophiles lurking around the internet trying to exploit innocent and vulnerable children for their evil acts is increasing day by day. Parents are increasingly worried and even panicking because of the growing rate of online child sexual abuse.
Read More: https://www.euronews.com/next/2021/08/06/apple-to-scan-us-iphones-for-images-of-child-sexual-abuse