Later this year Apple Inc will be launching new software that will be able to see what you put on iCloud photos account. They will be able to analyse all your photos and if they feel they are inappropriate they will be able to report you to the authorities.
Apple inc will also be able to observe your messages and again if they chose to, be able to report you to the authorities.
Apple Inc. said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.
As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.
If Apple detects a threshold of sexually explicit photos of children in a user’s account, the instances will be manually reviewed by the company and reported to the National Centre for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies. Apple said images are analysed on a user’s iPhone and iPad in the U.S. before they are uploaded to the cloud.
Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. The company is using a technology called Neural Hash that analyses images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.
Apple said its system has an error rate of “less than one in 1 trillion” per year and that it protects user privacy. “Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account,” the company said in a statement. “Even in these cases, Apple only learns about images that match known CSAM.”
Any user who feels their account has been flagged by mistake can file an appeal, the company said.
Click here for a secure way to sign up, you will be supporting independent news. Click the button below.
Disagree with this article? why not write in and you can have your say? email us