views
Apple makes an attempt to assuage concerns about the new anti-child abuse identifiers that would scan photos uploaded to iCloud and also analyze and blur any sexually explicit images sent and received by the iMessage app on your child’s iPhone. Since Apple had announced the Child Sexual Abuse Material (CSAM), there have been concerns that this could become a surveillance tool if governments get access to it. Apple has categorically said that neither will they be sharing any information with law enforcement, but not allow this to become a backdoor for any governments. Just yesterday, the Facebook owned WhatsApp continued the privacy focused war of words with Apple and has claimed that the tech giant has built a software that scans all the private photos on your iPhone.
• Is this enabled on all iPhones? First and foremost, the CSAM feature needs to be enabled by parents for a child’s Apple iCloud account for their iPhone, iPad or Mac. Only then will the feature be enabled to detect images stored in iCloud or sent or received on iMessage for any sexually explicit content. “If the feature is enabled for the child account, the device will evaluate images in Messages and present an intervention if the image is determined to be sexually explicit,” says Apple. These features arrive later this year with the new operating systems, the iOS 15 for the iPhone, iPadOS 15 for the iPad, watchOS 8 for the Apple Watch, and macOS Monterey for the Mac computing devices.
• Can agencies and governments access your photos? Apple says that the information and any CSAM detections will not be shared with the police or government agencies. These will only work for iCloud photos, which are saved on the cloud and also for messages sent and received using iMessage. Apple also says they never gain access to any messages that users send or receive.
• “Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” says Apple.
• It is interesting Apple may say this. There are two sides to that coin. First is the constant push-back that the tech giant has done when government agencies come calling for access to user data and unlocking an individual’s iPhone. Case in point being Apple’s tussle with the FBI over the San Bernardino shooter’s iPhone. Also, Apple has been championing the cause of data privacy for users by rolling out severe restrictions on apps that try to track users to serve advertisements. In fact, popular apps including TikTok have been caught on the wrong side of the fence, as a result. Yet, Apple also sells iPhones without the FaceTime app in many countries. The company also has removed thousands of apps from the App Store meant for users in China and is alleged to also be storing user data on servers that the Chinese authorities reportedly have access to.
• How does this work? First and foremost, parents and guardians have to enable and opt-in to have communications safety enabled for their child’s account of age 12 or younger. Now, if a child in this age group receives or sends an image that CSAM detects as sexually explicit, this image will come with a warning asking if they wish to continue viewing the image, while the parents and guardians will also be sent a notification. For children between the age groups of 13-17 years, they will also get a warning message with any such image or content, but the parents are not notified.
• Can Apple scan all your iPhone photos? Apple says that the CSAM detection only applies to photos that are uploaded and stored on the iCloud cloud storage. “Even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device,” says Apple.
• How does CSAM know what images to scan or look for? The CSAM algorithms will scan iCloud Photos and derives its capabilities from images hashes provided by NCMEC and other child safety organizations. “This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos,” says Apple. the reporting to NCEMC is applicable only in countries, such as the US, where possessing child porn images is a crime.
• What about false positives or targeting? “Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC,” says Apple.
• What about encryption in iMessage? Apple insists that this doesn’t change the privacy of iMessage communications and chats. The important bit to note here is that the child safety feature has to be enabled for an iCloud account for the detection and evaluation of images that are suspected to be pornographic in nature. It is being insisted that no communications, images, image evaluations, interventions for suspected images and warning notifications are shared with Apple or anyone else.
Read all the Latest News, Breaking News and Coronavirus News here.
Comments
0 comment