Digital Rights Group Is ‘Pleased’ That Apple Has Put Its Child Safety Features on Hold; Wants Complete Abandonment Though

Digital Rights Group Is ‘Pleased’ That Apple Has Put Its Child Safety Features on Hold; Wants Complete Abandonment

Apple recently announced that it delayed its CSAM detection features as the company was aiming to improve it, then roll it out at an unconfirmed time. One digital rights group is pleased with the decision, but according to its latest announcement, it wants the technology giant to abandon those features completely.

Electronic Frontier Foundation (EFF) Claims That Apple’s CSAM Detection Features Will Create a Cloud of Danger to iPhone Users’ Privacy and Security

Earlier, Apple mentioned in its statement that the reason for delaying its child safety features was due to the feedback it received from various groups.

Related StoryOmar Sohail
M2 iPad Pro Once More Reported to Launch in October, Its Release Should Increase mini-LED Tablet Market Share to 66 Percent

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

With this decision, the EEF is happy with the step Apple has taken, but it is not 100 percent satisfied with the company’s efforts, saying that these child safety features need to be abandoned completely.

“But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely. The features Apple announced a month ago, intending to help protect children, would create an infrastructure that is all too easy to redirect to greater surveillance and censorship.”

The group also states that there are over 90 organizations across the globe urging Apple not to implement these features for fear that introducing them would lead to the censoring of protected speed, and threaten the privacy and security of iPhone users. EEF’s petition demanding Apple to completely abandon its child safety features has reached 25,000 signatures so far.

However, ever since Apple’s announcement of delaying its CSAM detection feature, it has not made any public announcement regarding an update, so we will keep our ear to the ground and let our readers known when the next step is taken.

News Source: EEF

WccfTech Tv
Filter videos by