Apple

Apple’s child protection plans, almost no one seems to like


Apple unilaterally decided to propose a system to protect children (CSAM). How? Through an automatic inspection of the photos uploaded to iCloud. That way it is known if there is an image out of place and that harms these infants. The news has not liked almost anyone. It is a violation of the user’s privacy. But it could be said that does the end justify the means? Not even the EFF wants this Apple idea to go ahead.

Many associations including the EFF do not want this idea of ​​Apple on CSAM to go ahead

Apple’s goal is to make people feel safe and empowered through technology. “We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM). That’s why Apple is introducing new child safety features in three areas:

  1. Higher Parental control
  2. Through machine learning of the Messages app. Warn about confidential content
  3. Apple obtains information about CSAM collections at ICloud Photos that you will share with the authorities.

This is where the biggest problems are occurring. Apple is supposed to be the paradigm of privacy both offline and online. However, with these new methods, it is clear that for the American company, the end does justify the means. User privacy is invaded for the sake of child protection. I don’t know if it’s a good idea or a bad idea. But what seems to have a laudable end, it is becoming an ordeal for Apple.

At the moment the initiative is stopped, because it is supposed to be implemented later, but the most underlying problem is that there are many companies, organizations and private people that are crying out for it. Here we might think that if “we have nothing to hide, we should not fear this technique because they will never find anything.” But the fear that gives is that he gets confused and finds something that he is not. The machines are good but not foolproof. And we also return to the usual question. The end justifies the means?.

One of the organizations that declares itself against this initiative of Apple is the Electronic Frontier Foundation (EFF). It has asked Apple to completely abandon its child safety project. The group says it is “in luck” that Apple’s move is on hold for now. But he calls the plans, which include scanning user images for child abuse material (CSAM), “a decrease in privacy for all iCloud Photos users.” The EFF petition against Apple’s original ad now contains more than 25,000 signatures. Another, started by groups like Fight for the Future and OpenMedia, contains more than 50,000.

Many are the people who have signed against this initiative. It may not be a significant number when compared to Apple’s device sales, but they are enough to take into account its position and its approaches. Some experts have warned that the feature could be expanded to find other material later if Apple bows to the inevitable pressure from governments.

EFF is pleased that Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go beyond just listening, and abandon its plans to put a backdoor on its encryption altogether. The features that Apple announced a month ago, intended to help protect children, would create an infrastructure that is too easy to redirect to increased surveillance and censorship. The measures would create a huge threat to privacy and security of Apple users, offering authoritarian governments a new mass surveillance system to spy on citizens.

We will remain attentive to this fight for privacy versus security. Two fundamental values ​​and rights put face to face.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *