Google accuses you of child abuse? Impossible! Right?
The legislator in Europe is working on a proposal that could force companies to scan all messages we exchange for child abuse. The goal is noble, of course, but it can very easily go wrong. And if things go wrong, you might suddenly be accused of sexually abusing children.
All photos safe in the cloud
The New York Times reported a month ago about an American father with a sick child. The boy’s penis was swollen and painful. Their general practitioner asked the father to send pictures of the genitals. The father had taken those pictures with his phone. The phone was set up to automatically send each photo to Google’s cloud as well. Handy, in case he lost his phone. But Google’s computers run artificial intelligence that analyzes all the photos of all users. If there is possible sexual abuse of children, all alarm bells go off. And that happened here too, with all its consequences. Google deleted the father’s account and alerted the police.
Human as fail safe
Whoops. There was, of course, no child abuse at all. But because the photos are evaluated automatically, such errors are unavoidable. This is partly because computers are much less able to assess the context in which such a photo was taken than humans. That is why policymakers who want to use this type of technology often propose a human fail safe. Before taking any action, a person of flesh and blood must first have looked at it. But that’s a tough “solution” to fix a system that we can’t get to work flawlessly. Because when you think about it: that way of checking means asking ordinary people at ordinary companies to look into child abuse. These are not people who are trained for this, or who receive the much-needed psychological support for it, like the people working at the special victims department of the police. And to make matters worse, these types of companies have a particularly bad reputation when it comes to support for moderators.
A major tech company is accusing you (wrongly!) of child sexual abuse. Super intense.
Mistakes continue to occur
In addition, in the Netherlands, viewing images of child sexual abuse is punishable by law. So you would ask companies to hire employees to do something illegal. In response, policymakers came up with a new proposal: an independent body will be set up to assess such images. Companies forward your photos to them unseen. But that is of course also very problematic, because then the confidentiality of your photos is in even more peril. What parent would like to see sensitive photos of their child spread even further? Not a single one.
Bits of Freedom needs your help! Please donate
But okay, a human fail safe could fix the problem of computer errors, in the eyes of some policymakers. The New York Times story shows that human control by Google at least did not work so well. Google reported its user to the police after its own check. And even after the police confirmed that nothing illegal had happened, Google stuck to its own truth.
False accusation
Take a moment to think about that. A major tech company is accusing you (wrongly!) of child sexual abuse. Very troubling. And then they report you to the police. Even more deeply troubling. The police then starts an investigation into you. For the father from the New York Times story, it meant that the police requested all available data from Google, such as his location data, search history, photos and files. That may be understandable, but it is just as troubling. And that father, ironically, is a developer of the software that labeled him a criminal and probably knows how to fight back relatively well. But the vast majority of the population is less resilient and less handy with small print…
The legislator wants technology companies to monitor communications that ought to be protected with end-to-end encryption.
The above may very well be dismissed as a voluntary (and stupid!) action by a single tech company. Or an incident. But it isn’t. There are more examples, also with Dutch users.
Europe wants to standardize this
But there is more to consider. The European legislator is now discussing a bill that could force companies to scan all messages from all customers for child sexual abuse. They now want to enforce a practice that already regularly goes wrong at Google and other technology companies. So we’re going to see stories like this more often.
Confidentiality of communication is indispensable for everyone, including children and victims of sexual abuse.
It is even more painful that the European Commission’s proposal lifts confidentiality on the Internet. It wants technology companies to monitor communications that right now are still protected with end-to-end encryption. But the confidentiality of communication is indispensable for everyone, including children and victims of sexual abuse. The proposal must therefore be dropped. We and European partners are working very hard to achieve exactly that. Will you help us? Donate now!
This article was translated into Dutch by Celeste Vervoort and Philip Westbroek.