
It sounds like a nightmare, yet it could happen to anybody: In this case, unknown people stole nude pictures and sex videos from a young woman’s private cloud and spread them on internet porn sites. What was worse is that the content could be found via a simple Google search of the woman’s name because her ID had also been stolen from the cloud.
She turned to HateAid, a German non-profit that provides support for people affected by online hate and digital violence. With the organization’s help, she contacted many of the sites involved and reported more than 2,000 URLs that could be found via Google’s image search. Though Google usually removes such content, the images and videos kept reappearing online and in search results, as did deepfakes, which are generated and manipulated by artificial intelligence (AI).
The case has raised a number of crucial questions: How extensive should online data protection be? What are the worst-case scenarios, particularly for women and those perceived to be female? What can be done?
Violation of privacy ‘akin to rape’
The woman, who wishes to remain anonymous and is referred to as Laura on HateAid’s site, found out by chance that her data had been stolen when she one day decided to look up her own name online. She told the German weekly magazine Der Spiegel that discovering intimate pictures and videos of herself online, none of which were ever intended for publication, felt akin to having been raped. Her life has since changed completely. She has moved homes, changed jobs and now suffers from post-traumatic stress disorder.
After failing to get Google to remove the content from its search results out of court, she has now sued the company in Ireland, where it has its European headquarters.
HateAid is supporting her in this endeavor. “We are covering all the costs and future cost risks in this case, because very few people affected [by such a case] can actually imagine taking the risk of suing a corporation like Google,” Hate Aid CEO Josephine Ballon told DW. She added that she hoped there would be a landmark ruling that would clarify whether search engines are legally obliged to permanently remove images from search results after they are reported, even if they are re-uploaded elsewhere.
Laura discovered the hack when she searched for her own name Image: Eibner Europa/imago
How can Google be forced to protect data?
“11 years ago, the European Court of Justice made data protection history with its landmark ruling on the ‘right to be forgotten,'” Data protection expert and computer scientist Marit Hansen told DW. “The current case aims to build on that.”
The right to be forgotten allows a person to request the removal of their personal data if certain conditions apply.
Hansen, the data protection commissioner for the German state of Schleswig-Holstein, said it made sense that, in accordance with the EU’s General Data Protection Regulation (GDPR) and the fundamental right to data protection, a person’s use of their own data must remain manageable.
“It is not surprising that obligations for global search engine providers will arise. However, the extent of these obligations with regard to images must now be delineated.”
Asked whether it was technically possible to filter out certain results, Hansen said this was relatively easy regarding exact copies, where all the bits in an image file matched. What was more complicated, she explained, was when it came to content containing deviations from the original, for instance, due to cropping or alterations made with the help of AI.
“This issue is related to the possibilities offered by reverse image searches, which various companies, including Google, provide,” she said. A reverse image search involves uploading an image, and asking a search engine to look for similar images. It’s not 100% reliable, and often delivers incorrect results.
Hansen explained that search engine providers could use this to argue that the technical accuracy for filtering identical images simply wasn’t high enough yet. But she said that in principle, Google and others should be held accountable. Google did not respond to a request by DW to answer questions regarding the current case.
Our nudes are #NotYourBusiness
HateAid sees Laura’s case not only as an example of the problems related to data protection and privacy issues on the internet, but also of image-based sexualized violence against women and those who are perceived as female. And also of the profit that companies such as Google make. Because “a search engine makes content accessible to a wide audience and profits from the resulting clicks,” said Ballon.
HateAid is accompanying its lawsuit with a campaign: Our nudes are #NotYourBusiness. “Women and people perceived as female are particularly affected by the misuse of intimate images, or fake images and videos created using AI. These days, all you really need is a LinkedIn profile picture. This is a problem that affects society as a whole, and it is growing, as we see from our counseling services for victims,” said Ballon.
Using Google much? Watch out for these scams
To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video
Not only celebrities such as Taylor Swift at risk
Many people probably associate image-based sexual violence with celebrities. In 2014, hundreds of nude photos of mainly female celebrities were leaked after a huge hacking attack that became known as “Celebgate.” The US singer Taylor Swift and the Italian Prime Minister Georgia Meloni are just two of the many women who have been the victims of deepfake videos.
But Laura’s case shows that such attacks can also target people who are not in the public eye.
The idea of Hate Aid’s case is to force Google to provide better protection for victims of such attacks, as well as to make the creation of deepfakes a criminal offense if a person depicted did not give their consent.
“Without this, those affected will have to search for images their whole lives and submit manual removal requests,” said Ballon. “This is an incredible psychological burden that should not exist and does not have to exist.”
This article was originally published in German.