The Wikimedia Foundation on Wednesday released its first-ever transparency report — and along with it a protest against Europe’s “right to be forgotten” law. Wikimedia is the nonprofit owner of Wikipedia and other sites.
“Last week, the Wikimedia Foundation began receiving notices that certain links to Wikipedia content would no longer appear in search results served to people in Europe,” wrote Wikimedia General Counsel Geoff Brigham and Legal Counsel Michelle Paulson.
“Denying people access to relevant and neutral information runs counter to the ethos and values of the Wikimedia movement,” they added. “The Wikimedia Foundation has made a statement opposing the scope of the judgment and its implications for free knowledge.”
‘An Open and Neutral Space’
Wikimedia granted none of the 304 general content removal requests it received over the past two years, Brigham and Paulson noted in the organization’s transparency report.
“The Wikimedia Foundation is deeply committed to supporting an open and neutral space, where the users themselves decide what belongs on the Wikimedia projects,” they said.
Meanwhile, the foundation granted just 14.3 percent of requests for user data.
“As part of our commitment to user privacy, Wikimedia collects little nonpublic user information, and retains that information for a short amount of time,” they explained.
‘Accurate Results Are Vanishing’
At the same time, however, the Wikimedia Foundation has received multiple notices of intent to remove Wikipedia content from European search results.
“To date, the notices would affect more than 50 links directing readers to Wikipedia sites,” Brigham and Paulson noted.
With the European Court of Justice’s decision in Google Spain v. AEPD and Mario Costeja Gonzlez, “the European court abandoned its responsibility to protect one of the most important and universal rights: the right to seek, receive, and impart information,” they charged.
“As a consequence, accurate search results are vanishing in Europe with no public explanation, no real proof, no judicial review, and no appeals process,” Brigham and Paulson said. “The result is an Internet riddled with memory holes — places where inconvenient information simply disappears.”
The Wikimedia Foundation did not respond to our request for further details.
‘It Took Some Effort’
“I think they’re overstating the case,” John Simpson, director of Consumer Watchdog’s Privacy Project, told TechNewsWorld. “I don’t think they understand the privacy issues involved.”
Consumer Watchdog supports the right to be forgotten, Simpson added, but “it has to be a balance between the right to know and the right to be forgotten.”
Before the digital age, “if I did something when I was young and foolish, it went into the files, but over time it was forgotten,” he explained. “If someone was really seriously interested in finding out the details about me and my past, they could go through the public records and find stuff, but it took some effort.”
There was a sort of built-in self-correction, in other words, whereby events and data that became less important and less relevant over time naturally faded out of the limelight.
‘Just a Few Clicks Away’
Today, on the other hand, “privacy by obscurity has vanished, and everything is just a few clicks away,” Simpson said.
It’s not a question of whether factual information should be taken down, he pointed out. Rather, the question is whether that information should continue to be directly accessible by searching the name of the individual to whom it pertains.
“I think the knee-jerk reaction on the part of tech companies is to comply in a way that undermines the effect of what was intended,” he said. “No one is saying Wikipedia has to take down anything — just that the link from anyone’s name would have to be removed. The data would still be there.”
In short, “I think there needs to be clearer guidelines coming from data protection authorities in Europe,” Simpson concluded.
‘The Sky Is Falling’
“Interestingly enough, the Google Spain case doesn’t apply to Wikimedia — it isn’t really a search engine,” John Tomaszewski, an attorney in the international data protection practice of Seyfarth Shaw, told TechNewsWorld.
“If you look at the primary concern the court had with Google, it was with the capability to aggregate data from thousands of different sources together almost instantly,” he explained.
“The case didn’t say the magazine — which was the original source — had to remove the story, but that the interlinking data points between that story and many other unrelated data points were what had to get removed,” said Tomaszewski.
In other words, Wikimedia was “more like the Spanish newspaper that *didn’t* have to delete their archive than Google, which had to disable the linkages between data points,” he concluded. “So, I think that there is a little of ‘the sky is falling’ here.”
‘Transparency Is Key’
It’s important to remember that “the information in the European case was true — it was a matter of public record and lawfully posted,” noted Emma Llanso, director of the project on free expression for the Center for Democracy and Technology.
“The court didn’t give a lot of guidance on information that’s in the public domain as to when it’s no longer relevant or excessive,” she told TechNewsWorld.
Nevertheless, “companies are not in a good position to make these decisions — it’s on the courts to provide more guidelines for how to achieve this balance,” Llanso said.
“I think what Wikimedia is pointing out is crucial,” she stressed. “Transparency is a key to preventing abuse. Whatever additional rules they come out with, there’s got to be a way to inform the general public that search results are being altered at someone else’s request. Otherwise, they’re getting manipulated information without a real understanding that this is happening.”