Google has begun the process of removing links to “outdated or irrelevant” content in European search returns.
Since the European court of Justice ruling in May 2014, Google has established a “removals team” and private services have even sprung up to offer support in bids to be forgotten.
Widely hailed as the right to be forgotten, the concept is a bit of a misnomer. Sure, search engines in Europe are compelled to forget certain sites as they relate to the applicant, but the site itself can retain the offending information. The ruling has simply made retrieving such information more difficult. You know, sort of how it was to find something before the internet age, when one had to know what they were looking for and not just a search name.
For example, a newspaper that covered a crime in the past is not compelled to remove an archived article from their website, even if the person has record of that article removed from Google indexing. It stands to reason, then, that searching a news website for mention of a person would still return that article. One would simply need to have a sense of what sources might have records.

Moreover, with the use of a VPN, a user can appear to be searching from elsewhere in the world and still retrieve such information through Google in the US, for example.
The Big Grey Search Area
The inconsistencies of enforcing virtual forgetting aside, the ruling itself has left me conflicted.
On one hand, all people, being humans, are bound to make mistakes at some point during their lives – should they suffer forever with a trace of some stupid past transgression appearing in search returns for their name?
On the other hand, what if all a person seems to do is make “mistakes”, the sort that harm other people and are more a pattern than a fluke in someone’s behaviour – having a digital trail might prevent others from falling victim.
What happens if the information in question involves more than the applicant? Let’s say it is a culprit and the respective victim: does the victim have the right to insist the information remain indexed in Google? Would other parties even be consulted?
The other problem with this new regulation is context. Search engines are now required to forget information that is “inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes“. Yet who determines what is relevant or excessive? For now that seems to be Google, who, as The Guardian reports, “has set up an online form where people can request removal of links, and says people can appeal to data protection authorities if they disagree with its decision.”
While privacy advocates applaud the ruling – and push Google to adopt the outcome internationally – it all seems a bit reactionary, with minimal thinking through on implementation and how to measure what is a legitimate claim and what isn’t. Just another symptom of society’s inability to catch up with the changes brought by digital technology.
It will be interesting to see how it all unfolds. Given that this is happening through Google, though, I suspect little will be revealed. Unlike a public institution, they will not be compelled to release information – and indeed, given privacy regulations, they might not be at liberty to disclose much at all.
This, of course, is but one interpretation. What are your thoughts? Is this good? What about the moral implications? Is it something that can be reasonably implemented? Care to share your predictions?
1 Comment
Pingback: Mutually Assured Mental Destruction