"Algorithms can have errors": One man's quest to purge horrific pictures from his Google results
Last week, we covered the story of Camping Alfaques, a Spanish vacation spot whose owner recently sued Google in a local court. His concern: top search results that feature grisly images (note: thumbnail versions of a few appear in a screenshot below) of dead bodies from an old tragedy. Such cases have so many implications for the future of search engines and the companies who depend on them that we spoke to the owner of Camping Alfaques to learn more about his situation. He told us what led him to sue Google, how much the case matters to him, and why he doesn't want anything "deleted" from the 'Net—just relocated.
Mario Gianni MasiĆ”, now the owner of an oceanfront vacation spot called "Camping Alfaques" in southern Spain, was a child in 1978 when a tanker truck exploded into a fireball on the road just beyond the site. 23 tons of fuel ignited, immediately turning 200 campers to ash and badly burning several hundred more. Safely on the other side of the camp, Mario was unscathed.
Photographers descended, of course; pictures were snapped, graphic shots of bodies stacked like charcoal, carbonized arms rising from the earth. Newspapers covered the deaths. A movie was made. But 30 years is a long time, and while memories of the disaster never vanished, visitors to the campground didn't have the most shocking images shoved in their faces just for planning a trip.
Until two-and-half years ago, Mario says, when Google's algorithm changed. Suddenly, right there in the top results for his vacation business, were black and white photos of charred human flesh. Mario has been trying ever since to convince Google that it needs to change this. He doesn't suspect any malice on Google's part, but neither is he willing to throw up his hands in the face of Google's ranking equations.
"Algorithms can have errors," he said when we spoke by phone this week. "Significant errors, like this one, have to be addressed."
Years of trying to get them addressed have had little effect. A regional IT consultant told him that the websites hosting the pictures had no interest in making any changes, so Mario decided to try Google. He began reporting the images as offensive, using Google's own tools, sometimes clicking on each five times a day; it had no result. He sent a certified letter to Google, begging them to associate the graphic images with searches for the accident and not with generic ones for his campground; they said there was nothing they could do.
Mario refused to accept this. After all, this business meant something to him. Camping Alfaques was a family affair, started by his mother and father in 1956. They built a loyal clientele of tourists from across Europe, many returning year after year. When the disaster happened, clients and locals pulled together to support Camping Alfaques.
Mario grew up and left Spain to do stints at universities like Harvard, then worked at an international law firm based in the UK. Eight years ago, he decided to take over Camping Alfaques when his father retired. He knew—the whole family knew—that the 1978 tragedy would always be a part of their history, and they had no desire to erase it.
"We accept this is a historic event," he said. He also would not change the name of his business to avoid associations with the tragedy, saying that it would be "like trying to erase history."
But the pictures unnerved him—and he couldn't understand why they showed up where they did. For instance, a search for "accidente camping alfaques" brought up mentions of the tragedy but no graphic thumbnails. Yet searching just for the campground ("camping alfaques") provided four terrible images just below the third result.
"We don't want to erase history, we want them [Google] to classify properly the information," Mario told me. He cited other searches, such as the one for the "MGM Grand" in Las Vegas. The hotel suffered a famous fire, but no picture or links to it appear just by searching the hotel name. Only specific searches for the tragedy bring up images of the burning building. Most of the world's accidents and terror attacks don't bring up such "horrific close-ups of people," Mario said.
After taking legal advice on the question and making no progress with Google, he finally sued the company's Spanish subsidiary in a local "court of first instance" in Spain. He didn't want any money; he wanted the images moved to other searches that he argues are more appropriate for the information. Last week, a judge threw the case out, saying that US-based Google Inc. had to be sued instead, as they were responsible for the algorithm.
The algorithm has spoken
Google has a famous aversion to hand-tuning anything. The company is all about the automated algorithm; if search results have a problem, it prefers to fix it directly rather than patch the symptoms. Such a stance has an engineer's logic to it, but it also helps to insulate the company from being caught up in millions of disputes over just what should appear with any given search result. (Not that it stops all such attempts; copyright holders are currently arguing that Google has a duty to show legitimate sites ahead of links to pirated content, for instance.)
Mario stresses that his quest isn't quite like many of the current "right to be forgotten" cases in Spain, where the Spanish data protection authority is currently trying to get personal information removed from Google searches and sites. He doesn't want results banned from the search engine, just "disambiguated."
Google has a long history of such complaints. Back in 2004, Google searches for "Jew" returned as their first result a site called "JewWatch"—yes, it's as bad as it sounds. Outrage abounded; why was Google endorsing what many considered hate speech as the very first result? The Anti-Defamation League reassured its members that "the ranking of JewWatch and other hate sites is in no way due to a conscious choice by Google, but solely is a result of this automated system of ranking."
Still, Google took an unusual step—it put up a special page to explain what had happened. "If you recently used Google to search for the word 'Jew,' you may have seen results that were very disturbing," the company wrote. "We assure you that the views expressed by the sites in your results are not in any way endorsed by Google."
And then it explained what it would and would not do to address the issue:
Individual citizens and public interest groups do periodically urge us to remove particular links or otherwise adjust search results. Although Google reserves the right to address such requests individually, Google views the comprehensiveness of our search results as an extremely important priority. Accordingly, we do not remove a page from our search results simply because its content is unpopular or because we receive complaints concerning it. We will, however, remove pages from our results if we believe the page (or its site) violates our Webmaster Guidelines, if we believe we are required to do so by law, or at the request of the webmaster who is responsible for the page.
The company wasn't kidding about being reluctant to intervene manually. Years later, JewWatch remains the second result for "Jew," while a site headlined "THE INTERNATIONAL JEW - THE WORLD'S FOREMOST PROBLEM" comes in at number five. And yet Google does make one concession to manual intervention (though not in the results themselves); it has an Adwords ad on the page pointing to its own explanation page and saying, "We're disturbed about these results as well. Please read our note here."
Similar high-profile incidents have involved a George Bush "Google bomb" that pushed an unflattering site to the top of the list. A (purposely obscene) definition of presidential candidate Rick Santorum's last name was for years the top result for "Santorum."
In both cases, Google refined its underlying algorithm to address the general techniques that allowed such results to top the results. Its efforts against Google bombs have sanitized George Bush's results; a recent change bumped the Santorum redefinition down to the fourth spot. Search blogger Danny Sullivan reported that Google had taken no manual action on the Santorum ranking, but it had tweaked its search algorithms to deprioritize "irrelevant adult content" and to better recognize "official" sites about groups and people.
Even more disclaimers?
But critics continue to raise questions about Google's level of responsibility for search results. Technology scholar and contrarian Evgeny Morozov recently wrote in Slate about the power of the Web to disseminate every variety of nutball claim and to sustain communities who believe them. One possible solution might be getting search engines more involved in "curating" their results for accuracy.
Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.
In more than a dozen countries Google already does something similar for users who are searching for terms like "ways to die" or "suicidal thoughts" by placing a prominent red note urging them to call the National Suicide Prevention Hotline. It may seem paternalistic, but this is the kind of nonintrusive paternalism that might be saving lives without interfering with the search results. Of course, such a move might trigger conspiracy theories of its own—e.g. is Google shilling for Big Pharma or for Al Gore?—but this is a risk worth taking as long as it can help thwart the growth of fringe movements.
Such a move would be unavoidably political. Who decides what pseudoscience gets the special treatment? Who decides what counts as pseudoscience in the first place? Law professor Derek Bambauer worries about how far this might be taken.
Should the site implement a disclaimer if you search for “Tommy Lee Pamela Anderson”? (Warning: sex tape.) If you search for “flat earth theory,” should Google tell you that you are potentially a moron? I don’t think so. Disclaimers should be the nuclear option for Google - partly so they continue to attract attention, and partly because they move Google from a primarily passive role as filter to a more active one as commentator. I generally like my Web results without knowing what Google thinks about them.
From Google's perspective, such a move doesn't pose only problems of politics; it's also revoltingly manual. And manual doesn't scale as well as automated solutions. For Google, scale is everything.
Moving forward
Google trusts that the 500 tweaks it make each year to its search algorithms—and the Web links that lie behind them—will increasingly point to truth and accuracy. But they haven't yet helped Camping Alfaques.
Mario insists that the algorithm "is not an absolute truth" about the world and that Google has a duty to fix it when presented with situations like his own. "Not to correct significant errors in results is totally unfair," he said.
He's still considering how far to press the issue. Suing US-based Google—and enforcing any judgment against it—could be difficult and expensive. A successful appeal of last week's Spanish court ruling would be simpler, and might allow Camping Alfaques to go after the easier-to-reach Google Spain. Mario will make a decision next week.
Even a total victory would be only partial, however. Microsoft's Bing search engine also includes thumbnails—and the first one under "camping alfaques" is a disturbing photo of an incinerated child crouched in the mud, arms open in supplication, something like a grin stretched across his face.
These aren't the kinds of images Mario believes are appropriate for general searches about his property, but for the moment there's nothing he can do. His case turns on a simple but hugely contentious question, one that has become increasingly important for search engines and the people who rely on them: should there be something he can do?
DANCING NEBULA
Tuesday, March 6, 2012
"Algorithms can have errors": One man's quest to purge horrific pictures from his Google results
via arstechnica.com
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment