On January 12th, Google announced that they’d been
the target of a series of cyber attacks aimed at accessing the Gmail accounts of Chinese human rights activists. Their response was to stop censoring Google.cn
effective immediately. In the same week, Google was granted a patent that would enable them to
replace real billboards with virtual ones in Google Street View.
Leaving aside the utter surprise at finding myself on the pro-corporate side of the government v. megacorp sovereignty wars, both of these stories are about the same thing: filtering data streams to match one set of aims or another.
photo credit:
Barabeke The China story is a story of the filtration with which we are most familiar. Censorship is old, well understood and for the most part opposed by people of conscience. Though, Google hasn’t yet announced a unilateral end to their censoring practices
in France and Germany and there are plenty of things
that you aren’t allowed to say all over the world.
The replaceable ads feel like something else entirely. If they get implemented, it probably won’t be a wholesale replacement on a political agenda. It’s more likely to be a patchwork replacement of the real. There’s an appealing elegance to the move. They were ads anyway, now they are just different ads. (Weirdly, many of the real billboards
are there illegally. There is a troubling chemistry at work when virtual advertising space is being made available on the basis of improperly placed real advertisements.)
So much for the governmental and corporate.
There is a third layer of filtration, the personal layer. By putting up a set of filters, individuals can customize their experiences; cutting away the unwanted and subbing in whatever they please.
Consider the ways that you can remake YouTube in your image: you can make videos
downloadable; you can filter out all the comments that are
algorithmically determined to be stupid; you can even
filter out the whole site. Sites like
Give me back my Google attempt to remove affiliate and other spam-type links that Google hasn’t filtered themselves.
Subtractive filtering isn’t the only approach.
Add Art is a Firefox plugin that replaces online advertising with curated art shows. The shows consist of images formatted to the
standard ad dimensions which replace real advertising on the fly. (I tried Add Art for awhile but had to uninstall it when a show consisting of rapidly strobing GIFs rotated in.) If Google implements Street View virtual ads, maybe Add Art can implement Street View art shows on top of that.
Having a good set of filters is crucial. We are bathed in a sea of insistent data and augmented reality is only going to make it worse. As
Bruce Sterling points out, an issue with augments is that they don’t scale. If you’re looking at any given square of surface, there’s only room for so much text, video or images. Very, very quickly, too many people will be trying to get your attention at once. You’ll need heavy filters to handle it all. What if they get too powerful?
Overly powerful personal filters is a recurring fear. We worry that people
watch the news,
go to church and generally
build communities that confirm their opinions. Self-imposed censorship is probably the hardest to fight as there isn’t curiosity or fascination with the forbidden to turn leaks into torrents.
Take the moment in John Carpenter’s
They Live when Roddy Piper puts on the glasses.
Click to view
Now, imagine that
this guy got together the tools to make a set of glasses that highlighted occult symbols automatically. Imagine a set of glasses that filtered out objectionable images for whatever value of “objectionable images” you care to substitute.
It’s pre-augmented reality but the 2004 film
Epic 2014 presents a compelling vision of a world where the filters have won in a big way. The resulting output is both the best curation of the news-you-need-to-know and the worst set of bias-confirming pablum, depending on the preferences and habits of the user.
The bias confirmation fear has been around for a long time. It was the main theme of my grade 11 “Media Literacy” class. The key has shifted from centralized media control to fears about distributed extremist groups, but the chorus is the same.
It wasn’t until this past week that I worked out the problem with the analysis. The trajectory assumed is of increasingly powerful and impregnable filters. If that trajectory holds, then one expects an increasingly balkanized culture, full of isolated groups that think they have nothing in common. But there’s a second set of actors in play, the ones being filtered out.
As the first group works harder to filter out unwanted messages, the second works harder to break through. We see it in the arms race around advertising. We see it in politicians struggling to find new ways of reaching their audience. We see it in Google’s need to constantly change and update their pagerank algorithms as black hat SEOs learn to game the system.
So long as the arms race continues, the filters will get better without becoming perfect. And in those cracks, reality (or at least an alternate viewpoint) can intrude. Insofar as we believe that people can’t know in advance what is best for them or what information they should receive, we should celebrate inefficiencies in filters.
In every successfully delivered spam message, there is a ray of hope.
Originally published at
Quiet Babylon. You can comment here or
there.