We typically think of censorship in terms of active prohibition of the creation or distribution of speech (or similar content.) But today we learn of an insidious method of censorship in current use here in the U.S.: tampering with search engines. POPLINE, the worlds largest reproductive health database, recently redefined the word "abortion" as a "stop term" - a word that, like "a", "an", and "the" is ignored by the search engine. The effect, naturally, was to cause searches for the term "abortion" to come up empty. "Stop word" censorship doesn't eliminate the underlying material; it simply makes it impossible to find by rational searching.
Why the ban? Michael Klag, dean of Johns Hopkins School of Public Health, which maintains the database, explains that there was concern that several articles of A, The Abortion Magazine, might contain content that would be considered abortion advocacy materials. The US Agency for International Development cannot, under federal law, support abortion activities and presumably they felt that their support for this database effectively rendered some "abortion" searches a form of abortion support. (USAID denies that they asked POPLINE to take this particular action.)
Let's set aside the politics of this issue, the accuracy of this interpretation of federal law, and the potential finger pointing. What chills me is watching 21st Century censorship play out - almost in a test run. As yesterday's post suggested, on one hand Google is making more and more personal information available via searches. And the dark side to this is that one of our prime methods of maintaining privacy - the simple difficulty and cost of gathering loads of information about any one person - seems to be increasingly ineffective. But the flip side is that, as we depend more and more on Google (and other free and pay engines) as definitive sources of information, the more we are vulnerable to strategic speech management through search term manipulation. In this case, the censorious act was detected - but largely because the manipulation was done at such a gross level that it was obvious. But in the future, with a more savvy censor, we may not detect these forms of suppression. (I say in the future, though we really don't know if such suppression is already happening - it surely must be occuring in China, for instance.) And as search engines subsume more and more content, it will be increasingly difficult to find materials through end-runs: alternative terms that might yield the same result, albeit with more "noise" - more unwanted search results.
There is no crisis at hand, but this should serve as a warning to civil libertarians that we won't discover censorship exclusively in places like library surveillance policies or TV curse word regulations. Indeed, when it happens, we may not discover it at all.