Versity

In the wake of the Trump election, there has been a lot of hand-wringing and self flagellation in tech communities about the so-called “filter bubble” created by social media. Was Trump elected by Facebook? Is this “our” Twitter revolution — only in the wrong direction? I wrote about this previously, invoking a strange coinage, versity, as an inversion and mutation of diversity:

I think there is work to be done on collaborative filtering in the context of ideology and identity. Surely this is a type of group interpellation. The technology of collaborative filtering, also called suggestive filtering and included in the growing field of intelligent agents, allows one to predict characteristics (particularly our so-called desires) based on survey data. Identity in this context is formulated on certain hegemonic (negotiated, but never actively negotiated) patterns. In this massive algorithmic collaboration the user is always suggested to be like someone else, who, in order for this to work, is already like the user. As Matt Silvia of Firefly describes: “a user’s ratings are compared to a database full of other member’s ratings. A search is done for the users that rated selections the same way as this user, and then the filter will use the other ratings of this group to build a profile of that person’s tastes.” This type of suggestive identification, requiring a critical mass of identity data, crosses vast distances of information to versify (to make similar) objects.

Firefly was one of the very first companies to deploy collaborative filtering technologies. They were bought by Microsoft (and more or less shelved as far as I can remember) — and the notion of “web 2.0” wouldn’t become a viable category until a few years later. But even in these early days it was clear that algorithms for filtering large databases of users were fundamentally oriented around logics of grouping, clustering, similarity, identity, unity-through-diversity… or, we might say, “versity.” They’re called clustering algorithms, after all.

The paragraph quoted above was written in 1998. And it’s still applicable today. I’m not saying I predicted the rise of Trump — ha! But many of us in digital media studies have been talking about this for a long time, even if some journalists have only “discovered” the concept of the filter bubble in the last week. Let’s be clear, these are technologies for the management of diversity, not the facilitation of real, lived difference. Versity doesn’t mean the elimination of difference — social differentiation, identity difference, digitality as difference, etc. — on the contrary, versity uses difference as its real substrate in order to generate systems of organization. In his amazing book Laws of Cool, Alan Liu described this phenomenon as a “monoculture of diversity” (58).

In the past I’ve also connected this effect to the so-called Robustness Principle, defined by Jon Postel in the white paper for IP (Internet Procotol) and in other documents. Here are two different versions of the the Robustness Principle, with slightly different wording:

In general, an implementation should be conservative in its sending behavior, and liberal in its receiving behavior. (from RFC 760 [1980])

Be conservative in what you do, be liberal in what you accept from others. (from RFC 761 [1980])

Terms like “conservative” and “liberal” don’t often appear in the Internet protocols. So your eyes really light up when you encounter terms like this amid the heap of dry, technical writing. If not a political theory proper, Postel was certainly articulating a very particular philosophy of organization. And it’s exactly the same principle that I mentioned at the outset: assume a heterogeneous context, but propagate that reality in a slightly less heterogeneous way. In other words, the world is “liberal,” but our infrastructure is “conservative.” In The Exploit, Eugene Thacker and I expressed this phenomenon in unambiguous language: “communications protocols are technologies of conservative absorption. They are algorithms for translating the liberal into the conservative” (131). (And yes, yes some of you will point out that the protocol designers were just trying to be practical, just trying to write “code that works.” I fully acknowledge that fact. Still, I’m addressing an entirely different aspect of these technologies.)

These are some of the things to keep in mind when evaluating social media, Google’s “page rank,” even the kinds of clustering algorithms used in digital humanities — not lapse back into platitudes about how digital technologies supposedly corrode power and increase freedom. As we have seen in recent days, the story is much more complex than that.

 

taken from here

Nach oben scrollen