I think most of the thinking around The Filter Bubble comes from people who are not very procedurally literate to begin with. That is to say they are not very adept at understanding the rules that govern interactive systems nor are they well equipped at reconfiguring them to suit their ends. I touch on this because the same tired argument was parroted in this Zeit interview with Miriam Meckel, a leading German communication scientist. It starts off with some very sensible sentiments but then it quickly derails on the topic of algorithms and concludes on several sidelines.
There is a clear need for caution when it comes to algorithms, as has also been expressed by algoworld expert Kevin Slavin in his TED talk ‘How algorithms shape our world’ but there is no need for the undue fear being mongered by Eli Pariser and his pack. Meckel says the following (as also remarked by Basti Hirsch):
Es gäbe keinen kritischen Diskurs mehr, und damit würde unser System auseinanderfallen. Informationen sind der Kitt, der unsere Gesellschaft zusammenhält. In meinem Buch treibe ich diese Idee auf die Spitze: Die Menschheit schafft sich durch die Perfektionierung der Algorithmen selbst ab.
Bei manchen durch Algorithmen betriebenen Werbeangeboten hingegen bekämen Sie diesen Artikel gar nicht erst zu sehen.
While deploring the extremism prevalent in German discourse on the topic of the internet. She herself now takes an extremist and poorly nuanced position herself. The Filter Bubble argument that is currently in vogue (see this treatment by Alexander) is mostly hollow and it creates understanding on the back of fear. I work for the internet and I am sick of hearing this nonsense time and time again.
The Filter Bubble contrasts a previously filtered situation of redacted mainstream media with the new filtered situation of personalized online content and plays off of people’s fears. There are two main differences in the new situation.
The first difference is that the filters personalize content spheres for each person. I don’t think this is all that problematic. Having trained machine learning algorithms myself, I have seen how coarse they turn out no matter what amount of training. Training which is somewhere between a dark art and trying to hit a subjective target somewhere. Algorithmic filters resemble fractal surfaces more than they do smooth bubbles and personalization will never provide a perfectly sealed off environment. This means that as soon as you get into the technical details the whole thing very quickly falls apart.
The second difference is that filters are being applied by algorithms instead of editors now. Both are enigmatic creatures, but judging from the cold reception algorithms get, it seems that the traditional humanities are better equipped to deal with human entities than they are with the algorithmic variety. There is nothing new under the sun. Large scale social segregation and associated detrimental effects also happened using traditional media with people logging into their own newspaper or radio station. One of the most visibly polarized societies right now is the USA where the ‘debate’ between the right and the left is raging on talk radio, 24 hour news networks and, yes, also online. If anything the filters may help by making the groups of like minded people too small and too busy to be harmful to society.
My second problem is that while complaining about the lack of technical literacy in the general populace, her discipline and her research does not come over as very technically literate. She says:
Unser Land ist tendenziell eher technikfeindlich eingestellt.
The interviewer then adds that she draws from literary and philosophical sources. Those are interesting but hardly enough to thoroughly treat a subject. Deep talk about about information technology should draw from philosophy but it should also bring a literacy of the field itself. That means knowledge of its technical workings and affordances, the design practices inherent in the creation of technical artifacts and the procedurality and interaction that is so key to them.
So yes I very much agree that we need to instill a large scale procedural, data and media literacy in people and we may well need to start with the humanities. That may be the only way to fix their relevance problems when it comes to digital things (see also Ian Bogost’s two part essay ‘Beyond the Elbow-patched Playground’ on that).
So with those skills in hand, we could discuss the filter bubble drawing from applied research. One finding I would like to see is a technical assessment of the feasibility of trapping people in filter bubbles and measurements of the amount of information isolation that can be achieved. Another would be to research real life internet users and see if in fact they shut themselves off more from other influences and how far this affects their world views. Only with a praxis firmly based in reality can we talk about this subject in a way that is not gratuitous.
Update: This review of the Filter Bubble by Olga Goriunova in Computational Culture mostly vindicates my argument and I agree that we need more writing, not less to bridge the gap of literacy that stands ahead of us.