April 22, 1993: Mosaic Browser Lights Up Web With Color, Creativity | WIRED

Back in the nineties, when cyberspace was first coined, there was a utopian fervor about this new frontier where the individual could free itself from the trappings of corporate and government hegemony and domination. Cyberlibertarians taunted the corporations with idealistic manifestos where encryption keys would protect us all in a disembodied world. “Your legal concepts of property, expression, identity, movement, and context do not apply to us. They are all based on matter, and there is no matter here,” claimed EFF founder John Perry Barlow’s Declaration of the Independence of Cyberspace [which was part of the reading I suggested for this week, but put in the wrong category for last week. I think my mind is getting a little mushy this semester.] What these (mostly white, male, and middle class) over-confident pioneers missed was that this was not a new world. It was a distorted reflection of the old world, a world where, in the words of bell hooks vis-à-vis Christina Boyles’s article we read this week, “imperialist white supremacist capitalist heteropatriarchy” not only dominates, but keeps itself in power by controlling others through cultural hegemony and surveillance.

The internet is not a disembodied experience. We exist online, have lost control over that existence, are recreated online by forces outside our control, and the ramifications for this are real. Furthermore, the inequalities inherent in a neoliberal marketplace where profit by any means trumps the collective good were bound to dominate the space where most of us not only communicate but get our information.

As our readings this week make clear, something seemingly as innocuous as an internet search or database inquiry is not objective but relies on algorithms that subsist on subjective diets. As Safiya Umoja Noble shows in her work on the racial and gender bias of Google’s search engine [ironically, this official link for her book was more difficult than just linking to the preview on Google Books] the results are not simply a reflection of society as Google and others argue, but are a reflection of online activity distorted by neoliberalism. There is no incentive for Google to fix algorithms where pornography is the first thing that comes up when one searches “black girls.” Of course, Google remedied this when it would affect their reputation, i.e. the bottom line. Unfortunately, a monopoly like Google will always be reactive [Google may have “fixed” the black girl problem, but if you search “Asian girls” right now, one of the first page hits is the Wiki for “Asian fetish.”]. Yet, it is not simply a problem of economics, as I explored way back in my second blog post, it is also a reflection of an industry dominated by ideas of whiteness and masculinity. This is not only a problem of popular digital information, though, as Sharon Block shows in her critical exploration of the JSTOR Thesaurus topical algorithm.

Block explores the seemingly strange phenomenon of women’s history articles being categorized not by the topic of “women” but “men.” [Incidentally, this is an example of why I never use the topic searches in databases and only rely on full-text searches] JSTOR’s taxonomists chalked this up to the word “women” being “noise” that the algorithm could not understand. This is not a surprise they would give such an excuse given the “black box” nature of algorithms where those that create algorithms usually can’t explain results. The band-aid used in reaction to complaints is to simply remove words so the unruly monster recalibrates. But this does not fix the problem which goes beyond artificial intelligence. The problem is the way in which these algorithms are constructed which most of us do not understand, and if we did, they tend to be proprietary. So what do we do?

Along with diversifying the field of programming, which Noble claims is not an issue of people not wanting to be programmers but a lack of the industry wanting diverse programmers, all three of the works we read this week focus on end-user input. There needs to be more engagement with real people. As Noble states, “There is a missing social and human context in some types of algorithmically drive decision making, and this matters for everyone engaging with these types of technologies in everyday life” (10). This involves educating users. In the case of surveillance (both digital and analog) that discriminates against black and indigenous people, Christina Boyles teaches undergraduates to use “speculative thinking,” to go beyond academic frameworks like Foucault’s concept of panopticism, where hegemonic structures create a culture of self-surveillance which is equally applied across society. We know mathematical frameworks single out groups. They are not egalitarian in their recognition and in their retention of our identities.

This is difficult for for people like me, who the algorithms consider “normal,” to recognize. Our new imagined future can only be attained if recognition of the inequity is broad. The electronic frontier is a site of settler colonialism where corporations reap the benefits through many unwittingly compliant actors. While, as Noble argues, the ideal would be a shift away from a reliance on commercial search engines, our influence can force companies to change their algorithms. This has to be done not just as cyber-activists, though. According to Noble, “We will not sort out social inequality lying in bed staring at smartphones.”

Leave a Reply