Beyond the Reveal: A Metaphor’s Effect


In my last post, I described how the black box emerges historically with the extrication of (at least some) laborers from the machines of industrial labor. The cost of this move is that the laborer, now outside the machine as an operator, must herself operate as black box. The interface between the laborer and machine becomes central to this new relationship, especially as managers and technologists focus on how constantly to reconfigure the interactions between and among human-machine pairs.

In recounting this history of a metaphor, I aim toward a critique of how black box metaphors are used today to describe opaque technological processes. And I don’t mean to suggest that any use of a black box metaphor inadvertently invokes a whole history of labor and interface. But I do think we can surmise from this history a dominant narrative that draws heavily from the black box metaphor:

  1. As an “infrastructural inversion,” the black box metaphor creates the possibility, for some, of imagining themselves as outside a system that formerly may not have been visible at all.
  2. Where and when this happens, interfaces emerge and gain prominence as a point of mediation with the formerly invisible system.
  3. Design for interaction between the user and the “black boxed” process tends to imagine the human mind as another form of black box, emphasizing cognitive over manual processes.
  4. The new system comprised by this user and her machine then starts the process anew—the user/worker has been incorporated into a new system that she may not actually see unless naming a new “black box.”
  5. This narrative will also depend on the exclusion of some who need to “stay behind” and keep the system going within the “old” forms of labor.

To describe a process as a black box thus potentially sets in motion a whole series of implications for sensation, knowledge, labor, and social organization.

Let’s look at this, for example, in light of new attention brought to the role of algorithms in Facebook use (an effort in which I have been involved as a scholar). How does describing the Facebook algorithm as a black box set us on a particular narrative of analysis and research?

Let’s imagine a Facebook user who is not yet aware of the algorithm at work in her social media platform. The process by which her content appears in others’ feeds, or by which others’ material appears in her own, is opaque to her. Approaching that process as a black box, might well situate our naive user as akin to the Taylorist laborer of the pre-computer, pre-war era. Prior to awareness, she blindly accepts input and provides output in the manufacture of Facebook’s product. Upon learning of the algorithm, she experiences the platform’s process as newly mediated. Like the post-war user, she now imagines herself outside the system, or strives to be so. She tweaks settings, probes to see what she has missed, alters activity to test effectiveness. She grasps at a newly-found potential to stand outside this system, to command it. We have a tendency to declare this a discovery of agency—a revelation even.

But maybe this grasp toward agency is also the beginning of a new system. The black box metaphor suggests that such providers will also need to design for the user who tweaks. (It may even be that designing for the tweaker may be more profitable than designing a “perfect feed.”) As in previous ergonomic problems, this process will begin to imagine and construct a particular kind of mind, a particular kind of body, a particular kind of user. Tweaking to account for black-boxed algorithmic processes could become a new form of labor, one that might then inevitably find description by some as its own black box, and one to escape.

Maybe, by structuring our engagement with the experience of Facebook’s opaque processes through the black box metaphor, we’ve set ourselves up to construct a new black box, and ignored the ways in which our relations to others, within and without the present system, have been changed by our newfound awareness.

I’m struck here, for example, by how well the narrative of the black box I’ve described here fits a number of stories we’ve lived and heard regarding privacy and networked media. Whether it’s the Snowden revelations or Facebook’s unauthorized emotion study, the story often plays out the same way for many of us. We realize or remember anew just how much work we’re providing some entity within a current system, and then proceed to either alter our use patterns or abstain altogether from that system in order to remain outside that work. Debates ensue over who is complicit and who is not, and with the exception of those working in a more organized fashion to enact prosecution or new laws, most of us are stuck in an “opt-in or opt-out” scenario that never goes anywhere.

It’s likely only a matter of time before the market for more subtle responses than “opt-in or opt-out” is met with a new set of black box systems. One can imagine, for example, a range of services: free email if you submit to full surveillance and data-trolling, modestly-priced email if you submit your data for use via an anonymizer, or premium email at high costs that removes you from all data-harvesting.

Perhaps, even as we remain justifiably critical of the unseen and unknown software processes that govern and regulate a growing number of shared spaces and subjectivities, we might search for another way to live with these processes than hitting the escape button and entering a higher-level routine. More on that in my next posts.

Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.