Category Archives: Beyond the Reveal/Black Box

Beyond the Reveal: Opacity in Personal Chrono-tech

 

From Apple II watch instructions on Instructables.com URL: http://www.instructables.com/id/Apple-II-Watch/]

From Apple II watch instructions on Instructables.com URL: http://www.instructables.com/id/Apple-II-Watch

Part Four: Opacity in Personal Chrono-tech

As a conclusion to this series on the limits of black box metaphors in critiques of obscured technological systems, I want to offer a brief example of an alternative approach. Earlier this year, I presented this material as a lecture. Since then, a new black box has entered the marketplace—Apple’s Watch. I have not yet interacted with Apple’s “most personal device,” but I expect (largely merited) critiques about how the Watch embeds Apple’s system ever deeper in the daily routines of users. With both fewer buttons and less screen real estate with which to interact, the inputs and outputs for this system will probably be more passive and less obtrusive, even as the background software and hardware processes grow more complex. What new routines and rhythms of attention will the Watch afford, and on what algorithmic processes of surveillance, marketing, or communication will this attention depend?

We will need new audits. We will need to know, as with the iPhone, what information this new device is storing and sharing, and with whom. The Watch’s role in collecting medical data should give us particular pause in this regard. But when considering constraints on agency and freedom, we shouldn’t limit our analysis to revealing the processes at work “inside” this device. The processes by which we live with such devices deserve as much attention as the routines at work in the operating system. And we can learn a great deal about this device’s role in our lives without ever peering inside the system.

As a prompt in this direction, I’ll offer a brief tour of objects that, like the Watch, “want” to be a part of our everyday rhythms of attention, yet make “seamful” rather than seamless opacity a foregrounded aspect of our interaction with them.

 

Vague Clock by Sejoon Kim URL: http://sejoonkim.com/design/vagueclock.html

Vague Clock by Sejoon Kim URL: http://sejoonkim.com/design/vagueclock.html

Take, for example, Sejoon Kim’s Vague Clock. In contrast to Apple’s Watch, it offers the time not “on demand” (with the raise of an arm), but “on exploration” (with the caress of a hand). The clock’s almost opaque fabric makes the reading of time at a glance almost impossible. Instead, the laborer at her desk must get up and not only tap the clock face, but explore it, changing a two-dimensional plane into a three-dimensional form.

 

Risk Watch by Fiona Dunne and Anthony Raby URL: http://www.dunneandraby.co.uk/content/projects/75/0

Risk Watch by Fiona Dunne and Anthony Raby URL: http://www.dunneandraby.co.uk/content/projects/75/0

The speculative designs of Fiona Dunne and Anthony Raby are also instructive here. Their 2007/08 series of objects entitled DO YOU WANT TO REPLACE THE EXISTING NORMAL? includes The Risk Watch, a watch whose opaque face carries a small nipple in place of any visible marks of temporal passage. When placed to an ear, the nipple activates a small device which speaks a number that “corresponds to the political stability of the country you are in at that time.” Dunne and Raby state about this body of work that “if our desires remain unimaginative and practical, then that is what design will be.” The Risk Watch gives us what we want—a sort of single-app Apple Watch—in a way that invites us to examine both the desires we bring to personal tech, and the processes we trust to grant them.

The NoPhone URL: http://nophone.myshopify.com/

The NoPhone URL: http://nophone.myshopify.com/

Dunne and Raby’s approach to opacity might also call to mind the NoPhone, a project launched last year via Kickstarter that reached some unexpected, if modest, financial success. The NoPhone, billed as “a technology-free alternative to constant hand-to-phone contact,” is simply a brick of black plastic molded in the size and shape of an iPhone. In use as a replacement for one’s phone, the device aspires to deliver a different sort of “reveal,” catching the user in the act of relentless phone-checking. Like Ben Grosser’s Facebook Demetricator, the NoPhone calls to mind counter-addiction regimes, but does so with some humor, and a desire to cast human habits into the spotlight.

The Durr watch, by Skrekkøgle URL: http://skreksto.re/products/durr

The Durr watch, by Skrekkøgle URL: http://skreksto.re/products/durr

Another provocative neighbor to Apple’s Watch is the Durr, a product of the Norwegian studio Skrekkøgle. As with the NoPhone, the Durr’s designers create personal technologies that utilize opacity in order to reveal something about the user’s daily activities. In this case, however, the object also introduces a modest new machinic process into the picture. Like the NoPhone or the Vague Clock, the Durr presents a wholly opaque face where a screen or dial might normally reside. Inside the object, however, resides a small vibrating motor that operates at five minute intervals.

For a few months now, I’ve been replacing my usual watch with a Durr for a day or two each week, with enlivening effects. The Durr reveals not only my habits of watch-checking, but the relative speed at which time passes in relation to the intensity and direction of my attention. Checking email, I can’t believe how fast the Durr is going. Traveling across town on foot, the durations seem broad and wide. Five minutes is just long enough to forget the thing in many cases, just too long to be counted by the human attention clock. Its opacity depends in part on me as much as on the device itself. As such, wearing the Durr casts my other machinic attention regimes into new light and invites me to reorient my body accordingly.

I could go on to mention a dozen different life-management and attention-management tools, simple things like www.donothingfor2minutes.com, or “productivity” apps such as Freedom, which disables a device’s internet for set periods of time. Where such efforts serve behavior-modification regimes, they should surely be set in the historical context of disciplinary, labor, or even religious regimes.

Set next to the growing number of algorithm auditing efforts, however, such attention-modification works serve a different function. They show how, in the quest to understand the influence of machinic processes on human agency, there is much to be learned without ever “unboxing” the technologies at hand. As we move forward with the vital work of monitoring and interpreting the multitude of new processes at work behind our technologies of attention, we should take great care not to stop our efforts at the algorithmic reveal. We should insist on the co-presence of at least two other bodies of work in the growing intellectual spaces devoted to critique of algorithms—that of critical race, gender, and labor studies, which reveals the differently-structured life on which the new algorithms depend, and of design, art, and play that casts human action and desire toward interface in new light.

Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

FacebookTwitterGoogle+LinkedInShare

Beyond the Reveal: Toward Other Hermeneutics

fitness

Part III: Toward other Hermeneutics

I want to make clear here that I believe we need to keep pushing for new research—new policies and practices that help ensure just algorithmic processes at work inside our infrastructures. (See posts one and two of “Beyond the Reveal.”) If our search engines, pricing structures, law enforcement or trade practices depend on or enact unlawful, unethical, or unjust algorithmic processes, we need to have ways of stopping them. We need accountability for these processes, and in some cases that will also mean we need transparency.

But, as urban studies scholar Dietmar Offenhuber points out in Accountability Technologies, accountability isn’t inextricably linked to transparency. In fact, some forms of revelation about opaque processes may do more harm than good to the public. If we make information access a priority over “answerability and enforcement” when it comes to just algorithmic infrastructures, Offenhuber warns, we may not achieve our goals.

So there may be times when “opening the box” might not be the best path to dealing with the possibility of unjust systems. And it is almost certainly the case that our black box metaphors aren’t helping us much in research or advocacy when it comes to charting alternatives.

In my own collaborative work on a Facebook user study, my co-authors and I focused primarily on a question directed to users: “Did you know there’s a black box here, and what do you think it’s doing?” The results of this study have set us on a path to at least learning more about how people make sense of these experiences. But in some ways, our work stands to get stuck on the “reveal,” the first encounter with the existence of a black box. Such reveals are appealing for scholars, artists, and activists—we sometimes like nothing better than to pull back a curtain. But  because of our collective habit of establishing new systems to extricate ourselves from old ones, that reveal can set us on a path away from deliberative and deliberate shared social spaces that support our fullest goals for human flourishing.

I confess that at this point, I bring more cautions about black box hermeneutics than I bring alternatives. I’ll conclude this post by at least pointing to a path forward and demonstrating one possible angle of approach.

My critique of black box metaphors so far leads me to the following questions about our work with technologies:

  1. How else might we deal with the unknown, the obscured or opaque besides “revealing” it?
  2. Do we have to think of ourselves as outside a system in order to find agency in relation to that system?
  3. Can interface serve to facilitate an experience that is more than cognitive, and a consciousness not ordered by the computational?

As Beth Novwiskie pointed out in a response to this post in lecture form, we already have at least one rich set of practices for addressing these questions: that of interpretive archival research. Are not the processes by which a corpus of documents come to exist in an archive as opaque as any internet search ranking algorithm? Isn’t part of the scholar’s job to account for that process as she interprets the texts, establishing the meaning of such texts in light of their corporeal life? And aren’t multiple sensoria at work in such a process, only some of which are anticipated by the systems of storage and retrieval at hand? Understood as “paper machines” and technologies in their own right, certainly the histories of how scholars and readers built their lives around epistles, chapbooks, encyclopedias, and libraries have much to offer our struggles to live with unknown algorithms.

We might also, however, look to the realms of art, design, and play for some productive alternatives. Take for example, the latest black box to take techno-consumption by storm—Apple’s iWatch. This object’s use is almost certainly headed in the direction of integration into users’ lives as a facilitator of new daily routines and systems, especially by the quantified self set. Other writers on this blog have already helpfully set the new box in the context of its precedent in meditative practices or contemporary tech labor. But as we work to understand how the new systems involve us in new, opaque processes, a glance at some more intentionally opaque neighbors might be of help. In my next post, I’ll set a few recent objects and experiences next to the iWatch for comparison for how they invite distinct incorporation into the rhythms of daily attention, thought and action.

Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Beyond the Reveal: A Metaphor’s Effect

future

In my last post, I described how the black box emerges historically with the extrication of (at least some) laborers from the machines of industrial labor. The cost of this move is that the laborer, now outside the machine as an operator, must herself operate as black box. The interface between the laborer and machine becomes central to this new relationship, especially as managers and technologists focus on how constantly to reconfigure the interactions between and among human-machine pairs.

In recounting this history of a metaphor, I aim toward a critique of how black box metaphors are used today to describe opaque technological processes. And I don’t mean to suggest that any use of a black box metaphor inadvertently invokes a whole history of labor and interface. But I do think we can surmise from this history a dominant narrative that draws heavily from the black box metaphor:

  1. As an “infrastructural inversion,” the black box metaphor creates the possibility, for some, of imagining themselves as outside a system that formerly may not have been visible at all.
  2. Where and when this happens, interfaces emerge and gain prominence as a point of mediation with the formerly invisible system.
  3. Design for interaction between the user and the “black boxed” process tends to imagine the human mind as another form of black box, emphasizing cognitive over manual processes.
  4. The new system comprised by this user and her machine then starts the process anew—the user/worker has been incorporated into a new system that she may not actually see unless naming a new “black box.”
  5. This narrative will also depend on the exclusion of some who need to “stay behind” and keep the system going within the “old” forms of labor.

To describe a process as a black box thus potentially sets in motion a whole series of implications for sensation, knowledge, labor, and social organization.

Let’s look at this, for example, in light of new attention brought to the role of algorithms in Facebook use (an effort in which I have been involved as a scholar). How does describing the Facebook algorithm as a black box set us on a particular narrative of analysis and research?

Let’s imagine a Facebook user who is not yet aware of the algorithm at work in her social media platform. The process by which her content appears in others’ feeds, or by which others’ material appears in her own, is opaque to her. Approaching that process as a black box, might well situate our naive user as akin to the Taylorist laborer of the pre-computer, pre-war era. Prior to awareness, she blindly accepts input and provides output in the manufacture of Facebook’s product. Upon learning of the algorithm, she experiences the platform’s process as newly mediated. Like the post-war user, she now imagines herself outside the system, or strives to be so. She tweaks settings, probes to see what she has missed, alters activity to test effectiveness. She grasps at a newly-found potential to stand outside this system, to command it. We have a tendency to declare this a discovery of agency—a revelation even.

But maybe this grasp toward agency is also the beginning of a new system. The black box metaphor suggests that such providers will also need to design for the user who tweaks. (It may even be that designing for the tweaker may be more profitable than designing a “perfect feed.”) As in previous ergonomic problems, this process will begin to imagine and construct a particular kind of mind, a particular kind of body, a particular kind of user. Tweaking to account for black-boxed algorithmic processes could become a new form of labor, one that might then inevitably find description by some as its own black box, and one to escape.

Maybe, by structuring our engagement with the experience of Facebook’s opaque processes through the black box metaphor, we’ve set ourselves up to construct a new black box, and ignored the ways in which our relations to others, within and without the present system, have been changed by our newfound awareness.

I’m struck here, for example, by how well the narrative of the black box I’ve described here fits a number of stories we’ve lived and heard regarding privacy and networked media. Whether it’s the Snowden revelations or Facebook’s unauthorized emotion study, the story often plays out the same way for many of us. We realize or remember anew just how much work we’re providing some entity within a current system, and then proceed to either alter our use patterns or abstain altogether from that system in order to remain outside that work. Debates ensue over who is complicit and who is not, and with the exception of those working in a more organized fashion to enact prosecution or new laws, most of us are stuck in an “opt-in or opt-out” scenario that never goes anywhere.

It’s likely only a matter of time before the market for more subtle responses than “opt-in or opt-out” is met with a new set of black box systems. One can imagine, for example, a range of services: free email if you submit to full surveillance and data-trolling, modestly-priced email if you submit your data for use via an anonymizer, or premium email at high costs that removes you from all data-harvesting.

Perhaps, even as we remain justifiably critical of the unseen and unknown software processes that govern and regulate a growing number of shared spaces and subjectivities, we might search for another way to live with these processes than hitting the escape button and entering a higher-level routine. More on that in my next posts.

Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

Beyond the Reveal: Living with Black Boxes

brain

Part One: Histories

Amidst growing attention and calls to action on the role of algorithms in our everyday lives, one idea recurs: “opening the black box.” In such analyses, the “black box” describes a process that happens in secret, for which we only know the inputs and outputs, but not the steps that takes place between. How might this metaphor be structuring our approach to thinking about algorithms and their place in our lives, long before we get to the work of accounting for the social and political work of algorithmic systems?

In this first of four posts, I’ll begin an answer to this question by looking at the history of the “black box” as a way of modeling cognitive or computational processes. In the second post, I’ll offer some cautionary words about reliance on this metaphor in the important work of ensuring just systems. Finally, in the last two posts I’ll look to some alternatives to black-box-opening in our relationships to opaque technological processes.

The black box metaphor began to acquire its shape during changes in labor that took place after World War II. Whereas managers before the war had largely treated work as a series of learned behaviors, the designers of work and work environments after the war began to think less about suiting the laborer to the work, and more about suiting the work to the laborer.

More than a mere Taylorist repeater of actions, the new ideal worker of post-war Human Factors research not only acts but perceives, acting according to learned evaluative routines that correlate sensation to action. The ideal post-war laborer is not a person of a particular physical build, conditioned to perform particular motions, but rather a universalized collection of possible movements, curated and selected according to mathematical principles. Human Factors research turned the human laborer into a control for a system, a proper medium for the transfer and transformation of input.

Key to this new approach was the influence of information theory on approaches to both computing and psychology. In computing, the understanding of signals as information paved the way for a mathematics of binary code, in which the course of electrons through physical gates and switches could translate into algorithms and mathematical functions. In psychology, those who had grown weary of behaviorism’s stimulus-response approaches to explaining and modifying human action saw in Claude Shannon’s approach echoes of the structure of the human brain. These early cognitive scientists saw in thought a kind of algorithm performing consistent functions on ever-changing sense data, zipping through the brain’s neural pathways the way electrons travel through the copper of a computer’s circuits.

And so a new understanding of the operator’s actions emerged alongside a new understanding of a computer’s routines. The first software emerged at the same time that psychologists began to analyze human thought and memory as a collection of mathematical functions performed on sense data. In other words, the black box as we know it emerged as a pair of metaphors: one to describe the computational machine, and one to describe the human mind.

Before these developments, systems of manufacture and control were designed to include the human body as a “control” in the operational sense. The control in any function is a limiter, providing brackets to the acceptable inputs and possible outputs. If a laborer slows done his or her work, the entire process slows. In the new post-Taylorist work flow, in contrast, the control is performed by a computational process, rather than a human embodied one. The new computers allowed for the programming of internal black boxes within the machine itself. Information from multiple sensors, as it coursed through these machines, would be analyzed and checked for deviation. The result produced from such analyses would set certain mechanical processes in motion in order to produce a desired end.

Although the worker has been replaced by an algorithm as the system control, she or he is not missing from the scene entirely. Rather, the human operator now performs the function of a control for the control. The machine affords indications to the human operator of the proper functioning of the software-based controller. Deviations from designated functions trigger new action from the human operator, according to more advanced algorithms than required of previous industrial operators. This new human operator must synthesize multiple forms of data—visual, aural, even symbolic data—and then decide on a proper course of action, of input to the machine, according to a trained set of decision-making criteria and standards.

Though operating from more of a distance in relation to the phenomena of mechanical system function, this new, error-detecting human operator plays no less critical a role. His or her mental routines must be just as carefully scripted and trained as the Taylorist laborer’s physical actions, and often via emerging understanding of the brain as a computer.

The new operator is thus less of the system even though he or she is made more in the image of that system. Formerly one organ within a mechanical body, he now is modeled as a discrete body himself, tethered to another, mechanical body, and modeled after that body, for the purposes of safe and consistent system flow. The machine and the operator mirror one another, with the interface as their crucial site of division, the glass of reflection and action.

These changes also effect sociality through the creation of a new entity to include all agents. This new entity—the organization—invites design at a complex level that accounts for multiple machinic and human actors. Where each machine used to come with an operator as controller, the two treated as a single entity, the post war machine comes with an operator as agent, who is necessary to the proper functioning of the machine. But the human operator is separate from the machine. For large-scale projects, this doubling results in increased complexity, which the organization approaches as yet another information processing problem.

The organization, this plurality of entities, is coincident with the emergence of the interface. Machines and operators without true interfaces—as in Taylorist scenarios—are not collective in that they are not social. They are merely aggregate. Thus some of the biggest moves in computing research toward the latter half of the twentieth century were those that simultaneously addressed the interface between one operator and her machine, and the structure of all machine-human pairs, organized together into one system—one black box process.

Kevin Hamilton is an artist and researcher at the University of Illinois, Urbana-Champaign, where as an Associate Professor he holds appointments in several academic units across theory, history, and practice of digital media. He is currently at work with Infernal Machine contributor Ned O’Gorman on a history of film in America’s nuclear weapons programs; other recent work includes a collaboration with colleagues at Illinois’ Center for People and Infrastructures on the ethics of algorithms in internet and social media platforms.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.