Revealing Errors

(C) Copyright 2008 -- Benjamin Mako Hill
Distributed under the terms of the
Creative Commons Attribution Share-Alike License

Presentation from Penguicon
Based on a presentation from LUG Radio Live USA 2008

Introduction

Note

Slide: Faked Blue Screen of Death

"How embarrassing..." I have this

Note

Slide: Faked Linux Kernel Oops

"That's better."

Note

Title Slide

I'm going to do is try to talk about free (and open source) advocacy and activism. But in a strange way...

We're used to talking about why free and open source software is important by emphasizing all of the way that it works:

I'm going to make a case for free software and open source advocacy that talks about the way that technology (all technology, not just free software) doesn't work.

First, overview of my goals for the talk:

The Case for Revealing Errors

First, I'll law out the problem.

Note

Slide: Free Software Definition

And no, Free Software is not the problem.

The Problem

I trust I don't need to give you a background into free software but here is the abbreviated free software definition to show a point.

An enumerated set of freedoms:

The freedom to...

  • ... use software for any purpose.
  • ... study and modify.
  • ... distribute copies to others.
  • ... collaborate and share your changes.

What's important to note in the FSD is that, unlike the work of CC (for example) and unlike the work of the FSF for that matter, free software (as an idea) is highly user centric.

Developers have the same freedom, sure, but they also have responsibilities! To respect their users freedoms.

But free software's dark (and very poorly kept secret) is that we've done a rotten job of communicating our message to the non-technologists -- the users -- we're trying to free and protect.

EFF, OSI, SPI, etc: The exact same situation. (Sometimes worse!)

In FOSS, developers have everything to lose: their livelihoods, comfortable business models, nice lifestyles, their power, etc.

Users have everything to gain and they are silent in the debate.

The free software and open source message (don't even get me started on the software itself) resonates with technologists and seems to stop there. The important question that we must answer is: Why?

Free Software and Power

I think the first problem is that we need to stop talking about about free software as being about software. And we do this a lot. Almost exclusively.

In a certain sense, this is a critique of the whole "open source" message but it's much deeper and is actually equally critical of the FSF message as well.

Note

Slides: power, control, and autonomy

Instead, we need to think about software as being about POWER, CONTROL, and AUTONOMY (to borrow the last from Eben Moglen).

Example: Communications technologies and the who/what/where/how (we still own the "why").

The designers of technologies (software or otherwise) are as powerful as their technologies.

Their technologies are, quite explicitly, mitigating and mediating our lives.

...but we all know that. Because we design technologies and build and support technologies.

And, as people that think about these things...

We understand that technology should be free (or open) because we understand that there is a real dystopian alternative to non-freedom. The world in which people work and experience on others terms. We understand that technology is powerful because we have that power in our hands.

This begins to explain why we have such a problem communicating the FSF, OSI, or EFF message to non-technical people:

They do not understand that (much less how or why) technology is powerful.

Invisible Technology

But in fact, as I've worked on this, I've concluded that the problem is even bigger:

The reason most people don't understand the power of technology is that they don't realize technology exists.

I think that last statement needs a little more defense.

Marc Wiesner, director of Computer Science at Xerox PARC wrote a paper seen as the birth of "Ubiquitous Computing" that made a call for invisible computing:

A good tool is an invisible tool. By invisible, I mean that the tool does not intrude on your consciousness; you focus on the task, not the tool. Eyeglasses are a good tool -- you look at the world, not the eyeglasses. The blind man tapping the cane feels the street, not the cane. Of course, tools are not invisible in themselves, but as part of a context of use. With enough practice we can make many apparently difficult things disappear: my fingers know vi editing commands that my conscious mind has long forgotten. But good tools enhance invisibility.

Whether other not Ubicomp succeed, this is the reality of technology already. Most technology are invisible to most people.

For example, people don't know that switches, systems, communication systems, firewalls, etc, even exist.

And this is a huge problem for free software.

One can't make a call for a free operating system if one doesn't realize what an operating system is. Free firmware? What is firmware? Never heard of it.

One can't talk about the power that filtering software gives ISPs if people haven't imagine that their ISP make real technological choices and holds real power over communication technologies.

So, while the most important project of the free software movement will be communicating its message to non-technical people, to do so we need to reveal technology and its power.

But technology's invisibility is never perfect...

Note

Slide: Broken Eyeglasses

Broken glasses -- even a smudge -- and eyeglasses become very visible indeed!

Errors can be like the shark's dorsal fin. The tip of the iceberg. They are opportunities. And they are largely unexploited as a mechanism to reveal, discuss, and build on technologies.

What I am suggesting is a method of discussion, conversation, and evaluation.

I am talking about "revealing errors" in both its meanings. I am suggesting a process of showing off errors to show off hidden to technologies to demonstrate the power of technology.

ATM and Windows Examples

Introducing, the broken eyeglasses of bank operating systems...

Note

Slide: ATM BSOD

Note

Slide: ATM Error Dialog

"This copy of windows must be activated before you can log on. Do you want to activate windows now?"

Note

Slide: Windows Boot Screen

Who has seen one of these?

Seeing one of these introduces the OMG moment:

My ATM runs windows? My bank runs windows?

Followed up by:

I am trusting my money to Microsoft Windows?!

You don't have to be a Free Software fanboy or fangirl to be shocked. You might simply say:

I use Windows at home. It crashes. It has viruses. It loses my data. Do I trust my money to it?

And people do exactly this. Flickr is full of these photographs. Hundreds of them.

The error reveals the technology (MS Windows) and, in the process, encourages people to ask questions about their technology, its authors, and issues of control and power.

Note

Slide: New Hardware Dialog

Cute aside about the Hardware Dialog.

Examples

Errors Reveal Hidden Constraints

Note

Slide: T9 errors

Tegic (now Nuance)'s predictive text system allows for certain things but not other.

Note

Slide: Cupertino effect (example text)

Quote:

Within the GEIT BG the Cupertino with our Italian comrades proved to be very fruitful. (NATO Stabilisation Force, "Atlas raises the world," 14 May 2003)

Could you tell us how far such policy can go under the euro zone, and specifically where the limits of this Cupertino would be? (European Central Bank press conference, 3 Nov. 1998)

The source of this strange stuff, as the Language Log blog discovered, was spell checking software that contained the word "Cupertino but not cooperation."

Here's a screenshot from Microsoft Outlook Express circa 1996:

Note

Slide: Cupertino

Of course, this is always the case! But it's much more obvious here.

Simply put: Certain messages are easier than others.

And this is always the case.

Errors encourage us to ask about the who, why, and how.

Errors Reveal "Designed-in" Values

Sometimes, designers try to create hidden constraints because they have a particular value system. In the process, their values are revealed.

Note

Slide: Shiv Video

Spell checkers are the same way.

It's again about making certain messages easier and others more difficult. And it's an explicit prudish project.

It's it horrible and restraining? Maybe not. But limiting our options is and can be. Errors are ea way to begin that analysis and, with non-technical people, to begin that conversation.

Note

Slide: Shitty IE

Sometimes, decoding the values is much less difficult. :)

Errors Reveal "Hidden" Mediators

Note

Slide: Apple XSS Issue

Our technology is mediated, always, in ways we don't always know but that are powerful. They could be modifying and they are changing.

And, sometimes, we actually get a picture of the mediator.

Note

Slide: Google Books Scan

Errors Provide Views into Closed System

In an final sense, we often get windows into normally closed system that we use. We see pieces and processes that without the error we might never see.

Note

Slide: GNU Units

Tell GNU Units anecdote.

Note

Slide: Source Code View

Sometimes this is extraordinarily obvious (i.e., we get to see the code itself).

Conclusions

What I've shown in a methodology.

Errors are useful because they are everywhere and they are, I hope, revealing. By revealing errors we can reveal much more.

Errors, I hope I've shown, can also be funny. And in that sense we can hook people in and get people to listen.

What you walk away with?

I have three suggestions:

  1. Pay attention to errors were you see them. Think about what they reveal. Think about the power and politics of technology. Think about the impacts of closed and controlled code. Think about closed and controlled code.
  2. Teach people about technology wherever possible. Reveal technology and its effects. Look at errors to do this.
  3. Selfishly, help out my Revealing Errors project by reading my blog, spreading the word, and more importantly, sending me revealing errors when you see them. I'd like to write a book and I need your help.

What's at stake?

In Code (his first and best book) L. Lessig describes the way that "code is law" -- it controls and frames interaction and work. In a way, his argument was similar to what I'm doing here.

Technology is going to happen: Our choice is how: XOs or mobile phones?

We are, right now, locked in what may turn out to be the most important struggle of our lifetime. How gets to control technology and the terms on we which we learn, act, interact, and experience. It is the most fundamental question.

There is power and politics to technological design.

We, as technologists, are the politicians.

We, as technologists, have a responsibility to pay attention to errors, to understand technologies, and to fight for control over technology.

Errors can help reveal the details of technologies and frame political debates. We need to pay attention and lead these debates. We need to use whatever tools we have, including errors, to do so.

More importantly though, we need to communicate and teach about technology to others. We must democratize technological understanding.

Analog: Writing and Coding.

We must, and can, build that better, more free, more open world. Errors can be one piece.