From the Philosophy of the Open to the Ideology of the User-Friendly

Since I’ve been posting bits and pieces here from or on my book project, Reading Writing Interfaces, I wanted to also post what I’ve been thinking through in the third chapter “From the Philosophy of the Open to the Ideology of the User-Friendly.” Below is the introductory section for the chapter in which I outline my interest in the shift from a philosophy of the open, flexible and extensible to the closed environment of the “user-friendly” Macintosh which continues to influence the shape of contemporary computing.

*

“Compared to the phosphorescent garbage heap of DOS – an intimidating jumble of letters and commands – the world one entered into when flicking on a Macintosh was a clean, well-lit room, populated by wry objects, yet none so jarring that it threatened one’s comforting sense of place. It welcomed your work.” (Levy 157)


In the Old Testament there was the
first apple, the forbidden fruit of the Tree of Knowledge, which with one taste sent Adam, Eve, and all mankind into the great current of History. The second apple was Isaac Newton’s, the symbol of our entry into the age of modern science. The Apple Computers symbol was not chosen purely at random: it represents the third apple,  the one that widens the paths of knowledge leading toward the future. (Gassée 10-11)

The third cut I make into the history of twentieth century reading/writing interfaces is the era of the personal computer that was preceded by Douglas Engelbart, Alan Kay, and Seymour Papert’s experiments with (especially educational) computing and interface design from the mid-1960s to mid-1970s and that began with expandable homebrew kits from the mid- to late-1970s, irrevocably transforming into so-called “user-friendly,” closed, work-stations with the release of the Apple Macintosh in late January 1984.[1]

This chapter, then, concerns itself with two significant aspects of this roughly ten year period: first, the shift from seeing a user-friendly computer as a tool that encourages understanding, tinkering, and creativity to seeing a user-friendly computer in terms of an efficient work-station for productivity and task-management and the effect of this shift particularly on digital literary production. Second, tightly connected to the first, this chapter concerns itself with the rupture marked by the turn from computer systems based on the command-line interface to those based on “direct manipulation” interfaces that are iconic or graphical (GUI) – a turn driven by rhetoric that insisted the GUI, particularly that pioneered by the Apple Macintosh design team, was not just different from the command-line interface but it was naturally better, easier, friendlier. As I outline in the second section of this chapter, the Macintosh was, as Jean-Louis Gassée (who headed up its development after Steve Jobs’ departure in 1985) writes without any hint of irony, “the third apple,” after the first apple in the Old Testament and the second apple that was Isaac Newton’s, is “the one that widens the paths of knowledge leading toward the future.” (11)[2]

Despite studies released since 1985 that clearly demonstrate GUIs are not necessarily better than command-line interfaces in terms of how easy they are to learn and to use, Apple – particularly under Jobs’ leadership – successfully created such a convincing aura of inevitable superiority around the Macintosh GUI that to this day the same “user-friendly” philosophy, paired with the no longer noticed closed architecture, fuels consumers’ religious zeal for Apple products.[3] I should note that I have been an avid consumer of Apple products since I owned my first Macintosh Powerbook in 1995. However, what concerns me is that ‘user-friendly’ now takes the shape of keeping users steadfastly unaware and uninformed about how their computers, their reading/writing interfaces, work let alone how they shape and determine their access knowledge and their ability to produce knowledge. As Wendy Chun points out, it’s a system in which users are, on the one hand, given the ability to “map, to zoom in and out, to manipulate, and to act” but, she implies, the result is is a “seemingly sovereign individual” who is mostly an devoted consumer of ready-made software, ready-made information whose framing and underlying (filtering) mechanisms we are not privy to (8).

Thus, the trajectory of this argument culminates in chapter four, in which I make it clear that the logical conclusion of this shift to the ideology (if not the religion) of the user-friendly via the Graphical User Interface (GUI) is, first, expressed in contemporary multi-touch, gestural, and ubiquitous computing devices such as the iPad and the iPhone whose interfaces are touted as utterly invisible (and so their inner workings are de facto invisible as they are also inaccessible); and, second, this full realization of frictionless, interface-free computing born out of the mid-1980s is in turn critiqued by works of activist digital media poetics.[4] From this perspective, it is, then, no coincidence at all that Apple had actually designed something like an iPhone in 1983; at the same time that Macintosh designers were hard at work, Hartmut Esslinger, the designer of the Apple IIc, built a white landline phone complete with a built-in, stylus-driven touch-screen. (“Apple’s First iPhone”). The Apple IIc was in fact a close relative of the Macintosh in terms of portability and lack of internal expansion slots which made them both closed systems; the IIc was also released in 1984, just three months after the Macintosh.

But while chronologically proceeding from the era of the typewriter, using a media archaeology methodology to understand this particular rupture in media history means that activist media poetics plays out quite differently in the 1980s as it was an era newly oriented toward the efficient completion of tasks over and beyond a creative use or mis-use of the computer. Arguably one reason for the heightened engagement in hacking type(writing) in the mid-1960s to mid-1970s is that the typewriter had become so ubiquitous in homes and offices that it had also become invisible to its users. It is precisely at the point at which a technology saturates a culture that writers and artists, whose craft is utterly informed by a sensitivity to their tools, begin to break apart that same technology to once again draw attention to the way in which it offers certain limits and possibilities to both thought and expression. There are indeed examples of digital media activist poems that also inherit an emphasis on making, doing, hacking but – once again – it seems to me that the vast majority of these works do not appear until both the personal computer and the user-friendly computer whose GUI is designed to keep the user passively consuming technology rather than actively producing it become practically ubiquitous.

As I discuss in the first section of this chapter, activist media poetics in this particular time period mostly takes the form of experimentation with digital tools that at the time were new to writers – an experimentation that, at least under the terms set by Mckenzie Wark’s Hacker Manifesto, certainly could be framed as hacking (Wark infamously writes that “Hackers create the possibility of new things entering the world” [004] and that “The slogan of the hacker class is not the workers of the world united, but the workings of the world untied” [006]). However, as I will discuss, work by Invisible Seattle, bpNichol, Paul Zelevansky, Geof Huth, and Robert Pinsky is not working to make the (in this case) command-line interface visible so much as it is openly playing with and tentatively testing the parameters of the personal computer as a still-new writing technology. This kind of open experimentation almost entirely disappeared once Apple Macintosh’s design innovations as well as their marketing made open computer architecture and the command-line interface obsolete and GUIs pervasive.


[1] Related to this shift from the homebrew kit to the user-friendly GUI-based personal computer is the initial attempt to make computers appear friendly to uncertain, first-time buyers by marketing them as sophisticated typewriters. For example, Don Lancaster’s declares in the TV Typewriter Cookbook that his 1973 TV Typewriter can “convert an ordinary Selectric office typewriter into a superb hard-copy printer” (218); and a 1979 advertisement in Byte magazine for the word processor AUTOTYPE (produced by Infinity Micro) – “a true processor of words – oddly includes images of text in the shape of arrows and trees which could easily be mistaken for typewriter-created concrete poetry. (“Autotype” 169)

[2] It’s worth noting that, despite Gassée’s hyperbolic rhetoric that I use to help demonstrate the ideological fervor of those working for Apple in the 1980s, his vision for Macintosh was quite different from Jobs’ in that Gassée helped shepherd onto the market three models of the Macintosh (the Mac Plus, Mac II, and Mac SE) that were all expandable instead of the first generation Macintosh which actively prevented users from opening up the computer by, as I describe in the body of this chapter, giving the user a small electrical shock if they did not adhere to the warnings. While these later models of the Macintosh included expansion slots which philosophically returned Apple to the era of Steve Wozniak’s Apple II (whose six expansion slots permitted a whole range of devices for display controllers, memory boards, hard disks etc.), it seems clear that the return of Jobs to Apple in 1997 meant – and still does mean – a return to keeping the inner workings of Apple computers and computing devices firmly closed off to users.

[3] For example, in 1985 John Whiteside et al wrote in “User Performance with Command, Menu, and Iconic Interfaces” that “interface style is not related to performance or preference (but careful design is)” and further they concluded, “the care with which an interface is crafted is more important than the style of interface chosen, at least for menu, command, and iconic systems.” (185, 190) Such studies have been repeated as recently as 2007 (see Chen et al).

[4] It is precisely out of a media archaeology impulse that I have created the Archeological Media Lab at the University of Colorado at Boulder – a lab which houses most of the computers I discuss in this chapter, including the Apple II, Apple Lisa, and Apple Macintosh – precisely because their out-datedness very clearly communicates to us now the design ideologies behind both their hardware and software that delimits what can be written, what can be thought. The key to the lab’s success will be to avoid presenting these machines as novelty or kitsch and instead approach each of them as a productive field for understanding our computing past and present.

5 thoughts on “From the Philosophy of the Open to the Ideology of the User-Friendly

  1. Anne McGrail

    This is the most fun I’ve had reading in a long time. I wish I were a coder so I would have the authority to point to all the “consumers” of whom you speak and be immune to the effects of which you speak. Instead, the closest I come to coding is to say that I wrote my disseration with function keys that they were little clumps of code that made the limitations only slightly more clunky than Apple’s–a kind of “sensible shoes” computing to Apple’s “high heels.”. And I still miss function keys. I’ve been following this (DH) field for only 6 months, but I feel like what you are talking about here is as close to an evolving theory as Baudrillard’s.

  2. Brett Bobley (@brettbobley)

    This book looks really fascinating — I can’t wait to read more!

    This topic of open vs. closed (or hacker box vs. appliance) is interesting to me. When I first started using computers (Commodore PETs, Apple ][, TRS-80, etc.) I really perceived it as a machine to be programmed — writing code was what it was all about. I also perceived it as something that kids were mainly interested in, like comics or rock n roll. We hung around the computer lab and screwed around with these new machines and read Byte and pranked each other by writing fake BASIC interpretors and leaving them running in the lab. So the open architecture certainly made sense for this culture. I was always saving up to buy some crazy board to put in my Apple or soldering something (like the shift-key mod for the ][).

    I suspect that my perception of PCs as kid/hacker culture was driven in large part by the fact that the business community hadn’t yet embraced personal computers. There were no killer business apps yet — ones that businesses had to have. I remember talking to my dad, telling him about “word processing software” and how we all might, one day, use that instead of typewriters. He probably smiled at his hacker son but never imagined that my “hobby” would one day be something he’d have at his desk at his office in the business world.

    When Lotus 123 and the other killer business apps came out, that really started us down the road of moving toward the appliance. Businesses purchased computers to do a task — sometimes just one, like spreadsheets. They weren’t buying gear for you to hack on — they wanted productivity. They also wanted machines that had the same kind of reliability as an office phone — you pick it up and you’ve got dial tone. (Still haven’t really achieved that.)

    This really changed things because once the business world started using them solely to run off the shelf software, that became what computers were all about. That notion moved into the home too — buy a computer and you can run software program X or game Y. It wasn’t about taking it home and hacking anymore. Sometimes I find this troubling — all the kids I knew in the ’70’s who had a PC all knew how to code and how computers worked. But kids I meet today often have no idea — they only know how to use an appliance. The underlying platform is a mystery. But, on the other hand, I certainly see what Jobs was going for. My dad eventually got a computer and he loves it. He doesn’t code — he can’t even install a software patch himself. But he can use some off-the-shelf applications and finds his Mac useful and fun. So things are different now.

    Thanks!

    1. Lori Emerson

      Thanks so much for the great comment Brett – your account of this shift to the appliance computer helps me to understand better that thread I kept coming across in those old issues of Byte and all that rhetoric about “business solutions” and “task management.” I always liked Matthew Fuller’s essay “Microsoft Word” but his point about software constructing workers instead of creativity became crystal clear to me looking through Byte. Also, I’ve had a hard time understanding what exactly people meant by “appliance” – or at least I did until I ran across a couple phrases in, I think, writing by Donald Norman in which he says the computer should be as easy-to-use and as invisible as a sunbeam toaster. Like you observing your dad, I really can see the value of this type of user-friendliness. And since I’m not a programmer myself, where on earth would I be without the GUI! So, in this chapter I was going to focus on the shift from the command-line to the GUI but now it’s mostly about the history of the GUI and how not all philosophies of the user-friendly are created equally. As a non-programmer I think I could be just as comfortable using Smalltalk for the Xerox Star as I am using MacOS but Smalltalk would make me a whole lot smarter, or more informed about the general shape of the system and its underlying workings … anyways, just some ramblings. Thanks again so much for reading and thoughtfully responding—

  3. Brendon

    I come from the realm of the coder, often a bit scared of GUI design (but recently warming to it). After reading this article (And promptly following), my eyes opened up to how the design aspect of simplifying it for the user is evolution of not just computers. Your post really opened my eyes up to look at the evolution of it in a different way. Thank you.
    I think this can also be seen in the declining sales of desktop PC’s, no-one wants to tinker with their PC anymore.
    I am looking forward to reading more of your book.

Comments are closed