[The second of a series, the first of which is here.]
Steven Levy’s ground-breaking chronicle, Hackers: Heroes of the Computer Revolution (1984), published four decades ago (before his books on Google, Facebook, the Macintosh, the iPod, artificial life, and the popularization of encryption), offers enduring lessons to pre-law and law students, and to lawyers.
Levy, currently Wired magazine’s Editor at Large, focuses not on the details of programming but instead on the people and the perspectives propelling (in the words of his 2010 afterword) “an often heated battle between geeky idealism and cold-hearted commerce.”
He traces the evolution of hacking’s community and culture from its inception, at MIT in the late 1950s (where, in some respects, as concert promoter Bill Graham famously said of the Grateful Dead, “They’re not the best at what they do. They’re the only ones that do what they do.”); through Palo Alto in the 1970s; and into the mass marketing of video games—especially by the Yosemite-adjacent On-Line Systems (later renamed Sierra On-Line)—in the early 1980s.
From the relatively restricted realms of Cambridge’s computer facilities, Berkeley’s Community Memory project, and Menlo Park (California)’s Homebrew Computer Club emerged issues, tensions, and dynamics of even more critical consequence in today’s pervasively-networked, always-online society and economy, where “personal computers” can be pocket-sized.
At MIT, “hacker” appreciatively referred to a participant in “a project undertaken or a product built not solely to fulfill some constructive goal, but with some wild pleasure taken in mere involvement. . . . [T]o qualify as a hack, the feat must be imbued with innovation, style, and technical virtuosity.” (By 2010, a veteran hacker had to acknowledge to Levy that popular culture “stole our word, . . . and it’s irretrievably gone.”)
With warmth, well-crafted portrayals, and a wealth of wonderful quotations, the book summarizes, analyzes, and illustrates what Levy identifies as the elements of “The Hacker Ethic.”
In his analysis, hackers champion, beyond simple elegance in the design and implementation of software and hardware:
● “the Hands-On Imperative” (complete access to any facilities, tools, hardware, and software that could be educational; but not for purposes of illegal profits);
● the free sharing of information, including program code, for review and editing by any interested members of the community;
● the decentralization of authority, and an antipathy towards bureaucracy;
● the assessment of others by their technical skills, rather than by “bogus criteria such as degrees, age, race, or position” (although virtually all of the featured hackers are young—sometimes, very young—white males).
Indeed, “some hackers. . . would never graduate, and be too busy hacking to really regret the loss”; on the other hand, the loss would be Hewlett-Packard’s, when it denied employee Steve Wozniak a position as the head of a new division for small computers, because he had not officially graduated from Berkeley.
● the belief that computers “can change your life for the better,” in part by enabling users to create “art and beauty,” in the design and/or by the application of their programs;
● the attitude that “no system or program is ever completed,” often coupled with “the illusion that total control [is] just a few features away” (the “Creeping Feature Creature”); and
● the understanding that many hardware and software problems can be resolved by a unique, optimal, graceful, and universally-appealing solution, known as “The Right Thing.”
At its extreme, the Hacker Ethic led one of MIT’s most talented and dedicated hackers to threaten to destroy an MIT computer if it were to be operated, as had been announced, on a “time-sharing” basis. (Ultimately, time-sharing was restricted to daytime hours, enabling the hackers to retain, at night, their exclusive control of the system.)
The book describes various specialties of hackers, including: practical coding of software “Tools to Make Tools,” or, “pragmatic systems building” (“[T]here was no higher calling in hackerism than quality systems hacking”); “math hacking,” or the surprising and sometimes-surreal enhancement of programs by a master of “the magical connections between things in the vast mandala of numerical relationships on which hacking was ultimately based”; telephone hacking (unauthorized and sometimes-illegal exploration of the phone system); and, hardware (and, particularly, lock) hacking.
Though not referred to as such, it also describes a form of managerial or administrative hacking, or what might be called hacker-harnessing. When inspired and (self-)motivated, a group could create, over a weekend, “a program that would have taken the computer industry weeks, maybe even months to pull off”; however, both individually and collectively, the hackers were notoriously resistant to any form of external direction. (Lawyers might especially appreciate Levy’s discussion, later in the book, of the challenges of governing meetings of the Homebrew Computer Club.)
Even an administrator who might himself be considered a hacker “failed ignominiously” at “assigning the hackers specific parts of [a] problem” to solve. “He ultimately accepted the fact that the best way to get hackers to do things was to suggest them, and hope that the hackers would be interested enough.” (Levy notes that “someone like [artificial intelligence pioneer] Marvin Minsky might happen along and say, ‘Here is a robot arm. I am leaving this robot arm by the machine.’ Immediately, nothing in the world is as essential [to a hacker] as making the proper interface between the machine and the robot arm. . . .”)
This accommodation involved two opposed meanings of “oversight” when it came to the hackers’ habitual violations of others’ privacy and property. Another administrator explained to Levy that instead of creating physical or digital barriers, which would only trigger the hackers’ compulsion to defeat technological challenges, “the trick was to sort of have an unspoken agreement. . . . And if someone violated those limits, the violation would be tolerated as long as no one knew about it. Therefore, if you gained something by crawling over the wall to get into my office, you had to never say anything about it.”
Such approaches might succeed within the close confines of MIT’s computer centers, but they were increasingly threatened as innovations in software, and in “personal computers” (a longtime dream of hackers) spawned worldwide commercialization of their production.
The evolution and environments of the real world, unlike those of the hackers’ beloved computer simulation of LIFE (introduced in 1970 by Martin Gardner’s “Mathematical Games” column in Scientific American), could not be carefully contained and controlled. Nor could the hackers remain, despite their intellects and intensity, insulated and isolated.
In fact, the chapter devoted to their near-obsession with LIFE details as well the increasing social concerns that in the late 1960s led MIT to install special security measures in its computer facilities, which had become targets of antiwar demonstrators. Not all of the hackers were opposed to these protections.
The activists were not misinformed: “[A]ll of the lab’s activities. . . had been funded by the Department of Defense,” whose Advanced Research Project Agency’s (ARPA) “money was the lifeblood of the hacking way of life.” (Prefiguring elements of today’s employee and shareholder activism at high-tech companies, in 1975, “a debate was raging within [the Osborne Computer Corporation] as to the propriety of selling [its database and communications] software to anyone who cared to use it, or restricting it so that it would not benefit any military efforts.”)
Levy’s “second wave of hackers” took the Hacker Ethic public, by making software and personal computers much more widely available. The Homebrew Computer Club facilitated networking of all types—notably enabling the unauthorized distribution of corporations’ proprietary computer chips, schematics, and software code. In a watershed moment for the community (discussed approximately halfway through Hackers), a nineteen-year-old Bill Gates, incensed by the widespread piracy of his and Paul Allen’s BASIC interpreter code for the long-awaited Altair personal computer, published an Open Letter to Hobbyists in the Homebrew Computer Club’s newsletter, in January 1976.
Signing his letter as “General Partner, Micro-Soft,” Gates argued that, “As the majority of hobbyists must be aware, most of you steal your software. . . Who can afford to do professional work for nothing? What hobbyist can put three man-years into programming, finding all the bugs, and distributing for free?”
Although his appeal might not have changed the attitudes, or behavior, of many readers, Gates’ assertion of intellectual property rights in software marked a significant departure from the practices of MIT’s proto-hackers, who had freely shared their code. (Levy’s summary of their position: “As for royalties, wasn’t software more like a gift to the world, something that was reward in itself? . . . When you wrote a fine program you were building a community, not churning out a product.”)
Among the programs they shared was Spacewar, perhaps the first video game. The Digital Equipment Corporation (DEC), which had supplied, at the hackers’ request, “the elaborate sine-cosine routines necessariy to plot the [space]ships’ motion,” used the game as “as a final diagnostic program” on the computers it manufactured. Casual collaboration on another program, MacLISP, “was all part of the easy arrangement between MIT and DEC, and no one questioned it,” in the interests of promoting the creation and dissemination of the best software possible.
Gates’s contrast of “professional[s]” with “hobbyists,” though, would blur over time. If, as Levy observed, in 1975 “being a hobbyist in digital electronics meant you were probably a hardware hacker,” the programming and hardware wizardry of Steve Wozniak would bring user-friendly Apple computers to the public, enabling them to more easily create their own software. Indeed, for purchasers of Apple computers, or of their early competitors, “hardware creation was esentially done for you. People bought these machines to hack software.”
Wozniak himself straddled the two categories. Originally “building a computer to have fun with, to show his friends,” he was prevailed upon by his family and friends (who in turn had been persuaded by Steve Jobs) to leave Hewlett-Packard for the nascent Apple, to bring computers to the public.
Wozniak told Levy, “[T]here’s no way I would associate Apple with doing good computer design in my head. . . . The reason for starting Apple after the computer design is there’s something else—to make money.” But, as opposed to Gates’ proprietary approach, “Every twist and turn of [Wozniak’s] design, every coding trick in his BASIC interpreter. . . would be documented and distributed to anyone who wanted to see.”
The third generation of hackers, who designed applications—most notably, games—for personal computers, operated in an unabashedly corporate (including venture capital), commercial, and commodified world, where collaboration across companies—and even participation in Homebrew Computer Club meetings—declined rapidly.
The head of game-maker Sirius Software concluded, “It’s one thing to see your Apple product on the wall of a computer store, . . . but when you see a rack of your stuff in K-Mart, you know you’ve arrived.”
Discussing the advent of copy-protection technologies, and reproducing an anti-piracy warning distributed by game-maker Atari, Levy examines the ways in which game companies, including Sierra On-Line, reconfigured rivals’ programs to create their own versions, sparking copyright litigation. Despite its success in a pre-trial motion, Sierra settled litigation brought against it by Atari: its leader realized, “If this [win] opens the door to other programmers ripping off my software, . . what happened here was a bad thing.”
Although programmers increasingly demanded that their names be featured on the boxes containing their works, they were themselves now often seen as interchangeable cogs in the production of games.
In his 2010 afterword, Levy concluded that, “[H]acking’s values aren’t threatened by business—they have conquered business. Seat-of-the-pants problem solving. Decentralized decision making. Emphasizing quality of work over quality of wardrobe. These are all hacker ideals, and they have all infiltrated the working world.”
Beyond the continuing questions of ways in which software can, and should, be subject to intellectual property protection, Hackers raises issues relevant to any current professional or pre-professional, in the law or otherwise:
● Where, today, can one find emerging areas like computers in the late 1950s, and a community within which, and mentors from which, to learn?
● To what degree, if at all, should top performers, individually or collectively, in corporate settings or otherwise, be indulged, or exempted from general requirements or restrictions, because of their perceived talents?
● What is the proper balance of expediency and elegance, even if, as one software executive acknowledged, “Software always takes longer [to develop] than you expect”?
● What is the proper work-life balance?
● How should the determination of that balance be affected by the level of excitement, challenge, and fun that one finds (or might, or hopes to, find) in one’s work?
● How can an individual, group, or company maintain its original professional intensity, focus, and purpose?
● To what degree can, and should, “the Hacker Ethic” be adopted by different areas of today’s professional and popular culture?
● To what degree, and in what contexts, should “the Hacker Ethic” be overruled by considerations such as national security, and personal privacy? To what degree are they consistent? (Richard Stallman, portrayed at the end of the book as the staunchest holdout for the Hacker Ethic, insisted to Levy in 2010, “You have to believe that freedom is important and you deserve it.”)
● Finally, how much of an extraordinarily complex subject, particularly one that has evolved over generations, is it possible for a single person, however talented, to master?
As it has been said, in a much different context:
“Once a young man who wanted to become a Hasid arrived at the court of Isaac Meir, the rebbe of Gur.
“The rabbi asked him if he had learned Torah.
“The young man didn’t know what to answer. He had studied Torah but didn’t want to appear too bold and answer ‘Yes,’ as if he knew all of Torah; nor could he say ‘No,’ for he would then be lying.
“So he responded, ‘I know a little.’
“The rabbi replied, ‘Can anyone know more than a little?’”