Simple Strategies and Secrets for Success in Law School (A Companion to the Book of the Same Name)

Author: waltereffross (page 3 of 6)


     [The previous essays in this series are here, here, here, here, here, here, here, here, here, here, here, here, and here.]

     This month, for the thirtieth year, medical schools’ “white coat ceremonies” formally initiated students into the profession.  Many of those events featured some version of the Hippocratic Oath, which was created about twenty-five centuries ago.

     New law students—preparing for positions at least as complicated, and much more contentious, than those of physicians—were presented with no special garment or equipment, and might not have recited any pledge (although some law schools have introduced “professionalism oaths”).

     However, during their own orientations these “One Ls” effectively donned an invisible, insulating, and indispensable cloak designed by the American Bar Association.

     Tailored to the complexities of attorney-client relationships, the plain-English provisions of the Model Rules of Professional Conduct (available on the ABA’s website, and valuable reading for any pre-law or law student) are by necessity both roomy and restrictive.

     They enable a lawyer to act simultaneously on behalf of her law firm, its clients, and the legal system; and to serve variously as an advisor, advocate, negotiator, evaluator, and arbitrator.

     Because lawyers lack the distinctive dress of doctors, they are sometimes required to identify their profession and the possibly adversarial nature of their involvement.

     For example, an attorney dealing on a client’s behalf with someone not represented by counsel must “make reasonable efforts to correct [that person’s] misunderstanding” of “the lawyer’s role in the matter.” In particular, an attorney who represents an organization should “explain the identity of the client” when communicating with executives, employees, and shareholders whose personal interests are adverse to the organization’s, but who might mistakenly believe that the attorney also directly represents them.

    The Rules also reflect lawyers’ (as opposed to doctors’) legal status as agents, who owe to each client/principal the fiduciary duties of care and loyalty. They prescribe standards for providing competent, informed, and timely counsel; and for championing a client’s concerns, including confidentiality, ahead of the lawyer’s own. 

    Although the interests of a physician’s different patients might rarely collide, the Rules address in detail how to prevent, and resolve, conflicts among a lawyer’s (or law firm’s) clients. 

    Lawyers’ cloaks do allow them some flexibility and freedom of movement.

    First, among the Rules’ mandatory (“shall”) provisions are many discretionary (“may”) provisions, and, more than a hundred appearances of some form of the qualifier, “reasonable”; more than thirty of “substantial”; and more than twenty-five of “material.”

    Second, although doctors aren’t usually perceived by the public as endorsing their patients’ values and actions, lawyers well might be.  Model Rule 1.2(b) declares that representation “does not constitute an endorsement of the client’s political, economic, social or moral views or activities.”

    Third, under Model Rule 1.16(b)(4), an attorney may generally withdraw from a representation if “the client insists upon taking action that the lawyer considers repugnant or with which the lawyer has a fundamental disagreement.”

     Fourth, the lawyer herself isn’t explicitly required to refrain from personal activities that might embarrass or disconcert some clients.

    Fifth, just as doctors might make suggestions that are not technically medical, lawyers may offer perspectives beyond the strictly law-related. Model Rule 2.1 provides that, “In rendering advice, a lawyer may refer not only to law but to other considerations such as moral, economic, social and political factors, that may be relevant to the client’s situation.”

     For example, ESG (Environmental, Social, and Governance) attorneys often review with clients possible public pushback to (or plaudits for) their various legal options.

     The most recent characteristics of law students’ cloaks were added by the ABA’s accreditation standards.  As of February 2022, law schools must “provide substantial opportunities to students for .  . . the development of a professional identity,” which “involve[s] an intentional exploration of the values, guiding principles, and well-being practices considered foundational to successful legal practice.”

     One of those fundamental values, principles, and practices—compassion, for oneself and others—is nowhere mentioned in the Model Rules.

    However, it appears in the first of the American Medical Association’s ten Principles of Medical Ethics, as well as in the accounts of physicians like Danielle Ofri (What Patients Say, What Doctors Hear), Suzanne Koven (Letter to a Young Female Physician), and Jerome Groopman (How Doctors Think). 

     Generations of law students have been advised, “If the facts help your client, pound on the facts.  If the law helps your client, pound on the law.  If the facts and the law don’t help your client, pound on the table and yell.”

      In one old story, a courtroom lawyer, finding both the facts and the law unavailing, chooses a less aggressive option.  Humbly pleading for judicial compassion for his client, he happens to resort to Yiddish: “Your Honor, it’s a matter of rachmones.” 

      To which the opposing attorney, accustomed to legal decisions bearing such titles as Matter of Smith and Matter of Jones, responds in confusion, “Counsel, do you have a citation for that reference?”

      The pockets of a law student’s invisible cloak should have at least as much room for compassion as do those of a medical student’s new white coat.

5-7-5-365: Or, Micro-Journaling

     [The previous essays in this series are here, here, here, here, here, here, here, here, here, here, here, and here.]

     Regularly reading haiku can relax, refresh, and refocus multitasking students and lawyers too busy to parse prolix poetry

     Moreover, composing a daily haiku is not only a way to quickly capture classroom, campus, and extracurricular moments and moods, but also a mindful method of “microjournaling,” as well as an evocative exercise in stylistic subtlety.  As Harold G. Henderson’s An Introduction to Haiku (1958) observes, “[G]ood haiku are full of overtones.  The elusiveness that is one of their chief charms comes, not from haziness, but from the fact that so much suggestion is put into so few words.”

     Nor should such writing require lawyerly redlining or other formalities.  Clark Strand’s Seeds from a Birch Tree: Writing Haiku and the Spiritual Journey (1997) insists that “[C]omposing a haiku by revision is unlikely to result in the kind of clarity, the feeling of reality that is the unique hallmark of haiku and the true source of its meaning.”

    Simply staying alert for the topic of the day’s poem is a training of its own.  Strand’s chapter, “Taking a Haiku Walk,” discusses how he found poetic inspiration when he “worked on Wall Street as a proofreader for a corporate law firm”; elsewhere in the book, he recommends maintaining a “haiku diary” (“a moderately messy notebook. . .  neither cumbersome nor tiny,” in which one “can feel free to scribble notes and random moments of the day”).

     Similarly, Beth Howard, who originally adopted the daily-haiku practice for a year, and then just kept going, is quoted in her teacher Natalie Goldberg’s Three Simple Lines (2021) as advising, in part, “Small memo pads are all you need, nothing fancy. . . . Put down every line that comes. . . . You don’t have to finish the haiku in the moment, but you don’t want to lose it.”

     The Haiku Society of America defines a haiku as “a short poem that uses imagistic language to convey the essence of an experience of nature or the season intuitively linked to the human condition.”

     The HSA’s accompanying Discussion Notes elaborate that

     “Most haiku in English consist of three unrhymed lines of seventeen or fewer syllables, with the middle line longest. . . . In Japanese a typical haiku has seventeen ‘sounds’ (on) arranged five, seven, and five [and] include a “season word” (kigo), a word or phrase that helps identify the season of the experience recorded in the poem, and a “cutting word” (kireji), a sort of spoken punctuation that marks a pause or gives emphasis to one part of the poem. . . . The most common technique is juxtaposing two images or ideas (Japanese rensô). Punctuation, space, a line-break, or a grammatical break may substitute for a cutting word. Most haiku have no titles, and metaphors and similes are commonly avoided.”

     According to Lee Gurga’s Haiku: A Poet’s Guide (2003):

     “The primary poetic technique of the haiku is the placing of two or three images side by side without interpretation.  At least one of the images, or part of it, comes from the  natural world.  The second image relates to the first, sometimes closely, sometimes more ambiguously. . . . Traditional Japanese haiku aesthetics recognize several kind of interactions between the images, including echo, contrast, and expansion.”

     Gurga’s chapter, “The Craft of Haiku” reviews general techniques; and a subsequent chapter on “Writing and Revising Haiku” provides twenty-six specific guidelines (rather than rules).

     Traditional haiku rely on rustic imagery, and often illustrate the evanescence and transitions of natural phenomena.  Modern authors might choose to highlight such themes as gratitude, resilience, and perseverance.

     To dismiss as foolishness such a promising practice, which involves such a small investment of money, time, and energy, might well be, in the language of literary commentators, a pathetic fallacy.

     The four classic masters of haiku are: Matsuo Basho (1644-1694), Yosa Buson (1716-1784), Kobayashi Issa (1763-1828), and Masaoka Shiki (1867-1902).

      Following are selections from Robert Hass (ed. & trans.), The Essential Haiku: Versions of Basho, Buson & Issa (1994):


                 A crow

            has settled on a bare branch—

                 autumn evening.

                             The old pond—

                        a frog jumps in,

                             sound of water.

                                         Winter solitude–

                                    in a world of one color

                                         the sound of wind.


                                                            the cicada’s cry

                                                                 drills into the rocks.


                 Blow of an ax,

            pine scent,

                 the winter woods.

                             Cover my head

                        or my feet?

                           the winter quilt.

                                         In the summer rain

                                    the path

                                         has disappeared.

                                                     Calligraphy of geese

                                                against the sky—

                                                     the moon seals it.


                 Don’t worry, spiders,

            I keep house


                             Climb Mount Fuji,

                        O snail,

                             but slowly, slowly.

                                         The crow

                                    walks along there

                                         as if it were tilling the field.

                                                     Children imitating cormorants

                                                are even more wonderful

                                                     than cormorants.

     Other notable collections of haiku include:

     ● Robert Aitken, The River of Heaven: The Haiku of Basho, Buson, Issa, and Shiki (2011).  This posthumously-published work provides a Zen master’s translations of and commentary on selected poems.  Roshi Aitken also wrote A Zen Wave: Basho’s Haiku and Zen (1978).

     ● Issa’s Best: A Translator’s Selection of Master Haiku (trans. David G. Lanoue) (2012), is a “guided tour through the work of Issa, gathering together in one text 1,210 of what I consider to be the master poet’s most effective and evocative verses,” categorized by season (New Year; Spring; Summer; Autumn; Winter; and, Non-Seasonal).

     12,440 of Issa’s haiku are available on the author’s website (click on its “search” buttom without specifying a keyword).

     ● Collected Haiku of Yosa Buson (trans. M.S. Merwin & Takako Lento) (2013).  Categorized by season.

     ● Basho: The Complete Haiku (trans. Jane Reichhold) (2008, 2013). 

     The introduction notes that “Becoming familiar with Basho’s single poems, reproduced here in the approximate chronological order in which he wrote them, gives the reader a marvelous overview of the process through which Basho changed as a poet and how he changed the poetry of Japan—and of the rest of the world.  The accompanying notes provide a true sense of the times and culture in which the poems were written.”

     One of the appendices identifies thirty-three “Haiku Techniques” such as association, comparison, and contrast, and illustrates each with an example from Basho’s haiku.


     [The previous essays in this series are here, here, here, here, here, here, here, here, here, here, and here.]

     One of the nine “Core Lists” that I recommend that law students construct is “an ABC list of Arguments, Building Blocks, and Clauses. . . that you encounter in your classes and assigned reading.”  I’ve suggested, “[Y]ou might even keep and organize [it] on index cards.”

     Beyond my book’s initial list of thirteen themes, three authors in particular supply entries for such a “starter deck,” which you can expand and customize during your law school journey.

     If—as the titles of two of their books indicate—these practical and theoretical elements constitute the “tools” of law students and practitioners, then a list, table, diagram, or deck would be a handy pegboard with which to neatly separate and organize them.

      ● Joel P. Trachman’s The Tools of Argument: How the Best Lawyers Think, Argue, and Win (2013) collects, introduces, and explains dozens of arguments (and their corresponding counter-arguments, and even counter-counter-arguments) in the categories of: procedure; interpretation; use of precedent; facts and evidence; and standards of legal responsibility. 

     These tactics, and his examples, range across the law school curriculum, including civil procedure, torts, contracts, evidence, constitutional law, international law, and jurisprudence.

      Trachtman, a professor of international law at Tufts University’s The Fletcher School of Law and Diplomacy, carefully states in the book’s introduction that “Lawyers are the modern heirs of the ancient Greek sophists, the worst of whom sought to ‘make the weaker argument appear the stronger.’”

    However, he provides no source for this quotation, despite his declaration that “Citations allow the reader to see, and to evaluate for himself, the quality of the support for the author’s statements.”

     In fact, he ultimately adopts the more pejorative, and “ethically unappealing,” meaning of the word: “This book is not intended to school sophists.  But it does include a taxonomy of the tools of sophistry so they can be identified and countered.”

      Thus, Trachtman might to some degree contradict his book’s emphasis on clarifying the meaning of terms, especially to preclude disputes about the drafter’s “original intent.”

     (The current popular understanding of sophist, like that of stoic, cynic, and epicurean, actually departs considerably from the tenets and techniques of the ancient Greek schools of philosophy known by the respective (capitalized) name.)

     ● Ward Farnsworth’s more complex The Legal Analyst: A Toolkit for Thinking About the Law (2007) focuses on “ideas that can be introduced effectively with a bunch of good examples in a short chapter.”  

     The author, a professor at the University of Texas at Austin School of Law, thus acknowleges “notable omissions [that] include ideas from moral theory, from critical legal studies, and from legal realism [and] theories of constitutional interpretation. . . . “

     Farnsworth’s thirty-one chapters, most of which conclude with a helpfully annotated list of related law review articles and books, are “grouped into five parts”: “incentives: the effects that legal decisions have on the choices people make afterwards”; “trust, cooperation, and other problems that arise when people work together”; “subjects from jurisprudence”; “cognitive psychology” and “the ways that people may act irrationally”; and, “problems of proof.”

     Most useful to the beginner might be the jurisprudence section’s discussion of “Rules and Standards,” “Slippery Slopes,” “Property Rules and Liability Rules,” and “Baselines.”

     Those without a background in economics should appreciate much of the first section, which presents such concepts as efficiency, waste, rent seeking (“wasteful efforts to gain a prize,” or “competition over the way some good thing ought to be distributed”), and Coase’s Theorem (“in a world with no transaction costs. . . . rights naturally flow into the hands of whoever will pay the most for them”). 

     The second section examines game theory situations, like the prisoner’s dilemma and the stag hunt, without demanding mathematical skills or a calculator. 

     Throughout, The Legal Analyst includes examples and illustrations of the concepts, and clearly analyzes their practical import for lawyers, clients, the court, and the public.

     ● Parts of Farnsworth’s fourth section (and some elements of his fifth)—like Trachtman’s discussion of such “Rhetorical Tricks” as misleading logical or causal connections—are  more deeply explored in Nobel Prize laureate Daniel Kahneman’s magisterial Thinking, Fast and Slow (2011).

     Kahneman, an emeritus professor of psychology at Princeton (and also of Psychology and Public Affairs, at The Princeton School of Public and International Affairs), pioneered with Amos Tversky the field of “behavioral economics” (sometimes referred to as “neuroeconomics”).  His longtime best-seller summarizes for non-specialists the findings and implications of his earlier and more technical articles (some of which Farnsworth cites). 

     After reviewing an unsettling and humbling catalog of psychological pitfalls to which even the most (subjectively) careful decision-maker might succumb, the author concludes modestly that, in general, “[M]y intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.  I have improved only in my ability to recognize situations in which errors are likely. . . .” and thus to “slow down, and ask for reinforcement from System 2” (his shorthand reference to one’s non-intuitive, deliberative, and effortful mental processes).

     Lawyers might well educate themselve and their individual and corporate clients about these potential vulnerabilities, and help develop cognitive countermeasures, including decision-making processes. 

     In particular, “Organizations are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures.”

      Appendix B of my own book lists, categorizes, and summarizes dozens of such traps, and offers a starter set of “Preventive Procedures, Policies, and Protocols.”    


     [The previous essays in this series are here, here, here, here, here, here, here, here, here, and here.]

     Investigative reporter John Carreyrou’s gripping and best-selling account of Theranos Inc., Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018), presents many personal and professional questions for law students to consider.

     Theranos claimed that it had developed a device that could, by miniaturizing and combining existing technologies, conduct many different diagnostic tests on the same small sample of a patient’s blood.  

      However, as detailed by Carreyrou (then reporting forThe Wall Street Journal, and now with The New York Times), the company never realized that goal, and instead appears to have supplied dangerously inaccurate results to some people whose blood it tested.

     In January 2022, Theranos founder and CEO Elizabeth Holmes was convicted of three counts of wire fraud, and one count of conspiracy to commit wire fraud, on the company’s investors.  That November, she was sentenced to a prison term of more than eleven years.

     In July 2022, Theranos’s former chief operating officer Ramesh “Sunny” Balwani was convicted of ten counts of wire fraud and two counts of conspiracy to commit wire fraud; five months later, he was sentenced to a term of almost thirteen years.

     Balwani, when discussing in 2010 with representatives of Walgreens the early stages of an arrangement to install Theranos devices in their drugstores (a parallel partnership was created for Safeway’s grocery stores), scuttled a suggestion to involve Walgreens’s information technology personnel.  He declared that, “IT are like lawyers, avoid them as long as possible.”

     Yet Carreyrou’s chronicle contains no shortage of lawyers, some of whom might have been more advantageously consulted earlier:

     ● The company’s in-house and outside counsel threatened and sued its former employees, claiming that they had stolen trade secrets. 

     ● Theranos’s lawyers attempted to compel an employee to surrender his patent for a method of affixing lights to bicycle wheels (an invention for which he had successfully raised funds on Kickstarter), asserting that the company was legally entitled to all intellectual property created by its employees.

     ● After one employee, distraught about developments at the company and about his professional prospects, ended his own life, his widow left a message for Holmes, only to receive that day “an email from a Theranos lawyer requesting that she immediately return [his] company laptop and cell phone and any other confidential information he might have retained.”

     ● When a newly-terminated laboratory director refused to sign any more legal papers, Balwani laughably “offered to hire him an attorney to expedite matters.”  (After retaining a lawyer on his own, the former employee complied with the company’s demands to destroy his copies of company documents.)

     ● An inventor and (former) friend of Holmes’ family engaged a lawyer to file, in 2006, a patent application essential to some planned functions of Theranos’s devices. 

     ● Because the inventor’s son was a partner at Theranos’s regular patent law firm, that firm declined to represent Theranos in challenging the application.  Theranos then sued the father and both of his sons, alleging that the lawyer had conveyed some of the company’s confidential intellectual property to his father.

     ● Because of the U.S. Central Command’s military interest in using Theranos’s device on the battlefield, a Army lawyer met with Holmes.  He “noted that she had brought no regulatory affairs expert to the meeting.  He suspected the company didn’t even employ one.  If he was right about that, it was an incredibly naïve way of operating.  Health care was the most highly regulated industry in the country and for good reason: the lives of patients were at stake.”

     ● The prestigious advertising agency Chiat\Day, concerned about its potential liability for statements made in marketing materials prepared for Theranos (and reviewed by Theranos’s counsel), consulted its own lawyers. 

      ● Most notably, Theranos ultimately retained Boies, Schiller & Flexner, the firm of prominent lawyer David Boies.  Boies subsequently became a director of the company, and the law firm became a shareholder in Theranos.  (Holmes would later claim that she had believed that Boies had represented not only the company but herself personally; in the course of dismissing that contention, a federal magistrate judge noted in June 2021 that there had been no written retention agreement between the law firm and Theranos.)

     ● Lawyers from Boies, Schiller “ambushed” former employee Tyler Shultz at the home of his grandfather, former Secretary of State (and, at the time, Theranos director) George Shultz, accusing Tyler of having contacted the Wall Street Journal.

     ● Tyler, refusing to sign a document that those lawyers pressed on him, observed, “A Theranos lawyer had drafted this with Theranos’ best interests in mind. . . . I think I need a lawyer to look at it with my best interests in mind.”

     ● Tyler later “arranged for [his parents] to have their own legal counsel.  That way he could communicate with them through attorneys and those conversations would be protected by the attorney-client privilege.”

     ● At a meeting in the Journal’s headquarters, Boies and colleagues  immediately activated “little tape recorders. . . at each end of the conference table” to  capture their conversation with Carreyrou, his editor, and the newspaper’s deputy general counsel. 

     They would later send the newspaper a letter that “sternly demanded that the Journal ‘destroy or return’ all Theranos trade secrets and confidential information in its possession.  Even though Boies must have known there was zero chance we would do that, it was a shot across the bow.” 

     ● However, in response, “the Journal’s legal department dispatched a technician to copy the contents of my laptop and phone in preparation for litigation.”

     ● Finally, after (starting in October 2015) Carreyrou’s front-page stories (which the newspaper’s “standards editor and the lawyers would comb through line by line”) were published, field inspectors from the federal Centers for Medicare and Medicaid Services (CMS), “the chief regulator of clinical laboratories,” visited Theranos for four days.  “As soon as she sat down” for an interview, a lab associate who had worked on the Theranos technology “asked for an attorney.  She looked coached and afraid.”

     Among the practical issues that Bad Blood raises are:

     ● What danger signs should compel current and potential employees, directors, investors, and lawyers to investigate, if not reevaluate, their relationship with a company and its leaders?  What disclosure policies, or other protocols or processes, should be instituted to detect, and follow up on, such “red flags”?

     At Theranos, not only were firings apparently frequent, but employee access to information was extremely compartmentalized.  Although Holmes and Balwani explained that internal secrecy was an aspect of the company’s “stealth mode,” one employee privately pondered that other firms had “cross-functional teams with representatives from the chemistry, engineering manufacturing, quality control, and regulatory departments working toward a common objective.”

     On another level, it was not only some employees who were surprised to learn of the personal relationship between Holmes and Balwani.  Carreyrou asks, “If Holmes wasn’t forthright with her board about her relationship with Balwani, then what else might she be keeping from it?”

     ● By what techniques and processes can individuals and companies, and their counsel, prevent themselves from succumbing to the “almost hypnotic” and “mesmerizing effect” of an Elizabeth Holmes’ “mixture of charm, intelligence, and charisma”?

     Holmes won the support of, among others, the associate dean of Stanford’s School of Engineering (one of her former teachers); venture capitalists; Oracle founder Lawrence Ellison; and board members who included not only Shultz and fellow former Secretary of State Henry Kissinger but also the former head of the U.S. Central Command, a former admiral, a former Secretary of Defense, a former CEO of Wells Fargo, and former Congressmen.

     Carreyrou observes that at an emergency meeting of Theranos’ board, convened after company insiders had reported misleading revenue projections and statements of technical progress, Holmes managed to talk the directors out of removing her as CEO.

      ● To what degree do personal, corporate, and technological interconnections compound, or clarify, complexities?

      Carreyrou, meeting with a source, notes that “As we drove around in her car, I was struck by how small and insular Palo Alto was,” and proceeds to detail the proximity of some of the people and companies involved.

      Holmes “was able to leverage the family connections of a wealthy Mexican student at Stanford” to obtain government authorization to use Theranos devices in Mexico; made her brother the associate director of Theranos’ product mangement; and later hired some of his friends.

      Of the company’s engineering efforts, Bad Blood notes, “When you packed that many instruments into a small, enclosed space, you introduced unanticipated variations in temperature that could interfere with the chemistry and throw off the performance of the overall system.”

    One component of the product, a cartridge in which the sample of blood was combined with chemicals, “was a complicated, interconnected system compressed into a small space.  One of [the company’s] engineers had an analogy for it: it was like a web of rubber bands.  Pulling on would would inevitably stretch several of the others.”

     ● How can individuals and companies resist the pressure to form agreements out of a Fear of Missing Out (FoMO)?  Carreyrou suggests that such concerns motivated both Walgreens (worried about about rival CVS) and Safeway to partner with Theranos.

     ● How do potential whistleblowers document their concerns in the face of company counsel’s demands (excerpts from which are included) that they destroy or return, by a specified deadline, any proprietary information (including, in some cases, emails that employees had forwarded to their own personal accounts)?

     ● How and when, and under what conditions, should current and former employees contact, and/or cooperate with inquiries from, journalists? 

     ● At what point if any, should counsel for a source, and counsel for the journalist’s organization, become involved?  (In addition, should the journalist ever retain her own personal lawyer in this connection?)

     ● What practices should the journalists and/or lawyers recommend, and what agreements should they adopt, to protect the identities of potential and actual sources?

     Carreyrou himself starts becoming a major part of Bad Blood only in “The Tip,” the nineteenth of the book’s twenty-four chapters.  One potential source “told me he would speak to me only if I promised to keep his identity confidential.  Theranos’s lawyers had been harassing him and he was certain the company would sue him if it found out he was talking with a reporter.  I agreed to grant him anonymity.”

     Another source agreed to talk, “but only off the record.  This was an important journalistic distinction: [other sources] had agreed to speak to me on deep background, which meant I could use what they told me while keeping their identities confidential.  Off the record meant I couldn’t make any use of the information.” 

     Tyler Shultz, rejoining Carreyrou a year after their first meeting (which Tyler had arranged after “calling me from a burner phone that couldn’t be traced back to him”), “didn’t want to discuss the subject in an open place within earshot of other people.” Once in a private setting, he says, “My lawyers forbade me from talking to you, but I can’t keep this bottled up anymore.”  (“I agreed to keep whatever he was about to tell me off the record and only to write about it in the future if he gave me his permission to do so.”)

     ● To what degree do potential sources know about these options, and how standard are they?  Does their meaning depend on the form of media?  The particular publication?  The particular journalist?

     ● How, if at all, are a source’s understandings with a journalist (including with whom, and under what circumstances, the journalist can reveal that source’s identity to one or more of her colleagues) clarified and documented? 

     ● If any form of promised confidentiality is compromised, how can a source demonstrate the terms of the original arrangement if he sues the journalist and her organization?

     ● Finally, under what circumstances should lawyers adopt or adapt for their own purposes one of the Wall Street Journal’s practices, “a cardinal rule called ‘No surprises’”?

      “We never went to press with a story without informing the story subject of every single piece of information we had gathered in our reporting and giving them ample time and opportunity to address and rebut everything.”


     [The previous essays in this series are here, here, here, here, here, here, here, here, and here.]

     Law and pre-law students, and lawyers, might find many valuable perspectives in Philip Delves Broughton’s detailed, critical, and sometimes-irreverent account, Ahead of the Curve: Two Years at Harvard Business School (2008). 

    Although Delves’s professors appear to have focused on legal issues only sparingly, the book illuminates the education of a lawyer’s potential clients; provides instructive comparisons with, and contrasts to, traditional practices of legal education; and (directly contradicting a classic movie line), offers advice not only for the reader’s professional career but also for her personal life.

    In 1854, Thoreau famously wrote, “I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived.”

    One hundred and fifty years later, Broughton entered “HBS,” about fifteen miles away from Thoreau’s Walden Pond.  After a decade in journalism, he wanted “to learn about business in order to gain control of my own financial fate, and more important, my time,” and to pursue “greater knowledge about the workings of the world and broader choices about the life I might lead.”

    Without revealing the author’s post-graduation opportunities (although his series of job interviews are carefully described), it can be said that the book’s themes and “takeaways” include:

    ● The immersive nature of the business school experience, and its similarities to the process of learning a new language.

    ● Defining and refining one’s short-term and long-term purposes, and maintaining one’s individuality and ethics, while resisting peer pressure and the urge to compare oneself to others.  (Broughton includes a particularly entertaining discussion of the personality tests that he and his classmates were required to take.)

    ● The division, as in law schools, of entering classes into equal sections, the members of which take most, if not all, of their first year classes together, and gradually develop their own classroom and extracurricular cultures.

    ● The degree to which much of the author’s HBS education came not in the classroom but from discussions—often about defining “selling out” or finding the appropriate work/life balance—with his diverse group of classmates, a number of whom had come to the school from other countries and/or had business or military experience.  (“I was developing a soft spot for the ex-military guys.  They were refreshingly sane.”)

    ● On a different level, the importance to some businesses of a local community of complementary (and in some cases competing) firms, such as in the economic ecosystems of Wall Street and Silicon Valley.  

    ● The business school’s “case studies,” which—though adapted from law schools’  Socratic discussions of court decisions—instead involve faculty-written descriptions, “from a couple of pages to more than thirty,” of “business situations drawn from real life.  The question you are expected to answer in each one is, What would you do?  There are no right or wrong answers to these problems. . . . The only thing that matters is how you think about the problems, how you deal with the paucity of information, the uncertainty.”

     ● The degree—extreme, by law school standards—to which class participation counts in the grading of first-year courses: “Fifty percent of our grade would be determined by. . .  the quality and frequency of our comments,” predictably resulting in a “battle for classroom air time.”

     ● The questions–increasingly being considered by law schools—of whether, and how, leadership, including motivational skills, can be systematically and objectively defined and taught.  (“The word leadership lurked in every corner of HBS.”)

     ● More generally, the extent to which complex concepts and systems can be reduced to formulas or diagrams.

     ● The clarification and codification of decision-making processes; and, the possible perils of prolonged profits-only thinking.  (“This was why people hated MBAs.  Too much cost-benefit analysis, too little humanity. . .  However technically correct an analysis might be, it could also be too rational.  Achieving the balance between reason and emotion would never cease to be a challenge.”)

     ● The crucial, but nonformulaic, role of personal reputation, chemistry, and connections in business success.  (“The financial system turns on personalities as much as anything else.”)

     (In one episode of AMC’s Mad Men, the head of a 1960s-era advertising agency dressed down a young account manager who had alienated a major client: “I don’t know if anyone’s ever told you that half the time, this business comes down to, ‘I don’t like that guy!’”  In another episode, that account manager memorably answered an academic’s condescending query, “What do you do every day?”)

     ● Student stress, including the ever-present FOMO (Fear of Missing Out).  Broughton reports that his class was repeatedly advised by HBS administrators that “You had to choose exactly what you wanted to do and do it without fretting about what else was going on.”

     ● The definition, and process, of valuation—of an asset, a business, an opportunity, and a person (including oneself).

     ● How to define, obtain, and maintain a competitive advantage, as a professional businessperson and for one’s business(es).

     ● The definition, measurement, and appropriate levels of uncertainty and of risk.

     ● The roles of, and requirements for, entrepreneurs.  (“The HBS definition of ‘entrepreneurship’ was ‘the relentless pursuit of opportunity beyond resources currently controlled.’”) 

     Broughton discusses the extracurricular lessons of his own attempt, with a classmate, to obtain venture capital funding to “create a website that would be a supermarket of audio content.”

     ● As recommended by one of the HBS faculty (and in my book on law school), gauging, or demonstrating, one’s mastery of a subject by one’s ability to explain it to someone with no background in the area.

     ● Deep focus and constant attention to details, and a relentless search for innovations.  Broughton identifies the “crux” of the Advanced Competitive Strategy course as, “Most ideas are pretty good ones.  It all comes down to execution, staying alert, paying attention.”  (Or, as poet Jane Hirshfield summarized Zen: “Everything changes.  Everything is connected.  Pay attention.”) 

     ● An early anecdote, presented by one of the faculty:

      “[A]n MBA student [was] arguing with someone in the HBS administration.  As tempers rose, the student burst out, ‘Why are you treating me like this?  I’m the customer, goddamnit.’  ‘No, you’re not,’ said the HBS employee.  ‘You’re the product.’”

      (The speaker concluded, to the class: “I guess you’re somewhere between. . . . Sometimes you’re the customer, other times you’ll feel like the product.”)

     ● Among the most fascinating features of Broughton’s book are his summaries of the variety of life advice (including, “return calls and e-mails in a timely way”) provided to his class by the HBS faculty and by notable (and named) businesspeople; the reflections of some of his classmates, a year after graduation; and his recommendations for improving the school.     But perhaps the book’s single best line is the author’s response to his wife’s characterization of some of his classmates as “freaks”:
     “I know, I know. . . I’m just worried that if I stop recognizing the freaks, I’ll become one of them.”


     [The previous essays in this series are here, here, here, here, here, here, here, and here.]

    The Clayton M. Christensen Reader (2016) can be read in an afternoon, but its professional and personal lessons might serve well throughout one’s career.

     The book collects eleven Harvard Business Review articles written or co-written by the Harvard Business School professor between 1995 and 2015 on the evolution of industries, companies, markets, executives, and management theories.

     Christensen (1952-2020) is best remembered for his concept of “disruptive innovation.”  His book, The Innovator’s Dilemma (2013), was reported to have “deeply influenced” Apple’s Steve Jobs.

    Using that strategy, less-established competitors can supplant market leaders by developing products—like transistor radios, smaller-than-standard hard drives, and personal computers—that might “perform far worse along one or two dimensions that are particularly important to [mainstream] customers,” but that “introduce a very different package of attributes” valued “in new markets or new applications.”

    ● Thus, in “Disruptive Technologies” (1995), Christensen warned dominant companies to look beyond their valued customers and successful offerings. 

      Managers should themselves pursue potentially disruptive technologies—especially those opportunities identified by “technical personnel,” as opposed to “[m]arketing and financial managers” or “lead customers”—by “creat[ing] organizations [such as ‘skunk works projects’] that are completely independent from the mainstream business.” Executives should research potential markets “by experimenting rapidly, iteratively, and inexpensively with both the product and the market.”

     ● Twenty years later, in “What is Disruptive Innovation?”, Christensen complained that his theory’s “core concepts have been widely misunderstood and. . . basic tenets frequently misapplied.  Furthermore, essential refinements. . . appear to have been overshadowed by the popularity of the initial formulation.” 

     In particular, he insisted that the term did not apply to every “situation in which an industry is shaken up and previously successful incumbents stumble,” but only to the displacement of established businesses by upstarts that at first serve lower-level customers (and sometimes, as with personal photocopiers, entirely new markets), “frequently at a lower price,” but “then move upmarket, delivering the performance that incumbents’ mainstream customers require, while preserving the advantages that drove their early success.”

     Thus, Uber did not qualify as a disruptive innovation, because it had neither exploited a low-end market that had been overlooked by San Francisco’s established taxi companies nor pursued “people who found the existing [ride-for-hire] alternatives so expensive or inconvenient that they took public transit or drove themselves instead.” 

     On the other hand, Netflix was a disrupter, once it moved from supplying DVDs by mail to “becom[ing] appealing to Blockbuster’s core customers, offering a wider selection of content with an all-you-can-watch, on-demand, low-price, high–quality, highly convenient approach. . . [F]ailing to respond effectively to the trajectory that Netflix was on led Blockbuster to collapse.”

     Christensen also resisted the loose use of “disruptive” to characterize any successful innovation.  Pointing to the large number of failed Web-based stores, he noted, “Not every disruptive path leads to a triumph, and not every triumphant newcomer follows a disruptive path.”

    He concluded by observing that even companies that, as he’d recommended, reevaluate their own operations won’t necessarily be immune to disruptive innovation: in the complex and fluid world of business, “Sometimes this [advice] works—and sometimes it doesn’t.”

    Indeed, his more general essay, “Why Hard-Nosed Business Executives Should Care About Management Theory” (2003), had already observed that, “in business, as in medicine, no single prescription cures all ills”; but added that “Progress comes from refining theories to explain situations in which they previously failed. . . .”

     Law students and lawyers should find much of value in considering the ways in which disruptive innovation (most recently, the changes wrought by ChatGPT and other “generative AI” technology) threatens—and provides opportunities to—not only their current and potential clients, but also their own careers and profession.

     ● Christensen’s most far-reaching and personal article, from 2010, bears the title of his 2012 book: “How Will You Measure Your Life?” 

     He wrote that he concluded each semester by asking his business school classes “to find cogent answers to three questions: First, how can I be sure that I’ll be happy in my career?  Second, how can I be sure that my relationships with my spouse and my family become an enduring source of happiness?  Third, how can I be sure I’ll stay out of jail?  Though the last question sounds lighthearted, it’s not.”

    Christensen cautioned that the temptation to deviate “just this once” from one’s principles “suckers you in, and you don’t ever look at where that path ultimately is headed and at the full costs that the choice entails. . . . You’ve got to define for yourself what you stand for and draw the line in a safe place.”

    This counsel corresponds with part of the “Ten-Step Program to Resist Unwanted Influences” that Stanford psychology professor Philip Zimbardo (1933- )—most memorably connected to the (in)famous Stanford Prison Experiment (1971) and to the broken window theory (1969)—provided in his book, The Lucifer Effect: Understanding How Good People Turn Evil (2007), and on the associated website, as “a starter kit toward building individual resistance and communal resilience against undesirable influences and illegitimate attempts at persuasion.”

     Zimbardo “discourage[s readers] from venal sins and small transgressions, such as cheating, lying, gossiping, spreading rumors, laughing at racist or sexist jokes, teasing, and bullying.  They can become stepping-stones to more serious falls from grace.”

     Both works, in their own ways, present personal morality as something that, like a company’s products, should be carefully researched and developed. 

    Like Christensen’s theory, one’s morality should be defined as clearly as possible, although it might not yet (or ever) be comprehensive.

    But one’s moral line, unlike a line of products, should not be self-disrupted or compromised to head off competitors, and should not be subject to “pivots” or “iterations” induced only by the pressure of peers, employers, or the market.

     Otherwise, as Groucho Marx joked more than eighty years ago, “Those are my principles.  If you don’t like them, I have others.”


     [The previous essays in this series are here, here, here, here, here, here, and here.]

     Law students and lawyers, deluged daily with data, details, and deadlines, are often inclined towards prescriptions and practices for personal and professional productivity.

     David Allen’s classic Getting Things Done: The Art of Stress-Free Productivity (2001), which spawned numerous “GTD”-related websites, begins from the premise that  “Most people walk around with their [short-term memory] bursting at the seams.  They’re constantly distracted, their focus disturbed by their own internal mental overload.”

      Allen identifies, diagrams, and discusses sequentially “five separate stages” of working, which he recommends implementing at least weekly:

      “We (1) collect things that command our attention; (2) process what they mean and what to do about them; and (3) organize the results, which we (4) review as options for what we (5) do.”

      GTD’s reassuring system and structures might not contain many revelations, but could well repay regular rereading.  Allen’s principles remain relevant, and could easily be adopted and/or adapted, in an increasingly-digital environment.

      Among Allen’s advice:

     ● “It’s. . . just a great habit to date everything you hand-write. . . The 3 percent of the time that this little piece of information will be extremely useful makes it worth developing the habit.”

     ● “Whenever you come across something [in hard copy that] you want to keep, make a label for it, put it in a file folder, and tuck that into your filing drawer. . . . [I]f you can’t get it into your system immediately, you’re probably not ever going to.”

    ● When processing an item, determine the “next action” related to it. “If the next action can be done in two minutes or less, do it when you first pick the item up. . . . If you have a long open window of time in which to process your in-basket, you can extend the cutoff for each item to five or ten minutes.”

     Getting Things Done frequently features lists (including examples and enumerations of categories of other listable items): current projects; next actions; “someday/maybe” initiatives; action reminders and “incompletion triggers”; elements of a “Weekly Review”; helpful office supplies; physical areas to organize; and, six levels of perspective on one’s work (Current actions; Current projects; Areas of responsibility; One- to two-year goals; Three- to five-year visions; and, Life).

     For a specifically hard-copy-based system of productivity, many have turned to Ryder Carroll’s The Bullet Journal Method (2018), presented as “a way to stem the tide of digital distractions [and] an analog solution that provides the offline space needed to process, to think, and to focus.” 

     Like Allen (who offers detailed instructions for “corralling your ‘stuff’” and then conducting a “mind-sweep”), Carroll invites readers to take an initial inventory (of “all the things you are presently working on”; “all the things you should be working on”; and, “the things you want to be working on”).

     Not everyone might employ his various “signifiers and custom bullets” (symbols such as dashes, < , > , and ◦) to designate and distinguish Tasks, Events, Notes, and other information entered.

     Moreover, the “journal” aspect of Carroll’s method is extremely abbreviated, because “Not having to articulate the complexity of an experience makes it much more likely for us to write it down.  That’s the most important part: to have a record.” (One example: “Signed the lease.  Yay!”)  Similarly short are sample “notes” about aspects of pending, or completed, events.

     The full Bullet Journal system, which some might consider overly elaborate, includes “four core Collections: the Daily Log (compiled as the day progresses; as with Allen’s approach, “The idea is to be consistently unburdening your mind”); the Monthly Log (prepared before the month begins, and added to during the month); the Future Log (which “stores entries that have specific dates that fall outside of current month”); and the Index, which contains lists of previous page references, organized by topic.   

    Whatever one’s preferred mechanics of time management, the expansive—and often counterintuitive—philosophy of Oliver Burkeman’s Four Thousand Weeks: Time Management for Mortals (2021) should be a sobering, but also refreshing, companion.

    Burkeman’s title refers to the human lifespan; and his introduction, “In the Long Run, We’re All Dead,” summarizes the book’s theme:

     “Productivity is a trap.  Becoming more efficient just makes you more rushed, and trying to clear the decks simply makes them fill up again faster.  Nobody in the history of humanity has ever achieved ‘work-life balance,’ whatever that might be, and you certainly won’t get there by copying the ‘six things successful people do before 7:00 a.m.’”

     Burkeman counsels readers to acknowledge their limitations; to disregard FOMO (the Fear Of Missing Out), since no one can travel every road; and to resist the temptation to keep one’s options endlessly open (instead, “deliberately mak[e] big, daunting, irreversable commitments, which you can’t know in advance will turn out for the best, but which reliably prove more fulfilling in the end”).

     He recommends that readers pursue hobbies (“it’s fine, and perhaps preferable, to be mediocre at them”); “develop a taste for having problems” rather than fantasizing about a friction-free future; patiently proceed through “the trial-and-error phase of copying others, learning new skills, and accumulating experience”; work steadily and incrementally on large projects; and, to “strengthen the muscle of patience” and sustain long-term productivity, “be willing to stop when your daily [scheduled] time is up, even when you’re bursting with energy and feel as though you could get much more done.”

    Burkeman’s closing advice is to “Practice doing nothing,” even for only a few minutes at a time, which will help you “begin to regain your autonomy—to stop being motivated by the attempt to evade how reality feels here and now, to calm down, and to make better choices with your brief allotment of life.”

     In those meditative moments, one might recall the (translated) poem of Zen-influenced haiku master Matsuo Basho (1644-1694):

                        Sitting quietly, doing nothing

                        Spring comes

                        And the grass grows by itself.


     [The previous essays in this series are here, here, here, here, here, and here.]

     In his book, Making Movies (1995), director Sidney Lumet summarizes simply as, “Listen,” the theme of his first movie, Twelve Angry Men (1957) (trailer).

     The script was written by Reginald Rose, a lawyer’s son inspired by his own jury service (in a manslaughter case in New York City), and by his opposition to McCarthyism.  A shorter form of Rose’s drama had been televised live in 1954 on Studio One; and a version had been performed as a stage play the following year.

    Sequestered in a small and sweltering room as a storm sweeps in, twelve men—portrayed by actors including Henry Fonda (who co-produced the movie with Rose), Lee J. Cobb, E.G. Marshall, Martin Balsam, Ed Begley, and Jack Klugman—attempt to reach a verdict in the case of a young man accused of having stabbed his father. 

     They have been instructed that a guilty or a not-guilty verdict on the single charge, murder in the first degree (i.e., premeditated murder), must be unanimous; and that a verdict of guilt will result in the defendant’s execution.  (The judge and the defendant appear briefly at the beginning of the movie, but in the stage script neither is visible to the audience.)

     To accentuate the jurors’ close quarters, and to heighten the increasing sense of confinement during the ninety real-time minutes of their interaction, Lumet “slowly shifted[ed] to longer lenses,” and “shot the first third of the movie above eye level, shot the second third at eye level, and the last third from below eye level.”

     Without including spoilers, it can be said that the drama’s themes include:

     ● The degree to which each juror has in fact listened to the judge’s instructions, to the lawyers, to the witnesses, and, perhaps most importantly, to his colleagues.

     ● The ways in which reasonable people, acting in good faith, can disagree.

     ● Peer pressure and “groupthink”; and the courage, caution, and possible heroism of someone who questions and/or opposes the majority’s opinion.

     ● Depersonalization.  Seated at the jury table in the order of their juror numbers, the men don’t know each other’s names, but occasionally refer to one another as “that gentleman.”

     ● Individuality, and Diversity.  Each juror figuratively (and, in one show-stopping moment, literally) brings something to the table, as their varying careers, experiences, prejudices, sympathies, and relative maturity—which are gradually, and sometimes inadvertently, revealed—inform their deliberations. 

     ● The respect, or disrespect, shown by the various jurors to each other, to the defendant and to his (unspecified) demographic group, and to the jury process itself.  (One juror mocks another: “What are you being so polite about?”  The response: “For the same reason you’re not—it’s the way I was brought up.”) 

     Justice Oliver Wendell Holmes, Jr. reportedly described the Supreme Court as “nine scorpions in a bottle.”  The dispositions, distractions, engagement, and self-restraint of the dozen jurors vary widely, and change significantly over the course of their work.

     ● Individual jurors’ commitment to continuing deliberations, which are not subject to any deadline or other time constraints, rather than reporting to the judge that the group cannot reach a unanimous verdict.

      ● The willingness and responsibility of jurors to explain their own positions when questioned or challenged, to carefully evaluate the arguments of others, and to vote in good faith rather than for the sake of expediency.

      ● The men’s assessment of the incomplete information presented at trial (whose details they occasionally, sometimes after consulting notes, recall differently), and of the witnesses’ appearance, character, and credibility. 

     ● The deductive prowess of different jurors, and the degree to which a juror can and should independently investigate issues and collect evidence. (The second issue, not presented as particularly controversial or rule-breaking in the movie, is certainly addressed by current jury instructions, especially with regard to jurors’ conducting their own online research.)

     ● The physical evidence requested by the jurors through the court officer, and its relevance to their reconstruction of events.

     ● The all-important question of whether the prosecution has met its burden of proof by demonstrating the defendant’s guilt beyond a “reasonable doubt” (a term that is never detailed or defined in the jurors’ discussions).

     ● The inherent imprecision of the jury’s considerations of possibilities and probabilities:

     -In this context, are objectivity and certainty themselves suspect?  One juror fumes, “I’m sick and tired of ‘the facts.’  You can twist them any way you want!” At another point, recalling one part of the trial presentation, another juror states, “I began to get a peculiar feeling—I mean, nothing is that positive!”       

     -Is is ever appropriate to answer another juror’s question with, “I don’t know”?

     -Might a verdict of guilt condemn to death a man who’s actually innocent?  Alternatively, could a not-guilty verdict free a murderer, who might well kill again?  And (although this is not specifically the jury’s concern), if the defendant didn’t murder the victim, who did, and why?

     ● The jurors’ reflections on the possible unprofessionalism of the prosecutor and of defense counsel (though not of the judge)—and on the degree to which those complicate the jury’s job.

     ● The direct and indirect methods of persuasion and leadership exercised by the foreman and the other jurors—including appeals to reason, emotions, and biases—and the sometimes-delicate dances of dominance, deference, and decorum within various permutations of the jurors, and among the group as whole.

     Sixty-six years ago, a few days before the release of the movie, Rose wrote in The New York Times, “Much of the intricate business of living is singled out and held up for scrutiny in this jury room, I feel.  For instance, the alliances formed for purely intellectual reasons and those formed for emotional reasons alone remind us of larger and more important alliances that we can see at every turn in our newspapers, locally, nationally, internationally.”

     ● The jurors’ reaching agreement on the internal governance of their deliberations, such as their seating, choice of foreman, order and form of discussions, and moments and methods of voting.

     ● The differing roles of voting by secret ballot, by voice, and by a show of hands.

     ● The democratic nature of jury selection and deliberations.

     In 2010, Justice Sonia Sotomayor, invited to select a movie to be screened at the Fordham University School of Law’s Film Festival, chose Twelve Angry Men

     Justice Sotomayor told the audience that, around the time she began college, she had been impressed by the way in which one juror, presented as having emigrated to the United States, championed the jury system: “It sold me that I was on the right path. . . . This movie continued to ring the chords within me.”

     However, she also noted that earlier in her career, she had cautioned jurors that the movie’s depiction of deliberations was unrealistic: “There was an awful lot of speculation.”

     Justice Sotomayor added that, as reflected by the jurors’ discussions, both the prosecution and the defense had “failed in their duties.”

     According to Joan Biskupic’s Breaking In: The Rise of Sonia Sotomayor and the Politics of Justice (2014), the Justice said, recalling scathing references by one of the movie’s jurors to “’those people,’” “You have to flinch. . . Those [remarks] are personal.  They were personal when I saw it the first time.  I had heard about ‘those people’ in my life so often.”

     The most clear-headed of Reginald Rose’s characters coolly responded to that juror’s eruption of bigotry (the script’s expanded version of which is even more extreme and toxic): “It’s always difficult to keep personal prejudice out of this.  And wherever you run into it, prejudice obscures the truth.”

     So it could be said that Rose’s writing, and Lumet’s staging and photography, of Twelve Angry Men, and of that scene in particular, also dramatically depict what should, if heard, not be listened to.


     [The previous essays in this series are here, here, here, here, and here.]

     Not only students of law and of history, but anyone interested in writing and editing, might derive at least eight valuable lessons—applied even more easily in this digital era—from MIT historian Pauline Maier’s deeply-documented American Scripture: Making the Declaration of Independence (1997).

    ● First, keep careful contemporaneous records of drafts and revisions. 

    At the beginning of the core of her book—its third chapter, “Mr. Jefferson and His Editors”—Maier mentions the fragmentary and conflicting records of the creation of the initial version of the Declaration, by Thomas Jefferson (with limited assistance from fellow attorneys John Adams, Roger Sherman, and Robert R. Livingston, and also from Benjamin Franklin), between June 11 and June 28, 1776.

     That committee “left no minutes of its proceedings, and the account of its work written nearest the event, . . . Jefferson’s ‘Notes of Proceedings in the [Second] Continental Congress,’ is succinct to a fault.”

     ● Second, when possible, produce a first draft quickly.  Jefferson, like his colleagues on the committee, had many other Congressional responsibilities; but, according to Adams’s account, completed the draft in only one or two days.

     ● Third, draw on previous materials, including your own relevant writings. 

     Jefferson made use of “the draft preamble for the Virginia constitution that he had just finished and which was itself based upon the English Declaration of Rights”; of “a set of draft instructions for Virginia’s delegates to the First Continental Congress” that he had unsuccessfully proposed, on his own initiative, in 1774, and that had subsequently been published by his friends; and of George Mason’s “preliminary version of the Virginia Declaration of Rights.” 

     (Maier includes a helpful one-page “Family Tree” of various documents that influenced, or were influenced by, the Declaration.)

     ● Fourth, especially for those not so extraordinarily well-read as Jefferson, and/or without a literary and historical memory as capacious as his appears to have been, maintain a library (possibly online, or on a hard drive or flash drive) of relevant documents, for ready access.  

     Jefferson referred in his diary to having, with colleagues in Virginia’s House of Burgesses in 1774, “rummaged over” a collection of Puritan-era “revolutionary precedents and forms” in the library of that body’s council chamber, after which they “cooked up a resolution, somewhat modernizing [those] phrases,” in response to the imposition of Parliament’s Boston Port Act.  However, Maier indicates that there is no record of Jefferson’s having searched for written sources in Philadelphia in 1776.

    ● Fifth, consider using a preface, as Jefferson most memorably did, to introduce and set the tone of your document. 

     Maier discusses how the Declaration announced and justified Independence to the American people, including the soldiers who would be fighting for it, and how Congress deliberately had the document disseminated “not only to the state assemblies, congresses, and conventions that were its immediate constituents and to their Committees of Safety, but to the commanders of the Continental Army, [directing] that it be proclaimed not only in all the states, but at the head of the army.”  It was also widely printed in colonial newspapers, and separately circulated in the form of broadsides.

     ● Sixth, read your writing aloud, even if you don’t anticipate that it will be publicly proclaimed (although Jefferson, for that purpose, marked the Declaration’s text to show where those reading to assemblies should pause. The earliest printed copies of the final document included those marks.).

      “Such attention to the cadences of language was natural for Jefferson, a committed violinist fascinated with music.  He had also studied classical oratory and rhetorical theory. . . . “

     ● Seventh, remember that in preparing (at least some) documents, “Less is more.”

     Jefferson, in his draft, abbreviated some of Mason’s writings.  In turn, the Second Continental Congress, “sitting as the Committee of the Whole, . . . [in] an act of group editing that has to be one of the great marvels of history,” trimmed elements of (particularly the second half of) Jefferson’s work, refining his “overlong attack on the British people to a more lean and constrained statement.”

     ● Eighth, try not to take revisions of your work, especially as part of a collaborative effort, personally. 

      In another illustration of the preceding principle, Maier mentions Jefferson’s account of how Franklin “perceived that I was not insensible to these mutilations,” and tried to cheer him up. 

     Franklin told Jefferson a story about a young hatter who, before opening his own store, asked friends for their advice on his proposed sign: “John Thompson, Hatter, makes and sells hats for ready money,” accompanied by an image of a hat. 

     They helpfully eliminated the redundancies one by one, until all that remained was his name, and the picture of the hat.

(A possible ninth lesson: Franklin advised Jefferson that one moral of that story was to resist, “whenever in my power. . . becoming the draughtsman of papers to be reviewed by a public body.”)

     Two days before the Declaration of Independence was formally approved by the delegates—and signed by John Hancock and Charles Thompson (as the President and the Secretary, respectively, of the Second Continental Congress)— Congress voted for independence from England.

     One of his biographers reports that John Adams wrote to his wife Abigail:

     “The second day of July 1776 will be the most memorable epocha in the history of America.  I am apt to believe that it will be celebrated by succeeding generations as the great anniversary festival.  It ought to be commemorated as the Day of Deliverance by solemn acts of devotion to God Almighty.  It ought to be solemnized with pomp and parade, with shows, games, sports, guns, bells, bonfires, and illuminations from one end of this continent to the other from this time forward forever more.”

     It could well be quibbling to question whether “solemnize” applies to all of the activities in Adams’s last sentence, and to note that Independence Day is celebrated on July 4th, rather than July 2nd.

      But reading closely, or even reviewing quickly, the style and the substance of American Scripture’s seven-page, redlined Appendix C—“The Declaration of Independence: The Jefferson Draft with Congress’s Editorial Changes”—might, for many, be both humbling and inspiring.


     [The previous essays in this series are here, here, here, and here.]

     Anyone in search of a well-written, informative, friendly, and even entertaining overview of American legal history might appreciate one of Stanford law professor Lawrence M. Friedman’s three different treatments of the topic.

     Law in America: A Short History (2002) is the shortest (192 pages, not counting the index).

    A History of American Law (4th ed. 2019) is the most comprehensive and the longest (at 796 pages).

    But the most rewarding and engaging choice for law and pre-law students could be American Law in the 20th Century (2002) (607 pages). 

     Its major themes are “the rise of the welfare-regulatory state”; the increased use of judicial review; the tensions between federal and state laws and courts; “the shift of power and authority to Washington—to the national government”; the enormous growth in “the size and scale of the legal system” (including in the number, and the diversity, of lawyers); and the dramatic rise in cases concerning products liability, civil rights, and immigation and citizenship. 

     As the book illustrates, law not only permeates American society, but “is a product of society.”  In particular, “The immediate source of law is not social change but what we can call legal culture.  By this I mean people’s ideas, attitudes, values, and expectations with regard to law,” themselves “a complicated, existing system, a system of crosscurrents and interrelations, a web of values and norms.”

      The longest of Friedman’s three histories compares law to “the reflection of a face in a slowly moving river, that is, somewhat refracted and distorted”; the briefest concludes that “Law, in short, is a mirror held up against life.”

     Indeed, each of these books moves beyond courts, commissions, and agencies to illuminate the interdependence and interplay of legal developments with those in global, national and local politics, economics, and demographics; in technology (especially in transportation and communication); and in the public’s perspectives and moods (including popular reactions to rulings and regulations).

     About one-quarter of the main text of American Law in the 20th Century concerns “The Old Order” (from the late 19th century through 1932); the next two-thirds examines the history and implications of “The New Deal and Its Successors” (1933-1979); and the concluding 85 pages summarize, “The Way We Live Now: The Reagan and Post-Reagan Years.”

     Particularly useful to students will be the book’s introductions to, and contextualization of, such traditional course areas as contracts and corporate law (Chapters 3 and 12), torts (Chapter 11), criminal law (Chapters 4 and 8), property (Chapter 13), family law (Chapter 14), and constitutional law (throughout).  It includes detailed discussions of legal issues of racial and gender equity (Chapters Five and Ten); as well as “freedom of speech and what we might call political justice,” especially during the era of McCarthyism and the Cold War (Chapter 10).

      Friedman reviews a number of decisions traditionally included in first-year casebooks, and begins one chapter by warning, “Civil procedure is the ugly duckling of law.  It is a field only a lawyer can love; and even most lawyers find loving it a struggle.”  

     His survey of the superstructure and infrastructure of American law captures complexities, rejects casual simplifications, and focuses at times on the history-shaping roles of individual lawyers and litigants.

     The book begins with the notorious 1905 decision of Lochner v. New York, in which the Supreme Court struck down as unconstitutional a state statute that restricted the number of hours that bakery employees could work each week and each day.

     However, Friedman quickly clarifies that in “much more obscure” decisions, the Court actually upheld state legislation, including several laws on workers’ rights.  He concludes that, in the early part of the century, the Justices “were not so much reactionary, as soundly upper middle class”; and that they, “and judges in general, were cautious and incremental.  They did not consistently adhere to any economic philosophy,” although judges “were much more likely to sympathize with professional people, and skilled artisans, than with laborers and unskilled workers, and their unions.”

     After devoting a chapter to the century’s changes in the composition, education, and (bar) organizations of lawyers, Friedman chronicles the rise of the “uniform laws” movement, and changes in bankruptcy, corporate, antitrust, tax, and labor law. 

     “On the whole, regulation that appealed to the middle class was much more likely to get enacted than programs pushed by the labor unions.”  In particular, the 1906 federal Food and Drug Act was sparked in part by the deeply disturbing (and disgusting) depictions of meatpacking plants in Upton Sinclair’s The Jungle, published that year.

     “In an age of mass media, and mass communication, the role of scandal and incident in lawmaking was bound to multiply”—if not always predictably.  Sinclair, who had intended his novel as a sweeping indictment of capitalism, concluded, “I aimed at the public’s heart, and by accident I hit it in the stomach.”

     In 1935, the Supreme Court’s Schechter Poultry Corp. v. United States decision struck down the National Industrial Recovery Act as an unconstitutional Congressional intrusion upon commerce.  Yet only a few years later (after the failure of President Roosevelt’s “court-packing” plan to enlarge the number of Justices), the Court “renounced its economic activism [and] gave its stamp of approval to all the programs of the New Deal.  It simply abandoned the whole line of thought that Lochner v. New York epitomized.”

     Friedman contends that the New Deal, which included the creation of the Securities and Exchange Commission (1934) (among other agencies), as well as the passage of the Social Security Act of 1935, “was not. . . a total revolution. . . ; but it was a dramatic quickening, a ratcheting upward. . . ,” spurred not just by the Great Depression but by such developments as radio, newsreels, and movies “that pulled national attention away from the neighborhoods and into Washington, D.C.” 

     He discusses also the “second wave,” under the “War on Poverty” and “Great Society” initiatives of the Johnson Administration in the 1960s; and a “third wave,” devoted to protecting the environment, and also “social justice, rights, health and safety, [and] style of life.”

     Friedman devotes a later chapter to the changing demographics and dynamics of the legal profession, and of legal education.  Students, especially, might appreciate his brief discussions and assessments of the emergence (in the 1980s) of the critical legal studies (CLS), critical race theory, and “law and economics” movements, and the increasing tendency of legal scholarship to cite non-law sources (including “everything in the current intellectual circus world from diamond-hard to mushy-soft”).  He reports that “In 1900 law reviews were much concerned with expounding the law.  In 2000 they were most concerned with criticizing it—or suggesting changes.”

     Friedman observes that “competition between [big law] firms became more intense in the last part of the twentieth century. . . . They were more ruthless in pruning out deadwood, even when the deadwood held a partnership interest.  A partner’s income depended on how much business he brought in: as the phrase went, you ‘eat what you kill.’ . . . [These] firms now began to hunt business more actively, prowling about the business world like leopards on the plains.  They made presentations to clients like advertising agencies, they concerned themselves with public relations.” 

     The business of practicing law also changed dramatically for smaller firms.  In 1975, the Supreme Court prevented state bars from forcing their members to adhere to minimum fee schedules (thereby accelerating the shift to charging clients by the number of “billable hours” expended on their matters).  Two years later, the Court allowed attorneys to advertise their services.

     Yet the profession was already losing some of its mystique and its popular respect, which some commentators considered to have peaked in the 1950s and 1960s.  Friedman notes the rise of (anti-)lawyer jokes, and the reportage of American Lawyer and National Law Journal, which were “breezy, gossipy, full of inside dope and human interest.”

     As an abbreviated substitute for parts of, or as an adjunct to, American Law in the 20th Century, one might read Law in America: A Short History

     Its initial chapter introduces and explains federalism, judicial review, and the basic differences between common law and civil law systems (although observing that the roles of their judges have tended to converge); the second chapter summarizes “American Law in the Colonial Period”; and the third reviews “Economy and Law in the Nineteenth Century.” 

     Of the remaining two-thirds of Law in America, parts of the fourth chapter (“Family, Race, and the Law”), fifth chapter (“Crime and Punishment in the Republic”), and much of the sixth (“The Twentieth Century and the Modern Administrative-Welfare State”) and seventh (“American Law at the Dawn of the Twenty-First Century”) chapters recap some of Friedman’s discussions in the 20th Century volume.

     Although the author’s extensive A History of American Law (4th ed. 2019) covers the subject from colonial times onwards, approximately 82 percent of its almost 800 pages are devoted to pre-20th century developments.  In its preface, Friedman notes, “I have expanded somewhat the treatment of the twentieth century, but it is still, I must confess, something of a poor relation” to the book’s examination of earlier eras.

     On a broader level, one might compare the evolution of American law, and of its structures and strictures, to those of the subjects of The Whole Earth Catalog creator Stewart Brand’s How Buildings Learn: What Happens After They’re Built (1994).

     Chronicling the types of transformations in the forms and functions of particular buildings, Brand (who, according to a biography, wrote in his 1986 journal, with regard to the so-called KISS principle, “Keep it simple stupid, is a good way to keep it stupid”) observes that “Buildings keep being pushed around by three irresistible forces—technology, money, and fashion.” 

     He concludes, “an adapted state is not an end state.  A successful building has to be periodically challenged and refreshed. . . . The scaffolding was never taken completely down around Europe’s medieval cathedrals because that would imply that they were finished and perfect, and that would be an insult to God.”