Alex Karp and co-author Nicholas Zamiska are unafraid to offend those among the technocratic elite who have drifted away from vital national questions to instead develop a smug and complacent focus on shopping websites, photo-sharing apps, and other shallow but wildly lucrative endeavors. To them, there is no point to fighting over who gets the most luxurious stateroom on the Titanic. Without a renewed commitment to addressing the most existential national threats we face, serious risks to our country will continue to grow-rendering mere business success quite hollow.
You will never touch the hearts of others, if it does not emerge from your own.
The power to hurt is bargaining power. To exploit it is diplomacy-vicious diplomacy, but diplomacy.
A moment of reckoning has arrived for the West. The loss of national ambition and interest in the potential of science and technology, and resulting decline of government innovation across sectors, from medicine to space travel to military software, have created an innovation gap. The state has retreated from the pursuit of the kind of large-scale breakthroughs that gave rise to the atomic bomb and the internet, ceding the challenge of developing the next wave of pathbreaking technologies to the private sector —a remarkable and near-total placement of faith in the market. Silicon Valley, meanwhile, turned inward, focusing its energy on narrow consumer products, rather than projects that speak to and address our greater security and welfare.
The grandiose rallying cry of a generation of founders in Silicon Valley was simply to build. Few asked what needed to be built, and why. For decades, we have taken this focus—and indeed obsession in many cases-by the technology industry on consumer culture for granted, hardly questioning the direction, and we think misdirection, of capital and talent to the trivial and ephemeral. Much of what passes for innovation today, of what attracts enormous amounts of talent and funding, will be forgotten before the decade is out.
This early dependence of Silicon Valley on the nation-state and indeed the U.S. military has for the most part been forgotten, written out of the region’s history as an inconvenient and dissonant fact-one that clashes with the Valley’s conception of itself as indebted only to its capacity to innovate.
The term “scientist” itself was only coined in 1834, to describe Mary Somerville, a Scottish astronomer and mathematician; prior to that, the blending of pursuits across physics and the humanities, for instance, was so commonplace and natural that a more specialized word had not been needed. Many had little regard for the boundary lines between disciplines, ranging from areas of study as seemingly unrelated as linguistics to chemistry, and zoology to physics.
The country’s ability to reliably deliver economic and scientific advances for the public, from medical breakthroughs to military capabilities-was essential to its credibility. As Jürgen Habermas has suggested, a failure by leaders to deliver on implied or explicit promises to the public has the potential to provoke a crisis of legitimacy for a government. When emerging technologies that give rise to wealth do not advance the broader public interest, trouble often follows. Put differently, the decadence of a culture or civilization, and indeed its ruling class, will be forgiven only if that culture is capable of delivering economic growth and security for the public.
Many of these engineers have never encountered someone who has served in the military. They exist in a cultural space that enjoys the protection of the American security umbrella but are responsible for none of its costs.
This generation knew what it opposed-what it stood against and could not condone—but not what it was for. The earliest technologists who built the personal computer, the graphical user interface, and the mouse, for example, had grown skeptical of advancing the aims of a nation whose allegiance many of them believed it did not deserve.
“Scientists aren’t responsible for the facts that are in nature. It’s their job to find the facts. There’s no sin connected with it—no morals.” The scientist, in this frame, is not immoral but rather amoral, existing outside or perhaps before the point of moral inquiry. It is a view still held by many young engineers across Silicon Valley today. A generation of programmers remains ready to dedicate their working lives to sating the needs of capitalist culture, and to enrich itself, but declines to ask more fundamental questions about what ought to be built and for what purpose.
Nobody’s sense of self, or at least not ours, turns on the ability to find the square root of a number with twelve digits to fourteen decimal places. We were, as a species, content to outsource this work—the mechanical drudgery of mathematics and physics—to the machine. And we didn’t mind. But now the machine has begun to encroach on domains of our intellectual lives that many had thought were essentially immune from competition with computing intelligence.
What does it mean for humanity when Al becomes capable of writing a novel that becomes a bestseller, moving millions? Or makes us laugh out loud? Or paints a portrait that endures for decades? Or directs and produces a film that captures the hearts of festival critics? Is the beauty or truth expressed in such works any less powerful or authentic merely because they sprang from the mind of a machine?
Rather than resist, we might see this next era as one of collaboration, between two species of intelligence, our own and the synthetic. The relinquishment of control over certain creative endeavors may even relieve us of the need to define our worth and sense of self in this world solely through production and output.
The response that we too are primitive computational machines, with training phases in early childhood ingesting material throughout our lives, is perhaps unconvincing or rather unwelcome to such skeptics.
We might be wary, however, of a certain chauvinism that privileges the experience and capacity of the human mind above all else. Our instinct may be to cling to poorly defined and fundamentally loose conceptions of originality and authenticity in order to defend our place in the creative universe. And the machine may, in the end, simply decline to yield in its continued development as we, its creator, debate the extent of its capabilities.
Our attention should instead be more urgently directed at building the technical architecture and regulatory framework that would create moats and guardrails around the ability of Al programs to autonomously integrate with other systems, such as electrical grids, defense and intelligence networks, and our air traffic control infrastructure. If these technologies are to exist alongside us over the long term, it will be essential to rapidly construct systems that allow more seamless collaboration between human operators and their algorithmic counterparts, but also to ensure that the machine remains subordinate to its creator.
While it is currently fashionable to claim that the strength of our ideas and ideals in the West will inevitably lead to triumph over our adversaries, there are times when resistance, even armed resistance, must precede discourse. Our entire defense establishment and military procurement complex were built to supply soldiers for a type of war-on grand battlefields and with clashes of masses of humans—-that may never again be fought. This next era of conflict will be won or lost with software. One age of deterrence, the atomic age, is ending, and a new era of deterrence built on Al is set to begin. The risk, however, is that we think we have already won.
The ability of free and democratic societies to prevail requires something more than moral appeal. It requires hard power, and hard power in this century will be built on software.
“To be coercive, violence has to be anticipated. The power to hurt is bargaining power. To exploit it is diplomacy-vicious diplomacy, but diplomacy.” The virtue of Schelling’s version of realism was its unsentimental disentanglement of the moral from the strategic. As he made clear, “War is always a bargaining process.”
We have seen firsthand the reluctance of young engineers to build the digital equivalent of weapons systems. For some of them, the order of society and the relative safety and comfort in which they live are the inevitable consequence of the justice of the American project, not the result of a concerted and intricate effort to defend a nation and its interests. Such safety and comfort were not fought for or won. For many, the security that we enjoy is a background fact or feature of existence so foundational that it merits no explanation. These engineers inhabit a world without trade-offs, ideological or economic.
Nobel confided in a letter to a friend that more capable weapons, not less, would be the best guarantors of peace. “The only thing that will ever prevent nations from beginning war is terror,” he wrote.
“We hated what we were doing,” a U.S. airman who flew in one of the B-29 bombers over Tokyo in March 1945 later recalled in an interview. “But we thought we had to do it. We thought that raid might cause the Japanese to surrender.”
The culture almost snickers at Musk’s interest in grand narrative, as if billionaires ought to simply stay in their lane of enriching themselves and perhaps providing occasional fodder for celebrity gossip columns. A profile of Musk in the New Yorker published in 2023 suggested that the world would be better off with fewer “mega-rich luxury planet builders,” decrying his “seeming estrangement from humanity itself.”
The American foreign policy establishment has repeatedly miscalculated when dealing with China, Russia, and others, believing that the promise of economic integration alone will be sufficient to undercut their leadership’s support at home and diminish their interest in military escalations abroad. The failure of the Davos consensus, the reigning approach to international relations, was to abandon the stick in favor of the carrot alone. Anne Applebaum rightly reminds us that a “natural liberal world order” does not exist, despite our most fervent aspirations, and that “there are no rules without someone to enforce them.”
The unrelenting scrutiny to which contemporary public figures are now subjected has also had the counterproductive effect of dramatically reducing the ranks of individuals interested in venturing into politics and adjacent domains. Advocates of our current system of ruthless exposure of the private lives of often marginally public figures make the case that transparency, one of those words that has nearly become meaningless from overuse, is our best defense against the abuse of power. But few seem interested in the very real and often perverse incentives, and disincentives, we have constructed for those engaging in public life.
The expectations of disclosure have increased steadily for more than half a century and have brought essential information to the voting public. They have also contorted our relationship with our elected officials and other leaders, requiring an intimacy that is not always related to assessing their ability to deliver outcomes. Americans, in particular, “have overmoralized public office,” as an editorial in Time magazine warned decades ago in 1969, and “tend to equate public greatness with private goodness.”
We think we want and need to know our leaders. But what about results? The likability of our elected leaders is essentially a modern preoccupation and has become a national obsession, yet at what cost?
In that moment, the country was for the first time introduced to a new and striking level of granularity in the disclosures that it required from its politicians, and perhaps the beginning of a decline in the quality of those willing to come forward and submit to the spectacle. His wife reportedly asked Nixon, affecting a certain naïveté, faux or otherwise, “Why do you have to tell people how little we have and how much we owe?” Her husband replied that politicians were destined to “live in a goldfish bowl.” But the systematic elimination of private spaces, even for our public figures, has consequences, and ultimately further incentivizes only those given to theatrics, and who crave a stage, to run for office. The candidates who remain willing to subject themselves to the glare of public service are, of course, often interested more in the power of the platform, with its celebrity and potential to be monetized in other ways, than the actual work of government.
An entire generation of executives and entrepreneurs that came of age in recent decades was essentially robbed of an opportunity to form actual views about the world-both descriptive, what it is, and normative, what it should be-leaving us with a managerial class whose principal purpose often seems to be little more than ensuring its own survival and re-creation.
The problem is that those who say nothing wrong often say nothing much at all. An overly timid engagement with the debates of our time will rob one of the ferocity of feeling that is necessary to move the world.
When you strike at a king, you must kill him.
The Soviet leadership went to great lengths to document and detail the proscriptions of the day, even publishing “periodic handbooks that listed which specific phrases were out of bounds.” The means by which the Chinese government patrolled the boundaries of speech, however, were far more subversive in Link’s view, and in many ways more closely approximate the contemporary model of attempts to constrain speech in the United States. Link wrote that the Chinese government “rejected these more mechanical methods” of censorship used by the Soviet regime “in favor of an essentially psychological control system,” in which each individual must assess the risk of a statement against what Link describes as “a dull, well-entrenched leeriness” of disapproval by the state.
“If I give my name, I lose my future,” he said. But is a belief that has no cost really a belief? The protective veil of anonymity may instead be robbing this generation of an opportunity to develop an instinct for real ownership over an idea, of the rewards of victory in the public square as well as the costs of defeat.
She argued that “our primary moral allegiance is to no community,” national or otherwise, but rather “to justice” itself. The ideal at the time, and still for many today, was for a sort of disembodied morality, one unshackled from the inconvenient particularities of actual life. But this move toward the ethereal, the post-national, and the essentially academic has strained the moral capacity of our species. These cosmopolitan and technological elites in the developed world were citizens of no country; their wealth and capacity for innovation had, in their minds, set them free.
Carter noted that the roots of the contemporary skepticism of religion are essentially modern, beginning with Freud perhaps, who viewed religion as a sort of obsessive impulse. In an essay titled “Obsessive Actions and Religious Practices,” published in 1907, Freud wrote that the “formation of a religion,” with its oscillating focus on guilt and atonement from sin, itself “seems to be based on the suppression, the renunciation, of certain instinctual impulses.”
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents finally die.” The miracle of the West is its unrelenting faith in science. That faith, however, has perhaps crowded out something equally important, the encouragement of intellectual courage, which sometimes requires the fostering of belief or conviction in the absence of evidence.
But the unintended consequence of this assault on religion was the eradication of any space for belief at all-any room for the expression of values or normative ideas about who we were, or should become, as a nation. The soul of the country was at stake, having been abandoned in the name of inclusivity. The problem is that tolerance of everything often constitutes belief in nothing.
We unwittingly deprived ourselves of the opportunity to critique any aspects of culture, because all cultures, and by extension all cultural values, were sacred.
A survey conducted in 2023 of graduating seniors at Harvard University, for instance, found that nearly half of the entire class was headed for jobs in finance and consulting. Only 6 percent of graduates of Harvard College in 1971 went into those two professions after graduation, according to an analysis by the Harvard Crimson. That proportion rose steadily in the 1970s and 1980s, peaking at 47 percent in 2007 just before the financial crisis.
An aristocracy driven by talent is an essential feature of any republic. The challenge is ensuring that such aristocracies remain open to new members and do not descend into mere caste structures, which close their ranks along racial or religious lines.
The challenge for any organization, and indeed nation, is finding ways of empowering a group of leaders without incentivizing them to spend more effort guarding the trappings and perquisites of office than advancing the goals of the group. The caste structures that have formed within countless organizations around the world—from federal bureaucracies to international agencies to academic institutions and Silicon Valley technology giants-must be challenged and dismantled if those institutions have any hope of survival over the long term.
The antiseptic nature of modern discourse, dominated by an unwavering commitment to justice but deeply wary when it comes to substantive positions on the good life, is a product of our own reluctance, and indeed fear, to offend, to alienate, and to risk the disapproval of the crowd. Yet there is too much that lies “beyond justice.” Justice is the skeleton: the good life is the flesh and blood.
What began as a noble search for a more inclusive conception of national identity and belonging—and a bid to render the concept of “the West” open to any entrants interested in advancing its ideals—over time expanded into a more far-reaching rejection of collective identity itself. And that rejection of any broader political project, or sense of the community to which one must belong in order to accomplish anything substantial, is what now risks leaving us rudderless and without direction.
The issue was not merely what college students ought to be taught, but rather what the purpose of their education was, beyond merely enriching those fortunate enough to attend the right school. What were the values of our society, beyond tolerance and a respect for the rights of others? What role did higher education have, if any, in articulating a collective sense of identity that was capable of serving as the foundation for a broader sense of cohesion and shared purpose?
The virtue of a core curriculum situated around the Western tradition was that it facilitated and indeed made possible the construction of a national identity in the United States from a fractured and disparate set of cultural experiences— a form of civic religion, tethered largely to truth and history across the centuries but also aspirational in its desire to provide coherence to and grounding of a national endeavor.
Appiah, a critic of the entire conception of “the West,” would later argue that “we forged a grand narrative about Athenian democracy, the Magna Carta, Copernican revolution, and so on,” building to the crescendo of a conclusion, notwithstanding evidence to the contrary, that “Western culture was, at its core, individualistic and democratic and liberty-minded and tolerant and progressive and rational and scientific.” For Appiah and many others, the idealized form of the West was a story, riveting perhaps and compelling at times, but a narrative nonetheless, and one that had been imposed, and awkwardly foisted and fitted, onto the historical record, rather than emerging from it.
There is not a history, but rather many possible histories.
To many critics, the apparent arbitrariness of the editorial process of developing a syllabus for a course as ambitious as the History of Western Civilization-and selection of only a small handful of works for inclusion from such an enormous list of candidates-was alone reason to abandon the project. “We have Plato, but why not Aristotle? Why not more Euripides? Paradise Lost, but why not Dante? John Stuart Mill, but why not Marx?”
The substantive triumph of Orientalism was its exposing to a broad audience the extent to which the telling of history, the act of summation and synthesis into narrative from disparate strands of detail and fact, was not itself a neutral, disinterested act, but rather an exercise of power in the world.
His central thesis provides the basis for much of what passes as foundational in the humanities today, that the identity of a speaker is as important if not more important than what he or she has said. The consequences of this reorientation of our understanding of the relationship between speaker and that which is spoken, storyteller and story, and ultimately identity and truth have been profound and lasting.
The book displayed no awareness of the vast archive of Asian, African, and Latin-American thought that had preceded it, including discourses devised by non-Western elites - such as the Brahminical theory of caste in India—to make their dominance seem natural and legitimate.
Attempts to construct world history courses had themselves “often been contaminated” by what he regarded “as patently false assertions of the equality of all cultural traditions.”
The thin conception of belonging to the American community consisted of a respect for the rights of others and a broad commitment to neoliberal economic policies of free trade and the power of the market. The thicker conception of belonging required a story of what the American project has been, is, and will be-what it means to participate in this wild and rich experiment in building a republic.
In an essay on human aggression published in 1947, Parsons observed that many men “will inevitably feel they have been unjustly treated, because there is in fact much injustice, much of which is very deeply rooted in the nature of the society, and because many are disposed to be paranoid and see more injustice than actually exists.” And he went further. The feeling of being “unjustly treated,” Parsons noted, is “not only a balm to one’s sense of resentment, it is an alibi for failure.”
Far too much capital, intellectual and otherwise, has been dedicated to sating the often capricious and passing needs of late capitalism’s hordes.
At most human organizations, from government bureaucracies to large corporations, an enormous amount of the energy and talent of individuals is directed at jockeying for position, claiming credit for success, and often desperately avoiding blame for failure. The vital and scarce creative output of those involved in an endeavor is far too often misdirected to crafting self-serving hierarchies and patrolling who reports to whom.
Every human institution, including the technology giants of Silicon Valley, has a means of organizing personnel, and such organization will often require the elevation of certain individuals over others. The difference is the rigidity of those structures, that is, the speed with which they can be dismantled or rearranged, and the proportion of the creative energy of a workforce that goes into maintaining such structures and to self-promotion within them.
The point is only that voids or perceived voids within an organization in our experience have repeatedly had more benefits than costs, often being tilled by ambitious and talented leaders who see gaps and want to play a role but might otherwise have been cowed into submission for fear of venturing onto somebody else’s turf.
A symphony orchestra, for example, should, based on the prevailing conceptions of how organizations ought to be structured, have “several group vice president conductors and perhaps a half-dozen division VP conductors.” Orchestras, however, had no such layers. As Drucker explained, “There is only the conductor-CEO—and every one of the musicians plays directly to that person without an intermediary. And each is a high-grade specialist, indeed an artist.”
The flaw, and indeed tragedy, of American corporate life is that the vast majority of an individual employee’s energy during their working lives is spent merely on survival, navigating among the internal politicians at their organizations, steering clear of threats, and forming alliances with friends, perceived and otherwise. We and other technology startups are the beneficiaries of the sheer exhaustion that many young and talented people either experience or can sense from the American corporate model, which can be an unapologetically extractive enterprise that too often requires a redirection of scarce intellectual and creative energy toward internal struggles for power and access to information.
In this way, the legions who have flocked to Silicon Valley are cultural exiles, many of whom are extraordinarily privileged and empowered, but misfits and thus exiles nonetheless. They have consciously chosen to remove themselves from capitalism’s dominant corporate form and join an alternative model, imperfect and complex, to be sure, but one that at its best suggests a new means of human organization.
The central insight of Silicon Valley was not merely to hire the best and brightest but to treat them as such, to allow them the flexibility and freedom and space to create. The most effective software companies are artist colonies, filled with temperamental and talented souls. And it is their unwillingness to conform, to submit to power, that is often their most valuable instinct.
At many of the most successful technology giants in Silicon Valley, there is a culture of what one might call constructive disobedience. The creative direction that an organization’s most senior leaders provide is internalized but often reshaped, adjusted, and challenged by those charged with executing on their directives in order to produce something even more consequential. A certain antagonism within an organization is vital if it is to build something substantial. An outright dereliction of duty might simply hold an organization back. But the unquestioning implementation of orders from higher up is just as dangerous to an institution’s long-term survival.
The instinct to conform to the behavior of those around us, to the norms that others demonstrate, and to prize the abilities that most around us find second nature, is in the vast majority of cases extraordinarily adaptive and helpful, for both our individual survival and that of the human species. Our desire to conform is immense and yet crippling when it comes to creative output.
With the army’s attempt to build a software system for soldiers in Afghanistan, the reliance on a tangle of contractors and subcontractors— and a yearslong procurement process that often involved more preparation and planning for the construction of software than actual coding-had deprived Lockheed Martin of any real opportunity to incorporate feedback from its users into its development plans for the system. The military’s software project had devolved into a pursuit of an almost abstract conception of what software should look like, with far less concern for the actual features and capabilities, the workflows and interface, that would either make the software valuable to someone working all night on a laptop in Kandahar to prepare for a special forces operation the next morning or not.
The organization’s stated goal had been to acquire or build what soldiers needed within three to six months—a radically ambitious timeline in the world of defense contracting, where new weapons systems often languished in development for years and even decades.
The conflict would end up costing $2 trillion over two decades, or $300 million every day for twenty years, according to estimates by a research group at Brown University. It has been more than fifty years since the United States abandoned mandatory conscription in 1973, near the end of the Vietnam War. And since then a generation of political elites has essentially enlisted others to fight their wars abroad.
As of August 2006, there were only three members of Congress-three out of our 535 U.S. representatives and senators-who had a child serving in the American military.
The structural issue was that the procurement bureaucracy within the U.S. government had become so large and so entrenched, wielding enormous power and influence, that it had grown used to ordering custom-built versions of whatever it needed instead of shopping for goods, like everyone else, on the open market. The federal procurement officials responsible for supplying the U.S. military could direct the efforts of thousands of subcontractors and suppliers, essentially dictating that anything they wanted or needed be conjured and created from scratch. The government did not technically employ the product designers or own the factories. But it effectively controlled them, and could also pay any price.
The strategy of public servants, he added, was often “to just not make waves, to not disturb their careers, to not do anything unusual that might get them in trouble.”
The federal procurement machine, which includes 207,000 federal employees who have been hired to manage government acquisitions and purchases.
The modern enterprise is often too quick to avoid such friction. We have today privileged a kind of ease in corporate life, a culture of agreeableness that can move institutions away, not toward, creative output. The impulse— indeed rush-to smooth over any hint of conflict within businesses and government agencies is misguided, leaving many with the misimpression that a life of ease awaits and rewarding those whose principal desire is the approval of others.
This is a grievance industry, and it is at risk of depriving a generation of the fierceness and sense of proportion that are essential to becoming a full participant in this world. A certain psychological resilience and indeed indifference to the opinion of others are required if one is to have any hope of building something substantial and differentiated.
The act of rebellion that involves building something from nothing-whether it is a poem from a blank page, a painting from a canvas, or software code on a screen—by definition requires a rejection of what has come before. It involves the bracing conclusion that something new is necessary. The hubris involved in the act of creation-that determination that all that has been produced to date, the sum product of humanity’s output, is not precisely what ought or need be built at a given moment-is present within every founder or artist.
For Berlin, there was a “great chasm” between the hedgehogs among us in the world, “who relate everything to a single central vision, one system less or more coherent or articulate, in terms of which they understand, think and feel,” and the foxes, “who pursue many ends, often unrelated and even contradictory, connected, if at all, only in some de facto way.”
After the end of World War II, U.S. defense and intelligence agencies launched a massive and secret effort to recruit Nazi scientists, in order to retain an advantage in the coming years in developing rockets and jet engines. At least sixteen hundred German scientists and their families were relocated to the United States. Some were skeptical about this late embrace of the former enemy. An officer in the U.S. Air Force urged his commander to set aside any distaste for recruiting the German scientists to this new cause, writing in a letter that there was an immense amount to be learned from this “German-born information,” if only “we are not too proud.”
The challenge is fostering a sufficiently gentle and forgiving internal culture that encourages the most talented and high-integrity minds within an organization to come forward and report problems rather than hide them. Most companies are populated with people so fearful of losing their jobs that any hint of dysfunction is quickly covered up. Others are simply trying to make it to their retirement without being discovered as providing little or no value to the organization. Many more are monetizing the decline of empires they had once built.
The drift of the technology world to the concerns of the consumer both reflected and helped reinforce a certain technological escapism-the instinct by Silicon Valley to steer away from the most important problems we face as a society toward what are essentially the minor and trivial yet solvable inconveniences of everyday consumer life, from online shopping to food delivery. An entire swath of challenges from national defense to violent crime, education reform to medical research, appeared to many to be too intractable, too thorny, and too politically fraught to address in any real way.
An FBI file on the writer James Baldwin had swelled to 1,884 pages by 1974. Such invasions of personal privacy set the stage for a certain dualism in the debate over the twentieth century; either technological advances, including fingerprints, DNA, and later facial recognition systems, were essential to the difficult and often fruitless task of dismantling violent criminal networks, or they were the tools by which an overreaching state would target the powerless and imprison the innocent.
The use of our platform, known as Gotham, spread quickly across the police department, with the Times-Picayune describing the system as “a one-stop shop for pulling up and cross-referencing information,” and “discovering unseen connections among victims, suspects or witnesses.”
The critics, however, were swift and fierce. The reaction, indeed, was visceral for many. Why should New Orleans permit the deployment of a software system designed for use in a foreign war on the streets of the city at home?
It was a surreal moment. One billionaire asking a multimillionaire whether a salary of less than what a first-year associate would make at an investment bank was appropriate for the chairman of the Federal Reserve, the most powerful and influential central bank on the planet. The decisions that Powell himself makes are easily some of the most consequential in the world. During the course of his tenure, the fates of hundreds of millions of workers in the United States and abroad have hinged on his instincts about the path of inflation, the timing of interest rate increases and potential decreases, and his views about the strength of the American and global economies. Trillions of dollars in stock markets from New York to London, and Sydney to Shanghai, would trade hands as the direct result of his thinking and attempt to steer the U.S. economy, and by extension the world’s, through a historically vulnerable period of inflation and potentially softening growth. And yet Congress has decided to pay him around $190,000 per year.
But we decline to confront the consequence of this approach, which is that we essentially incentivize candidates for public service to become wealthy before entering office, or to monetize their position after their departure. The extent of self-promotion and theater in the U.S. Congress is astounding, with representatives in the lower chamber vying for clicks and social media influence, and by extension incomes, after they leave office. The quality of candidates is a feature, in part, of what we are willing to pay them.
At a parliamentary debate on the matter, Lee responded that politicians “are real men and women, just like you and me, with real families who have real aspirations in life.” He continued: “So when we talk of all these high-falutin, noble, lofty causes, remember at the end of the day, very few people become priests.”
The ascetic streak in American culture is admirable; deprivation, a skepticism of the material, reminds us that a bare and hollow commitment to consumption alone will inevitably lead us astray. But those instincts, the unstated desire that public servants be our priests, are having the unintended and undesirable consequence of depriving vast sectors of the public economies-in government, education, and medicine of the benefits that the right incentives can create.
In his mind, the rules were for other people. When a deputy arrived in his office with a book of U.S. Navy regulations, Rickover recalled telling the officer to get out and burn the book. “My job was not to work within the system. My job was to get things done.”
The speed with which we increasingly have abandoned the unpopular, the unlikable, and the less than charismatic personalities among us should give us pause. The risk is that we begin to privilege the seemingly unobjectionable goals of transparency and process over what actually matters-building submarines, developing our most elusive cures, preventing terrorist attacks, and advancing our interests. Such a utilitarian calculus is unattractive. But in any struggle, we must sometimes set aside aesthetic distaste. We too often hide behind our piety as a way of avoiding more challenging and indeed uncomfortable questions about outcomes and results.
Our desire for purity is understandable. We cling to the hope that the most noble and pious among us will also have the ambition to seek power. But history tells us that the opposite is far more often the case. The eradication of any space for forgiveness a jettisoning of any tolerance for the complexities and contradictions of the human psyche may leave us with a cast of characters at the helm we will grow to regret.
Without such belonging, there is nothing for which to fight, nothing to defend, and nothing to work toward. A commitment to capitalism and the rights of the individual, however ardent, will never be sufficient; it is too thin and meager, too narrow, to sustain the human soul and psyche. James K. A. Smith, a philosophy professor at Calvin University, has correctly noted that “Western liberal democracies have lived off the borrowed capital of the church for centuries.”
He compared an African mask, leaving its country of origin on the continent unspecified, with the Apollo Belvedere at the Vatican, concluding with characteristic assuredness that “the Apollo embodies a higher state of civilization than the mask.” Elsewhere he declined, with a bracing dismissiveness, to provide Spain a central role in the history of Western civilization, questioning what of significance the country had done “to enlarge the human mind and pull mankind a few steps up the hill.”
The entrepreneurs of Silicon Valley do not lack idealism; indeed, they often appear to be brimming with it. But it is thin and can wither under even the slightest scrutiny. The legions of young founders have for decades now routinely claimed that they aspire to change the world. Yet such claims have grown meaningless from overuse. This cloak of idealism was put on in order to relieve these young founders of the need to develop anything approaching a more substantial worldview. And the nation-state itself, the most effective means of collective organization in pursuit of a shared purpose that the world has ever known, was cast aside as an obstacle to progress.