Alex Karp and co-author Nicholas Zamiska are unafraid to offend those among the technocratic elite who have drifted away from vital national questions to instead develop a smug and complacent focus on shopping websites, photo-sharing apps, and other shallow but wildly lucrative endeavors. To them, there is no point to fighting over who gets the most luxurious stateroom on the Titanic. Without a renewed commitment to addressing the most existential national threats we face, serious risks to our country will continue to grow-rendering mere business success quite hollow.


You will never touch the hearts of others, if it does not emerge from your own.


The power to hurt is bargaining power. To exploit it is diplomacy-vicious diplomacy, but diplomacy.


A moment of reckoning has arrived for the West. The loss of national ambition and interest in the potential of science and technology, and resulting decline of government innovation across sectors, from medicine to space travel to military software, have created an innovation gap. The state has retreated from the pursuit of the kind of large-scale breakthroughs that gave rise to the atomic bomb and the internet, ceding the challenge of developing the next wave of pathbreaking technologies to the private sector —a remarkable and near-total placement of faith in the market. Silicon Valley, meanwhile, turned inward, focusing its energy on narrow consumer products, rather than projects that speak to and address our greater security and welfare.


The grandiose rallying cry of a generation of founders in Silicon Valley was simply to build. Few asked what needed to be built, and why. For decades, we have taken this focus—and indeed obsession in many cases-by the technology industry on consumer culture for granted, hardly questioning the direction, and we think misdirection, of capital and talent to the trivial and ephemeral. Much of what passes for innovation today, of what attracts enormous amounts of talent and funding, will be forgotten before the decade is out.


This early dependence of Silicon Valley on the nation-state and indeed the U.S. military has for the most part been forgotten, written out of the region’s history as an inconvenient and dissonant fact-one that clashes with the Valley’s conception of itself as indebted only to its capacity to innovate.


The term “scientist” itself was only coined in 1834, to describe Mary Somerville, a Scottish astronomer and mathematician; prior to that, the blending of pursuits across physics and the humanities, for instance, was so commonplace and natural that a more specialized word had not been needed. Many had little regard for the boundary lines between disciplines, ranging from areas of study as seemingly unrelated as linguistics to chemistry, and zoology to physics.


The country’s ability to reliably deliver economic and scientific advances for the public, from medical breakthroughs to military capabilities-was essential to its credibility. As Jürgen Habermas has suggested, a failure by leaders to deliver on implied or explicit promises to the public has the potential to provoke a crisis of legitimacy for a government. When emerging technologies that give rise to wealth do not advance the broader public interest, trouble often follows. Put differently, the decadence of a culture or civilization, and indeed its ruling class, will be forgiven only if that culture is capable of delivering economic growth and security for the public.


Many of these engineers have never encountered someone who has served in the military. They exist in a cultural space that enjoys the protection of the American security umbrella but are responsible for none of its costs.


This generation knew what it opposed-what it stood against and could not condone—but not what it was for. The earliest technologists who built the personal computer, the graphical user interface, and the mouse, for example, had grown skeptical of advancing the aims of a nation whose allegiance many of them believed it did not deserve.


“Scientists aren’t responsible for the facts that are in nature. It’s their job to find the facts. There’s no sin connected with it—no morals.” The scientist, in this frame, is not immoral but rather amoral, existing outside or perhaps before the point of moral inquiry. It is a view still held by many young engineers across Silicon Valley today. A generation of programmers remains ready to dedicate their working lives to sating the needs of capitalist culture, and to enrich itself, but declines to ask more fundamental questions about what ought to be built and for what purpose.


Nobody’s sense of self, or at least not ours, turns on the ability to find the square root of a number with twelve digits to fourteen decimal places. We were, as a species, content to outsource this work—the mechanical drudgery of mathematics and physics—to the machine. And we didn’t mind. But now the machine has begun to encroach on domains of our intellectual lives that many had thought were essentially immune from competition with computing intelligence.


What does it mean for humanity when Al becomes capable of writing a novel that becomes a bestseller, moving millions? Or makes us laugh out loud? Or paints a portrait that endures for decades? Or directs and produces a film that captures the hearts of festival critics? Is the beauty or truth expressed in such works any less powerful or authentic merely because they sprang from the mind of a machine?


Rather than resist, we might see this next era as one of collaboration, between two species of intelligence, our own and the synthetic. The relinquishment of control over certain creative endeavors may even relieve us of the need to define our worth and sense of self in this world solely through production and output.


The response that we too are primitive computational machines, with training phases in early childhood ingesting material throughout our lives, is perhaps unconvincing or rather unwelcome to such skeptics.


We might be wary, however, of a certain chauvinism that privileges the experience and capacity of the human mind above all else. Our instinct may be to cling to poorly defined and fundamentally loose conceptions of originality and authenticity in order to defend our place in the creative universe. And the machine may, in the end, simply decline to yield in its continued development as we, its creator, debate the extent of its capabilities.


Our attention should instead be more urgently directed at building the technical architecture and regulatory framework that would create moats and guardrails around the ability of Al programs to autonomously integrate with other systems, such as electrical grids, defense and intelligence networks, and our air traffic control infrastructure. If these technologies are to exist alongside us over the long term, it will be essential to rapidly construct systems that allow more seamless collaboration between human operators and their algorithmic counterparts, but also to ensure that the machine remains subordinate to its creator.


While it is currently fashionable to claim that the strength of our ideas and ideals in the West will inevitably lead to triumph over our adversaries, there are times when resistance, even armed resistance, must precede discourse. Our entire defense establishment and military procurement complex were built to supply soldiers for a type of war-on grand battlefields and with clashes of masses of humans—-that may never again be fought. This next era of conflict will be won or lost with software. One age of deterrence, the atomic age, is ending, and a new era of deterrence built on Al is set to begin. The risk, however, is that we think we have already won.


The ability of free and democratic societies to prevail requires something more than moral appeal. It requires hard power, and hard power in this century will be built on software.


“To be coercive, violence has to be anticipated. The power to hurt is bargaining power. To exploit it is diplomacy-vicious diplomacy, but diplomacy.” The virtue of Schelling’s version of realism was its unsentimental disentanglement of the moral from the strategic. As he made clear, “War is always a bargaining process.”


We have seen firsthand the reluctance of young engineers to build the digital equivalent of weapons systems. For some of them, the order of society and the relative safety and comfort in which they live are the inevitable consequence of the justice of the American project, not the result of a concerted and intricate effort to defend a nation and its interests. Such safety and comfort were not fought for or won. For many, the security that we enjoy is a background fact or feature of existence so foundational that it merits no explanation. These engineers inhabit a world without trade-offs, ideological or economic.


Nobel confided in a letter to a friend that more capable weapons, not less, would be the best guarantors of peace. “The only thing that will ever prevent nations from beginning war is terror,” he wrote.


“We hated what we were doing,” a U.S. airman who flew in one of the B-29 bombers over Tokyo in March 1945 later recalled in an interview. “But we thought we had to do it. We thought that raid might cause the Japanese to surrender.”


The culture almost snickers at Musk’s interest in grand narrative, as if billionaires ought to simply stay in their lane of enriching themselves and perhaps providing occasional fodder for celebrity gossip columns. A profile of Musk in the New Yorker published in 2023 suggested that the world would be better off with fewer “mega-rich luxury planet builders,” decrying his “seeming estrangement from humanity itself.”


The American foreign policy establishment has repeatedly miscalculated when dealing with China, Russia, and others, believing that the promise of economic integration alone will be sufficient to undercut their leadership’s support at home and diminish their interest in military escalations abroad. The failure of the Davos consensus, the reigning approach to international relations, was to abandon the stick in favor of the carrot alone. Anne Applebaum rightly reminds us that a “natural liberal world order” does not exist, despite our most fervent aspirations, and that “there are no rules without someone to enforce them.”


The unrelenting scrutiny to which contemporary public figures are now subjected has also had the counterproductive effect of dramatically reducing the ranks of individuals interested in venturing into politics and adjacent domains. Advocates of our current system of ruthless exposure of the private lives of often marginally public figures make the case that transparency, one of those words that has nearly become meaningless from overuse, is our best defense against the abuse of power. But few seem interested in the very real and often perverse incentives, and disincentives, we have constructed for those engaging in public life.


The expectations of disclosure have increased steadily for more than half a century and have brought essential information to the voting public. They have also contorted our relationship with our elected officials and other leaders, requiring an intimacy that is not always related to assessing their ability to deliver outcomes. Americans, in particular, “have overmoralized public office,” as an editorial in Time magazine warned decades ago in 1969, and “tend to equate public greatness with private goodness.”


We think we want and need to know our leaders. But what about results? The likability of our elected leaders is essentially a modern preoccupation and has become a national obsession, yet at what cost?


In that moment, the country was for the first time introduced to a new and striking level of granularity in the disclosures that it required from its politicians, and perhaps the beginning of a decline in the quality of those willing to come forward and submit to the spectacle. His wife reportedly asked Nixon, affecting a certain naïveté, faux or otherwise, “Why do you have to tell people how little we have and how much we owe?” Her husband replied that politicians were destined to “live in a goldfish bowl.” But the systematic elimination of private spaces, even for our public figures, has consequences, and ultimately further incentivizes only those given to theatrics, and who crave a stage, to run for office. The candidates who remain willing to subject themselves to the glare of public service are, of course, often interested more in the power of the platform, with its celebrity and potential to be monetized in other ways, than the actual work of government.


An entire generation of executives and entrepreneurs that came of age in recent decades was essentially robbed of an opportunity to form actual views about the world-both descriptive, what it is, and normative, what it should be-leaving us with a managerial class whose principal purpose often seems to be little more than ensuring its own survival and re-creation.


The problem is that those who say nothing wrong often say nothing much at all. An overly timid engagement with the debates of our time will rob one of the ferocity of feeling that is necessary to move the world.


When you strike at a king, you must kill him.


The Soviet leadership went to great lengths to document and detail the proscriptions of the day, even publishing “periodic handbooks that listed which specific phrases were out of bounds.” The means by which the Chinese government patrolled the boundaries of speech, however, were far more subversive in Link’s view, and in many ways more closely approximate the contemporary model of attempts to constrain speech in the United States. Link wrote that the Chinese government “rejected these more mechanical methods” of censorship used by the Soviet regime “in favor of an essentially psychological control system,” in which each individual must assess the risk of a statement against what Link describes as “a dull, well-entrenched leeriness” of disapproval by the state.


“If I give my name, I lose my future,” he said. But is a belief that has no cost really a belief? The protective veil of anonymity may instead be robbing this generation of an opportunity to develop an instinct for real ownership over an idea, of the rewards of victory in the public square as well as the costs of defeat.


She argued that “our primary moral allegiance is to no community,” national or otherwise, but rather “to justice” itself. The ideal at the time, and still for many today, was for a sort of disembodied morality, one unshackled from the inconvenient particularities of actual life. But this move toward the ethereal, the post-national, and the essentially academic has strained the moral capacity of our species. These cosmopolitan and technological elites in the developed world were citizens of no country; their wealth and capacity for innovation had, in their minds, set them free.


Carter noted that the roots of the contemporary skepticism of religion are essentially modern, beginning with Freud perhaps, who viewed religion as a sort of obsessive impulse. In an essay titled “Obsessive Actions and Religious Practices,” published in 1907, Freud wrote that the “formation of a religion,” with its oscillating focus on guilt and atonement from sin, itself “seems to be based on the suppression, the renunciation, of certain instinctual impulses.”


“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents finally die.” The miracle of the West is its unrelenting faith in science. That faith, however, has perhaps crowded out something equally important, the encouragement of intellectual courage, which sometimes requires the fostering of belief or conviction in the absence of evidence.


But the unintended consequence of this assault on religion was the eradication of any space for belief at all-any room for the expression of values or normative ideas about who we were, or should become, as a nation. The soul of the country was at stake, having been abandoned in the name of inclusivity. The problem is that tolerance of everything often constitutes belief in nothing.

We unwittingly deprived ourselves of the opportunity to critique any aspects of culture, because all cultures, and by extension all cultural values, were sacred.


A survey conducted in 2023 of graduating seniors at Harvard University, for instance, found that nearly half of the entire class was headed for jobs in finance and consulting. Only 6 percent of graduates of Harvard College in 1971 went into those two professions after graduation, according to an analysis by the Harvard Crimson. That proportion rose steadily in the 1970s and 1980s, peaking at 47 percent in 2007 just before the financial crisis.


An aristocracy driven by talent is an essential feature of any republic. The challenge is ensuring that such aristocracies remain open to new members and do not descend into mere caste structures, which close their ranks along racial or religious lines.


The challenge for any organization, and indeed nation, is finding ways of empowering a group of leaders without incentivizing them to spend more effort guarding the trappings and perquisites of office than advancing the goals of the group. The caste structures that have formed within countless organizations around the world—from federal bureaucracies to international agencies to academic institutions and Silicon Valley technology giants-must be challenged and dismantled if those institutions have any hope of survival over the long term.


The antiseptic nature of modern discourse, dominated by an unwavering commitment to justice but deeply wary when it comes to substantive positions on the good life, is a product of our own reluctance, and indeed fear, to offend, to alienate, and to risk the disapproval of the crowd. Yet there is too much that lies “beyond justice.” Justice is the skeleton: the good life is the flesh and blood.


What began as a noble search for a more inclusive conception of national identity and belonging—and a bid to render the concept of “the West” open to any entrants interested in advancing its ideals—over time expanded into a more far-reaching rejection of collective identity itself. And that rejection of any broader political project, or sense of the community to which one must belong in order to accomplish anything substantial, is what now risks leaving us rudderless and without direction.


The issue was not merely what college students ought to be taught, but rather what the purpose of their education was, beyond merely enriching those fortunate enough to attend the right school. What were the values of our society, beyond tolerance and a respect for the rights of others? What role did higher education have, if any, in articulating a collective sense of identity that was capable of serving as the foundation for a broader sense of cohesion and shared purpose?


The virtue of a core curriculum situated around the Western tradition was that it facilitated and indeed made possible the construction of a national identity in the United States from a fractured and disparate set of cultural experiences— a form of civic religion, tethered largely to truth and history across the centuries but also aspirational in its desire to provide coherence to and grounding of a national endeavor.


Appiah, a critic of the entire conception of “the West,” would later argue that “we forged a grand narrative about Athenian democracy, the Magna Carta, Copernican revolution, and so on,” building to the crescendo of a conclusion, notwithstanding evidence to the contrary, that “Western culture was, at its core, individualistic and democratic and liberty-minded and tolerant and progressive and rational and scientific.” For Appiah and many others, the idealized form of the West was a story, riveting perhaps and compelling at times, but a narrative nonetheless, and one that had been imposed, and awkwardly foisted and fitted, onto the historical record, rather than emerging from it.


There is not a history, but rather many possible histories.


To many critics, the apparent arbitrariness of the editorial process of developing a syllabus for a course as ambitious as the History of Western Civilization-and selection of only a small handful of works for inclusion from such an enormous list of candidates-was alone reason to abandon the project. “We have Plato, but why not Aristotle? Why not more Euripides? Paradise Lost, but why not Dante? John Stuart Mill, but why not Marx?”


The substantive triumph of Orientalism was its exposing to a broad audience the extent to which the telling of history, the act of summation and synthesis into narrative from disparate strands of detail and fact, was not itself a neutral, disinterested act, but rather an exercise of power in the world.


His central thesis provides the basis for much of what passes as foundational in the humanities today, that the identity of a speaker is as important if not more important than what he or she has said. The consequences of this reorientation of our understanding of the relationship between speaker and that which is spoken, storyteller and story, and ultimately identity and truth have been profound and lasting.


The book displayed no awareness of the vast archive of Asian, African, and Latin-American thought that had preceded it, including discourses devised by non-Western elites - such as the Brahminical theory of caste in India—to make their dominance seem natural and legitimate.


Attempts to construct world history courses had themselves “often been contaminated” by what he regarded “as patently false assertions of the equality of all cultural traditions.”


The thin conception of belonging to the American community consisted of a respect for the rights of others and a broad commitment to neoliberal economic policies of free trade and the power of the market. The thicker conception of belonging required a story of what the American project has been, is, and will be-what it means to participate in this wild and rich experiment in building a republic.


In an essay on human aggression published in 1947, Parsons observed that many men “will inevitably feel they have been unjustly treated, because there is in fact much injustice, much of which is very deeply rooted in the nature of the society, and because many are disposed to be paranoid and see more injustice than actually exists.” And he went further. The feeling of being “unjustly treated,” Parsons noted, is “not only a balm to one’s sense of resentment, it is an alibi for failure.”


Far too much capital, intellectual and otherwise, has been dedicated to sating the often capricious and passing needs of late capitalism’s hordes.


At most human organizations, from government bureaucracies to large corporations, an enormous amount of the energy and talent of individuals is directed at jockeying for position, claiming credit for success, and often desperately avoiding blame for failure. The vital and scarce creative output of those involved in an endeavor is far too often misdirected to crafting self-serving hierarchies and patrolling who reports to whom.