This article was initially delivered as a speech to the 75th Innovation Command of the US Army Reserve on January 8, 2022.
A growing number of Americans are recognizing that a long era in American history seems to be coming to an end. Slogans like “Make America Great Again” reflect an awareness of this ending, but the actual characteristics of the era that is ending are harder to define. This essay takes a global view to better understand the structural conditions for American supremacy that are now on a downward slope of decline.  It also proposes what might be done–not to recover a far-from-ideal past, but to chart an even better course for the future.
The World’s Largest Creditor
Our story begins at the beginning of the twentieth century, when the United States established itself as the world’s creditor based on its loans and arms sales to European countries fighting the First and Second World Wars. After the First World War, the US insisted that the loans we had made to our Allies be repaid in gold; during the Second World War, we also insisted that our Allies pay for the arms we sold them in gold. As a result, by the end of the Second World War, the United States had become the world’s major gold repository, housing about ¾ of the world’s gold. The Allies, by contrast, had depleted their gold stores.
At first, this seemed great for the United States–until we realized that, if other countries had no money and no assets, there was no one for us to trade with. Not only that, but broke countries were far more likely to go down the paths of fascism and communism–which had created the very Second World War that we had just helped the world end. In other words, we realized that if we were too isolationist–if we put up too many tariff barriers and kept all the toys for ourselves–we would create an unstable and dangerous world order, with enemies around every corner. Such a world was a powder keg, ready to explode at any moment.
So, in 1944, as the Second World War was coming to a close, the world needed stability, and that meant it also needed money. The United States got its allies together at Bretton Woods, in New Hampshire, to figure out how to create a world system that worked for everyone–but of course, for America most of all. At the conference, 44 allied countries came together to align on the world’s post-war economic order. The resulting Bretton Woods Agreement linked the dollar to gold, and other countries agreed to peg their currencies to the dollar. This meant their central banks were committed to maintaining fixed exchange rates between their own currencies and USD. The US dollar has been the world’s primary reserve currency ever since.
Now, how were we going to solve the problem that these countries were largely broke? How would they get the dollars they needed to restart their economies and trade with the United States? We gave loans to some countries to help them along, but the main way we decided to accomplish this was by running foreign trade deficits with countries around the world. In other words, by exporting more to America than they imported, other countries would earn the dollars they needed to then also buy American goods. Surplus dollars could then be converted by foreign central banks into Treasury bills and Treasury bonds and stored as reserves.
There was also another way we financed the world: military spending and foreign aid, which are often closely connected. Congress has always been reluctant to approve foreign aid programs unless they could be tied to American strategic military interests, and in the second half of the 20th century, that mostly meant fighting communism. The Cold War began, in effect, as a giant program of wealth redistribution from the United States to the countries around the world that we wanted to prevent from becoming communist. The problem with this strategy was that it mired us deeper and deeper in debt to finance our wars.
The World’s Largest Debtor
Only five years after the end of the Second World War, in 1950, the United States found itself in debt for the very first time–due entirely to the costs of the Korean War. But we kept deficit spending to fund that war and then every war after: Vietnam, other anticommunist proxy wars, Desert Shield/Desert Storm, and most recently, the Global War on Terror (Afghanistan, Iraq, etc.).
But we didn’t stop at war. We also began deficit spending to fund major social programs, like LBJ’s Great Society–Medicare, Medicaid, and Social Security, among other programs. And we didn’t stop there either. Spending on Human Services–which includes these programs as well as things like unemployment assistance, food assistance, education, and others–has steadily grown as a % of GDP since the early 1960’s.
How did we achieve this ability to spend without limit? Every other country that has waged war at the scale of the United States has gone bankrupt. This happened to the Allies after WWI and WWII, when the United States–their major creditor–demanded that these countries sell off their gold reserves and international assets to pay their debts to us. But when our turn came to pay our debts, we did something unprecedented–we simply said “no”.
In 1971, as the Vietnam War entered its final stage and countries around the world were rushing to redeem their dollars for gold, we closed the gold window. This meant that foreign countries could no longer redeem gold for dollars. Instead, the only thing they could buy for their dollars were T-Bills and T-Bonds–US debt.
Let’s pause for a moment to remark on how unprecedented this was. We told other countries that not only would we not repay our debts in gold, like we promised at Bretton Woods (and like we made them do for us). Instead, we told them that they had to keep buying more of our debt if they wanted their economies to remain solvent. We did something no other country in the world had ever done before: we created a global empire by becoming the world’s biggest debtor.
In a memo circulated to the British War Cabinet in 1945, economist John Maynard Keynes wrote, “Owe your banker £1,000 and you are at his mercy; owe him £1 million and the position is reversed.” This statement has been re-said many times in different ways. The basic thrust is that there is an inflection point when it comes to debt: once it passes a certain threshold, the power dynamic between debtor and creditor reverses. Creditor and debtor are locked in a relationship, and what happens to one of them necessarily impacts what happens to the other. Even a very imbalanced relationship where one or the other seems on top is never as unilateral as it may seem. The power the United States has been able to exercise as the world’s greatest debtor has also become the power other countries hold over us.
So why did countries around the world keep buying US debt, despite this seemingly unfair bargain? Well, countries with currencies pegged to the dollar needed to keep buying T-bills in order to keep their exchange rates stable. But even those who stopped pegging their currencies to the dollar after the gold window closed in 1971 still wanted to keep their currencies competitive to ensure export markets for their goods and services. If countries stopped buying US treasuries, the US dollar would rapidly lose value against their own currencies as dollars flooded the international markets. This would make US exports far cheaper and more competitive, crushing competition. While only about 9-12% of US GDP has come from exports over the past few decades, many other countries rely on exports for around 25%--sometimes more–of their GDP. Those countries buy up US debt in order to ensure that they don’t lose their export markets–particularly the lucrative US market. Thanks to their competitive exports, many developing countries have realized the international development goals that were conceptualized at Bretton Woods: they have used their trade surpluses to build out their manufacturing bases, often at the expense of US manufacturing.
The United States also used its prominent position–and veto power–in international financial bodies like the IMF to ensure that we could open large streams of credit whenever needed, and never be obligated to repay those loans. This is, in essence, what motivated the creation of the Special Drawing Rights (SDR) reserve asset in 1969. The US needed to continue financing the Vietnam War and social programs as it closed the gold window. It convinced countries that subscribed to this plan to accept SDRs instead of gold, dollars, or more tangible assets. Notably, countries that didn’t subscribe to this plan at the time included a few major oil producing countries: Kuwait, Libya, and Saudi Arabia.
The Petrodollar System
This was a problem for the US, especially after the 1973 Arab-Israeli war drove OPEC to curtail oil supply and drive up prices fourfold to punish the United States for its military support of Israel. In response, the US used the full force of its diplomacy and the lure of military aid to strike a deal with Saudi Arabia to price oil–arguably the world’s most important commodity–in dollars. This created a domino effect whereby most oil-producing nations also started pricing oil in dollars, and the “petrodollar” reserve system was born. This system “shadow backed” the US dollar with another global commodity–oil–after the gold window had closed. The system helped finance US deficits and ensured that oil wealth would be infused into the US economy through “petrodollar recycling.” It also incentivized countries around the world to sell their exports in dollars so that they could buy the oil they critically need.
However, this handshake agreement with Saudi Arabia has cast a long shadow over the United States. Our military presence in Saudi Arabia, a critical element of the political exchange between the two countries, has been deeply resented by many Saudis. It is by now well-known that fifteen of the nineteen 9/11 hijackers were Saudi citizens. While the Saudi Government has repeatedly denied any involvement in the attacks, attempts to pursue evidence of links both through Congressional and FBI investigations have been actively censored or thwarted by US Government officials–even at the highest levels of the Executive Branch. It is remarkable that after an attack on the US mainland that killed more Americans than Pearl Harbor, the US did not reconsider its special relationship with an implicated sovereign nation. Instead, we invaded Iraq–a nation that had no ties to 9/11, but that had recently (in 2000) begun pricing its oil exports in the new “Euro” currency.
There is no way to overstate the damage that 9/11 and the post-9/11 fallout has done to American society and governing institutions. Although the attacks briefly created a feeling of national unity, that unity quickly dissipated as the administration began pursuing wars in Afghanistan and Iraq. These were struggling countries on the other side of the world that most Americans could not connect to their everyday lives–and now they were being asked to fight and die there. Almost immediately (in 2002), the War on Terror resulted in the justification of torture–previously anathema to official US policy, which had adhered to the Geneva Conventions established after the Second World War. The American public did not become aware of this shift until the 2004 Taguba Report, which revealed the widespread torture of Iraqi prisoners by American forces at Abu Ghraib prison. Private contractors made $138 billion on the War in Iraq alone, about $38 billion of which went to Halliburton, a company that had been led by Vice President Dick Cheney as CEO from 1995 until 2000. While Cheney denied conflicts of interest, he held 433,000 Halliburton stock options during his tenure as VP. This led to a widespread sense among American people that the wars were characterized by corruption and backroom dealing. Finally, a series of whistleblower revelations–especially those provided by Chelsea Manning to Wikileaks–revealed evidence of war crimes perpetrated by American forces. To this day, senior editor of Wikileaks Julian Assange is still being prosecuted by the US government and allied governments; under the Trump Administration, the CIA even discussed kidnapping and killing him.
It is these “moral injuries” of war--the public damage to professed American values–that arguably has had the most corrosive effect on American political institutions. If the abuse of power is publicly normalized by American leaders and government institutions, then what does that say about the integrity of our country?
In addition, what was achieved by sacrificing American values? Much like Vietnam, which ended with victory by communist forces (the very thing we ostensibly gave up the gold standard to prevent), the end of the Afghan war last year prompted an immediate takeover of the country by the Taliban, protectors of the very terrorist organization–al-Qaeda–that had orchestrated the 9/11 attacks. The bungled Afghanistan withdrawal quickly became a disaster for the Biden administration, even as most Americans supported a withdrawal in principle. The war in Iraq was officially shorter (2003-2011), and did successfully remove Saddam Hussein from power. But it certainly failed to defeat terrorism, which was its stated goal. Instead, it created the conditions for the rise of the Islamic State, an even more violent offshoot of al-Qaeda. The Islamic State is now at war with its parent terrorist organization in the region, where the US continues to conduct military operations to “contain” it.
The wars in Iraq and Afghanistan sharply polarized and divided the United States, in large part due to their cost, which was never really discussed with Congress or the American people prior to their authorization. The War in Afghanistan alone cost $2.3 trillion, and that does not include future interest payments on the loans we took out to fight that war. Our humiliating withdrawal from that country has only made those costs sting more. As of 2020, the conflicts in Iraq and Syria cost another $2 trillion (including interest payments), with very mixed foreign policy outcomes for the United States. Iraq is in effect a failed state, while Syria is now back under Bashar al-Assad’s despotic rule, thanks to Russia’s support–and increased influence in the region.
Meanwhile, as these wars have raged, most Americans have seen their costs of housing, healthcare, and education rise dramatically. The median American income can no longer provide for these costs. Record numbers of Americans are struggling with lifetime student loans and medical debt that is our nation’s primary cause of personal bankruptcy, and an economy where real wages have not increased since the early 1970’s. To add insult to injury, Americans see wealth inequality in our society equal to or greater than during the Gilded Age. The net worth of the top 1% in America is equal to the net worth of the bottom 90%. Among developed nations, the US has the highest level of income inequality and the lowest level of socioeconomic mobility–which has called into question the idea of the “American dream”. According to the US Census Bureau, wealth inequality hit a 50-year high in 2021, and approximately 33 million people earn poverty-level wages (less than $10 per hour). But during the first eleven months of the Coronavirus Pandemic alone, the wealth of US billionaires grew by 44%--$1.3 trillion–even as millions lost jobs and healthcare coverage.
Political party leaders like to counter the fact of Americans’ declining purchasing power and rising inequality with: “We have a strong economy with high employment.” Indeed, the US Department of Labor puts the official unemployment rate for December 2021 at 3.9%. But that number paints an all-too-rosy picture. The Ludwig Institute for Shared Economic Prosperity has calculated what they call the rate of functional employment, or true rate of unemployment. That number includes those looking for full-time work who can only find part-time work and those working full-time but earning too little to climb above the poverty line. That number was over 25% in January 2021--and it had barely changed from the 24% it registered in February 2020, right before the pandemic hit, when the economy was supposedly “hot”. If you add in Americans who have simply given up looking for work, nearly 54% percent of working-age Americans did not have living-wage full-time jobs as of January 2021, only one year ago.
In other words, many Americans are beginning to wonder whether our status as the world’s sole superpower actually benefits them. It is no accident that President Trump campaigned twice on a populist platform of ending what he called America’s “forever wars”. (Of course, as President, he often behaved very differently.) Some of the biggest promoters of the Wars in Afghanistan and Iraq–Liz and Dick Cheney–are currently in the process of being publicly and dramatically forced out of the Republican Party. They are being embraced, in what at first might seem an unbelievable reversal, by the Democratic Party. But we should remember that Democratic Party leadership not only voted for both the Afghanistan and Iraq wars, but led the pursuit of regime change in countries including Serbia, Libya and Honduras. By the time the Syria war came along, Congress would no longer vote to authorize military force, so the Obama administration went ahead and began striking ISIS in Syria anyway, claiming it was authorized under the 2001 AUMF against al-Qaeda. The broad war-making drive of both Republican and Democratic parties is simply a matter of public record.
Costs of the Petrodollar System
The point of this historical excursion is that American society has paid a steep price for maintaining the dollar as the world’s reserve currency–and for being the world’s military superpower. Let’s itemize this receipt:
In short, the United States has been relying on its “too big to fail” status as the arbiter of the petrodollar world reserve system and as the world’s guarantor of “peace and security” via warmaking. But all “too big to fail” means is that, in this moment, the structural incentives are such as to motivate powerful actors to prop something up. The moment conditions change and those incentives are no longer there, whatever was previously too big to fail, fails indeed. And that moment is coming.
As wars become more expensive to fight directly, some in the US take solace in the fact that the petrodollar’s reserve status enables us to project power as the world’s financial police, using our sway over the SWIFT network and international financial institutions to sanction countries and individuals that do things we don’t like by cutting them off from the international payments system. But do we really want to be proud of cutting off Afghanistan–the country we occupied for two decades–from its sovereign bank accounts, plunging an entire population into starvation? Will future generations remember us freezing the assets and confiscating the property of whistleblowers as in line with American values of liberty and justice for all? And if the United States–the supposed bastion of free-market capitalism–doesn’t stand for property rights, what rights do we stand for?
Bitcoin: Sound Money as Structural Diplomacy
It was inevitable that countries like Russia, China, and India would find ways to become increasingly sanctions-resistant by moving away from the dollar to trade in oil and other commodities, and to use alternative payment systems. For example, Russia and India are already trading oil bilaterally in rubles and rupees. India has been paying for Iranian oil in rupees since 2019. And not only has China built an alternative to the SWIFT network, CIPS, but it has used blockchain technology to build its own digital currency that will soon be required to transact in China. This reflects a nascent move toward a more regional reserve currency model–a decentralization of world power in which the US is still a very powerful actor, but one among many. In Russia, President Vladimir Putin and the Ministry of Finance are pushing back against a ban on Bitcoin proposed by the country’s Central Bank to suggest that regulated mining and use of the stateless cryptocurrency could be a competitive advantage.
The United States will need more than military strength to prevent this growing decentralization from becoming a bloody contest among nations that leads to the next set of World Wars. We will need more than a powerful military to solve the domestic problems caused by taking on the role of the world’s sole superpower. Indeed, for the United States to begin thriving again, we must start rebuilding a net creditor position, and to do this we will need international institutions that are truly neutral arbiters of value and cannot be captured by the political interests of any country or class of people. That is why Bitcoin–the distributed, unforgeably scarce digital currency–represents a critical store of value and medium exchange for the 21st century. Its political neutrality and censorship-resistance are the very properties that make it ideally suited to mediate value between states, societies, and individuals. In this way, Bitcoin is critical not only for ensuring economic growth, peace, and security for the United States, but for the wider world.
While the Bitcoin protocol prescribes the creation of 21 million bitcoin, the entire world economy could be rebooted from a single coin. In other words, the absolute amount of bitcoin circulating in the world has no impact on the currency’s ability to act as a backstop of value for everything else. This is what makes Bitcoin “sound money”, and sound money constrains spending and debt while providing great elasticity from a credit standpoint. Sound money also restores confidence in the value of assets (like fiat currencies) that may be pegged to or backstopped by it. In addition, as a peer-to-peer digital currency that enables people to transfer value directly without intermediaries, Bitcoin embodies American values of liberty and justice that have been so publicly compromised by political institutions in recent decades. It reminds us that the right to our own property is, in seed, the right to many other things–that it is the basis for inviolability that protects the individual from the state and enables individuals to constitute societies together. Finally, Bitcoin’s energy consumption is already driving a shift to a new energy paradigm by incentivizing decarbonization and the transition to renewable energy. This reduces the need for oil and the global power politics it gives rise to, creating new possibilities for shared prosperity.
This is not to suggest that Bitcoin will necessarily become “the” new reserve currency of the world, but rather that it will soften the inevitable transition from a global petrodollar reserve system to a more multipolar system of regional reserves that include the dollar and that are potentially backstopped by, or at the very least in a trade pair with, Bitcoin. This is ultimately in the United States’ interest, as it will help to prevent or soften the terrible economic devastation if the current system collapses without an alternative already in place. The end of the petrodollar reserve system is, after all, coming. It is structurally inevitable, but it is within our power to ensure that we not only survive but thrive through it together. For this reason, the United States should immediately begin stockpiling a strategic Bitcoin reserve, as we will need a monetary reset pegged to something sounder than the full faith and credit of the United States when the latter is no longer of sufficient value.
The dollar’s unique global reserve status was a function of specific historical circumstances that no longer obtain. There is no going back to the mid-20th century world that gave rise to that reality. Even if the United States were to somehow acquire ¾ of the world’s Bitcoin, like we did with the world’s gold after the Second World War, and pegged the US dollar to that Bitcoin, there could not be another Bretton Woods in which so many countries agree to peg their currencies to that dollar. Quite simply, after nearly a century of American global dominance, the world’s countries are looking for more equal partnerships not only with the United States, but with other countries as well. And they are in a stronger position than ever to demand that. If we are smart, we will use this reality to our advantage by forming strong trade partnerships and becoming the jurisdiction of choice for the best and brightest from all over the world to come and generate value. We will achieve this as much through our character as we do through our strength. We can build the next generation of global alliances and American prosperity by increasingly channeling our vast military power toward what that Presbyterian minister and icon of American popular culture, Mr. Rogers, suggested we all become–a Good Neighbor. This is not some kind of pollyanna naïveté, but what leadership with moral courage looks like.
In summary, by embracing a truly decentralized sound money reserve–Bitcoin–America can once again start running a trade surplus; restore manufacturing power to our country; bolster our national security; become energy self-sufficient; dramatically decrease carbon emissions, kickstart GDP growth; protect property rights; and refrain from the overextension of commitments and resources that eventually destroys all empires.
Perhaps most important of all, we can heal our moral injuries and restore hope by remembering who we are as a nation; what we stand for; and what we fight for: liberty and justice for all.
 This essay adapts findings from Michael Hudson’s Super Imperialism: The Economic Strategy of American Empire. Islet, 2021.
I started this blog to share some of my thoughts about value. The transformations we are seeing in the world around us—in systems of government, geopolitical alignments, economic institutions and received wisdom—make now a particularly appropriate time to be asking the question, “What is value?”
It turns out that answering this question is not straightforward. We all have a sense that we know what value means, but we are often surprised by the ways it manifests in everyday life. For example, why do people sometimes seem to act against their own self-interest? Who determines whether something is valuable, and how? What is the nature of the relationship between growth and debt? Ownership and wealth? And how does all of this relate to qualities of character, like trust and integrity?
These are questions that interest me both as a social scientist and a sales executive. Hopefully, they will also be interesting to those in other disciplines—economics, finance, government, computer science, the arts, and the natural sciences. I welcome a dialogue with expert voices across disciplines. Ultimately, truth and value are inextricably connected. Anyone seriously in pursuit of truth is also in pursuit of value.
Every era of major transformation is a time of both crisis and opportunity. How we respond to it determines the trajectory the transformation takes. I believe that humanity can solve some of the most intractable problems—poverty, social inequality, opportunity dead zones, sterile institutions, climate crisis, information warfare—if we put our heads together. Choosing a path forward requires that we make trade-off decisions about what we choose to prioritize and when. The methods we use to achieve the goals we have prioritized will be contested; they will not align neatly with the simple answers offered by ideological teams and tribes.
But conflict isn’t the problem; it’s actually one of the ways humans problem solve in groups. Anger isn’t a problem either; healthy anger is an expression of personal investment and care. Conflict and anger only become toxic if they calcify into rigidity, cruelty, and lack of imagination. I think we can preclude that by keeping our focus on the goals we are trying to achieve together. The goal of this blog is just a first, small step: let’s think together.
We're living in strange times. Political polarization not only in the US, but in many countries around the world, is at generational highs. Charismatic political leaders and parties exploit social divisions to heighten hatred and mutual suspicion within their populations, propelling themselves to electoral victories.
Since the end of 2019, a global pandemic has swept the globe, killing over a million people and infecting over 40 million as of this writing. The novel coronavirus (COVID-19, or SARS-CoV-2), which affects the lungs, kidney, heart, and brain, may produce a severe illness that often takes much longer to recover from than most flu viruses. Mortality rates for COVID-19 are still being estimated, but scientists at the Johns Hopkins School of Medicine suggest it may be ten times deadlier than most strains of the flu. The disease offers no immunity from re-infection, which also means building herd immunity to the disease is impossible. It has resulted in harrowing, lifelong medical complications for many survivors.
The COVID-19 pandemic has pushed the US economy into a historically unprecedented state. Like a dissociating patient, its real economy and financial economy have split into two completely distinct realities. The US real economy (representing profits, growth, and job creation) contracted at a never-before-recorded rate of 32.9% in Q2 2020. Q2 2020 beat the previous quarterly record for fastest drop in US GDP growth—Q3 1893, which saw a crippling depression with -8.4% growth following a run on the banks. By contrast, the US financial economy—specifically, the equities market—has been skyrocketing to all-time-highs. (The trajectory of the credit market has been much more mixed.)
It is not the pattern of a split between the real and financial economy that is unprecedented. Stock prices rose in seven of the past twelve recessions going back to the Second World War, and the 2008 crisis was no exception. During an economic contraction, money often flees to a conservative position of fiat currency, bonds, and safe equities. In addition, government stimulus money given to corporations is rarely used to keep people employed; instead, temporary furloughs become permanent layoffs, which boost earnings results—raising stock values in turn. Moreover, the particular nature of the COVID-19 crisis has rapidly accelerated a shift to digitization of both consumer and enterprise services and business processes, boosting technology stocks to all-time highs. Indeed, almost all of the gains in the equities market came from the technology sector, whose top companies are seeing record earnings and their highest valuations in history.
What is unprecedented is the magnitude of the split between the contraction of the real economy and growth in the stock market. It reflects the historically high gap between the highest and lowest earners and the fundamentally different ways in which these groups deploy capital. Income inequality in the US is now at the same levels it was just before the Great Depression. Since 1989, the total US wealth controlled by the bottom 50% has been cut nearly in half (from 3.6% to 1.9%). Between 50% and 75% of Americans now live paycheck to paycheck, meaning they rely upon ongoing income from their own labor to make ends meet. About 30% of Americans have no savings at all, and 40% of US adults don’t have a spare $400 in cash, savings, or credit they can quickly pay off. The bottom 90% of Americans own most of their wealth in their homes and are burdened by over 75% of the country’s private debt.
In other words, despite the longest economic expansion in US history, record low unemployment (3.5% by the end of 2019), and wage increases (including minimum wage increases in more than 20 states), most Americans are leading lives characterized by deep economic insecurity. Wage increases at the bottom haven’t come anywhere close to the increases in income and wealth occurring at the top. When adjusted for inflation, average wages for most Americans have barely increased since 1974. This is despite the growing productivity of the American worker: net productivity has increased by over 100% since 1979. While the 1940 Fair Labor Standards Act limited the workweek to 40 hours (8 hours per day), many Americans across the economic spectrum work significantly more.
Some have argued that this comparison is unfair, because it excludes benefits received by workers and doesn’t take into account the great advances in product quality and prices that have improved quality of life for everyone. It is undoubtedly true that advances in connectivity and consumer technology have empowered and made life easier for ordinary people unlike anything before. Yet, such an analysis doesn’t take into account that the costs of housing, healthcare, and education--major expenses necessary to live lives of productive citizenship—have spiraled upward in recent decades, becoming less affordable than ever even for families with means. The median price of home sales has increased 39% since 1974, while healthcare costs have increased by $9,000 per person since 1970, and the cost of education has more than doubled since 1971.
In other words, employer benefits are paying for less now than in the 1970’s, even if they may be spending more. And while many in the poor and middle class may enjoy the benefits of smartphones, advanced computing, and near-ubiquitous web connectivity, this doesn’t resolve the economic precarity that keeps them living paycheck to paycheck in tenuous jobs that increasingly lack the benefits offered full-time employees.
By contrast with the bottom 90%, the total private national wealth owned by the top 10% of US households has risen by about 10% since 1989, to 69% of all US wealth in 2020. The top 1% of US households now own 15 times more wealth than the bottom 50% combined. They also own more than half of the national wealth invested in stocks and mutual funds, but only about 5% of private debt in the country. In other words, those at the top of the economic pyramid are virtually unexposed to contractions in the real economy, or to price increases in education, healthcare, and real estate. So long as they steward their asset portfolios well, they can mostly continue making money no matter what happens. It is no surprise that a 2019 Georgetown Study found that wealth, not ability, is the biggest predictor of future success in America.
This means there is a profound division between how the top 10%--and especially the top 1%--experience an economic contraction and how everyone else does. And an economic contraction during a pandemic is particularly harsh. Unemployment rates during the COVID-19 crisis have exceeded those during the 2008 Great Recession, reaching levels that last prevailed during the 1929 Great Depression. Nearly 50 million Americans have filed for unemployment at some point since March 2020. About 40% of households earning less than $40,000 per year were laid off or furloughed by early April. Although many of these people have found employment since their initial filing, these jobs are often temporary and precarious.
The vast majority of this group was likely part of the 50%-75% of Americans living paycheck to paycheck before the pandemic. This has resulted in a spike in food insecurity. Before the pandemic, about 10% of US households were food insecure, but the COVID-19 crisis more than doubled this number to 23% of US households. Among households with children, the rate rose to nearly one in three (29.5%). Enrollment in the expanded Supplemental Nutrition Assistance Program (SNAP) grew by 17% from February to May, but it was not sufficient to meet the growing need.
Similarly, when COVID-19 struck, nearly half of all renter households in the US were already rental-cost burdened, meaning they paid over 30% of their income on rent. 25%, including over half of all renters below the poverty line, were paying over half of their income towards rent. Already facing rent insecurity, up to 40 million Americans are now at risk of eviction. Of those rent-insecure Americans, approximately 11 million are behind on rent as of September 2020, representing one out of six adult renters. That number does not include the approximately 10 million homeowners—10% of the homeowning population—who are behind on their mortgage payments.
Finally, in 2019, before the pandemic struck, 30 million Americans already had no health insurance. The number of uninsured had been increasing steadily since 2016 (reasons for this increase are still disputed). Because most Americans rely on their employer for insurance coverage, as of late August 2020 an additional 12 million had lost coverage when they lost their jobs as a result of COVID-19. Their options are grim: to continue their previous employer’s coverage under the COBRA plan, they must pay what they were previously paying each month while employed as well as paying their former employer’s contribution. This is often prohibitively expensive. Alternatively, they may qualify for a plan under the Affordable Care Act, but it may not be less than their COBRA payment. Finally, Medicaid may be an option, but only if they meet their state’s eligibility requirements. For example, collecting unemployment benefits could render some people ineligible for Medicaid, leaving Americans in a double bind: do I seek unemployment benefits or health insurance?
If it seems particularly harsh to lose healthcare coverage during a global pandemic, that is because it is. The number of Americans regularly facing the decision of whether to buy life-saving medications, like insulin, or groceries was already troubling pre-pandemic, but now has increased dramatically. And while COVID tests at certain locations are now free, they can still cost money at “non-public sites” like emergency rooms—and patients often aren’t told the difference. Independent nonprofit FAIR Health estimates that uninsured patients, or insured patients who receive care deemed out-of-network by their insurance company, can expect to pay anywhere from $42,486 to $74,310 if they are hospitalized with COVID-19. Those who are insured don’t fare much better; insurance just cuts their costs about in half ($21,936 to $38,755). Some patients do pay less than this, but many patients pay much more, especially if they have additional health complications. The fragmented, complex nature of the US healthcare system means that patients often can’t know what their treatment will cost until they receive a bill.
The numbers above paint a striking picture of two Americas, divided by class (which, as we will see in future posts, is also heavily overlapped by race), being asked to choose who will represent them politically during a time of economic precarity and upheaval. But while the coronavirus pandemic has undoubtedly exacerbated existing social and economic divisions within the US, it has also shed light on the instability underlying the US financial system. This has led to profound concerns that any steps taken to ease the devastation of the COVID crisis could hasten the collapse of the system itself.
The instruments typically employed by governments to manage economic crises—fiscal and monetary policy—are proving their limitations in real time. On the fiscal policy side, it is clear that federal government stimulus—via direct payments and loans made to state governments, businesses, and individuals—staved off the very worst of the potential human suffering for at least some of the 90%. It also significantly boosted both credit and equities markets, as well as consumer confidence. However, as shown above, it has been insufficient to remedy the nationwide humanitarian crisis, and the government will likely soon pass another stimulus package. Much like the decade following 2008, it is probable that the US government will continue to deficit spend at high levels, for a long period of time, to stimulate the economy.
On the monetary policy side, the Federal Reserve has deployed a tool honed during the 2008 crisis: quantitative easing. By employing unlimited quantitative easing (sometimes colloquially called “money printing”), the Fed has reinvigorated bank liquidity, propped up failing corporations, and promoted consumer spending. This may achieve some of the government’s economic objectives, but at the cost of the overall creditworthiness of the United States. This is because every bout of QE significantly increases the US’s Debt-to-GDP ratio, which signals to credit and capital markets how much money the United States is generating annually (GDP) to pay off its debts. Between Q4 2007 and Q4 2012, the US Debt-to-GDP ratio ballooned from 63% to 100% as a result of QE deployed to recover from the 2008 economic crisis. As of Q1 2020, the ratio stood at 107%. As of this writing (October 2020), the US Debt-to-GDP ratio is over 130%.
In the past century, 51 out of 52 countries with a 130% sovereign debt-to-GDP ratio have defaulted. (Japan is the only counterexample, but we shall see why in a moment.) Default is a situation where a country’s treasury can no longer service its debt. If the US defaults, the consequences would be catastrophic for not just the US, but the world. Hundreds of billions of dollars of foreign investment would likely be removed from the US immediately, crippling the economy. Many countries hold significant amounts of US treasury reserves, and would be unable to collect their value--losing assets that add up to significant percentages of their GDP. International confidence in the dollar would crumble, and the US would no longer be able to borrow on favorable terms (if at all). US assets might be seized around the world. US Treasury rates would spike, and since US mortgage and student loan rates are tied to treasury rates, most Americans would see interest rates on their private debt shoot upward. This would trigger a wave of private defaults even more significant than what we have already seen in the housing and student loan crises to date. A global depression would likely ensue.
Despite its astronomical debt, however, a default of the United States is impossible. Quite simply, all of the United States’ debt is denominated in dollars, its own sovereign fiat currency. That also means that the United States is the only country or entity in the world that can issue the currency used to repay its own debt. The Federal Reserve can immediately generate all the dollars needed to repay its debt by just creating the money on its digital ledger. This is why Valéry Giscard d'Estaing, French Minister of Finance during the 1960s, famously excoriated the “exorbitant privilege” of the dollar.
Japan is in a similar position to the US; all of its sovereign debt is denominated in its own currency, the yen. In addition, 50% of the debt is held by the Bank of Japan, the country’s central bank, and another 40% is held by Japanese investors. In essence, Japan primarily owes itself. Thus, by keeping interest rates very low, Japan can continue servicing its debt relatively easily. This has allowed Japan’s debt-to-GDP ratio to balloon to over 250% with relatively few negative repercussions for the economy—at least in the near term.
The UK and China also have domestic debt denominated in their own currencies. A default of these countries is therefore also extremely unlikely.
However, default is not the only reason to avoid debt. The main problem with endlessly printing money to pay down debt is that, over time, it devalues the money itself. Conventional wisdom holds that this is because it hyperinflates the currency, but in fact, currency devaluation and hyperinflation are two different things. As investor and strategist Lyn Alden shows, currency devaluation has only been inflationary in the US when federal and private debt is very high as a percentage of GDP and Congress begins deficit spending that gets monetized by the Fed (rather than funded by private lenders). In other words, exactly what is happening now. The resulting devaluation of the currency also devalues everyone's debt, which, in turn, prompts lending and spending. Once this happens, however, only a modest increase in monetary velocity is needed to bring about high inflation, which further decreases purchasing power.
Alden demonstrates how the US dollar has been systematically devalued over the past century as part of debt “supercycles”. This means that savers and investors with a large amount of currency exposure have lost significant purchasing power, even if they've seen returns in nominal terms. In other words, simply “adjusting for inflation” isn’t enough to understand how much money people are really making over time; the value of the dollar has dropped to the extent that even people seeing returns on investment higher than the rate of inflation over the past several decades have had significant amounts of their wealth destroyed. The statistic cited earlier—that the average, inflation-adjusted wage of most Americans has barely budged since 1974—doesn’t take into account currency devaluation. As a result, we can infer that the average American is actually making less money than they were in 1974.
The only way to beat the dual forces of inflation and currency devaluation and actually grow one’s wealth over time is to invest in assets that appreciate in value so fast that they break out of the combined gravitational pull of inflation and currency devaluation. These tend to be equities, but even such rapidly-growing equities are rare. Nevertheless, the few equities that do grow reliably fast over a long enough period of time can generate net new wealth for entire societies, if enough capital is leveraged to them. And entrepreneurs who build new companies generating new value can create significant personal wealth for themselves, their employees, and for other shareholders via well-timed liquidity events.
Recently, Bitcoin has been showing similar patterns. As the world’s first truly decentralized, non-sovereign currency, Bitcoin unsettles central banks because its fixed, deflationary monetary policy is beyond their control. There will only ever be 21 million Bitcoin, each of which is subdivided into 100 million Satoshi. This digital scarcity ensures that it cannot simply be “printed” to pay down debts. In this way, it is more akin to gold than to most currencies, but it also has other features of currencies, like fungibility and portability, that make it useful as a medium of exchange. It is therefore a new kind of money. Bitcoin’s relative novelty in the marketplace has shown us a real-time process of price discovery reflecting a dawning comprehension of its value. This value, expressed in price, has skyrocketed along a parabolic trajectory since it was launched in 2009.
The US is not the only country where the value of currency has been systematically depreciated. Brazil, China, Japan, the UK, and the Eurozone are all major economies that have dangerous levels of debt and monetary policies that allow for the unlimited creation of money to pay down that debt. This is why the launch of Bitcoin as a global store of value is poised to offer a stable foundation during a time of what may be wild fluctuations in currency values. At the very least, it offers a vehicle for growing and protecting wealth in an era when central banks are growing their balance sheets to eye-watering levels.
The situation described above—political extremism and polarization; a pandemic; declining standards of living; rising wealth inequality; dangerous levels of national debt; and a bipartisan political system inherited from the mid-20th century—is what ordinary Americans are facing as they move into the presidential election on November 3rd, 2020. They are in a kind of Scylla and Charybdis situation, forced to choose every few years between what they perceive as the "lesser of two evils" to stave off what they experience viscerally as accelerating decline. This routine political cage match has become a major distraction from the work of actually looking at the evidence of what is going on and reasoning about what to do next. People often feel more passionately about whichever "lesser of two evils" they have chosen than they do about positive alternatives to the choice itself. This tribalism, in turn, creates a reactive mindset that is easily captured by various flavors of paranoia and propaganda.
What is our way out of this impasse? I am beginning this blog because I believe that the first step is better understanding what we mean by stores of value, credit, and, of course, debt. We need to take a closer look at these categories because they help us describe the circulation of value in a society and give us insight into how engines of wealth generation may be preserved regardless of the fate of any particular store of value, indebted institution, or defaulting debtor. Macroeconomics has focused on describing the relationships between these categories at scale and then deriving techniques for managing them. But in this blog, I want to take a step back and ask how they are related. This is a sociological question that sheds light on the why behind the how.
The first step to solving seemingly intractable social problems—including an economic crisis or crisis of governance—is to think. This is more difficult than it seems. Contemporary political discourse would have us believe that all the cards are already on the table; that there are well-established “teams” and authorities with points of view that are either right or wrong, and all we need to do is choose which team and authority we will align ourselves with and then ensure they get and keep power.
I aim to show that, rather than already understanding the terms of the debate, the very terrain that generates those terms is shifting under our feet. We get to make this path as we walk it—and we could go in many different directions.
So let’s be vigilant, let’s be courageous, let’s be optimistic—and let’s walk it together.