Today is Black Friday, the day where most of the citizens of the United States, after participating in a day of gorging on food and disposing most of it into the garbage get into their automobiles and swarm to various department stores to buy stuff they don’t need and justify it because the goods are supposedly deeply discounted, marking the beginning our annual season of mass consumer mania. In a 2019 study, it was estimated that 80% of these goods will end up in landfills.
Anticipating this perennial convulsion of consumer behavior, Netflix recently released a documentary called, Buy Now: The Shopping Conspiracy. Even from my vantage point as an individual inspired by Duane Elgin and Arnold Mitchell’s ideas on Voluntary Simplicity, I was shocked by how effectively creepy and terrifying this piece of anti-consumer propaganda was. Buy Now isn’t just a sensationalist expose of modern-day sales tactics, it observes our addiction to buying stuff and cranking out products. Picture CGI scenes of cities drowning in heaps of trash, but what really hits home is the real footage of a beach in Ghana, buried under a flood of thrown-away clothes. The film paints a stark picture of how our consumption habits are wreaking havoc and how the rampant tactics of green-washing and recycling are an illusion to make “eco-conscious” consumers feel less guilty to continue to buy more.
I probably became aware of this phenomenon in 1989, when the MTV cable network broadcast the documentary film Woodstock about the landmark 1969 music festival of the same name. As an aspiring musician, I dutifully recorded the film on my hand-me-down Sony Betamax Video Cassette Recorder and repeatedly watched it as I grew into young adulthood. Over the years I’ve realized that film’s profound impact on my worldview, and likely many others in my generation. From my perspective growing up in Reagan’s yuppie suburban America, surrounded by conspicuous aspirational wealth and material abundance, the film depicted the Baby Boom generation as a pseudo-tribal culture, partially clothed in strange attire and participating in rituals of dancing, free love and extensive drug use. That may have been the first time I became aware of an interest in anthropology and the many contradictions in our society.
I had many questions that I’ve grappled with since. I knew the film was a snapshot of a very important cultural event around the same time that humans had landed on the moon, and that it was just a sample of a generation’s hippy subculture. I later learned about Durkheim’s theories of Social Anomie to make sense of how the same culture that produced such technological achievements as space travel, computers, automobiles, telephony and microwave ovens, and whose arc of evolution was supposed to point to a future like that depicted in the science fiction of Gene Roddenberry’s Star Trek coexisted with a “counterculture” engaged in what seemed like spasms of primitive tribal behavior, replete with a hodgepodge of superstitions like vibes, auras, astrology, astral projection and other “experiments in truth” via a range of psychoactive substances.
In 1989, the culture depicted in that documentary was nowhere to be found in my daily life, save for a few winks and nods from older people who wore tie dyed t-shirts on weekends or in the DeadHead subculture who followed the rock band The Grateful Dead around the country as they endlessly toured. 1969 not only saw the Apollo 11 mission, but the introduction of the Boeing 747, the Concorde supersonic passenger plane and the first messages on ARPANET, the technology we now know as the Internet. Following that high point of western civilization, the seventies were an era of introspection and stagnation. The eighties were a time of superficial luxury, ostentation and excess. The nineties declared the “end of history” and enjoyed a victory lap in the triumph of individualism over collectivism in the great global battle of economic ideas. 1969 was pregnant with the duality of the future — what Alvin Toffler described in his 1970 book, “future shock” — a certain psychological state of individuals and entire societies, and a personal perception of “too much change in too short a period of time.” Ironically, after the book was published, the speed of technological progress slowed considerably.
This technological stagnation seems to have been the outcome of a society that was either too distracted or somehow rendered incapable of sustaining the rate of progress of their forebears. This has puzzled many in my generation, from Objectivist libertarian scions like Elon Musk and Peter Thiel to leftist intellectuals like the late David Graeber, who was closely involved in the anti-globalization efforts of the WTO protests in the early 2000s and the Occupy movements following the 2008 financial crisis. These three people might be depicted in a Venn diagram of generation X ideology, who, despite representing very different different traditions, intersect in many interesting ways and have all posed some variation of the question, “why were we promised flying cars but instead got technological stagnation and the world’s greatest distraction system called the Internet?” Graeber and Thiel even participated in a lively debate on the subject in 2014.
Bowling With Nixon
Economist Yanis Varoufakis offers a different explanation in his recent book, Technofeudalism. On 15th August 1971, US President Richard Nixon, faced with rising inflation and the reality that the United States economy had shifted from a net exporter to a net importer, signed legislation that led to the unilateral cancellation of the direct international convertibility of the United States dollar to gold, a fundamental pillar of the Bretton Woods system. The fiat dollar became the backbone of the global economy and sent shockwaves throughout it. Central banks spent the subsequent decades gaming their sovereign currencies against the dollar.
The “dark deal” (as Varoufakis describes it) the United States made with export economies like Germany, Japan, South Korea and later China was that the U.S. economy would continue to absorb the massive volume of exported goods through debt spending and these export countries and their governments, central banks and capital interests could then freely invest and store their wealth in the New York stock exchange, as well as invest in non-strategic real estate and capital assets on the U.S. mainland. This explains why the US economy over the past 55 years has seen a continual transformation from manufacturing to financial, health care, real estate and technology services. Until recently the only sectors that were significantly protected were energy, agriculture, automotive and defense technology. Intellectual property was actively licensed globally or stolen outright by competitors. The U.S. financial sector has seen a mind boggling expansion of derivative products related to equities, commodities and real estate as the dollars poured in from around the world.
In Varoufakis’ view, this came at the cost of massive wealth inequality, lower nominal wages for workers, massive debt spending, never-ending stimulus and ultimately, the collapse of the system in 2008 which has been subsidized precariously by the U.S. government and central banking system. Today Wall Street markets continue to balloon as the world’s investment playground while a large portion of the population still have little to no savings and live paycheck to paycheck.
Varoufakis was democratically appointed as the Finance Minister of Greece during the 2008 financial crisis, where he negotiated bailouts with the European Union and economic reforms by the International Monetary Fund. He is considered a controversial left-leaning economist who is critical of Globalization, but economists of all stripes take him seriously. Globalization has of course sparked diverse reactions throughout history, stemming from philosophical differences regarding its costs and benefits. Supporters argue that economic growth, expansion, and development are essential for human society’s well-being, viewing globalization as a necessary process. However, critics believe these processes can harm social well-being, both globally and locally, by highlighting issues such as sustainability, social inequality, and the negative impacts of colonialism, imperialism, cultural assimilation, and appropriation.
The final scenes of the Woodstock film still haunt me. My teenage hero Jimi Hendrix playing a chaotic electrified rendition of the Star Spangled Banner at around 9:30 am on August 19, 1969 looking out on miles of New York farmland littered with piles of trash, discarded bottles, cups, coolers, blankets, plates, chairs, clothes and excrement left behind by the idealistic youth who ran out of food, water, alcohol, drugs and good vibes and retreated post haste back to the comforts of their modern air conditioned homes, frozen foods, and television sets. It is those final reels of film that impressed upon me an uneasy sense that their revolution and its vestige is ideologically incoherent — a sideshow of individualism that would meander through a hapless age of new age individualism in the 70s and eventually maturate politically as the insipid centrist neoliberal policies of Bill Clinton, Tony Blair, G.W. Bush, Barack Obama and Donald Trump — the baby boomers who were handed the keys to the economic machinery with a shared mission to sustain “growth” and aggregate demand at all costs, regardless of externalities.
A few months after the Woodstock festival, psychologist Walter Mischel conducted a fascinating experiment at Stanford University that sought to understand delayed gratification. Subjects aged around four to five years were presented with a single marshmallow sitting on a table. One by one, the subjects were invited into a room and presented with a choice: they could have one marshmallow immediately, or they could wait for a while and receive two marshmallows later. The researcher explained the rules, then left the subject alone with the marshmallow for about 15 minutes. During this time, the subject had to battle their impulses. When the researcher finally returned, if the subject had resisted the temptation and hadn’t eaten the marshmallow, they were rewarded. This simple test revealed a lot about self-control and patience, insights that would inform psychology for years to come. This widely publicized study was noteworthy for decades because multiple follow-up studies continued to find that there were associations between impulse control in childhood, behavioral problems and later success in test scores and adulthood. Follow-up studies involved brain scans that claimed there were observable differences in the prefrontal cortex (more active in high delayers) and the ventral striatum, (more active in low delayers). Another study compared Japanese and American subjects, finding that there were cultural differences when the context was a wrapped gift or a marshmallow, because social norms involving gifts are different.
One of the recurring criticisms of democratic societies is the problem of the “adequately informed citizen” that may used to justify limited participation in democratic processes. For example, it was the view of the “founding fathers” of the United States that only male landowners should be allowed voting rights. Following the civil war, it was the view of southern states that the portion of the population that is illiterate and thus misinformed should be excluded via literacy tests. Both lines of reasoning were abused to marginalize populations until the progressive reforms of the 20th century. One might be tempted to think, from such elitist logic, that populations conditioned by manipulative advertising, easy access to credit, rampant misinformation and political propaganda are less likely to delay gratification in their personal financial discipline and thus less likely to vote in the country’s best long term economic interests, which leads one down a dangerous slippery slope toward justifying various forms of censorship or authoritarian policies such as China’s current “people’s democratic dictatorship” which claims that western style democracies are too unstable to pursue long-term development targets and long-term programs because of their focus on election campaigns and the frequent changes of government to please a fickle electorate that is unwilling to make sacrifices for the common good. Various forms of such authoritarian thinking have been popular throughout history and have recently gained disturbing traction on both the left and right of the political spectrum of the United States.
This brings us to what I hope is an optimistic thesis: that humanity is slowly evolving our economic models to a material reality that will do much more with less. This will be a process of technological and economic optimization that results in less material inequality, not only because it’s fair but because it will prove to be more efficient. It’s easy to be pessimistic — to think that we’re nearing the end of a golden era of rapid technological progress — but what if it’s only the beginning? I’ve accepted that this will probably not come to pass within the remainder of my short lifetime, and doubt it will evolve via any existing trajectories of national policies because human civilization has only relatively recently passed the apex where optimization rendered material scarcity optional. Our material technology has evolved so rapidly over the past century that our social technology hasn’t caught up yet. We have one foot in the technology of the 21st century and the other in the sociopolitical traditions of the 18th century. Perhaps economic inequality is nothing more than the long half-life of mercantilism and colonialism as observed by 2024 Nobel Laureate economists Daron Acemoglu and James A. Robinson. The question is not whether we will eventually adapt and reach phases of equilibrium. We will. The question is how much destruction will be required along the way.
I no longer find the conceptual heuristics of modern political language particularly useful. The various “isms” are either outdated, politicized, co-opted, misunderstood or reductive and too often elevated as dogma. After all, appeal to tradition is simultaneously necessary for social institutions and a type of logical fallacy. It’s easy to take for granted that democracy is a proven solution to that dilemma. All political economic systems are ongoing policy experiments that either improve the quality of life of a society or not. Utopian stories are inspirational fictions — shining cities on an Arcadian hill, or magical castles in the clouds, but in reality, we’ve always just been making it up as we go along.
Who’s Keeping Score?
In modern society, we’re always talking about debt. Personal debt, credit card debt, student loan debt, debt spending, national debt, state debt, debt debt debt. Get comfortable with it because debt is practically synonymous with human societies. In fact, despite what we’re often taught in school about some nonsense on “barter societies,” there is no evidence, historical or contemporary, of a society that didn’t operate along the principles of gifts and debts. Money and currency are just technologies for portability and keeping score. All debt liabilities are simultaneously assets in a ledger somewhere that are only as resilient as the social institutions that create and enforce them. We see massive quantities of money magically appear and disappear every year, to and fro from the human collective imagination. Clinging to a material foundation of rare earth or computerized abstraction of scarcity may influence collective beliefs of supply elasticity dynamics, but it’s still an abstraction that is subject to the exact same mathematical manipulation the moment it is converted to fractional notation — and it always will be. Evidence of fractional notation dates back thousands of years, appearing in ancient civilizations like Babylonia, China, India and probably long before. A mountain of gold is a useless heap of metal if nobody wants it. A GUID in a blockchain is just an extremely expensive and inefficient way of maintaining a financial ledger, something that we readily find evidence of in Sumerian cultures thousands of years ago. If there’s widespread institutional collapse, will we have adequate computer systems to verify the blockchain? And if we do, without social stability to protect markets from plunder what will it be good for? Critics argue its value to society thus far has mostly been rampant speculation, various ponzi schemes and financial transactions that bypass the laws, rules and norms of the institutions that hold the whole game together.
Inclusive Institutions
As previously mentioned, Daron Acemoglu and James A. Robinson received the 2024 Nobel Economics Prize for their contribution in comparative studies of prosperity between nations. They pondered why some nation states that have obvious advantages in terms of natural resources and access to vital routes of trade remain poor, while other countries with extreme resource constraints and no obvious geographic advantages are extremely rich. After spending years pouring over the world’s economic data they came to a rather simple conclusion: the decisive role for the development of countries is played by strong inclusive institutions — a set of formal and informal rules and mechanisms for coercing individuals to comply with social norms. Inclusive economic institutions protect the property rights of wide sections of society, they do not allow unjustified alienation of property and allow all citizens to participate in economic relations in order to make a profit. The opposite of inclusive institutions are various forms of highly unequal and stratified regimes that limit the rights of citizens and tend to extract all the resources and labor for the benefit of the ruling elite.
Monopoly money is real when everyone is playing the game. All games have beginnings and ends. Extrapolate that into the world of institutions, enforcement and the ubiquitous subtle implicit threat of negative outcomes and we understand the foundations of modern economics. Human civilizations both great and small have commanded the allocation of resources for hundreds of thousands of years, and long before we invented the modern variants of state industrial capitalism (including Soviet, Keynsian and Neoliberal) the world population’s limited imagination currently agrees are the tool set to draw upon. While I won’t venture to guess what we we’ll come up with, there’s no reason to assume that what we’ve tried so far are the only ways to organize our human societies of the future.
Imagine for a moment the potential of a technological innovation like nuclear fusion. It would dramatically change the dynamics of society over a hundred, fifty or even twenty five years. Yes, resources are always going to be limited, but they are still abundant. When we solve the energy equation, it won’t necessarily change everything, but it would change quite a lot. It won’t eliminate humanity’s social psychological need for hierarchy, but there would be much less of an excuse for artificial material scarcity and inequality.
The immanent challenge is how to optimize and reduce what economists call negative value or externalities. That’s all the crap we left at Woodstock that Jimi Hendrix had to look at as he was playing our national anthem on his guitar. That’s all the crap we leave in our wake while we consume on Black Friday, because we are consumers who must consume. That’s the sewage from our industrial plants that we drink in our water supply. That’s the petrochemical plastics and chemical compounds in practically every bite of food that the animal kingdom, including humans put into their bodies every day. That’s the carbon and methane that spew out of all our fossil energy systems so we can heat our homes, process metals, manufacture plastics, drive our cars, charge the batteries for our electric vehicles, watch cute cat videos, mine bitcoin and interact with the fancy pattern auto-completion algorithms we currently call Artificial Intelligence.
We are consumers so we must consume. We are, what Giles Lipovetsky calls, Homo Consumericus. Or what Eric Fromm described as, “the man whose main goal is not primarily to own things, but to consume more and more, and thus to compensate for his inner vacuity, passivity, loneliness, and anxiety.”
Or not. We have the power to choose. We can step off the hedonic treadmill any time.
The mistake we seem to make is that we often conflate our identities with what we consume. Thorstein Veblen would likely agree. We’re bombarded with a flood of scientifically optimized information that reinforces this idea, but of course, none of us actually thinks that’s true. I suspect that we all will readily admit that we’re just bored, looking for pleasure and trying to fit in.
Anthropologist David Graeber, who sadly and permanently dropped the mic on September 2, 2020, asked in his 2011 paper, “Consumption” why is it that most forms of human self-expression or enjoyment are now called consumption, which is something analogous to eating food. He traces the origins of this idea to medieval European perspectives of desire and argues that it ultimately evolved to resolve conceptual problems in possessive individualism: property, desire, and social relations.
Graeber notes that in early French and English usage, consumption in the context of material goods was consistent with the original Latin verb consumere, which implied to overwhelm, waste, destroy or use up something, like fire. By the 13th century, the disease we now call tuberculosis even carried the same name.
He observes the contemporary sense of the verb “first appears in the writings of economists Adam Smith and David Ricardo to refer to the opposite of production. The contemporary usage is relatively recent. If we were still talking the language of the fourteenth or even seventeenth centuries, a ‘consumer society’ would have meant a society of wastrels and destroyers.”
He posits that inherent in the founding theories of Capitalism is a relentless drive for production and growth paired with constant destruction. To make way for new products, outdated items must be discarded, destroyed, or deemed irrelevant. This cycle defines consumer society, where ephemeral goods take precedence over enduring values. However, early economic thinkers did not focus on consumption as a primary concept. Even Adam Smith, who first used “consumption” in its modern sense in The Wealth of Nations, approached the theory of desire differently in The Theory of Moral Sentiments. Smith believed that humans primarily seek the sympathetic attention of others. It was only with the expansion of economic theory into various disciplines that the notion of desire began to be equated with the desire to consume.
In scholarly discourse, the concept of “consumption” emerged in the North Atlantic during the Industrial Revolution — the notion that human activities outside of work are primarily characterized by the destruction or utilization of resources that are produced at work. There’s our ledger again. This shift is marked by a distinct impoverishment in traditional discussions on the fundamental sources of human desire and gratification, which can be contrasted with the perspectives of earlier Western philosophers. For instance, St. Augustine and Thomas Hobbes both perceived human beings as entities driven by insatiable desires, inevitably leading to perpetual competition. This view closely prefigured the assumptions of subsequent economic theories, especially the Austrian tradition that emphasized an idealized formula of rational self-interest. However, neither St. Augustine nor Hobbes focused on the modern notion of consumption; instead, they identified human desires as comprising sensual pleasures, signifiers of social status and the quest for power.
No doubt humans will continue to seek pleasure, status and power, but there’s a very important difference between the world of Hobbes, Smith and Augustine and the world we live in today. It’s unlikely that these thinkers of antiquity could conceive of the material abundance that is possible in the 21st century, or that people would live healthy lives well into their 70s and 80s. The latter point is important because frivolous consumer spending usually decreases with age. Older people either achieve or accept a level of status and simultaneously realize that material possessions don’t make them happy.
Selling Happiness
By mid 20th century, there was a concern among business leaders and economists in advanced industrial societies that we were rapidly approaching a point where human needs and desires would be largely satisfied and we’d reach a ceiling of growth where people would no longer need to buy more things. There was a terrifying prospect of market saturation and recession. The solution was twofold: convincing people that they need things they don’t via advertising, which represented the dawn of what we’d now call “lifestyle branding” and planned obsolescence. This era of manufactured desires is personified in the character of Don Draper in Matthew Weiner’s television series Mad Men who said, “Advertising is based on one thing: Happiness. And you know what happiness is? Happiness is the smell of a new car. It’s freedom from fear. It’s a billboard on the side of the road that screams reassurance that whatever you are doing is okay. You are okay. What is happiness? It’s a moment before you need more happiness.”
This consumer propaganda machine exploits a fundamental vulnerability in human psychology known as the hedonic treadmill, or hedonic adaptation, where humans tend to return to a stable level of happiness or sadness despite significant life changes. As people earn more money or acquire more material possessions and status, their expectations and desires increase proportionally, leading to no lasting increase in happiness. Philip Brickman and Donald T. Campbell introduced this concept in their 1971 essay, suggesting that wealth doesn’t boost overall happiness. While it is a powerful system of psychological manipulation, many can and do resist its powers as they age.
Planned obsolescence however is a problem of serious concern. One of the positive stereotypes of Americans in the twentieth century was practical “Yankee ingenuity” and thrift that was borne out of traditional agricultural sensibilities, where a capital expense for something like a tractor, a truck, a tool a washing machine or even the clothing we wore was a big deal. People built and made product purchase decisions frugally using money they saved and chose the items that were made to last and could be repaired. Over time, greed led to intentionally manufacturing consumer goods designed to stop working or become obsolete in a matter of months or a few years. The common sense of the modern American is now often the opposite. We now think that there’s something normal about buying cheap disposable things and throwing them “away” to get the bright shiny new version. This perverse attitude was accelerated by years of cheap debt and low inflation. Now that we’re seeing price inflation, this proposition doesn’t seem as attractive, yet the addiction continues despite a flattening of nominal wages, and now the financial sector is yet again “at risk” because consumer debt has once again increased to levels financial institutions find uncomfortable because these financial “products” come at the cost of higher rates of interest on their balance sheets and many of the debtors are neither earning the discretionary income nor seem to exhibit the spending patterns that indicate they will be able to repay them. This is high risk debt. The solution? Credit default swaps. Yes, a variant of the same financial mechanism that contributed to the 2008 collapse of the global financial system.
One of the more powerful points in the “Buy Now” documentary was that the average consumer of today is constantly reassured that there exists a magical place vaguely called “away” where things go when we discard them, and that we’re coaxed into believing in this magical place through greenwashing and recycling labels. It doesn’t exist. Very little if anything is carbon neutral or recycled. We bury it in the ground, burn it, or put it on cargo ships that dump it into the global south.