Fermi Filters

Anthony Repetto
6 min readMar 29, 2018

~why the stars are silent~

There was only a brief and fortuitous period between cheap guns and cheap surveillance when democracy could rise out from under monarchy and feudalism, here on Earth. Our first attempt at independence was aided significantly by a vast sea separating king and colony. It is not clear that other worlds would have such distinct land masses dividing peoples, to favor democracy and liberty, or that there would be significant success before security technologies cement all states’ leadership. Perhaps, on many worlds much like ours, monarchy is the norm. Would that explain why we see no other civilizations bursting forth from their parent stars? Where did our neighbors go?

This is the Fermi paradox: among all the billions of stars in our galaxy, among hundreds of millions of galaxies, given the span of billions of years, we do not see telltales of alien civilizations. Given so many opportunities for life, civilization, technology, we would expect that hundreds or thousands of alien races must surround us. Presuming that our technology and economy continue to advance apace, we will be harvesting almost all the energy of our sun in three thousand years or less! Over the billions of years of existence, surely someone else must have gotten to the technology that we eventually attain. We would see them swallowing stars. We don’t see that.

So, theorists propose, there must be Great Filters — situations that arise which prevent civilizations from forming, from advancing, from avoiding self-destruction. Nuclear weapons are definitely a Great Filter. If a civilization develops nukes, an arms race may tempt them into mutual annihilation. Nuclear bombs are provocative — they incite a reaction that escalates tension and risk. Perhaps most advanced civilizations detonate, explaining the silence of the stars.

A Thousand Deaths

I am of the belief that there are simply many great filters. And we haven’t made it past some of the worst ones. If we want a chance at star-scale society, we must be wise about these crises: environmental destruction, inequality and injustice, longevity, warfare. I do not include artificial intelligence as a great filter, despite frequent fear-mongering, because intelligent machines are best used for relatively narrow sets of tasks.

We only need machine intelligences that each know a few tasks better than humans; we have no incentive to design a single machine that can do all kinds of human thoughts and actions. The incredible difficulty and ongoing cost of operation makes human-scale artificial intelligence unlikely among all the civilizations in the cosmos. Nuclear war is much more likely. Though, narrow machine intelligences take over the task of surveillance and deployment of forces; AI will make it easy for despots to take-over and patrol large areas. That is AI’s real risk.

Warfare grows in ferocity with each wave of technology. Empowerment, even with the best intentions, is made into a tool for oppression and destruction. Humanity suffers from this shortsightedness, and it will not be cured by more technology. Other peoples, on other worlds, may have succumbed to nukes, or bio-terrorism, or despotism. Continued warfare thins the pool of successful civilizations; the most warlike may have a 50% chance of surviving every additional 100 years, while peaceful races face a 50% survival rate in 1,000 years. As a result, most of the warlike peoples will be younger and have lower technology than those less vicious. Peace preserves itself. War devours itself.

Minimal Empathy

I suspect that the growth of altruism and empathy may occur too slowly in most intelligent organisms, as they evolve toward greater societies. Once we develop a little empathy, it glues us together into a cohesive whole. Not a lot of empathy is necessary for a web of bonds to knit large populations. This was popularized as Milgram’s Six Degrees of Separation. We only developed the minimal level of empathy to knit together. That empathy-glue allowed collaboration and skill specialization, with rapid technological shifts, resulting in an increased capacity for war.

If our earliest ancestors had kept fewer tools, and shared more, we might have entered the age of agriculture with enough compassion to avoid conflict. Did Prometheus give us fire too soon? It seems This is a Great Filter: that altruism involves greater developmental complexity than the crafting of simple weapons. The first growth of compassion knit us into opposing teams that could more easily destroy themselves, outpacing that compassion.

This seems to be the reason for the other Great Filters mentioned; environment, injustice, longevity. Pollution and destruction of biomes are well-documented risks to civilization, and they loom over us because of our lack of coherent response. We have the tools, yet we don’t work together to use them. Injustice persists because we march behind sociopaths, instead of exiling them. Our empathy extended far enough to encompass our clan, or our religion, or country, no further. Altruism hadn’t the time to mend our schisms, before technology made our differences more deadly.

Currency of Concern

While our own dominance over the Earth may end soon (on the galactic timescale) these Great Filters suggest the kind of being which might survive. Avoiding the risks of resource depletion, habitat loss, focusing research on the public good. If some Earthly world evolved such people, they may inherit the cosmos. All our monarchies and despots would be dead branches on the tree of life, while the living bough is some better kind of creature.

Look back to our early economy, before currency, not even barter. You are sitting among the village elders and children, grinding grains, when another member of your tribe bursts from the trees, shouting and pointing. You all rush to gather the children and key supplies, knowing from his tone and a few syllables that a forest fire is coming.

Capitalism requires payment for labor. This shouting relative, however, recruited the whole tribe into action without payment for their efforts. Each member of the tribe relies upon the division of labors within the tribe. Everyone is at a loss when anyone is lost. So, we cared for each other. That achieves a better result than capitalism. It’s a holistic evaluation of circumstances.

In parallel, how much is that shouting tribesmen rewarded, for ensuring others safety by warning them? A capitalist would bargain for the highest price that the others are willing to pay. That’s how markets set prices, after all, so it’s ‘fair’. And, with that large reward, the shouter represents greater demand for goods, greater power. Yet, that price might impoverish other tribe members. So, a tribe’s holistic evaluation could be to praise and give attention to the relative who rushed to warn everyone, but the shouter would not be given special power over others, or be allowed to take more than their share of food.

When one of us is frightened or in pain, we reach out to them, and we feel their tension and wound. That was our earliest currency. It paid for goods and services. It orchestrated hundreds of people toward common goals. Consider a civilization that dodged monarchy and capitalism. Each person’s efforts flow along a network of empathy, so the resources necessary moved to the places that need them. A civilization that maintains compassion as currency would react with all their effort in response to a threat, as soon as they identify one. (If our civilization were that responsive, responsible, we would have made a more serious effort to avert climate change since the 70’s, for example.) When a civilization moves quickly and coherently to thwart global risks, they are more likely to survive. Empathy brings longevity.

A Second Curse

Yet, that longevity is a temptation that may cripple many civilizations. If members of society live long, then they are more likely to respond to long-range threats. That seems to favor long-lived aliens. However, if longevity was achieved unequally, they could plunge into a race between despots to consolidate power, knowing that they will reign forever.

A more serious risk from the technology for longevity is the possibility of being tortured for longer periods of time. If a mobster can threaten to torture deserters forever, none will squeal to avoid prison time. Despots benefit from longevity’s ability to amplify threats. An unending, living hell. Eventually, the cost of maintaining all those torture-chambers encumbers a civilization. The people would have fewer resources to deal with external threats. (e.g. asteroids) These Mobster-Torture-Oligarchy civilizations are fragile, and they implode after a disaster or power vacuum. These are risks yet to come, and I estimate that they are Great Filters which destroyed many thousands of civilizations before us. We may even find the radioactive, barren worlds where they fought, more numerous than planets with fungi.

These are Dangers Ahead. The odds are against us. How long before someone who, caring only for themselves, or only for their team, unleashes a virus that topples us? The risk hasn’t lessened, but we waltz on, oblivious. A more firmly empathetic culture might have responded faster, done more to fix problems, instead of plastering over risks or covering their behinds. If any one of them suffered, all would feel the echo. When the universe births a people like that, I bet they will grow vast and live long and well. Us?

--

--