← Back to books • Amazon • Audible
The big project. We need one. We’ve almost gotten rid of war, famine, and disease. But humans are never satisfied. What’s next? (Relevance: don’t focus on current project too much to be caught off guard by the next one
- Complete eradication of war, famine and plague. (Check this.) cyber war. Some weapons are left unused
- Finding a key to human happiness
- Enhancing humans. Genetic engineering
(Immortality bliss and divinity)
Likely to work by the end of the 21st century, not necessarily completely achieve.
This prediction is a point of discussion. A prediction that cannot be changed is pointless. A prediction that changes our course of action is short lived. Karl Marx was relevant but his predictions failed to materialize precisely because people grew to take him seriously. The point then is to spare humanity from forcing ourselves into a corner
Studying history is useful because it shows us possibilities we wouldn’t normally considered. We’re all in our bubbles shaped by circumstance. We forget about other possibilities. Not to predict future but free ourselves from the past and imagine alternative futures
Future of the past. Humanist ideals. Old people. Pitiful state. Bringing ideals to a breaking point.
Animals. Humanism. Animal emotions. Agriculture-born religions. Super intelligence and lower life forms
Analogies to steam engines in psychology jargon. Today it seems childish so we compare it to computers. But that might be just as naive.
What makes humans dominate is not intelligence (we were just as intelligent if not more 20,000 years ago), but our unique ability to organize ourselves flexibly
We create fictions. Intersubjecive realities. Democracy, corporations, money, are all figments of our collective imaginations.
Humanism, capitalism - religions. Brought stability thanks to its belief in growth. The growth cannot stop. When threatened with ecological collapse, the response is more stuff.
World devoid of meaning. Capitalism has so far been a success
Humanism the new religion. Human feelings are the source of authority in politics aesthetics etc. so it doesn’t matter if you’re atheist or not. In the Middle Ages, without god , how can you live?
Experiences x sensitivity – the formula for resolving ethical dilemmas, and for knowledge.
Liberalism. Dilemmas, e.g. Letting refugees inside. Liberal nationalism. Socialism.
Why socialism failed and why traditional religions are not relevant
Humanism/liberalism might drive us to extinction.
Free will is an illusion. Cranial something something stimulation. Manipulate our desires.
Were dividuals. Split brain. Narrating self. We invent stories to make sense of the past, and to give meaning to our past suffering. (“Our boys didn’t die in vain”). The more you sacrifice for a story, the more you believe in it just so you don’t have to face the truth that there’s no meaning and you were stupid.
Threats on the horizon
Humans aren’t going to be useful economically militarily politially. (Automation, machine intelligence). What will people do? Algorithms could end up owning most of the planet. Making new jobs isn’t a problem. Making jobs humans are better at is a problem.
Mortal blow to liberalism. What’s so sacred about useless bums consumed by drugs and computer games?
(Humans are still useful as a collective but lose their individual authority) uIndividuals. We outsource critical decisions to computers. Today, computers serve us movies well like, Facebook knows us better than our friends/spouses with 100/300 likes. Facebook can predict elections and find exact voters that need to be persuaded to swing the vote. Medical info, DNA testing. All it takes is for the algorithm to know us better than we know ourselves, and that’s not difficult given that most people don’t know themselves very well. We give away information about ourselves, perhaps the most useful we have left, for free.
From oracles to agents to sovereigns.
3 that some humans will still be unique and indispensable but will be a new elite. Upgraded superhumans. Liberalism doesn’t try to create the same experience for everyone but it values experiences of everyone the same.
Achieving immortality divinity and bliss aims to surpass not safeguard normal conditions. Therefore it might create a caste of rich auperhumans. Making everyone equal might not make economical sense when humans aren’t useful economically or militarily.
Techno-humanism. Homo sapiens ran its course, and so we must create homo deus. If a relatively small changes in DNA and re-wiring of the brain could launch the cognitive revolution, then what’s the reason to think it couldn’t be done again? But playing with minds is dangerous. We’ve just invented a boat and we’re sailing without a map or a destination.
We don’t have a map of normal mental states. We’ve lost some abilities. Smell. Dream. Pay attention.
By combining an ability to change humans with our ignorance about mental states, and narrow interests of governments and corporations we might upgrade brains while losing our minds. Ultimately, a downgrade. Some govs might prefer it this way. The agricultural revolution downgraded animals because smart animals were trouble.
What if we develop an ability to change and modify our desires? How can humanism survive that?
If lives, minds, and emotions are just biochemical algorithms for survival and reproduction, we can think of everything as just data flows. Data processing algorithms.
The belief is that us humans are just the most efficient data processing algorithms. The stock market, the economy, are all extremely complex data processors. No one can fully understand them, they are just too complex. However what makes life valuable is that we contribute data to the vast network.
Record, upload, share.
What made capitalism win and communism fail wasn’t that one was morally superior to other, it was that capitalism is a more efficient data processing network. Distributed, not centralized. The amount of data to deal with to efficiently allocate capital is too great for communism to work. Therefore, capitalism and democracy made sense.
But what if today, we have so much information that governments are no longer capable of producing grand visions? Stagnant burwoceacies. Maybe that’s a good thing, grand visions gave us communism and nazis, but different grand visions defeated them. A vacuum of power never lasts long. Hence the new religion of dataism. Processing is what matters so let’s replace human processing with more efficient non-human processing.
In the past, power was access to information. Today, power is knowing what to focus on. (Censorship was denying access to info, now its flooding people with so much info you don’t know what to focus on)
Bigger problems in the short term. But in the long term, we have science converging on the dogma of dataism. Intelligence is decoupling from consciousness. Highly intelligent algorithms may soon know us better than we know ourselves.
3 Big questions:
- Is life really just an algorithm, and everything is reducible to data flows
- What’s more valuable: consciousness or intelligence?
- What will life look like for humans once we have unconscious but highly intelligent computers making us not useful economically?
← Back to books • Amazon • Audible