sol2070 reviewed More Everything Forever by Adam Becker
Awesome nonfiction on technology
5 stars
( em português: sol2070.in/2025/06/livro-more-everything-forever/ )
More Everything Forever (2025, 384 pages), by Adam Becker, was one of the best nonfiction books on technology I have ever read. The subtitle: “AI overlords, space empires, and Silicon Valley's quest to control humanity's destiny.”
Given the lack of discussion and interest in the ideologies that drive the world, this work is essential. It deals with the ideas defended by tech billionaires about what really matters today, to the detriment of everything else. In their view, these are:
-
Achieving AGI, artificial general intelligence. The addition of “general” implies an AI capable of doing anything a person does, including self-improvement.
-
Preventing such AGI from turning against humanity.
-
Achieving technological singularity, the point at which inconceivable advances would be possible with the help of AGI.
-
Uploading copies of the mind into a virtual world beyond death.
-
Colonizing all of space and converting it into a cosmic …
( em português: sol2070.in/2025/06/livro-more-everything-forever/ )
More Everything Forever (2025, 384 pages), by Adam Becker, was one of the best nonfiction books on technology I have ever read. The subtitle: “AI overlords, space empires, and Silicon Valley's quest to control humanity's destiny.”
Given the lack of discussion and interest in the ideologies that drive the world, this work is essential. It deals with the ideas defended by tech billionaires about what really matters today, to the detriment of everything else. In their view, these are:
-
Achieving AGI, artificial general intelligence. The addition of “general” implies an AI capable of doing anything a person does, including self-improvement.
-
Preventing such AGI from turning against humanity.
-
Achieving technological singularity, the point at which inconceivable advances would be possible with the help of AGI.
-
Uploading copies of the mind into a virtual world beyond death.
-
Colonizing all of space and converting it into a cosmic computer, where transhumans would live forever (including resurrected copies of the dead).
It is no coincidence that in this science fiction scenario — utopian or dystopian, depending on your bank account — those who have all the power and money are the owners of corporations.
Sam Altman, from OpenAI (ChatGPT), even proposed that all money be replaced by shares in his company (obviously, only a minority portion would be divided among ordinary people). His justification is the democratization of advances in AI and alignment with the interests of society. He just didn't mention that this would perpetuate the corporation as the most powerful and, in practice, transform it into a techno-emperor.
Much of the book focuses on the ideas of the Effective Altruism, Longtermism, and rationalist movements.
Longtermism
Longtermism is about aiming for hypothetical immense benefits in the very distant future and disregarding current problems. A central argument is that a hypothetical future, where quadrillions of posthumans would live in paradise, with a 0.01% chance of becoming reality, would be morally preferable to trying to ensure the well-being of all people today. It sounds completely absurd, but the logic is that by multiplying the good of quadrillions of future beings by a 0.01% chance, that number is much higher than that of people today multiplied by a chance of success of, say, 50% or even 100%.
The flaw in the argument is that human well-being, especially in an unlikely future, cannot be converted into a number. But since treating people like numbers is a specialty of the current system, billionaires love this kind of idea, which, crucially, also underpins their attitudes. If it weren't for them, movements like Longoprazismo would remain restricted to eccentric fantasies. With millions of dollars directed to their organizations, this ideal is already influencing even governments, in addition to the general public.
Although tech billionaires like Musk and Altman profess these beliefs, there is doubt as to whether they really believe in them or just use such ideas to apply a philosophical veneer to their insatiable greed and thirst for power.
As the author of the book is a doctor of astrophysics and science journalist, the book presents rich and profound refutations of the pseudoscience and cult fanaticism behind these ideas, without compromising its fluidity. It is also a book-report, containing the positions of various experts that Becker interviewed, as well as representatives of the movements.
Singularity
However, the pace may be compromised somewhat by the long and detailed analyses of space colonization and the ideas of Ray Kurzweil, the prophet of singularity. But this is justified: as the arguments are loaded, the refutation could not be otherwise.
For example, Kurzweil imagines space nanobots capable of assembling Dyson spheres (structures around a star to capture a gigantic amount of energy), before going out into the universe, multiplying and converting all matter into an immeasurable digital network, to house and process virtual reality “better than reality” where quadrillions of beings would live forever. Although he does not mention it, this conversion implies the destruction of everything that exists!
Not only is it something worthy of a Marvel movie villain, but also, in Becker's words, it is “just the latest entry in the annals of the oldest human fantasy,” to live forever.
More Everything Forever is a treasure not only for those who follow technology and its directions, but also for those who love science fiction.
The section on how tech billionaires seem to misunderstand science fiction stories — seeing dystopian warnings as roadmaps to follow — would be amusing if it weren't so tragic.
Mental bug
The author raises the possibility that the predominance of science and math degrees — with a lot of narrow-mindedness, of course — among big tech leaders may have something to do with this misunderstanding. It is as if there were an invisible bug in how their minds operate. Here are some excerpts on this topic:
This homogeneous intellectual background focused on STEM [science, technology, engineering, and mathematics] in the culture of tech startups and, more generally, in Silicon Valley, generates a denial of the humanities, a systematic and sometimes deliberate ignorance of the arts and humanities.
Nowhere is the denial of the humanities more evident in the tech industry than in its attitude toward history. “I don't even know why we study history,” says Anthony Levandowski, co-founder of Google's self-driving car division, now known as Waymo. "It's fun, I guess — dinosaurs, Neanderthals, the Industrial Revolution, and stuff like that. But what has already happened doesn't really matter. You don't need to know that history to build on what they did. In technology, all that matters is tomorrow."
The denial of the humanities is enabled—and enables—another endemic affliction in the tech industry: “engineer disease,” the belief that specializing in one area (usually STEM) makes you an expert in all others as well. (…) Or, put another way, there is only one thing that is really difficult, and you [the engineer] already know what it is, so everything else must be easy.
People who excel at software design become convinced that they have a unique ability to understand any type of system, from the ground up, without prior training, thanks to their superior analytical powers. Success in the artificially constructed world of software design promotes dangerous confidence.
For example, Elon Musk, in his sidereal hubris, completely ignored the consecrated image that astronomer Carl Sagan commissioned and evoked, about the “pale blue dot,” Earth, “... there being no other place, at least in the near future, to which our species could migrate.” An interviewer asked the entrepreneur to read the longer excerpt, and when Musk reached that sentence, he complained: “That's not true, that's false. Mars!”
Solution?
In the final chapter, Becker advocates a solution: billionaire fortunes should not be allowed. There is an urgent need for taxation that prevents the accumulation of more than, say, $500 million — which would still allow for immense wealth. Economist Paul Krugman is quoted as saying that something like this has actually already been done. In 1953 in the US, annual incomes above $300,000 were taxed at 92%.
Personally, I imagine that such regulation — in the very unlikely event that laws were passed — would be short-lived. Very wealthy people always end up hijacking politics simply because their fortunes allow them to, just as they did to overturn this 1950s law and many other regulations on corporate power.
Something much more radical would be necessary.