Yuval Harari, interviewed by the IMF:
“Artificial Intelligence is the most powerful technology ever created”
Press contribution from the International Monetary Fund (IMF). Finance & Development magazine.
The Israeli historian and writer, one of the leading scholars of the impact of AI on human evolution, warned in an interview that the new technology will strip people of the exclusive ability to impose emotions through the transmission of narratives.
According to Yuval Harari, the impact of Artificial Intelligence on humanity will bring such profound consequences that, in the not-so-distant future, humans will have to relinquish authorship of the ability to influence others through their narratives — something that has always allowed them to dominate the planet.
The Israeli historian and writer, who in his book Nexus explained that the world is leaving behind the money-based economy to replace it with an information-based economy, was interviewed on a podcast produced by the International Monetary Fund (IMF). A summarized version of that interview was published in Finance & Development (F&D), an IMF publication. The following is the full text of the interview:
Unlike the Homo economicus — the hyper-rational model invented to elucidate our financial dilemmas — the decisions of Homo sapiens have always depended greatly on social context and emotional responses to narratives.
Curious since childhood, Yuval Noah Harari writes today about human evolution as a philosopher and historian. Sapiens: A Brief History of Humankind, published in 2014, became an international phenomenon translated into almost 40 languages. His latest work, Nexus: A Brief History of Information Networks from the Stone Age to AI, examines the evolution of human communication networks and the possibility that Artificial Intelligence (AI) will surpass us on our own turf.
Harari is currently a professor of History at the Hebrew University of Jerusalem and a prominent senior researcher at the Centre for the Study of Existential Risk at the University of Cambridge. In a conversation with Bruce Edwards, he spoke about narratives, trust, and AI.
F&D: One of the basic principles of your history of Homo sapiens is that we are the only species with the ability to imagine the future. How has storytelling allowed us to dominate other species evolving alongside us?
YNH: Power lies in cooperation. For example, chimpanzees can only cooperate in very small groups, but Homo sapiens‘ cooperation is unlimited. Today, there are 8 billion people in the world who, despite many differences and conflicts, belong almost without exception to the same commercial networks. Much of the food, energy, and clothing we consume comes from the other side of the world, from people we’ve never met. These vast cooperation networks are our superpower, and they are based on trust.
Then we must ask where trust between strangers comes from. From stories.
Trust is forged by telling stories that many people believe in. It’s easiest to see in religion: millions of strangers can cooperate in charitable projects like building hospitals or fighting holy wars because they believe in the same mythology. But the same happens with the economy and the financial system, because no story has ever been as successful as the story of money. Basically, it is the only story that everyone in the world believes in.
F&D: But you refer to money as a mere cultural artifact.
YNH: Exactly. Money is a story, an invention; it has no objective value. You can’t eat or drink banknotes or coins, but you can give a stranger a worthless piece of paper in exchange for bread that you can eat. The fundamental premise is that we all believe in the same narrative about money; if we stop believing, everything collapses. This has happened throughout history, and it is happening today with new types of currencies. What are Bitcoin, the Ethereum network, and all these cryptocurrencies? They are narratives. Their value depends on the stories people tell and believe. And Bitcoin’s value rises and falls as people’s trust in that narrative rises and falls.
F&D: According to your latest book, Nexus, we are leaving the money economy for an economy based on information exchange, not currency. What is the information economy like?
YNH: I’ll give you an example: one of the most important companies in my life is Google. I use it every day, all day. But my bank statement shows no money exchanged; neither do I pay Google, nor does Google pay me. What Google gives me is information.
F&D: And you give Google information.
YNH: Exactly. I give Google a lot of information about what I like, what I don’t like, what I think — anything — and Google uses it. All over the world, more and more transactions follow this format of information for information, not something for money. And power, wealth, and the meaning of wealth are shifting from having a lot of money to having a lot of petabytes of information. What happens when the most powerful people and companies are rich in the sense that they have a gigantic amount of stored information, which they don’t even bother to monetize because they can get everything they want in exchange for information? Why would we need money? If information serves to buy goods and services, money becomes unnecessary.
F&D: Nexus posits that power structures and belief systems emerged from narratives throughout human evolution and contextualizes this with current technology. What does it say about the dangers of these increasingly advanced information networks?
YNH: The first message is almost philosophical: information and truth are not the same thing. Most information is fictional, implausible, and misleading. Truth is expensive; you need to research, gather data, and invest time, effort, and money to find it. And often, the truth hurts — that’s why it is a very small part of information.
Another message is that we are unleashing the most powerful technology ever created on the world: AI. AI is radically different from the printing press, the atomic bomb, or any other invention. It is the first technology in history that can make decisions and create new ideas on its own. An atomic bomb cannot decide where to detonate; AI can. It can make financial decisions and invent financial instruments independently — and the AI we know today, in 2024, is just the rudimentary form of this revolution. We have no idea what is coming.
One important point, especially for the IMF, is that the pioneers of AI are just a handful of countries. Most countries are far behind, and if we are not careful, we will see a repetition of the Industrial Revolution on an exponential scale. In the 19th century, only a few countries — Britain, then the United States, Japan, and Russia — led industrialization, while most others didn’t understand what was happening. Within decades, the entire world was either directly conquered or indirectly dominated by those few industrial powers.
Now we have the AI tsunami. Think about what the steam engine and telegraph did to global inequality — then multiply it by 10, 100, or 1,000. That’s the kind of impact we could see if a few countries monopolize the enormous power of AI, leaving the rest exploited and dominated in unprecedented ways.
F&D: Unchecked AI is dangerous, as you say in Nexus. But as you emphasize in Sapiens, humanity has trampled the planet like “gods who don’t know what they want.” Is there anything in economics capable of mitigating the impact of these two potentially destructive forces combined?
YNH: Economics seeks to establish priorities. Since there are limited resources and an abundance of different desires and needs, it raises the question of truth and desire.
The best system we’ve invented to address desires is democracy: we ask people what they want. However, democracy is not ideal for deciding what is true. If we want to know if the atmosphere is warming due to human activity or natural cycles, the answer does not come from a democratic vote. It is a matter of truth, not desire.
If we want to know the facts, we need expert institutions that know how to analyze data — but not dictate desires or tell us what to do.
F&D: But democratic decisions are based on stories people hear: what happens when those stories no longer come from humans?
YNH: It causes an earthquake. Societies are built on trust, which is based on information and communication. AI-generated stories will profoundly shake that trust.
AI can be enormously beneficial, but if it runs unchecked, it could pose an existential danger. I don’t see AI as Artificial Intelligence, but rather Alien Intelligence — not from outer space, but from our own laboratories. It thinks and makes decisions in fundamentally different ways than humans. Letting billions of alien agents loose without ensuring they use their power for our benefit is extremely dangerous.
Press contribution from the International Monetary Fund (IMF). Finance & Development magazine.
0 Comments