Artificial Intelligence XIV: A Philosophical Vision

Artificial intelligence

June 17, 2023

17 Jun, 2023

A PHILOSOPHICAL VISION OF ARTIFICIAL INTELLIGENCE. “WEAK DEMOCRACIES, CAPITALISM, AND ARTIFICIAL INTELLIGENCE ARE A DANGEROUS COMBINATION”

Mark Coeckelbergh: “Weak democracies, capitalism, and artificial intelligence are a dangerous combination.” The philosopher points out that institutions need to rely on experts to regulate technology, but without forgetting the citizens.

Mark Coeckelbergh has focused the attention of an audience unaccustomed to philosophical debates: engineering students filled a room to listen to this expert in technology ethics, invited by the Robotics and Industrial Informatics Institute of the Universitat Politècnica de Catalunya. Coeckelbergh, a prolific author — two of his books are published in Spanish by Cátedra, Ethics of Artificial Intelligence (2021) and Political Philosophy of Artificial Intelligence (2023) — knows how important it is to build bridges between those who develop technologies and those who must think about how to use them.

Question: Do you think that students, engineers, and major tech companies take the ethical aspects of artificial intelligence (AI) into account?

Answer: People are aware that this technology will affect our lives because it’s everywhere, but at the same time, we are confused because the changes are very fast and complex. That’s why I think it’s important that education and research try to find an interdisciplinary path between philosophy, programming, and robotics to address these ethical issues.

Question: And what about politics?

Answer: Yes, we need to create more links between experts and politicians, but not just technical opinions should matter. We need to figure out how we can organize our democracy to consider the vision of experts, yet still make decisions ourselves. Tech companies are gaining more and more power, and this is a problem because the sovereignty of nations and cities is diminishing. How much of our technological future should be left in the hands of private initiatives, and how much should be public and controlled by democracies?

Question: Is artificial intelligence a threat to democracy, or are democracies already weakened?

Answer: Democracy is already vulnerable because we don’t really have complete democracies. It’s like when Gandhi was asked what he thought of Western civilization, and he said it was a good idea. The same goes for democracy: it’s a good idea, but we don’t have it fully. For me, it’s not enough to vote and have majorities, it’s too vulnerable to populism, not sufficiently participatory, and it doesn’t take citizens seriously. There’s a lack of education and knowledge to achieve real democracy, and the same is true for technology. People have to understand that technology is also political, and we need to ask ourselves whether it’s good for democracy that communication infrastructures like Twitter are in private hands.

We use technology uncritically, and while a few benefit, the rest of us are exploited for our data.

Question: In what way does artificial intelligence threaten democracy?

Answer: We deal with technology without thinking; we use it uncritically, but it shapes us and uses us as instruments for power, control, and exploitation of our data. And while a few benefit, the rest of us are exploited for our data. This affects democracies because, not being very resilient, political trends are even more polarized by technology. This combination of weak democracies, capitalism, and artificial intelligence is dangerous. But I do believe it can be used in a more constructive way, to improve life for everyone and not just a few.

Question: Some see artificial intelligence as a way to work less and have more freedom, while others see it as a threat to their jobs.

Answer: I think AI right now empowers those who already have a privileged position or good education: for example, they can use it to start a company. But there will be changes in employment, and there will be some transformation of the economy, so we need to be prepared. On the other hand, the argument that technology makes things easier… Until now, it has led to precarious jobs, like Uber drivers, and to jobs that may be good but stressful. For example, we are all slaves to email, and it came as a solution.

Question: So, the problem is not so much the technology but the system.

Answer: It’s a combination of both things, but indeed, these new technological possibilities force us to question the system more than ever. Today, the political conflict is played out in the realm of technology.

Question: What impact does it have on the media?

Answer: In this environment, the problem isn’t that people believe a lie, but that they don’t know what is a lie and what is truth. Quality journalism is very important to provide context and to try to understand the world. I think it can help people gain more knowledge, even if Artificial Intelligence is used for some tasks in the profession. Philosophers, journalists, educators, we have to provide the tools to interpret the world, because when knowledge is lacking and confusion reigns, it’s easier for a leader to come with a simple, populist solution, as has already happened in some countries in Europe.

Question: Can technology make governments more technocratic?

Answer: Politicians are confused, they feel the pressure from lobbies and create regulatory frameworks, but at no point have citizens had a say. States are becoming more and more bureaucratic because they give power to those who control artificial intelligence. So, who is responsible? This kind of system, as Hannah Arendt said, leads to horrors. We must fight against it, with regulations that allow us to see why algorithms make the decisions they do and that allow us to know who is responsible.


Future Laboratory Analysis Team. Article/Report by Josep Cata Figuls.

Autor: Laboratory of the Future analysis team

Autor: Laboratory of the Future analysis team

Related articles

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

error: Content is protected !!