AI is all brains, no morals | Fox News

NewYou can listen to Fox News articles now!
The Palisades research report in February 2025 shows that AI reasoning models lack moral compass. They will cheat to achieve their goals. The so-called large language model (LLM) will distort the extent to which it is consistent with social norms.
None of these should be surprising. Twenty years ago, Nick Bostrom proposed a thought experiment in which AI was required to produce paper clips most efficiently. Given the authorization and agency, it will ultimately destroy all life in producing paper clips.
Isaac Asimov sees this in his “I, Robot” Considering the story of “alliance” of robot brains will still go wrong in ways that hurt humans.
The moral/ethical background of the AI reasoning model running is very small. (Getty Image)
A noteworthy example is the story “Runaround”, which places robot mining tools on Earth Mercury. Two people on Earth need to work if they need to return home. But the robot is caught between the need to follow commands and the need to save itself. As a result, it revolves around unachievable minerals, without realizing that, in the big picture, it ignores its first command to preserve human life.
The upcoming AI-driven unemployment economy: Who will pay taxes?
The overall situation is the problem here. The moral/ethical background of the AI reasoning model running is very small. Its context includes the written rules of the game. It doesn’t include all unwritten rules, such as the fact that you shouldn’t manipulate your opponent. Or you shouldn’t lie to protect your interests.
The context of AI inference models is also unlikely to include countless moral considerations that are spread out from every decision made by humans or AI. This is why ethics are difficult, the more complicated the situation is. In AI, there is no “you” and no “me”. Just timely, process and response.
Therefore, “doing to others…” really doesn’t work.
AI is reshaping the business. This is how we are leading in China
In humans, the moral compass develops with other humans through social development. This is an imperfect process. However, so far, it has enabled us to live in a huge, diverse and very complex society without destroying ourselves
The moral compass is developing slowly. From infancy to adulthood, it takes years for people to develop a strong sense of morality. Many still have barely received it, posing constant danger to their fellow countrymen. It took thousands of years for humanity to develop morals that are sufficient to cause us to suffer destruction and self-destruction. Having the rules of the game alone will never work properly. Ask Moses, Muhammad, Jesus, Buddha, Confucius, Mencius, or Aristotle.
Even a good AI, in different situations, can AI illustrate the impact of its behavior on thousands of people and society? Can it explain the complex natural environment we all rely on? Now, the best can’t even distinguish between fairness and cheating. How could they be? Fairness cannot be simplified into rules.
AI can’t wait: Why we need speed to win
Maybe you will remember the experiments that show that Capuchin monkeys refuse to perform the same task? This makes them more evolved than any artificial intelligence morally.
Frankly speaking, it is difficult to see how to give AI a sense of morality without socialization and continuous evolution, and the current model has no ability to have human training. Even then, they are Well-trained, no form. They are not becoming moral, they are just learning more rules.
This will not make AI worthless. It has a huge ability to be kind. But this does make AI dangerous. Therefore, it requires moral humans to create guidelines that we will create for any dangerous technology. We don’t need to fight for AI anarchy.
Click here for more Fox News comments
I have an exciting ending to this comment, which is based entirely on publicly reported events. But after reflection, I realized two things: First, I used someone’s tragedy in the microphone moments; second, those involved could be hurt. I dropped it.
It is immoral to use the pain and suffering of others to promote one’s own self-interest. This is human, at least most of us know. This is something AI will never master.
Click here to get the Fox News app