Herodot Statue

Philosophy Where humans exploit artificial assistants


A new LMU study shows: In Japan, people treat robots and AI helpers with more respect than in the West.

Munich/Germany, March 26, 2025. Imagine an automated delivery vehicle delivering groceries while you are running late for a long-awaited dinner with your friends. You both arrive at a busy junction at the same time. Do you slow down to make way for the self-driving delivery van? Or do you expect it to stop and let you pass, even if it actually has the right of way?

‘As autonomous driving technology becomes more and more of a reality, everyday encounters like this will determine how we share the world with intelligent machines in the future,’ says Dr Jurgis Karpus from the Chair of Philosophy of Mind at LMU. The introduction of fully automated cars is signalling a shift from simply using intelligent machines – such as Google Translate or ChatGPT – to actively interacting with them. The key difference: in heavy traffic, our interests do not always correspond to those of the self-driving cars we encounter. We have to interact with them, even if we are not using them ourselves.


In a study recently published in the journal Scientific Reports, researchers from LMU and Waseda University in Tokyo found that humans are much more likely to exploit cooperative artificial intelligence than similarly cooperative fellow humans. ‘After all, if you cut a robot off in traffic, you don’t hurt its feelings,’ says Karpus, lead author of the study. The team used methods from classical behavioural economics. The Japanese and US test subjects were given the choice in various game theory experiments: cheat your opponent or behave cooperatively? The result: if the opponent was not a human but a machine, the test subjects were much more likely to act selfishly.

However, as the results of the study show, our tendency to exploit a machine trained to behave cooperatively is not universal: people in the USA and Europe exploit robots significantly more often than people in Japan. The research team suspects that this difference is due to feelings of guilt: In the West, people feel remorse when they betray another person, but not when they do so to a machine. In Japan, on the other hand, people feel equally guilty – regardless of whether they treat a person or a well-meaning robot badly.

These cultural differences could influence the future of automation. ‘If people in Japan treat robots with the same respect as humans, fully autonomous taxis could be in use in Tokyo long before Berlin, London or New York,’ surmises Jurgis Karpus.


Original publication:

Karpus, J., Shirai, R., Verba, J.T. et al. Human cooperation with artificial agents varies across countries. Scientific Reports 15, 10000 (2025).
(https://doi.org/10.1038/s41598-025-92977-8)

ImageSource
morhamedufmg Pixabay


Beitrag veröffentlicht

in

von

Schlagwörter: