I, Robot, Empathise
Robots may soon “feel”, if scientists from Columbia University in New York City polish their current research.
In order for automatons to interact with humans on a social level, they must first have the ability to empathise with others, also known as the “Theory of Mind”. This means that the brain analyses inputs in order to predict the future actions of other people.
While artificial intelligence with such capacity is still in the realm of science fiction, the team at Columbia have successfully created a machine that has “visual theory of behaviour” – in other words, it visually predicted how another machine would behave.
The team programmed one robot to move towards green circles in a small controlled environment. If a green circle were blocked by something, either it would not move or it would make its way to a different circle.
Another robot observed the first’s behaviour for several hours, then started guessing future movements. Towards the end, the second robot was able to determine the first’s goal and path with remarkable accuracy.
Boyuan Chen, co-lead on the study, said: “Our findings begin to demonstrate how robots can see the world from another robot’s perspective. The ability of the observer to put itself in its partner’s shoes, so to speak, and understand, without being guided … is perhaps a primitive form of empathy.”