Kansei Robot


In recent years, robots have become an everyday reality. These include humanoid robots that walk on two legs, guidance robots that look just like human beings and pet robots that act like their animal counterparts.

The level of contact between robots and people is gradually increasing. For instance, AIBO, a typical pet robot developed by Sony, was at one time widely discussed as a potential domestic robot.


Why is Kansei necessary for robots?

So, what are the main problems that have prevented partner robots from becoming popular? One of our answers is that robot intelligence is still too mechanical or simplistic, which means that users quickly tire of interacting with the robot. Essentially, users feel little sympathy for a robot whose actions are simple, routine or uncomplicated.

It is our opinion that Kansei will play an important role in solving this problem. And although people often easily tire of pet robots, some reports indicate that expenditure on pets is increasing in Japan, suggesting that people remain interested in keeping real cats or dogs.

So, what is difference between pets and pet robots?

Of course, one major difference is their appearance. But if we focus for instance on a phenomenon like “getting tired,” a more important difference may be that robots do not behave like living creatures. Because people may have more sympathy towards robots with Kansei, we are trying to develop an artificial Kansei model for a partner robot.

Robot with emotions

To develop such an artificial Kansei model, our research team has focused on the keyword “emotion” as an important element that humans are born with. Emotion also plays a very important role in communication between humans, which is why any robot that can get along with humans and understand their sensibility will require a function called “emotion.”

The presence or absence of emotion defines a fundamental difference between people and robots. To become deeply familiar with (and to) humans, robots must therefore be provided with such a function.

Do you think robots will grow?

The emotion-generating model is a system for determining the emotion a robot will generate when stimulated. While many researchers around the world have studied this emotion-generating model of the robot, most of these have focused on how robots with a large number of complex emotions select those appropriate to the stimulus.

In our laboratory, on the other hand, we are building an emotion-generating model that focuses on a new point in the “development” of creatures. Specifically, we are aiming to develop a robot that generates emotion in the same way as living creatures by giving the robot functions that allow emotions to develop gradually, as in the process of human growth.

Emotion development of a robot

Robot in the presence of desire

Our research also focuses on “desire” and on “physiological” factors such as sleepiness and hunger. These are cases where living creatures feel strong emotions such as frustration and anger from desire for hunger and food, confirming that these elements are deeply connected with emotion.

Becoming friends with the robot!

We are also developing an emotion-generating model that can react differently to more than one person. The robot has evaluation criteria for each human, referred to as “degree of intimacy,” enabling it to make a corresponding response in each case.

Because these criteria are high when the robot’s desire has been eliminated, the robot gradually exhibits a friendlier reaction as the user communicates in a friendly way.

Robot with degree of intimacy


In the emotion generation model proposed in our laboratory, we continue analysis and improvement through repeated numerical simulation. In the future, we hope to implement the model in the robot to conduct experiments in robot and human communication. We expect to be able to generate creature-like emotional expression in the robot.


M.Harata, M.Tokumaru, “An emotion generation model with growth functions for robots”, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.17, No.2, pp.335-342, 2013-03.

A.Fukumoto, H.Takenouchi, M.Tokumaru, “Verification of Differences in Growth due to Mutual Influence of Parent and Child Robots with Emotional Growth Functions”, The 1st International Symposium on Affective Science and Engineering (ISASE2015), D2-2, 2015-3 (Tokyo, Japan).

S.Kinoshita, H.Takenouchi, M.Tokumaru,”An Emotion-Generation Model for a Robot that Reacts after Considering the Dialogist”, 7th IEEE International Conference Humanoid, Nanotechnology, Information Technology Communication and Control, Environment and Management (HNICEM2014), IS-09, 2014-11 (Puert Prinsesa, Palawan, Philippines).

A.Fukumoto, H.Nagano, M.Tokumaru, “Care Action Generation Model for Robot as Parent Child Interaction with Emotion Growth Functions”, 14th International Symposium on Advanced Intelligent Systems – ISIS2013, T1c-4, 2013-11 (Daejeon, Korea).

H.Nagano, M.Harata, M.Tokumaru, “Developing sophisticated robot reactions by long-term human interaction”, the 15th International Conference on Human-Computer Interaction: Human-Computer Interaction, Towards Intelligent and Implicit Interaction, Lecture Notes in Computer Science Volume 8008, 2013, pp 319-328, 2013-07 (Las Vegas, Nevada, USA).