Let’s dive a bit into the robotics market today.
There are two main categories of robots: industrial robots, which we know from the automotive, high tech and other verticals. And there are service robots, which we find in hospitality, healthcare, but also at home, like vacuum cleaners for instance. Following research done by KPMG, there are roughly 30 million robots active today. 6% of these are industrial robots, and 94% are service robots.
The social robots form a special class of service robots; Currently there are approximately 2,2 million active social robots, which corresponds to 8% of all service robots. They interact and communicate on an emotional, personal level with people, and not just with other robots. They react differently to the respective individual. They are integrated into daily life and replace, because of their nature/ability to interpret and imitate human behavior, in some cases even human relationships – especially in Asia, where (social) robots are already common and even participate in family life; The best known and probably the most widely used bot is certainly Pepper, which is available since mid 2015 and (according to Inc) sold more than 7.000 times alone in Japan already – however the Kirobo Mini, Buddy, Kuri and other bots, will certainly create some demand in Europe and the USA as well.
But also service robots that do not accompany us directly in our daily lives or don’t live with us in our own home can be social. For example, such bots interact with us in hotels, hospitals, shops, museums, etc. (e.g. relays, LEA, Oshbot, …) – although these service robots are not really socially, as they replace jobs, they are also classified as social robots. Ultimately, the built-in sensors, actuators and the brainware (an artificial intelligence/cognitive computing thing) are decisive for the perceived “social behavior”.
Examples of sensors might be video and audio – e.g. via webcams, microphones – but also geo-coordinates, temperature, pressure, light/brightness, acceleration, ionic strength and many other measurable variables.
Actuators or triggered actions can be virtually anything: dialogues with a person, other robots or even with your smart home, sending information/alarms, making the robot move/drive, etc.
Apart from the hardware and software of the sensors and actuators in robots, social robots do in addition have a that brainware piece, which allows the the robot to learn and “develop” itself; It represents more or less the personality and logic of the bots, the cognitive abilities, and generates the actions due to the sensory inputs.
For instance, when the sensor system films a face, the brainware is responsible for face, emotion, and speech recognition, and more. It decides/interprets how a person feels, whether she is alone, what the person is doing or maybe has just fallen, and it can even make recommendations for action. The brainware that is created/growing by machine-learning is an essential aspect of cognitive computing and artificial intelligence. In one of my next posts, I will tell you more about cognitive computing – so, stay tuned.