Robotics is a discipline that deals with designing, building and programming machines capable of performing tasks autonomously or in collaboration with humans. Robotics has applications in many areas, such as industry, medicine, agriculture, exploration, security and entertainment. However, to interact with robots, humans often have to use physical devices, such as keyboards, mice, joysticks, remote controls or sensors, which can be inconvenient, limiting or unintuitive.
To overcome these obstacles, some researchers are developing brain-robot interface (BRI) systems, which allow users to control robots by thought alone, without the need for any physical device. These systems are based on EEG (electroencephalography) technology, which records and interprets the electrical signals produced by the brain when a person thinks about an action or object.
An example of a BRI system is NOIR (Neural Signal Operated Intelligent Robots), developed by a group of researchers at Stanford University. NOIR is a generic, intelligent system that allows users to command robots to perform everyday tasks through brain signals. Through this interface, users communicate their objects of interest and actions to the robots, using two types of EEG signals: SSVEP (steady-state visually evoked potential) and MI (motor imagery).
SSVEPs are signals generated by the brain when a person looks at a visual stimulus that flashes at a certain frequency. NOIR uses screens showing different objects with different blinking frequencies, and decodes the SSVEPs to understand which object the user wants to manipulate. MIs are signals generated by the brain when a person imagines moving a body part. NOIR uses machine learning algorithms to decode MIs and understand how the user wants to interact with the object. In addition, NOIR uses a safety mechanism that captures jaw muscle tension to confirm or reject decoding results.
NOIR offers various robotic skills, which can be combined to perform complex tasks. The robotic skills are based on 14 parameterised primitive skills, such as picking up, putting down, pushing, cutting, pouring, mixing, switching on, switching off, opening, closing, writing, drawing, playing and singing. These skills can be applied to different objects and contexts, and can be adapted to the preferences and intentions of users.
NOIR demonstrates its success in a wide range of 20 challenging activities typical of everyday life, such as cooking, cleaning, taking care of oneself and having fun. For example, users can control a robot to bake a cake, wash the dishes, sweep the floor, turn on the TV, write a letter, draw a flower, play the piano and sing a song, simply by thinking.
The system’s effectiveness is enhanced by its synergetic integration of robotic learning algorithms, which enable NOIR to adapt to individual users and predict their intentions. The robot learns the user’s object, skill and parameter selections in a few-shot manner, thus reducing the effort and time required for decoding.
The work of Stanford researchers represents a breakthrough in research on thought-controlled robotics, and opens up new possibilities for human-robot interaction based on direct, neural communication. This type of interaction could have benefits for many categories of people, such as the disabled, elderly, sick or isolated, who could use robots to improve their quality of life and autonomy. However, thought-controlled robotics also presents challenges and risks, such as security, privacy, liability, trust and user acceptance, which require further study and regulation.
If you want to learn more about this topic:
- Social robotics: how ChatGPT improves the human-robot relationship, which explains how an artificial intelligence system based on natural language can make communication between humans and robots more fluid and human-like.
- Psychology of the human-robot relationship and work in Factory 4.0, which analyses the psychological and social implications of automation and robotization of production processes.
- Human-robot empathy: the complex relationship between AI and emotions, which explores the role of human emotions in relationships with robots and the potential of artificial intelligence to create empathetic artificial agents.