Sensory Systems for Robotic Applications
2: University of Derby, UK
3: Technical University of Munich, Germany
Robots have come a long way thanks to advances in sensing and computer vision technologies and can be found today in healthcare, medicine and industry. Researchers have been looking at providing them with senses such as the ability to see, smell, hear and perceive touch in order to mimic and interact with humans and their surrounding environments.
Topics covered in this edited book include various types of sensors used in robotics, sensing schemes (e-skin, tactile skin, e-nose, neuromorphic vision and touch), sensing technologies and their applications including healthcare, prosthetics, robotics and wearables.
This book will appeal to researchers, scientists, engineers, and graduate and advanced students working in robotics, sensor technologies and electronics, and their applications in robotics, haptics, prosthetics, wearable and interactive systems, cognitive engineering, neuro-engineering, computational neuroscience, medicine and healthcare technologies.
Inspec keywords: tactile sensors; three-dimensional printing; optical sensors; piezoresistance; MOSFET; microsensors
Other keywords: tactile sensors; microsensors; optical sensors; piezoresistance; robotic applications; three-dimensional printing; sensory syatems; MOSFET
Subjects: Sensing and detecting devices; MEMS and NEMS device technology; Low-field transport and mobility; piezoresistance (semiconductors/insulators); Insulated gate field effect transistors; Handbooks and dictionaries; Monographs, and collections; Optical instruments and techniques; Textbooks; Microsensors and nanosensors; Micromechanical and nanomechanical devices and systems
- Book DOI: 10.1049/PBCE097E
- Chapter DOI: 10.1049/PBCE097E
- ISBN: 9781849199483
- e-ISBN: 9781849199490
- Page count: 330
- Format: PDF
-
Front Matter
- + Show details - Hide details
-
p.
(1)
-
1 Development of tactile sensors for intelligent robotics research
- + Show details - Hide details
-
p.
1
–26
(26)
Our goal is to reveal and reconstruct human intelligence, and we believe that physical and informational interactions with other humans and the environment are essential. To study the physical interaction in the real world, we have developed systems such as humanoid robots and sensor suits, all with tactile sensors based on unique concepts. Since the development of a device is critically influenced by the objectives and constraints of each user, it is important to explain "what it is for" and "how it will be used." Therefore, we will discuss not only developed tactile sensors but also the research and background related to them. First, we will introduce developed tactile sensors and systems, including a sensor module with many features such as scalability and the ability to fit on three-dimensional (3D)-curved surfaces, a sensor glove with approximately 1,000 detection points per hand while adapting to wrinkles, and a highly stretchable tactile sensor based on inverse problem analysis. Next, research on robots with tactile sensors and tactile information processing is presented, including the lifting of large objects by a humanoid equipped with full-body tactile sensors, a dynamic roll-and-rise motion using contact data from the back of the humanoid, learning in-hand grasp of different shaped objects based on tactile recognition, and object's posture recognition while the state of the object continues to change through interaction. Finally, we will discuss the relationship between the whole-body movement, haptic exploration, and general creative activity, as an important milestone in our research.
-
2 Developmental soft robotics
- + Show details - Hide details
-
p.
27
–53
(27)
Since the term artificial intelligence (AI) was coined in 1956, the research field of intelligence was initially dominated by the 'computational paradigm of intelligence' (traditional cognitivism). In this context, intelligence was regarded as a computational processes, where symbolic operations were of central interest without explicitly considering what the symbols actually meant. At the time, a strong connection was conjectured between the idea of 'intelligence', the power of symbolic representation (e.g., in the brain), and the possibility for a system to change from a state to another [1, 2]. In this context, an individual would create a symbolic representation of the world by means of sensory perception, then a process akin to rule-based symbol manipulation would allow them to exhibit intelligence [3, 4]. While the computational paradigm of intelligence has given significant impact mainly in cyberspace, there have been a number of aspects of intelligence that cannot be fully explained in this framework.
-
3 Three-axis tactile sensor using optical transduction mechanism
- + Show details - Hide details
-
p.
55
–74
(20)
Tactile information plays a major role in our daily life and is crucial when we handle objects such as food, tools, books, or clothing. It is well known that we cannot button a shirt when we lose tactile sensation due to disease. For robots, tactile information is important to fulfill handling tasks. Many engineers and researchers in robotics have developed tactile sensors for robots that apply various physics phenomena, such as electric resistance variation, magnetic field variation, piezoelectric effect, piezoresistance variation, and optical variation. Recently, new technologies have been utilized, such as microelectromechanical systems (MEMS), metal-oxide-semiconductor field-effect transistors (MOSFET), micro-chips, and 3D printing. As some authors have previously reported on these [1-4], we focus on optical tactile sensors in this chapter.
-
4 Strain sensors for soft robotic applications
- + Show details - Hide details
-
p.
75
–90
(16)
This chapter summarizes some recent developments in strain sensors for soft robotic applications including the different strain sensing mechanisms, fabrication approach, and key results. Resistive-type strain sensors are quite common and generally have good sensitivity. However, they often suffer from hysteresis and nonlinear electromechanical response. The performance of resistive-type strain sensors has been improved through the use of advanced nanomaterials, structural engineering, and fabrication approach. On the other hand, capacitive-type sensors offer excellent stretchability, linearity, and negligible hysteresis, but they have poor sensitivity. Further, as the interest in self-powered systems continues to rise, triboelectric-type strain sensors are gradually gaining attention. However, their sensitivity is still low when compared to the resistive-type strain sensors.
For the application of strain sensors for robotic applications, a number of technical challenges which prevent its application for real-life application still exists. First, it is still challenging to realize a stretchable strain sensor that has the ability to measure decoupled multidirectional and multiplane deformations [3]. Solving this challenge will be a breakthrough for applications such as soft robotics, considering the stretchable and conformable nature of soft robots. Rather than the conventional architectures and materials, researches could strive to realize more advanced sensing architectures with 3D structures and metamaterials. Some other key novel features to introduce into the available strain sensors include, high sensitivity, nonlinearity, self-healing, and to adopt a more reliable system integration approach. Most strain sensors are also susceptible to unwanted pressure as well as variations in environmental conditions such as changes in temperature and humidity [3]. The solution to such environment challenges could come from the use of a more advanced packages.
-
5 Neuromorphic principles for large-scale robot skin
- + Show details - Hide details
-
p.
91
–123
(33)
Providing sensitive skin to robots has been explored since the 1980s [1]. The reasons for artificial robot skin are diverse. Recently, new development and new application of robot skin have seen another boost, because collaborative and interactive robots have been considered as a viable solution (i) to further increase the level of automation in complex industrial scenarios [2]; (ii) for health-care [3]; and (iii) also in household [4] applications. So, why do robots need sensitive skin? Skin is considered to be the key factor to (1) enable robots to recognize textures for contact/object classification and recognition and (2) enable intuitive and safe human-robot interaction and collaboration. Texture recognition allows robots to add feel to objects, which so far are only known visually. For example, the visual knowledge of an object (round and yellow) can be extended with the feel of the object (soft with a smooth surface). The knowledge helps the robot to increase the success rate of interaction and manipulation tasks because the robot can exploit knowledge about the grip properties of the object through a sense of touch. Furthermore, artificial robot skin can guarantee the safety of humans in the robots' workspace. In contrast to visual safeguards, contacts are direct and cannot be occluded. In this way, robot skin already contributes to collaborative robots. Potentially, it makes robots safe enough to remove safety fences and allow humans to touch and interact closely with the robot. In addition to that, robot skin can provide an intuitive interface for manipulating and teaching the robot. With the development of appropriate tactile behaviors, the robot can be guided and taught simply by touching and moving it as desired [5].
-
6 Soft three-axial tactile sensors with integrated electronics for robot skin
- + Show details - Hide details
-
p.
125
–172
(48)
This chapter introduces recent work of the Sugano lab on distributed, soft, 3-axial tactile sensors for robot skin, based on capacitive [1-3] and Hall-Effect [4-8] sensors, with distributed electronics.
-
7 A review of tactile sensing in e-skin, wearable device, robotic, and medical service
- + Show details - Hide details
-
p.
173
–199
(27)
In order to better perceive and manipulate the surrounding environment, various tactile sensing technologies have been developed over the last decades, taking inspiration from the human sense of touch. The tactile sensors have been greatly improved in terms of miniaturization, integration, sensitivity, resolution, etc. However, it is still a huge challenge to integrate them into devices on a large scale with different shapes that require tactile information as feedback. This survey summarizes the mainstream tactile sensing technologies into eight categories in order to discuss the general merits and demerits of each type. An overall picture of the design criteria that can help the researchers to evaluate the performance of a tactile sensing device is presented before an extensive review of the applications, including electrical skins (e-skin), robotics, wearable devices, and medical services. After that, trends of the above fields are presented, such as multifunctional sensing capability, adjustable sensing density in a large area, conformability to complex surfaces, self-powered array, etc. It should be noted that the state-of-art achievements in e-skins will more or less facilitate the development of other fields in which tactile sensing technologies are urgently needed. Finally, challenges and open issues are discussed based on the perspective of mass production, including standardization of the fabrication process, data transmission of a high-density sensing array, fault tolerance and autocalibration, and the layout of sensing elements on an irregular 3D surface without losing the mechanical and electrical performance of the sensors.
-
8 Neuroengineering approaches for cognitive hearing technology
- + Show details - Hide details
-
p.
201
–212
(12)
Implementing neurofeedback in a hearing aid has recently become feasible due to the development of specialized hardware such as in-ear EEG recordings or the cEEGrid that wraps around the outer ear. The software aspects around the neurofeedback are therefore now beginning to be tackled. They relate to two broader issues. First, neurofeedback requires to obtain information about cognitive aspect of acoustic processing, from the EEG sensors. Second, neurofeedback then necessitates the use of the neural readout to influence the neural processing through altering processes in the hearing aid.
Here we have presented preliminary work on both issues. In particular, regarding the first challenge, we have shown how selective attention can be decoded from fast neural responses to the temporal fine structure of speech. We also presented work on the decoding of speech comprehension from the slower cortical tracking of the amplitude fluctuations in speech. Regarding the second issue, we showed how applying tiny electrical currents in a non-invasive manner can modulate, and in fact enhance, speech comprehension.
Real-world application of these recent advances require; however, considerable further developments. Regarding the readout of cognitive processing, application in a hearing aid requires ideally a real-time analysis. In practice, this means that cognitive factors should be identifiable on a short time scale of a few hundred milliseconds or less. Moreover, these readouts should work reliably under different conditions, such as in different acoustic environment and with different types of background noise. As for the speech enhancement, practical applications would benefit from improving speech comprehension by a considerable margin, say 20% or higher, to make a real difference to the hearing-aid user. This will likely require the optimization of several types of hearing-aid processes, such of the acoustic processing as well as perhaps electrical current stimulation.
-
9 Mobile robot olfaction state-of-the-art and research challenges
- + Show details - Hide details
-
p.
213
–248
(36)
Olfaction is a fundamental sense for most animals, which use it to find mates, food or avoid predators [1]. The literature reporting studies on animal olfaction is extensive. The studies on insect olfaction and the olfactory capabilities of some male moths are particularly interesting. Such moths are known to detect pheromones released from females located several hundred meters away and are able to navigate in natural environments until finding their mate [2]. The notorious olfactory capabilities of several animals are well known and frequently exploited for practical applications. Dogs, pigs and even rats often team up with humans to find landmines [3] or used in airport patrols and other public places sniffing for drugs, explosives or other illegal substances [4]. They also assist search and rescue teams searching for victims [5], and can even detect diseases, such as some types of cancer or COVID-19 [6]. The use of trained animals to augment the limited capacities of human olfaction is acceptable for many applications but has some obvious drawbacks, such as the need to train the animals, lack of quantified detections, limited operating time, the impossibility to operate in hazardous environments and the need for a human supervisor. These drawbacks may be, at least to some extent, mitigated by mobile robots with advanced olfactory capabilities. Such robots would be able to explore the oceans, searching for substances of interest, such as valuable minerals, pollution or organic matter. They may explore the atmosphere, searching for pollution sources, monitor farms while detecting pests and diseases threatening the cultures, monitor forests, detecting wildfires at an early stage and eventually tracking their progress, or operate as advanced scientific instruments, helping scientists in the study of dangerous natural phenomena, such as volcanic eruptions. We may also imagine robot dogs, with the olfactory capabilities of their animal counterparts, but with the ability to operate autonomously, without human supervisors. A common feature to any of these robots is their capacity to smell, i.e., to detect a target substance moving in a fluid, be it water, or air. By analogy with biology, we call this detectable substance an odour and the chemical sensing system, eventually complemented by a flow sensing system, the robot's olfactory system.
-
10 Vision sensors for robotic perception
- + Show details - Hide details
-
p.
249
–264
(16)
In this chapter, we will introduce vision sensors for robotics applications. It first briefly introduces the working principles of the widely used vision sensors, i.e., RGB cameras, stereo cameras and depth sensors, and also the off-the-shelf vision sensors that have been widely used in robotics research, particularly robot perception. As one of the most widely used sensors to be equipped with robots and thanks to its low cost and high resolution, vision sensors have also been used in other sensing modalities. In recent years, there has been rapid development in embedding vision sensors in optical tactile sensors. In such sensors, visual cameras are placed under an elastomer layer and used to capture its deformation while being interacted with objects. The vision sensors enable robots to sense and estimate the properties of objects, e.g., their shapes, appearances, textures and mechanical parameters. We will cover various aspects of vision sensors for robotic applications, including the various technologies, hardware, integration, computation algorithms and applications in relation to robotics.
-
11 Audio sensors
- + Show details - Hide details
-
p.
265
–298
(34)
Audio information is crucial to interface with robots and systems. There are mainly three reasons: (1) Speech is used in human-human verbal communication, and thus it is natural that a robot or a system has auditory functions to interface with humans. (2) It is said that humans use multimodal information for perception. Mehrabian [1] stated that three main factors in communication are verbal, vocal, and facial information, and their contributions are 7%, 38%, and 55%, respectively, in terms of understanding attitudes and characters. This reveals that auditory functions are essential to a robot or a system in addition to visual and other sensory functions. (3) Visual sensing is generally more accurate than audio sensing, but sound can be detected from transparent/occluded/extravisual objects which are difficult to be detected only with vision. Audiovisual sensing will disambiguate missing information from each other [2]. This will be helpful for a robot or a system to implement situation awareness such as anomaly and danger detection functions. This chapter, therefore, widely describes using audio sensors, i.e. "microphones" from devices to auditory processing. Note that this chapter focuses mainly on microphones as audio sensors, although loud speakers and transducers are audio devices which are often discussed together with microphones.
-
12 Audio and gas sensors
- + Show details - Hide details
-
p.
299
–309
(11)
Audio is propagated through a medium (such as air or water) as continuous differences of air pressure caused by vibrational mechanisms that originated at the source [1]. These can be the human trachea which houses vibrating vocal cords that emit speech, or an electronic speaker which houses a vibrating metallic plate that emits previously recorded sound.
Acoustic sensory can be considered as the counterpart of this process, which aims to capture these vibrations from the air and interface them into electric signals, which can then be further analyzed to obtain information of the auditory scene.
Audio sensors in robotic applications [2] have been applied in a wide variety of case scenarios, from human-robot interaction to search and rescue efforts. Although their use is not as widespread as other sensors explored in this work, they are worth describing so as to offer a complete survey of sensory systems for robotic applications.
The structure of this section is divided into two parts. The first explores the hardware side of auditory sensing in robotic applications, where microphones and audio interfacing are concerned. The second part explores the software side, where current implementations are described to offer the reader alternatives to best choose what is more appropriate in their case scenario.
-
Back Matter
- + Show details - Hide details
-
p.
(1)