Cookie Settings
Cookie Settings
Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

Other cookies are those that are being identified and have not been classified into any category as yet.

No cookies to display.

Sensing Technologies in Robotics

Avatar photo

Today’s robots are becoming more human-like, not only in terms of movement but also in how they sense the real world. The rapid evolution of sensor technologies for robotic applications is supporting this trend, and the ability of robots to make decisions based on sensory feedback will have massive industrial and societal impact. 

Evolution of Sensing Technologies

The first mobile robot capable of any level of reasoning about its surroundings was built in 1970 by the Stanford Research Institute (now SRI International) in California. The robot, named ‘Shakey’, combined multiple sensor inputs, including TV cameras, laser range-finders and ‘bump’ sensors to navigate. 

In 1972, Waseda University in Japan created WABOT-1, the world’s first full-scale humanoid robot, which could grip and transport objects with its hands using tactile sensors. A vision system was deployed to measure distances, while directions to objects were gauged using external receptors – artificial eyes and ears. Just two years later, David Silver designed the Silver Arm, which was capable of fine movements that replicated human hands with feedback provided by touch and pressure sensors. There have been many more notable advances since these early robot sensing efforts. Arguably the most famous was ASIMO, which was created out of Honda’s humanoid project in 2000. ASIMO could communicate with humans, and recognise faces, environments, voices and postures.

Ongoing research into sensor capabilities has resulted in greater adoption in the industrial robot sector. And, as with all technologies that make the leap from research lab to commercial production, cost is falling in line with uptake. Thanks to the proliferation of industrial robots, volumes will rise and costs will fall further, making the latest sensor technology available to all, not just multinational robot OEMs. 

Robots’ Senses: Seeing, Hearing, Touching, Moving

A wide variety of sensors are needed to give a robot a complete picture of the environment in which it operates. So, what are the key technologies that help robots see, hear, touch and move, and how are they developing? 

Sound and Vision 

Fundamental to primary robot intelligence, vision sensing can be based on technologies ranging from the traditional camera, sonar and laser, through to the latest RFID technology. 

Light detection and ranging (LIDAR) systems are also a popular choice for robot vision. This technology bounces light off nearby surfaces to create a 3D map of the world around it. LIDAR is like radar in its basic mechanics, but because it uses light, not radio waves, it offers greater resolution. There are many key advances in vision-related hardware, not least the development of high-speed, low-noise CMOS image sensors, and new 2D and 3D vision systems. 2D vision is essentially a video camera that can perform tasks ranging from the detection of motion to locating parts on a conveyor, thereby helping the robot coordinate its position. 3D vision systems normally rely on either two cameras set at different angles, or laser scanners. With this technology, a robot can detect parts in a tote bin, for example, recreate a part in 3D, analyse it and pick the best handling method. 

Complementing vision sensors, audio sensors based on multiple microphones can be deployed to determine the direction and intensity of a person’s voice or listen to sound-based commands. Sensitivity can be adjusted using a potentiometer. Microphone technology has been around for a long time, but in the future, sound/audio sensors may be able to determine the emotional status of a human voice. However, this will demand analogue-to-digital conversion (ADC) and digital-signal processing (DSP) electronics in tandem with a powerful microprocessor and advanced software.

Touch  

A sensing device that specifies contact between an object and sensor is a tactile sensor. These are found in everyday objects such as lamps that brighten or dim by touching the base, for example. The stimulus-to-response pathways witnessed in electronic touch-sensor operations replicate human processes that involve the skin, signal transmission via the nervous system, and brain. Touch-sensor options include wire resistive – which measure the resistance between electrically resistive layers at the point of contact to determine the touch position, surface capacitive, projected capacitive, surface acoustic wave and infrared. Among recent advances in this area are adaptive filters. Applied to robot logic, such filters enable the robot to predict the resulting sensor signals of its internal motions, screening out any false signals. As a result, contact detection is improved and false interpretation reduced. 

Force/ Torque

As vision gives eyes to a robot, force-torque (FT) sensors give ‘feel’, enabling users to know the force a robot applies with its end effector. This can aid assembly operations – if a component does not fit well, for example, feedback from the sensor allows the robot to adjust its movement and reorientate the part in the correct position. An FT sensor detects different forces and torque levels in up to three geometric (XYZ) axes. Typically, the sensors are fitted at the robot flange or wrist so that effort can be measured effectively. Selection criteria for FT sensors include the number of measured axes, physical dimensions, force range and communication rate. FT sensors are important in collaborative and safety-based functions, as force-limiting capability is essential to robotic systems that can work safely alongside humans. 

Proximity/ Collision Detection

Proximity sensors that detect the presence of nearby objects (or targets) without any physical contact are placed on moving robot parts such as end effectors, with the sensor emerging from sleep mode at a pre-specified distance. Working on the principle that ‘no contact is better than some contact’, one of the growth applications for proximity sensors is in collaborative robots, where they help to ensure a safe environment for human workers. 

Different targets demand different sensors – a capacitive or photoelectric sensor, for example, may be appropriate for a plastic target, while inductive proximity sensors always require a metal target. Sometimes user-adjustable, the maximum distance that a proximity sensor can detect targets is defined as its nominal range. Proximity sensors typically offer high reliability and long functional life thanks to the absence of moving parts and the lack of physical contact between sensor and target. The wide variety of types includes those based on capacitive, eddy current, inductive, magnetic, optical, photo-resistive, radar, sonar, ultrasonic and fibre-optic technologies. 

Infrared sensors, for instance, transmit a beam of light that is reflected off a target and captured by a receiver, while ultrasonic sensors generate high-frequency sound waves whereby the presence of an echo suggests interruption by an object. Using ultrasound rather than infrared solves the challenge of the short range, as well as the need for calibration. Ultrasound is reliable in any lighting conditions and is fast enough to take care of collision avoidance for a robot. It can also handle being shaken, as long as the motion is not exceptionally fast. 

Position

Used for sensing and controlling arm position, the three most common types of position sensors are encoders, potentiometers and resolvers. Popular types of encoders – which are used for converting angular or linear displacement into digital signals – include linear, rotary, incremental and absolute. Incremental encoders have a glass disc with discontinuous stripes. 

A phototransmitter is present on one surface of the disk and a photoreceiver on the other. When the disk commences rotation, the light beams are finished alternately and interrupted, delivering output as a pulse train whose frequency is proportional to the disk rpm. Used mostly to determine the absolute position of a part, absolute encoders are similar to incremental encoders, with the stripes set to give a binary number that is relative to the shaft angle. 

Potentiometers, which can also be used to determine position, are essentially ‘voltage divider’ systems that produce an output voltage proportional to the position of a rotating wiper in contact with a resistive element. The wiper separates the voltage of the resistive element into two parts and, by measuring voltage, its position can be pinpointed. Like a potentiometer, resolvers are analogue devices. In this case, rotary electrical transformers are used for calculating degrees of rotation. An AC signal is needed and the output signal of a resolver is proportional to the angle of the rotating element with respect to the fixed element.

Further Advances in Robot Sensing Technology

Robot sensing technology is advancing rapidly, offering up a myriad of advanced and sometimes radical industry solutions for safety, and supporting more effective forms of collaboration between people and machines. 

Existing sensors, including cameras and depth sensors, are often affected by lighting and offer only a rough idea of a person’s position in 3D space. Emerging safety systems allow people to work in closer proximity to powerful robots, which will shut down completely if a person moves too close. By tracking a person’s motions more precisely (for instance using enhanced radar techniques), next-generation systems will make it possible for powerful robots to work in concert with a human co-worker. Such technology might also improve efficiency, because workers could grasp something that a robot has finished working on, without fear of being injured. 

Another recent breakthrough is a flexible sensor ‘skin’ that can be stretched over any part of a robot’s body to accurately convey the information about shear forces and vibration that is critical to grasping and manipulating objects. The skin mimics the way a human finger experiences tension and compression as it slides along a surface or distinguishes between textures. This tactile information is measured with similar precision and sensitivity as human skin, and could vastly improve robots’ ability to perform all tasks, from industrial to medical procedures.

Robotics Sensor Market

The increased use of robots in industries such as automotive, food and beverage, renewable energies, logistics, medical care, and telecommunications is a major factor that is expected to augment growth in the industrial robot sensors market over the coming years. 

GM Insights expects that the robot sensor market will be subject to an 11.5% CAGR between now and 2028 – with more than 12 thousand units being shipped over that time. iDTechEx forecasts that vision systems alone will command a worldwide market value of $5.7 billion by 2027, while force sensing technologies will reach $6.9 billion by that stage. A Technavio study predicts that the materials-handling segment will dominate the industrial robot sensors market over the next decade – particularly in sectors like automotive, food and beverage, packaging and pharmaceutical. Furthermore, increasing momentum behind Industry 4.0/5.0 will also prove a significant factor, driving market growth in the coming years. 

To find out more, read our guide to Robotics and Automation where you can find this and much more interesting content.

Total
0
Shares
Previous Post

Green Packaging: How Edible and Biodegradable Materials are Transforming the Industry

Next Post

Automated Solutions in Food and Beverage Manufacturing

Related Posts