Integrating LiDAR with Depth and RGB Cameras for Enhanced Robot Perception and Navigation

Disciplines

Robotics

Abstract (300 words maximum)

LiDAR, an acronym for Light Detection and Ranging, stands as a leading technology in expanding the perceptual horizons beyond the inherent constraints of conventional camera systems. This technology operates by emitting laser pulses and subsequently measuring the time it takes for the light to return, enabling precise distance measurements within its immediate field of view. Distinguished by its resilience in varying environmental conditions, LiDAR exhibits exceptional performance, even in low-light or suboptimal lighting scenarios, while delivering distance measurements at a remarkably high spatial resolution.

Within the context of this study, a 2D LiDAR sensor has been employed. The data acquired by this sensor, denoting the distances to all objects within the measurement radius sharing the same elevation as the laser emitter, is effectively transformed into a 2D map. This map grants enhanced environmental awareness to the robot, providing it with the capacity to discern its surroundings, adeptly avoid obstacles, and strategically plan its path of traversal.

To address the inherent limitations of the 2D LiDAR sensor, a complementary approach is adopted. Integrating data from a depth camera introduces a three-dimensional perspective, equipping the quadruped robot with the ability to gather spatial information across a more extensive field of view. Furthermore, the incorporation of RGB cameras facilitates object recognition, thereby affording the quadruped robot, GoAir 1, a comprehensive understanding of its environment. This combination of sensory modalities empowers the robot with the requisite information to make sound and prudent decisions, whether it is operated under manual guidance or autonomously navigates its surroundings using Simultaneous Localization and Mapping (SLAM) techniques.

Academic department under which the project should be listed

SPCEET - Robotics and Mechatronics Engineering

Primary Investigator (PI) Name

Muhammad Hassan Tanveer

This document is currently not available here.

Share

COinS
 

Integrating LiDAR with Depth and RGB Cameras for Enhanced Robot Perception and Navigation

LiDAR, an acronym for Light Detection and Ranging, stands as a leading technology in expanding the perceptual horizons beyond the inherent constraints of conventional camera systems. This technology operates by emitting laser pulses and subsequently measuring the time it takes for the light to return, enabling precise distance measurements within its immediate field of view. Distinguished by its resilience in varying environmental conditions, LiDAR exhibits exceptional performance, even in low-light or suboptimal lighting scenarios, while delivering distance measurements at a remarkably high spatial resolution.

Within the context of this study, a 2D LiDAR sensor has been employed. The data acquired by this sensor, denoting the distances to all objects within the measurement radius sharing the same elevation as the laser emitter, is effectively transformed into a 2D map. This map grants enhanced environmental awareness to the robot, providing it with the capacity to discern its surroundings, adeptly avoid obstacles, and strategically plan its path of traversal.

To address the inherent limitations of the 2D LiDAR sensor, a complementary approach is adopted. Integrating data from a depth camera introduces a three-dimensional perspective, equipping the quadruped robot with the ability to gather spatial information across a more extensive field of view. Furthermore, the incorporation of RGB cameras facilitates object recognition, thereby affording the quadruped robot, GoAir 1, a comprehensive understanding of its environment. This combination of sensory modalities empowers the robot with the requisite information to make sound and prudent decisions, whether it is operated under manual guidance or autonomously navigates its surroundings using Simultaneous Localization and Mapping (SLAM) techniques.