What is SLAM technology?
업데이트 시간: 2022-11-11 17:33:31
Contents
What is SLAM technology?
Simultaneous Localization and Mapping (SLAM) is a technology field that addresses the localization and mapping of robots moving in unknown environments.
Simply put, SLAM is a technology that allows a robot to acquire information about its environment through sensors, where it is, where it is going, how it is going, and what is in front of it. Then the system will get its orientation and path planning based on the environmental information.
A little hard to understand? No problem, let's take an example.
Let's say you are on a business trip to an unfamiliar city. To familiarize yourself quickly with the environment and complete your task of checking into a hotel, you should do the following things.
1. Feature extraction
Observe the surroundings with your eyes and remember their features.
2. Map construction
Construct a 2D or 3D map of the environment in your brain based on the information obtained from your eyes.
3. Bundle Adjustment or EKF
As you walk, you constantly acquire new features and landmarks and adjust your mental map model.
4. Trajectory
Determine your position based on the feature landmarks you have acquired during the previous walk.
5. Loop-closure Detection
When you have unintentionally walked a long way, match the landmarks in your mind to see if you have returned to the original path.
The above five steps are performed simultaneously, called Simultaneous Localization and Mapping.
As an indispensable and important technology for autonomous mobile robots, SLAM technology is receiving more and more attention.
SLAM technology is widely used in robotics, UAV, driverless, AR, VR, and other fields, relying on sensors to achieve autonomous localization, map construction, path planning, autonomous navigation, and other machine functions.
Laser SLAM or vision SLAM?
The sensors currently used in SLAM are mainly divided into two categories: Lidar-based laser SLAM (Lidar SLAM) and vision-based VSLAM (Visual SLAM).
Visual SLAM, like the eye, is the main source of external information and can acquire massive and redundancy-rich texture information from the environment, which is the advantage of visual SLAM.
The camera is often used as the "eyes" of the robot because of its small size, low energy consumption, and low cost, which is the basis of visual SLAM.
The robot uses the camera's image information as a basis to map out its surroundings and then transmits it to the "brain." Finally, the system makes a judgment to complete the robot's positioning.
This technology is difficult and complex to process information, and it is easily affected by lighting conditions, so in some cases, visual SLAM is not enough.
That's why laser SLAM is here to help.
Laser SLAM uses 2D or 3D LiDAR (single- or multi-line LiDAR). 2D LiDAR is generally used on indoor robots (such as floor sweepers), while 3D LiDAR is generally used in unmanned vehicles, robots, AMR/AGV, etc. The emergence and popularity of LiDAR have led to faster and more accurate measurements and richer information.
LIDAR distance measurement is more accurate, the error model is simple, the operation is stable outside the special environment, and the point cloud processing is easier, which can fully adapt to the dynamic changing environment. Laser SLAM theoretical research is also relatively mature, and the corresponding products are more abundant.
Through comparison, it is found that laser SLAM and vision SLAM have their strengths and limitations individually, while the fusion complements each other's strengths and weaknesses.
For example, vision works stably in dynamic environments with rich textures. It can provide very accurate point cloud matching for laser SLAM, while the precise direction and distance information provided by LiDAR will be more powerful on correctly matched point clouds.
In environments with severe light deficits or missing textures, the positioning work of laser SLAM allows vision to record scenes with little information.
Future Applications
SLAM technology has already achieved good landing results and achievements in many fields, including indoor mobile robots, AR/VR, drones, uncrewed vehicles, and so on.
In the future, the continuous improvement of sensor accuracy and the gradual reduction of cost will bring revolutionary changes to more industry fields.
As SLAM technology becomes hot, more and more talents will come into the field of mobile robotics, injecting more fresh blood and bringing new technical directions and research fields.
Ratings and Reviews
특수 제품에 대한 관련
-
MEC5105K-D1-TN
Microchip
FBGA > -
LP62S1664CU-70LLIF
Microchip
BGA > -
ATZB-S1-256-3-0-C
Microchip
RF TXRX MODULE 802.15.4 CHIP ANT > -
ATECC608A-SSHDA-B
Microchip
CryptoAuthentication, 2V to 5.5V, SOIC-8 > -
MIC5365-3.0YC5-TR
Microchip
LDO Voltage Regulator Controller, 150mA, > -
TC7107CKW
Microchip
3-1/2 Digit A/D, w/ LED Driver, 0C to +7 > -
TC427EPA
Microchip
Dual Low Side MOSFET Power Driver, 1.5A, > -
TC1320EOA
Microchip
Digital to Analog Converters - DAC 8-Bit > -
T2117-3AS
Microchip
IC SWITCH 0-VOLT ADJ RAMP 8-DIP > -
SY89847UMG
Microchip
1.5GHz Precision, LVDS 1:5 Fanout with 2 > -
SST39VF800A-70-4C-B3KE
Microchip
2 Mbit / 4 Mbit / 8 Mbit (x16) Multi-Pur > -
SST39VF010-70-4I-WHE
Microchip
Flash Memory, Parallel NOR, 1 Mbit, 128K > -
QT118H-ISG
Microchip
CHARGE-TRANSFER TOUCH SENSOR > -
QT1080-ISG
Microchip
8 KEY QTOUCH SENSOR IC,Microcontrollers > -
PIC24F32KA304-I/ML
Microchip
MCU 16-bit PIC RISC 32KB Flash 2.5V/3.3V >
가능 증권
더- PIC18F8680-I/PT
- PIC18F8520-I/PT
- PIC18F24K20-I/ML
- PIC16F818-I/SO
- MIC5357-SGYMME
- MIC502YN
- MIC5021YN
- MIC4469YWM
- MIC2505-1YM
- MIC2076-2YM
- MIC2009YM6-TR
- MCP7940MT-I/MNY
- MCP6072-E/SN
- MCP2510-I/ST
- LR12K4-G
- DSPIC33FJ16GS504-E/PT
- DSPIC33FJ128GP802-I/SP
- DSPIC33FJ128GP708-I/PT
- DSPIC33EP512MU814-I/PL
- DSPIC30F6010-20I/PF
- DSPIC30F4011-30I/P
- DSPIC30F3013-30I/SO
- DSPIC30F2010-20E/SO
- ATMEGA64RZAV-10PU
- ATF22V10C-10SI
- ATF16V8BQL-15SU
- AT94S10AL-25BQU
- AT93C86A-10SU-2.7
- AT93C66A-10SU-1.8
- AT93C66-10PC
- AT93C46E-PU
- AT93C46D-TH-B
- AT93C46A-10SU-2.7
- AT88SC0204C-SU
- AT49BV6416-70TU
- AT49BV163DT-70TU
- AT45DB041B-SI
- AT29LV020-10TU
- AT29C256-15PI
- AT29C1024-12JI
- AT29C040A-15PI
- AT29BV010A-15JU
- AT28HC64B-12JU
- AT27C512R-70RU
- AT27C010-45JU