Our perception framework relies on heterogeneous sensors, through advanced fusion algorithms to create a highly accurate 2.5 dynamic occupancy grid. The map-based localization feature utilizes visual cues, Inertial Navigation Unit (INU) pose data, and wheel odometry, ensuring precise pose information with an exceptional accuracy of less than 5 cm even in tight and narrow pathways.
Environment Perception
- OGM Generation using planar and 3D LiDARs and cameras
- Co-ordinate transformation
- Configurable resolution, and radial distance
- Configurable exclusion zone
- 3D to 2D projection with spatial partitioning
- Configurable z axis cut-offs
- Probabilistic and Binary OGM generation with 3D
- Configurable potential field for obstacles
- Occluded, free, occupied zones can be extracted
- Drivable region extraction
- Object classification and tracking
- Obstacle list generation
- Lead subject tracking
- Threat prediction
Localization
- Particle filter for pose tracking
- <5 cm based on environment challenges in corridors
- Filtering for random disturbances in measurements
- Wheel odometry, Visual odometry, indoor GPS
- Dynamically adjustable weights based on scenario