This repository features CARLA projects that use both ROS1 and ROS2 with Apollo to drive forward autonomous driving technology. Each project uses this combination to address unique challenges in autonomous systems, showcasing advanced skills in vehicle control, sensory integration, and navigation.
Naming Prefix | Description |
---|---|
HKAV_xx__ | Simulations in Carla β§ Apollo |
A self-driving vehicle, also known as an autonomous vehicle, is a car or truck that can drive itself using technology like sensors, cameras, and artificial intelligence. These vehicles can sense their surroundings and make safe driving decisions, changing the way we think about travel and transportation.
The CARLA simulator is a leading self-driving vehicle simulation platform based on Unreal Engine. It creates realistic environments to test autonomous driving technologies, offering various urban and vehicle setups. It also simulates tough driving conditions like bad weather, varied traffic scenarios, and emergency situations, which helps in improving vehicle perception and behavior.
The Apollo Foundation is a non-profit organization dedicated to advancing open-source autonomous driving technology. It provides a comprehensive suite of tools and libraries that support a range of functions from environmental sensing to vehicle planning and control. nApollo's platform includes robust capabilities for simulation, perception, decision making, and cloud data services, making it a comprehensive resource for developers and researchers in the field of autonomous driving.
By combining the Apollo Foundation's software with the CARLA simulator, developers can significantly enhance autonomous vehicle projects. This integration allows for extensive testing and improvement of vehicle systems in a virtual setup, speeding up development and ensuring systems are reliable before being used in the real world.
- High-precision Localization: Utilizes GPS, IMU, and LiDAR inputs to provide accurate positioning within the virtual environment.
- Perception: Employs machine learning algorithms for object detection, classification, and tracking, enabling vehicles to understand and react to their surroundings.
- Routine Planning: Offers sophisticated algorithms for route planning and optimization, adapting dynamically to changes in the environment.
- Control: Implements advanced control systems that ensure the vehicle operates safely and efficiently under various simulated conditions.
- Simulation Management: Integrates with CARLA's environment to provide a scalable testing framework that can simulate thousands of driving scenarios to validate the robustness and safety of autonomous systems.
It's essential to have a basic understanding of sensor technology, robotics, in-depth knowledge of computer vision, and the fundamentals of self-driving cars. Below are some of my recommendations I created.
Core of Autonomous Vehicles | ||
---|---|---|
Computer Vision | Path Planning Algorithms | Motion Control Algorithms |
Sensors Fusion | Localization and Mapping | Vehicle Dynamics and Kinematics |
Artificial Intelligence | Machine Learning | Deep Learning Perception |
Communications (V2V & V2X) | Software-Hardware Integration | Energy Efficiency and Management |
LIDAR Sensor | Radar Sensor | IMU Sensor | Collision Detector |
Depth Camera | GNSS Sensor | GPS | Lane Invasion Detector |
Obstacle Detector | Thermal Cameras | Capacitive Sensors | Tactile Sensors |
RBG Camera | V2V Communications | V2I Communications | Event Data Recorders |
RSS Sensor | Semantic LIDAR Sensor | Segmentation Camera | DVS Camera |
Optical Flow Camera | Inertial Sensor | Speed Sensors | Ultrasonic Sensor |