F110 Autonomous Valet Parking with Ouster LiDAR
Demo with two f110 cars:
- Clone this repo.
- Please refer to installation in PointPillars repo to install dependencies for PointPillars. Our repo contains needed code from PointPillars so no need to clone theirs.
- Install SparseConvNet and ros_numpy.
- Install ZeroMQ.
- Be sure to add second.pytorch/ to your PYTHONPATH.
System Structure:
Lidar is mounted on a tripod looking down on the cars:
- Connect the lidar to LAN and check host ip address. Run Ouster ROS node and open an rviz to check the point clouds.
cd avp_ws
source devel/setup.bash
roslaunch ouster_ros ouster.launch sensor_hostname:=os1-992006000706.local udp_dest:=[Host IP Address] lidar_mode:=2048x10 viz:=false
- Run
lidar_zmq_node
andvisualization_node
withavp_nodes.launch
.lidar_zmq_node
preprocesses the point cloud and publishes it on ZMQ.visualization_node
is for all visualization items to show on rviz. Please use the attached .rviz file.
cd avp_ws
source devel/setup.bash
roslaunch f110_avp avp_nodes.launch
- Calibrate the point cloud to the setup. Refer to the report on how to get good detection results.
conda activate pointpillars
python3 ./avp_utils/f110_viewer.py
- Run
detection.py
to start the pointpillars detection on the point cloud.
conda activate pointpillars
python3 -W ignore detection.py
- Run
localization.py
to start the localization process and you should see bounding boxes in rviz now. Please refer to the report on the process of localization.
python3 localization.py
-
Copy
car_nodes/odom_zmq_node.py
andcar_nodes/drive_node.py
to the F110 car and compile them. -
Run
teleop.launch
. -
Run
odom_zmq_node
to send odometry data to the host computer. -
After setting up way points, run
drive_node
to start navigating the car.