Skip to content

Old ROS2 Documentation

Low, Jun En edited this page Apr 12, 2024 · 1 revision

Getting Started

Current Steps

We first setup TrajBridge in SITL to familiarize ourselves with what it does within simulated environment. The steps are almost identical to the actual flight setup.

To run the SITL we start by setting up the PX4-Autopilot and Gazebo code:

# Install the uXRCE-DDS (PX4-ROS 2/DDS Bridge)
sudo snap install micro-xrce-dds-agent --edge

# Go to your workspace folder
cd ~/<workspace>/

# Clone our version locked copy of the PX4-Autopilot workspace:
git clone https://github.com/StanfordMSL/PX4-Autopilot.git

# Install the PX4-Autopilot submodules
cd PX4-Autopilot
git submodule update --init --recursive

# Check out the SFTI branch
git checkout feature/trajbridge

# Turn off lockstep on the Iris model
gedit <PX4-Autopilot folder>/Tools/simulation/gazebo-classic/sitl_gazebo-classic/models/iris/iris.sdf.jinja
Search and set <enable_lockstep>0</enable_lockstep> (default is: <enable_lockstep>1</enable_lockstep>)

Then we set up Trajbridge:

# Go to your workspace folder
cd ~/<workspace>/

# Clone TrajBridge:
git clone https://github.com/StanfordMSL/TrajBridge.git

# Build the package
cd ~/<workspace>/Trajbridge/Trajbridge
colcon build

# Create and activate a conda environment for your package. We call ours sc-env after the SimpleController
conda create -n sc-env python=3.10
conda activate sc-env

# Update gcc or else your conda environment won't be able to locate the ros2
conda install -c conda-forge gcc=12.1.0

# Install the package that your controller resides in using pip. Using SimpleController as an example:
cd ~/<workspace>/TrajBridge/SimpleController
pip install -e .

And you're all set! To run an example:

# In terminal (1) launch the uXRCE agent, pointing locally
micro-xrce-dds-agent udp4 -p 8888

# In terminal (2) launch the quadcopter in Gazebo:
cd ~/<workspace>/PX4-Autopilot
PX4_UXRCE_DDS_NS=drone5 make px4_sitl gazebo-classic

# In terminal (3) launch the trajbridge node:
cd ~/<workspace>/TrajBridge/Trajbridge
source install/setup.bash
ros2 launch px4_comm trajbridge.launch.py

# Make sure terminal (4) is in the sc-env. When the trajbridge state machine shows ready do:
conda activate sc-env
cd ~/<workspace>/TrajBridge/Trajbridge
source install/setup.bash
cd ~/<workspace>/TrajBridge/SplineController/scripts
python spline2position_node.py --traj search

After executing the terminal (3) command, the drone will arm, takeoff and fly to a starting position. Once there, it is ready to receive setpoint commands, which we send it through the terminal (4) command.

Some notes on options:

  1. You can change the trajectory you want to fly changing between the file names (with the '.json') in the SplineController/trajectories folder. You can also tweak/create new trajectories by editing the .json files. The flat outputs are, as their name implies, the flat outputs of the quadcopter [x,y,z,yaw] and their derivatives. Do not over-constrain your keyframes (intermediate frames should have the higher derivatives set to null)
  2. You can tweak the trajbridge.launch.py to do the following:
    1. auto_start: Setting this to True automates the drone arming and taking off. You can set this to False to do so manually (useful in hardware)
    2. auto_land: Setting this to True will cause the drone to land once it's done with the trajectory. Setting it to False will lock it at its current position while setting the yaw to the ready waypoint value (setting the position to ready waypoint might be dangerous). You can cause it to land by closing the trajbridge node or via RC.
    3. wp_ready: The ready waypoint [x,y,z,yaw] that the drone will start its trajectory from.
  3. When making the px4 firmware you might need to install a few packages that might not be on your system by default:
    1. pip3 install kconfiglib
    2. pip3 install --user jinja2 jsonschema
  4. Faced some issues in setting things up for Ubuntu 22.04. Seems like first doing the instructions in: https://docs.px4.io/main/en/dev_setup/dev_env_linux_ubuntu.html and then going through with the rest helps.
  5. You might also have to install gazebo-classic (default for Ubuntu 22.04 is Gazebo Ignition):
    1. sudo apt remove gz-garden
    2. sudo apt install aptitude
    3. sudo aptitude install gazebo libgazebo11 libgazebo-dev
  6. If you can't make in PX4-Autopilot folder due to permission issues (and when you try with sudo, it says something like can't find kconfiglib), you need to change permissions for the PX4-Autopilot folder. Do sudo chown -R <username> PX4-Autopilot/



TrajBridge with Orin Nano and SFTI‐Program Controllers

We use the following hardware:

  1. Orin Nano 8GB Dev Kit with SSD
  2. Ubuntu 20.04 Laptop (22.04 should be ok too)
  3. Shorting Pin
  4. Bluetooth Keyboard + Mouse
  5. DVI to Mini-Display Cable
  6. USB-A to USB-C Cable

We setup the Orin via the SDK Manager route with JetPack 6.0. SD card route fails due to dev kits not shipping with the necessary updates for the QSPI bootloader and so we have to use the SDK Manager to update said bootloader over USB.

  1. Download the SDK Manager from https://developer.nvidia.com/sdk-manager onto the Ubuntu 20.04 laptop (the version I downloaded was: sdkmanager_2.0.0-11405_amd64.deb). Install the SDK Manager.
  2. Short the FC REC and GND pin on the edge of the Orin (labelled horizontal pins right below the fan unit).
  3. With the USB cable, plug the USB-C side into the Orin and the USB-A side into the Ubuntu 20.04 laptop.
  4. With the Display cable, plug the DVI side into the Orin and the Mini-Display into a monitor. Plug in external keyboard + mouse to the Orin.
  5. Power up the Orin and launch the SDK Manager right after. The SDK Manager should detect the Orin and ask you to select between a regular Orin Nano and the 8GB dev kit. Select the latter.
  6. Go through the installation steps (I skipped the DeepStream SDK). Under OEM configuration, choose Runtime (and not Pre-Config) and under Storage Device, choose nVMe.

Some useful links (though these might get changed post-documentation date by nVidia):

  1. https://docs.nvidia.com/sdk-manager/install-with-sdkm-jetson/index.html
  2. https://developer.nvidia.com/embedded/learn/jetson-orin-nano-devkit-user-guide/software_setup.html

Once completed, it's time to configure the Orin with our stuff. For the sake of completeness, here's what I installed.

Optional Stuff

Some quality of life installations that you can skip. I use VSCode for coding, Firefox for browsing and Terminator for easier terminals. If you want these then download the .deb for the arm64 version of vscode at https://code.visualstudio.com/download and then:

# Install VSCode
cd Downloads
sudo apt install ./<file>.deb 

# Install Firefox and Terminator
sudo apt-get install terminator firefox

# Install screen
sudo apt install screen 

# Clean up a little
sudo apt autoremove

The VSCode throws a "Download is performed unsandboxed..." error but the installation still works.

Not So Optional Stuff

These stuff are most likely necessary. We start with ROS2, specifically ROS2 Humble:

# First ensure that the Ubuntu Universe repository is enabled.
sudo apt install software-properties-common
sudo add-apt-repository universe

# Now add the ROS 2 GPG key with apt.
sudo apt update && sudo apt install curl -y
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg

# Then add the repository to your sources list.
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(. /etc/os-release && echo $UBUNTU_CODENAME) main" | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null

# Install ROS2 packages
sudo apt update
sudo apt upgrade
sudo apt install ros-humble-desktop
sudo apt install ros-dev-tools

This follows from the tutorial here

The other important package to install is conda:

wget https://repo.anaconda.com/archive/Anaconda3-2024.02-1-Linux-aarch64.sh (note the arm flavour)
bash Anaconda3-2024.02-1-Linux-aarch64.sh 

Next we set up the micro-xrce-dds-agent for communicating between the Pixracer and the Orin over the serial port:

# In a terminal on the Orin do:
sudo snap install micro-xrce-dds-agent --edge

# Some permissions need to be configured. Install some stuff that will allow us to access the configurations
sudo apt install selinux-utils
sudo apt install selinux-policy-default

# Disable said configuration
sudo gedit /etc/selinux/config
SELINUX=disabled (default is permissive)

# Test out the serial communication
sudo micro-xrce-dds-agent serial --dev /dev/ttyTSH0 --baudrate 921600

And now its time for the StanfordMSL workspace. This is where the relevant repositories (PX4-Autopilot, TrajBridge and SFTI-Program) will recide.

# Create and navigate to our workspace folder (feel free to change the name)
mkdir StanfordMSL
cd StanfordMSL

First up, we install PX4-Autopilot (for SITL testing)

# Clone our version locked copy of the PX4-Autopilot workspace:
git clone https://github.com/StanfordMSL/PX4-Autopilot.git

# Install the PX4-Autopilot submodules
cd PX4-Autopilot
git submodule update --init --recursive

# Check out the TrajBridge branch
make clean
make distclean
git checkout feature/trajbridge
make submodulesclean

Then we install TrajBridge (for ROS2-PX4 link):

# Clone TrajBridge
git clone https://github.com/StanfordMSL/TrajBridge.git

# Build
cd TrajBridge/TrajBridge
colcon build (don't forget to source your ROS setup bash. e.g: source /opt/ros/humble/setup.bash

Finally, we install the SFTI package (for some neural network based controllers):

# Clone and checkout appropriate branch
git clone https://github.com/StanfordMSL/SFTI-Program.git
(checkout appropriate branch)

# Initailize the submodules
git submodule update --init --recursive

# Build acados from source within its submodule
cd acados
mkdir -p build
cd build
cmake -DACADOS_WITH_QPOASES=ON ..
# add more optional arguments e.g. -DACADOS_WITH_OSQP=OFF/ON -DACADOS_INSTALL_DIR=<path_to_acados_installation_folder> above
make install -j4

# The default downloaded version of tera renderer currently doesn't support the orin nano, so I have compiled both an x86 
# flavor and an arm flavor and saved it into a forked repository in MSL. The user need only to change the file that is used
# (t_renderer) to the correct architecture. Navigate to SFTI-Program/acados/bin and rename the file accordingly
# If x86: t_renderer_x86 -> t_renderer
# If ARM: t_renderer_arm -> t_renderer

# Create your conda environment
conda create -n sfti-env python=3.10 (3.8 if your Ubuntu is 20.04, 3.6 if your Ubuntu is 18.04)
conda activate sfti-env

# Update gcc or else your conda environment won't be able to locate the ros2
conda install -c conda-forge gcc=12.1.0

# Install the SFTI requirements
cd StanfordMSL/SFTI-Program/sfti-program
pip install -e .

# Install the acados requirements
cd StanfordMSL/SFTI-Program/acados/interfaces/acados_template
pip install -e .

We are now ready to fly! Here's an example run using vehicle rates MPC in SITL:

# In one terminal launch the micro-xrce
micro-xrce-dds-agent udp4 -p 8888

# In another terminal, launch the simulation
cd PX4-Autopilot
make px4_sitl gazebo-classic

# In the third terminal, launch the trajbridge state machine
source /opt/ros/humble/setup.bash (might be prudent to put this in .bashrc)
cd StanfordMSL/TrajBridge/TrajBridge
source install/setup.bash
ros2 launch px4_comm trajbridge.launch.py

# In the fourth terminal, the SFTI node
conda activate sfti-env
source /opt/ros/humble/setup.bash (if you haven't already done so)
cd StanfordMSL/TrajBridge/TrajBridge
source install/setup.bash  (if you haven't already done so)
cd SFTI-Program/scripts/
python sfti2command_node.py --cohort 101 --course orbit --pilot vr_M_Iceman --frame iris

And an actual run in hardware:

# On the computer that you want to relay the mocap from:
ros2 launch px4_comm mocap.launch.py

# On the computer that serves as your gcs:
ssh [email protected]
screen

# In one terminal launch the micro-xrce
micro-xrce-dds-agent serial --dev /dev/ttyTHS0 -b 921600

# In another terminal (ctrl,a,c) launch the trajbridge node
source /opt/ros/humble/setup.bash (might be prudent to put this in .bashrc
cd StanfordMSL/TrajBridge/TrajBridge
source install/setup.bash
ros2 launch px4_comm trajbridge.launch.py

# In another terminal (ctrl,a,c) the SFTI node
conda activate sfti-env
source /opt/ros/humble/setup.bash (if you haven't already done so)
cd StanfordMSL/TrajBridge/TrajBridge
source install/setup.bash  (if you haven't already done so)
cd SFTI-Program/scripts/
python sfti2command_node.py --cohort 101 --course orbit --pilot vr_M_Iceman --frame iris

Additional Notes:

  1. To upload the custom PX4 firmware, navigate to the PX4 folder and, depending on the board do:
    1. PixRacer: make px4_fmu-v4_default upload
    2. PixRacer Pro: make mro_pixracerpro_default upload
  2. The pinouts from the PX4 are as follows
    1. Pin6: GND (black)
    2. Pin8: UART1_TXD (green)
    3. Pin10: UART1_RXD (yellow)
  3. Drone Parameters for Indoor Mocap:
    1. UXRCE_DDS_CFG = TELEM 2
    2. EKF2_BARO_CTRL = Disabled
    3. EKF2_EV_CTRL = 11
    4. EKF2_EV_DELAY = 0.1 ms
    5. EKF2_EV_NOISE_MD = EV noise parameters
    6. EKF2_GPS_CTRL = 0
    7. EKF2_HGT_REF = Vision
    8. EKF2_RNG_CTRL = Disable range fusion
    9. MIS_TAKEOFF_ALT = 1.0m
  4. On the mocap desktop I installed the vrpn package to relay the mocap data to ROS2: sudo apt install ros-humble-vrpn-mocap
  5. Make sure the drone's airframe is 'Generic 250'.
  6. For acados to work we need to update ~/.bashrc with:
    1. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:"/home/<user>/StanfordMSL/SFTI-Program/acados/lib"
    2. export ACADOS_SOURCE_DIR="/home/<user>/StanfordMSL/SFTI-Program/acados"
  7. For vehicle attitude control you can do:
    1. python sfti2command_node.py --cohort 101 --course orbit --pilot va_M_Iceman --frame iris