Skip to content

Onboarding & Take Home Interview

Miles Cobb edited this page Nov 5, 2019 · 11 revisions

Welcome to the VTOL project! This article combines the processes of interviewing candidates for the VTOL team and bringing them up to speed with the technology that we work on.

What is a take-home interview?

Traditional interviews for joining a club usually try to gauge a student's willingness to learn and interest in the project by asking them verbal questions. This take-home interview accomplishes the same goal but allows you to work at your own pace and actually put your skills to use. You can use Google, ask us questions if you get stuck, ask other people taking the take-home interview for help, take an entire month to finish it, etc. As long you get it done, you know enough to be a valuable member of our team.

Additionally, we usually receive a large influx of prospective members at the beginning of each academic year. With lots of new members, it is not logistically possible to personally teach each person prerequisite programming knowledge without severely hindering productivity. Therefore, this article is necessary to streamline the process of working with prospective members. Once you're part of the team, communication will be more personal rather than streamlined.

Am I experienced enough to be part of VTOL?

We value your potential knowledge, not what you currently know. If you don't have enough knowledge to complete this document right now, but are willing to research the topics that you don't know about to eventually finish the document, you are the ideal candidate for our team.

While working through this article, you will:

  1. Understand how to use git and GitHub
  2. Understand the overall project concept of VTOL and NGCP
  3. Set up the VTOL repository
  4. Create a simulated autonomous vehicle
  5. Understand the architecture of our code and how our software works
  6. Complete your first issue

All six of these points are necessary requirements to become a member of our team.

0. Install Slack on mobile

Find Slack in the App Store/Play Store and install it on your phone. Turn notifications on. Remind me (the team lead) to add you to the Slack channel if I haven't already. If you're working through this document outside of a meeting, you can message us any questions you have either through direct message or a group channel.

1. Using git and GitHub

Using version control systems like git are absolutely necessary in any collaborative project. Unless you are already proficient in git, please look at pages 1 - 10 and complete all exercises from this quick git bootcamp.

Proficiency in git means that you have experience using version control in a collaborative project. If you've only used git add, git commit, git push, do the git bootcamp.

2. Understand the overall project concept of VTOL and NGCP

Read Project Overview and then the Glossary. I recommend taking notes while you read these.

Feel free to ask questions to make sure that you understand the project. When you're done, we may briefly verbally quiz you to make sure you know what we're working on.

3. Setting up VTOL

  1. Fork the repository and clone it (go back and complete the git tutorial if you don't know what this means).
  2. Install Python 3.7.4 and make sure you can use pip for Python 3.7.4. If you already have Python 2 installed, type pip in the terminal to check which version you have, you may be running pip for Python 2. In that case, run pip3 instead of pip.
  3. Run pip install -r requirements.txt

4. Make your own simple autonomous vehicle script

The most difficult concepts to understand are related to autonomous navigation - What are DroneKit, ArduPilot, PixHawk, MAVLink? How do they relate to each other? How do you use them?

This (slightly modified) excerpt from DroneKit documentation that sums it up:

DroneKit-Python allows developers to create apps that run on an onboard companion computer and communicate with the ArduPilot flight controller (in VTOL's case a PixHawk) using a low-latency link. Onboard apps can significantly enhance the autopilot, adding greater intelligence to vehicle behaviour, and performing tasks that are computationally intensive or time-sensitive (for example, computer vision, path planning, or 3D modelling).

The API communicates with vehicles over MAVLink. It provides programmatic access to a connected vehicle’s telemetry, state and parameter information, and enables both mission management and direct control over vehicle movement and operations.

The PixHawk flight controller is a piece of hardware on the VTOL that has access to its control system and navigation information (GPS, accelerometer, ...). The firmware running on the PixHawk is ArduPilot. Our software runs on a Raspberry Pi that is connected to the PixHawk, and they communicate with each other over MAVLink. DroneKit-Python provides an API for sending these MAVLink commands. These MAVLink commands allow us to directly control the autopilot firmware (ArduPilot) running on the vehicle.

Now that you understand the idea, you can experiment with a simulated vehicle provided by DroneKit. Look at this example now. Read through every line of code except the functions get_location_metres and get_distance_metres. Run the code and examine the output. What is happening? How does ArduPilot and MAVLink fit into this?

5. Understand the architecture

First, verify that you understand every term in the glossary. Open up VTOL in a text editor (we recommend Visual Studio Code). Read through main.py in the src directory, if you have questions about this, please ask.

6. Completing your first issue

This is the final step where everything you've learned will be applied. On the onboarding branch of our repository is a previous version of VTOL. You will be implementing a feature that we've already implemented in the past, which is having the vehicle react to the "stop" and "pause" messages from the GCS.

On the onboarding branch, the callback function will set the stop and pause variables appropriately based on messages received from GCS. However, the vehicle does not react to them. You will need to change the code such that if a stop message is received, the vehicle will land in the spot that it initially took off at. If the vehicle receives a pause message, the vehicle will hover in the same location/altitude until it receives a resume message.

You can either do this for quick scan and detailed search.

Make sure that you are following proper git practices; develop on a new branch on your fork. When you're done, make a pull request to the onboarding branch and I will review it.

Solution: since this feature has already been implemented on the master branch, if you're stuck you can look back and see how we did it. I won't be linking to that commit because this is a good way for you to get familiarized with navigating pull requests, issues, and commits on GitHub.

Once you're done with all of these steps, you'll be prepared to make valuable contributions to our team.

7. Optional - Set up the real simulator

This is past the minimum to be part of the team but you'll probably have to get this done eventually anyway. Follow the steps in this guide to run Quick Scan with the master version of ArduCopter.

Clone this wiki locally