Jetson/FRC Setup

=Introduction= One of the most important, but frequently minimized, considerations when designing FIRST Robotics Competition Robots is the use of vision. When teams think of vision, they often focus on the methods used to process images from onboard cameras, generally as an afterthought to the rest of the robot design and programming. However, by paying closer attention to the entire picture, robot performance can be greatly increased. There are two main components to FRC Vision: getting the image from the robot camera to the DriverStation and processing that image to get useful information. There are a number of implementations for both steps.

Streaming - Common streaming solutions and their pros/cons:
 * Axis (Network Camera)
 * Pros: Low latency, no software bottleneck
 * Cons: Difficult for newer teams (lack of current documentation), image quality, lack of processing ability
 * RoboRIO Webcam Server
 * Pros: Easy to setup, well documented solution, supports most USB Cameras
 * Cons: Introduces latency to the robot control loop, high bandwidth use, poor quality or latency

Processing - Common processing solutions and their pros/cons:
 * DriverStation
 * Pros: Easy to program using a number of different software tools, works with any of the streaming solutions
 * Cons: Depends on DriverStation hardware quality, requires low-latency stream and connection to robot for any useful real-time processing
 * RoboRIO
 * Pros: Easy to setup for SIMPLE vision processing
 * Cons: Massive processing bottleneck - if the RoboRIO is also streaming, the robot control loop will slow down a lot

Co-Processors - Some teams elect to add another processor to their robots, used for some combination of image streaming and processing. An abridged list of co-processors and their pros/cons:
 * RaspberryPi
 * Pros: Cheap, popular - many online resources
 * Cons: Very weak, very slow
 * Kangaroo
 * Pros: Can use Windows tools more familiar to teams, battery backup
 * Cons: Very slow and weak, runs Windows
 * Jetson
 * Pros: Powerful, many CUDA cores optimized for mathematical operations, large number of carrier boards makes it easy to find a solution that fits a team's needs
 * Cons: Upfront cost

=About this Guide= The NVidia Jetson, specifically the TX variants, is very well suited to serve as a co-processor for FRC robots. The jetson easily handles image compression, streaming, and processing on a single platform, freeing up resources on both the DriverStation and the RoboRIO. Additionally, with the libraries already included in the Jetson Jetpack Ubuntu image, the Jetson accomplishes these tasks without requiring teams to undergo complicated software build processes.

The purpose of this guide is to provide the knowledge and tools teams need in order to make use of this platform. This comes in multiple parts: a step-by-step walkthrough for setting up the Jetson for use in FRC, a set of software examples and templates, and additional information about using Linux and the various other technologies featured in this guide.

=Setting Up the Jetson for FRC=

Step 1: Setting up tools

 * In order to flash the Jetson, you must use the Nvidia Jetpack utility. Jetpack must be installed on an Ubuntu machine.  In order to facilitate this installation for teams who don’t have a dedicated Ubuntu computer, the first step in this guide is to Install and configure a virtual machine.


 * The software we have chosen to use is Virtualbox, due to the fact that it is the same product regardless of host machine. To download Virtualbox, go to https://www.virtualbox.org/wiki/Downloads and select the appropriate version for the platform it will be installed on.


 * After downloading Virtualbox, download an Ubuntu .iso. Note: Jetpack requires Ubuntu 15.04 or 16.04.  It will NOT work on Ubuntu 17.04.  Ubuntu can be found at https://www.ubuntu.com/download/desktop.


 * Once it has been downloaded, you must install Virtualbox. There are different installation procedures for different host platforms, but the installers are easy to understand.  Follow the instructions for a standard installation.


 * After Virtualbox has been installed, it is time to create the virtual machine. In order to make sure there is enough space for Jetpack and any other utilities, we recommend setting the virtual hard disk size to 40GB.  It is also recommended that at least 4GB of RAM is dedicated to the VM.


 * After initially setting up the VM settings, additional modifications must be done to the VM. Go into settings for the VM, and enable USB passthrough.  This is most easily done by going into the VM Settings and adding a blank filter in the USB section.  This will cause all USB devices connected while the VM is running to be passed through to the VM.  Note: In order to avoid issues with mice, keyboards, etc. that are needed by the host machine, do NOT detach and reattach peripherals while inside the VM.  Additionally, network settings for the VM must also be changed.  The adapter type should be changed from NAT to Bridged. This will allow the VM to communicate on the local network.






 * Install Ubuntu. When prompted for an .iso, select the Ubuntu .iso downloaded previously.  Follow the standard steps for an Ubuntu installation, downloading all updates but not downloading all add-ons.

Step 2: Using Jetpack
Now that there is an environment in which to set up the Jetson, we must download and install Jetpack (On the Ubuntu VM).
 * To download Jetpack, go to https://developer.nvidia.com/embedded/jetpack, and follow the instructions to create an account and get the file. It is suggested that you download the file into a custom directory, created by running   in a terminal.




 * In order to install the Jetpack utility on Ubuntu, you must modify the permission on the downloaded file to make it executable. To do this, run:   After adding the executable permission to the Jetpack file, run the file by typing in:




 * Select the correct package for the device you are configuring. Accept all license agreements, and continue on with the installer.  Ensure that the installer is set to do a full installation and to NOT automatically resolve dependencies.  Allow Jetpack to download and install all of the packages.




 * The next step will involve the Nvidia Jetson board itself. A black terminal window will open inside the VM, prompting you to restart the Jetson in force recovery mode.  Follow the on-screen instructions.  Run   in a separate terminal window in order to determine whether or not the Jetson is visible to the VM; an "NVidia" device should appear in the   results.  If it is not, verify first that host machine can see the Jetson; if it cannot, carefully follow the instructions to put it in force recovery mode once again.  If it can, check the VM USB settings and try again.




 * Press enter. The utility will not move forward in the process by itself, so you must move it forward once the Jetson can be seen by the VM.


 * Once the utility is done with the initial movement of files to the Jetson, a prompt will appear asking about the way the Jetson is connected to the network. Connect the Ethernet port on the Jetson to the same router as the host machine, and select the corresponding option.




 * Jetpack may then ask for the IP address of the Jetson. Obtain this via the   command in a terminal on the VM.  Input this IP address, along with the username “ubuntu” and password “ubuntu” into the utility window.


 * Allow the utility to finish, the Jetson will restart, and move on to actual setup on the Jetson itself.

Step 3: Configuring Jetpack Networking

 * Connect a keyboard, mouse, and screen to the Jetson. This will allow us to work directly on the Jetson while we Configure network settings.


 * This tutorial assumes that you will be setting the Jetson up on a network configured for static IP addresses. First, install resolvconf in order to be able to resolve IP addresses via DNS after reconfiguring your network settings.  Run the following commands to do this:.


 * Once resolvconf is installed, find the NetworkManager configuration file located at  so that the line:   is present, and no other entry exists for rc-manager.

auto enx00044b669328 iface enx00044b669328 inet static address (regular IP address) netmask (network netmask) gateway (normal gateway) network (normal network) broadcast (normal broadcast) dns-nameservers 8.8.8.8 8.8.8.4 dns-search google.com
 * Create three new interface files:,  , and   at  .  They should look as follows:

auto enx00044b669328 iface enx00044b669328 inet static address (regular IP address) netmask (network netmask) gateway (normal gateway) network (normal network) broadcast (normal broadcast) dns-nameservers 8.8.8.8 8.8.8.4 dns-search google.com

auto enx00044b669328 iface enx00044b669328 inet static address 10.xx.yy.12 netmask 255.0.0.0 gateway 10.xx.yy.1 network 10.0.0.0 broadcast 10.xx.yy.255 dns-nameservers 8.8.8.8 8.8.8.4


 * Where xx.yy is refers to the team number (e.g. 34.19)


 * Reboot the Jetson.

Step 4: Setting up Samba

 * In order to facilitate easier development, create a directory directly above root. For this guide, we will assume that the directory name will be / . Run   in order to gain full permissions for that directory.


 * Samba will be installed in order to allow remote systems to mount that directory as an external drive. This makes it easier to develop for the Jetson.  To install samba, run the following command:.


 * To configure Samba, run  to se the password for the user. A prompt will then appear asking for the password.  This should be the same password that you will later set for the user on the Jetson itself.


 * Made a copy of the Samba configuration file as a backup with

[ ] path = / valid users = ubuntu read only = no
 * Open the smb.conf file and add the following lines to the end:
 * Then restart Samba by running


 * You can now mount the Jetson’s / directory as an external drive on any development machine on the network.

Step 5: Building Network Tables (without Java)

 * NetworkTables is an important tool for passing data around the FRC Control System. In order to use NetworkTables on the Jetson, we will build it from source.


 * Get the source code from Github. To do this, navigate to the teamfolder created in the step above. Then run  .  This will create a new directory located at / /ntcore.  Inside this directory are a number of files used to build ntcore.


 * Open the file “CMakeLists.txt” and add:  to the TOP of the file in order to build ntcore without Java bindings.  Note that this method of building ntcore does NOT use gradle.


 * Run  and  . This will build the ntcore libraries as well as a set of tests.


 * In order to use NetworkTables in projects, the directory must be added to the shared library loader path. To do this, open   and add the following line to the bottom:

LIB_PATH=/ /ntcore/ NT_INCLUDE_PATH=/ /ntcore/include/ WPI_INCLUDE_PATH=/ /ntcore/wpiutil/include/ FILENAME=client.cpp OUTNAME=nt_client all: clean g++ -std=c++11 -g -Wall -L$(LIB_PATH) -I$(NT_INCLUDE_PATH) -I$(WPI_INCLUDE_PATH) $(FILENAME) -lntcore -lwpiutil -lpthread -o $(OUTNAME) clean: -rm ${OUTNAME} run: all ./${OUTNAME}
 * Reboot the Jetson. Now, the ntcore libraries will work properly as long as project build options are correct.  The following is a sample makefile that demonstrates the correct way to link the libraries when compiling with g++.

Step 6: Installing GStreamer

 * The Jetson can be used to provide a high-quality, low-latency video stream by compressing camera output using the H264 video compression codec. That image can then be sent over the network using gstreamer
 * Run


 * Gstreamer's functions can be accessed from the command line using gst-launch-1.0


 * Understanding gstreamer
 * GStreamer works by creating an image pipeline, elements of which are separated by “!”.
 * The first element in the pipeline is the image source
 * The next element sets properties or “caps” of the image. Options such as the framerate, dimensions, and image format can be set here.
 * The last element in the pipeline is the image sink, or where the image will go


 * Test gstreamer by displaying a test image using.
 * This command displays a test image on any monitor attached to the Jetson.
 * The image source is a gstreamer test image
 * The caps state that the image should be sent raw and uncompressed, and that it should be 1280 pixels by 720 pixels
 * The image sink is autovideosink which attempts to display the image on any available hardware -- note that if this command fails rerun using  (for XWindows)


 * Test camera capture with gstreamer. Make sure that the camera is connected to the Jetson with .  The camera should be among the listed devices.  Determine which video input the camera is on by running  .  The result of that command is the path to the camera.  Generally this should be   or.
 * The image source is the USB camera
 * The caps state that the image should be sent raw and uncompressed, and that the image should be 640 pixels by 480 pixels
 * The image sink is again autovideosink


 * Test gstreamer over a network. In order to see the camera’s image on your laptop, you will have to install gstreamer there and run a gstreamer pipeline on both the Jetson and your laptop. The pipeline on the jetson will send an image over udp to a port on your laptop, and your laptop will use that sent image as its source and display it.


 * More info on using gstreamer on a Jetson can be found here

Step 7: Integrating OpenCV and gstreamer

 * Some builds of OpenCV include built-in support for gstreamer, however the current build of OpenCV4Tegra does not include this. To work around this, provided in the attached sample projects is a modified version of the .cpp file from the OpenCV source that facilitates this support.

Flashing Camera Settings

 * The  included in the sample projects configures camera settings
 * Camera settings are set using the Video4Linux driver.
 * Notes on camera settings: In general a lower exposure time will yield better images for a few reasons, including more full colors, less motion blur, and brighter reflective tape. It may be that you need to move the “exposure_absolute” variable up or down to achieve a lower exposure, depending on your camera.
 * The sample C++ project for integrating OpenCV and GStreamer will call the  script directly.

Integration

 * and  are both needed for gstreamer and OpenCV integration.
 * The two files allow the use of OpenCV VideoReader and VideoWriter classes with gstreamer pipelines.
 * The diagram below illustrates the code structure of a project with both OpenCV and gstreamer.



=Software Examples=

Multiple gstreamer Pipelines
=Additional Resources=