Jetson/FRC Setup

From eLinux.org
Jump to: navigation, search

Introduction

One of the most important, but frequently minimized, considerations when designing FIRST Robotics Competition Robots is the use of vision. When teams think of vision, they often focus on the methods used to process images from onboard cameras, generally as an afterthought to the rest of the robot design and programming. However, by paying closer attention to the entire picture, robot performance can be greatly increased.

There are two main components to FRC Vision: getting the image from the robot camera to the DriverStation and processing that image to get useful information. There are a number of implementations for both steps.

Streaming - Common streaming solutions and their pros/cons:

  • Axis (Network Camera)
    • Pros: Low latency, no software bottleneck
    • Cons: Difficult for newer teams (lack of current documentation), image quality, lack of processing ability
  • RoboRIO Webcam Server
    • Pros: Easy to setup, well documented solution, supports most USB Cameras
    • Cons: Introduces latency to the robot control loop, high bandwidth use, poor quality or latency

Processing - Common processing solutions and their pros/cons:

  • DriverStation
    • Pros: Easy to program using a number of different software tools, works with any of the streaming solutions
    • Cons: Depends on DriverStation hardware quality, requires low-latency stream and connection to robot for any useful real-time processing
  • RoboRIO
    • Pros: Easy to setup for SIMPLE vision processing
    • Cons: Massive processing bottleneck - if the RoboRIO is also streaming, the robot control loop will slow down a lot

Co-Processors - Some teams elect to add another processor to their robots, used for some combination of image streaming and processing. An abridged list of co-processors and their pros/cons:

  • RaspberryPi
    • Pros: Cheap, popular - many online resources
    • Cons: Very weak, very slow
  • Kangaroo
    • Pros: Can use Windows tools more familiar to teams, battery backup
    • Cons: Very slow and weak, runs Windows
  • Jetson
    • Pros: Powerful, many CUDA cores optimized for mathematical operations, large number of carrier boards makes it easy to find a solution that fits a team's needs
    • Cons: Upfront cost

About this Guide

The NVidia Jetson, specifically the TX variants, is very well suited to serve as a co-processor for FRC robots. The jetson easily handles image compression, streaming, and processing on a single platform, freeing up resources on both the DriverStation and the RoboRIO. Additionally, with the libraries already included in the Jetson Jetpack Ubuntu image, the Jetson accomplishes these tasks without requiring teams to undergo complicated software build processes.

The purpose of this guide is to provide the knowledge and tools teams need in order to make use of this platform. This comes in multiple parts: a step-by-step walkthrough for setting up the Jetson for use in FRC, a set of software examples and templates, and additional information about using Linux and the various other technologies featured in this guide.

Setting Up the Jetson for FRC

Step 1: Setting up tools

  • In order to flash the Jetson, you must use the Nvidia Jetpack utility. Jetpack must be installed on an Ubuntu machine. In order to facilitate this installation for teams who don’t have a dedicated Ubuntu computer, the first step in this guide is to Install and configure a virtual machine.
  • The software we have chosen to use is Virtualbox, due to the fact that it is the same product regardless of host machine. To download Virtualbox, go to https://www.virtualbox.org/wiki/Downloads and select the appropriate version for the platform it will be installed on.
  • After downloading Virtualbox, download an Ubuntu .iso. Note: Jetpack requires Ubuntu 15.04 or 16.04. It will NOT work on Ubuntu 17.04. Ubuntu can be found at https://www.ubuntu.com/download/desktop.
  • Once it has been downloaded, you must install Virtualbox. There are different installation procedures for different host platforms, but the installers are easy to understand. Follow the instructions for a standard installation.
  • After Virtualbox has been installed, it is time to create the virtual machine. In order to make sure there is enough space for Jetpack and any other utilities, we recommend setting the virtual hard disk size to 40GB. It is also recommended that at least 4GB of RAM is dedicated to the VM.
  • After initially setting up the VM settings, additional modifications must be done to the VM. Go into settings for the VM, and enable USB passthrough under "Ports". This is most easily done by going into the VM Settings and adding a blank filter in the USB section. This will cause all USB devices connected while the VM is running to be passed through to the VM. Note: In order to avoid issues with mice, keyboards, etc. that are needed by the host machine, do NOT detach and reattach peripherals while inside the VM. Additionally, network settings for the VM must also be changed. The adapter type should be changed from NAT to Bridged. This will allow the VM to communicate on the local network.
Add a filter for USB passthrough to the VM


Set Network Adapter to Bridged
  • Install Ubuntu. When prompted for an .iso, select the Ubuntu .iso downloaded previously. Follow the standard steps for an Ubuntu installation, downloading all updates but not downloading all add-ons.
  • Note: if you experience difficulty with installing Jetpack with Virtualbox, try a different VirtualMachine host software e.g. VMWare, making sure that the machine settings allow for USB passthrough and Bridged networking

Step 2: Using Jetpack

Now that there is an environment in which to set up the Jetson, we must download and install Jetpack (On the Ubuntu VM).

  • To download Jetpack, go to https://developer.nvidia.com/embedded/jetpack, and follow the instructions to create an account and get the file. It is suggested that you download the file into a custom directory, created by running sudo mkdir /<directoryname> in a terminal.
Download Jetpack
  • In order to install the Jetpack utility on Ubuntu, you must modify the permission on the downloaded file to make it executable. To do this, run: sudo chmod +x <filename> After adding the executable permission to the Jetpack file, run the file by typing in: ./<filename>
Jetpack installer on start
  • Select the correct package for the device you are configuring. Accept all license agreements, and continue on with the installer. Ensure that the installer is set to do a full installation and to NOT automatically resolve dependencies. Allow Jetpack to download and install all of the packages.
Device selection
  • A prompt will appear asking about the way the Jetson is connected to the network. Connect the Ethernet port on the Jetson to the same router as the host machine, and select the corresponding option.
"Network connection"
  • The next step will involve the Nvidia Jetson board itself. A black terminal window will open inside the VM, prompting you to restart the Jetson in force recovery mode. Follow the on-screen instructions. Run lsusb in a separate terminal window in order to determine whether or not the Jetson is visible to the VM; an "NVidia" device should appear in the lsusb results. If it is not, verify first that host machine can see the Jetson; if it cannot, carefully follow the instructions to put it in force recovery mode once again. If it can, check the VM USB settings and try again.
"Post Install"
  • Press enter. The utility will not move forward in the process by itself, so you must move it forward once the Jetson can be seen by the VM.
  • Jetpack may then ask for the IP address of the Jetson. Obtain this via the sudo arp -a command in a terminal on the VM. Input this IP address, along with the username “ubuntu” and password “ubuntu” into the utility window.
  • Allow the utility to finish, the Jetson will restart, and move on to actual setup on the Jetson itself.

Step 3: Configuring Jetpack Networking

  • Connect a keyboard, mouse, and screen to the Jetson. This will allow us to work directly on the Jetson while we Configure network settings.
  • This tutorial assumes that you will be setting the Jetson up on a network configured for static IP addresses. First, install resolvconf in order to be able to resolve IP addresses via DNS after reconfiguring your network settings. Run the following commands to do this: sudo apt-get update sudo apt-get install resolvconf.
  • Once resolvconf is installed, find the NetworkManager configuration file located at /etc/NetworkManager/NetworkManager.conf so that the line: rc-manager=resolvconf is present under main:, and no other entry exists for rc-manager.
  • Determine the hardware address of your network interface by running ifconfig.
  • Create three new interface files: interfaces, interfaces.net, and interfaces.robot at /etc/network. They should look as follows:
    • Interfaces:
auto <code>interface hardware address</code>
iface <code>interface hardware address</code> inet static
address (regular IP address) 
netmask (network netmask)
gateway (normal gateway)
network (normal network)
broadcast (normal broadcast)
dns-nameservers 8.8.8.8 8.8.8.4
dns-search google.com
  • Interfaces.net:
auto <code>interface hardware address</code>
iface enx00044b669328 inet static
address (regular IP address) e.g. 192.168.1.34
netmask (network netmask) e.g. 255.255.255.0
gateway (normal gateway) e.g. 192.168.1.1
network (normal network) e.g. 192.168.1.0
broadcast (normal broadcast) e.g. 192.168.1.255
dns-nameservers 8.8.8.8 8.8.8.4
dns-search google.com
  • Interfaces.robot:
auto <code>interface hardware address</code>
iface <code>interface hardware address</code> inet static
address 10.xx.yy.12
netmask 255.0.0.0
gateway 10.xx.yy.1
network 10.0.0.0
broadcast 10.xx.yy.255
dns-nameservers 8.8.8.8 8.8.8.4
Where xx.yy is refers to the team number (e.g. 34.19)
  • Reboot the Jetson.

Step 4: Setting up Samba

  • In order to facilitate easier development, create a directory directly above root. For this guide, we will assume that the directory name will be /<teamnumber>. Run sudo chown ubuntu /<teamnumber> in order to gain full permissions for that directory.
  • Samba will be installed in order to allow remote systems to mount that directory as an external drive. This makes it easier to develop for the Jetson. To install samba, run the following command: sudo apt-get install samba.
  • To configure Samba, run sudo smbpasswd -a nvidia to se the password for the user. A prompt will then appear asking for the password. This should be the same password that you will later set for the user on the Jetson itself.
  • Made a copy of the Samba configuration file as a backup with sudo cp /etc/samba/smb.conf /etc/samba/smb.confBACKUP
  • Open the smb.conf file and add the following lines to the end:
[<teamnumber>]
path = /<teamnumber>
valid users = nvidia
read only = no
Then restart Samba by running sudo service smbd restart
  • You can now mount the Jetson’s /<teamnumber> directory as an external drive on any development machine on the network.

Step 5: Building Network Tables (without Java)

  • NetworkTables is an important tool for passing data around the FRC Control System. In order to use NetworkTables on the Jetson, we will build it from source.
  • Install cmake by running sudo apt-get install cmake
  • Get the source code from Github. To do this, navigate to the teamfolder created in the step above. Then run sudo git clone https://github.com/wpilibsuite/ntcore.git. This will create a new directory located at /<teamnumber>/ntcore. Inside this directory are a number of files used to build ntcore.
  • Navigate into the ntcore folder and run git reset --hard e6656326a821dc3069fca57f736883594be61d7b . This will set the repo to the 3.1.7 release which is needed to build NetworkTables without Java.
  • Open the file “CMakeLists.txt” and add: set (WITHOUT_JAVA true) to the TOP of the file in order to build ntcore without Java bindings. Note that this method of building ntcore does NOT use gradle.
  • Run sudo cmake . and sudo make. This will build the ntcore libraries as well as a set of tests.
  • In order to use NetworkTables in projects, the directory must be added to the shared library loader path. To do this, open /home/ubuntu/.bashrc and add the following line to the bottom: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:"/<teamnumber>/ntcore/"
  • Reboot the Jetson. Now, the ntcore libraries will work properly as long as project build options are correct. The following is a sample makefile that demonstrates the correct way to link the libraries when compiling with g++.
LIB_PATH=/<teamnumber>/ntcore/
NT_INCLUDE_PATH=/<teamnumber>/ntcore/include/
WPI_INCLUDE_PATH=/<teamnumber>/ntcore/wpiutil/include/
FILENAME=client.cpp
OUTNAME=nt_client
 
all: clean
	g++ -std=c++11 -g -Wall -L$(LIB_PATH) -I$(NT_INCLUDE_PATH) -I$(WPI_INCLUDE_PATH) $(FILENAME) -lntcore -lwpiutil -lpthread -o $(OUTNAME)
 
clean:
	-rm ${OUTNAME}
 
run: all
	./${OUTNAME}

Step 6: Installing GStreamer

  • The Jetson can be used to provide a high-quality, low-latency video stream by compressing camera output using the H264 video compression codec. That image can then be sent over the network using gstreamer
  • Run sudo apt-get install gstreamer1.0-tools gstreamer1.0-alsa gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav to install gstreamer and its necessary plugins
  • Run sudo apt-get install v4l-utils to install Video4Linux utilities that are needed to change camera properties
  • Gstreamer's functions can be accessed from the command line using gst-launch-1.0
  • Understanding gstreamer
    • GStreamer works by creating an image pipeline, elements of which are separated by “!”.
    • The first element in the pipeline is the image source
    • The next element sets properties or “caps” of the image. Options such as the framerate, dimensions, and image format can be set here.
    • The last element in the pipeline is the image sink, or where the image will go
  • Test gstreamer by displaying a test image using gst-launch-1.0 -v videotestsrc ! video/x-raw, width=1280, height=720 ! xvimagesink.
    • This command displays a test image on any monitor attached to the Jetson.
    • The image source is a gstreamer test image
    • The caps state that the image should be sent raw and uncompressed, and that it should be 1280 pixels by 720 pixels
    • The image sink is xvimagesink which attempts to display the image on any available hardware -- note that if this command fails rerun using ximagesink (for XWindows)
  • Test camera capture with gstreamer. Make sure that the camera is connected to the Jetson with lsusb. The camera should be among the listed devices. Determine which video input the camera is on by running ls /dev/video*. The result of that command is the path to the camera. Generally this should be /dev/video0 or /dev/video1. Run gst-launch-1.0 -v v4l2src device=/dev/video0  ! video/x-raw, width=640, height=480 ! xvimagesink.
    • The image source is the USB camera
    • The caps state that the image should be sent raw and uncompressed, and that the image should be 640 pixels by 480 pixels
    • The image sink is again xvimagesink
  • Test gstreamer over a network. In order to see the camera’s image on your laptop, you will have to install gstreamer there and run a gstreamer pipeline on both the Jetson and your laptop. The pipeline on the jetson will send an image over udp to a port on your laptop, and your laptop will use that sent image as its source and display it.
  • More info on using gstreamer on a Jetson can be found here

Step 7: Integrating OpenCV and gstreamer

  • Some builds of OpenCV include built-in support for gstreamer, however the current build of OpenCV4Tegra does not include this. To work around this, provided in the attached sample projects is a modified version of the .cpp file from the OpenCV source that facilitates this support.

Flashing Camera Settings

  • The camera_settings.sh included in the sample projects configures camera settings
  • Camera settings are set using the Video4Linux driver.
  • Notes on camera settings: In general a lower exposure time will yield better images for a few reasons, including more full colors, less motion blur, and brighter reflective tape. It may be that you need to move the “exposure_absolute” variable up or down to achieve a lower exposure, depending on your camera.
  • The sample C++ project for integrating OpenCV and GStreamer will call the camera_settings.sh script directly.

Integration

  • cap_gstreamer.cpp and cap_gstreamer.hpp are both needed for gstreamer and OpenCV integration.
  • The two files allow the use of OpenCV VideoReader and VideoWriter classes with gstreamer pipelines.
  • The diagram below illustrates the code structure of a project with both OpenCV and gstreamer.
Diagram demonstrating code structure


Software Examples

  • Note: Before using the examples run sudo apt-get install libtool-bin

Low Latency Streaming

  • This example transmits high-quality, low-latency video over the network via gstreamer.
  • The files for this example are available here.
  • "ClientSide" contains batch scripts for use on the receiving computer, in this example a Windows machine with gstreamer installed
  • "JetsonSide" contains a bash script to run on the Jetson

On-board Computer Vision

  • This example processes the image on the Jetson, and transmits the raw video via gstreamer
  • The files for this example are available here.

Combining the Two

  • The sample available here combines both image processing and video streaming to allow users to send modified video streams to the driverstation via gstreamer.
  • All the information for configuring camera and network settings is available in the README of this folder.

Additional Resources

  • The JetsonDemo github repository contains other examples of code written for the Jetson that makes use of gstreamer and OpenCV and can also serve as a starting point for teams looking to make use of the platform.