Jetson Zoo
This page contains instructions for installing various open source add-on packages and frameworks on NVIDIA Jetson, in addition to a collection of DNN models for inferencing.
Below are links to precompiled binaries built for aarch64 (arm64) architecture, including support for CUDA where applicable. These are intended to be installed on top of JetPack.
Refer to the listed forum topics for the latest updates or if you need help. Feel free to contribute to the list below if you have working software to add that is known to support Jetson.
Contents
Machine Learning
Jetson is able to natively run the full versions of popular machine learning frameworks, including TensorFlow, PyTorch, Caffe2, Keras, and MXNet.
There are also helpful deep learning examples and tutorials available, created specifically for Jetson - like Hello AI World and JetBot.
TensorFlow
- Website: https://tensorflow.org
- Source: https://github.com/tensorflow/tensorflow
- Version: 1.13.1
- Packages: pip wheel (Python 3.6)
- Supports: JetPack 4.2 (Jetson Nano / TX2 / Xavier)
- Install Guide: docs.nvidia.com/deeplearning/frameworks/install-tf-xavier/index.html#prereqs
- Forum Topic: devtalk.nvidia.com/default/topic/1048776/jetson-nano/official-tensorflow-for-jetson-nano-/
- Build from Source: https://devtalk.nvidia.com/default/topic/1055131/jetson-agx-xavier/building-tensorflow-1-13-on-jetson-xavier/
# install prerequisites
$ sudo apt-get install libhdf5-serial-dev hdf5-tools libhdf5-dev python3-pip
$ pip3 install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v42 tensorflow-gpu==1.13.1+nv19.5 --user
PyTorch (Caffe2)
- Website: https://pytorch.org/
- Source: https://github.com/pytorch/pytorch
- Version: PyTorch v1.0.0 - v1.1.0
- Packages:
Python 2.7 | Python 3.6 | |
---|---|---|
v1.0.0 | pip wheel | pip wheel |
v1.1.0 | pip wheel | pip wheel |
- Supports: JetPack 4.2 (Jetson Nano / TX2 / Xavier)
- Forum Topic: devtalk.nvidia.com/default/topic/1049071/jetson-nano/pytorch-for-jetson-nano/
- Build from Source: https://devtalk.nvidia.com/default/topic/1049071/#5324123
note — the PyTorch and Caffe2 projects have merged, so installing PyTorch will also install Caffe2
# Python 2.7 (download pip wheel from above)
$ pip install torch-1.1.0-cp27-cp27mu-linux_aarch64.whl
# Python 3.6 (download pip wheel from above)
$ pip3 install numpy torch-1.1.0-cp36-cp36m-linux_aarch64.whl
MXNet
- Website: https://mxnet.apache.org/
- Source: https://github.com/apache/incubator-mxnet
- Version: 1.4
- Packages:
- Supports: JetPack 4.2 (Jetson Nano / TX2 / Xavier)
- Forum Topic: https://devtalk.nvidia.com/default/topic/1049293/#5326170
- Build from Source: https://devtalk.nvidia.com/default/topic/1049293/#5326119
# Python 2.7
sudo apt-get install -y git build-essential libatlas-base-dev libopencv-dev graphviz python-pip
sudo pip install mxnet-1.4.0-cp27-cp27mu-linux_aarch64.whl
# Python 3.6
sudo apt-get install -y git build-essential libatlas-base-dev libopencv-dev graphviz python3-pip
sudo pip install mxnet-1.4.0-cp36-cp36m-linux_aarch64.whl
Keras
- Website: https://keras.io/
- Source: https://github.com/keras-team/keras
- Version: 2.2.4
- Forum Topic: https://devtalk.nvidia.com/default/topic/1049362/#5325752
First, install TensorFlow from above.
# beforehand, install TensorFlow (https://eLinux.org/Jetson_Zoo#TensorFlow)
$ sudo apt-get install -y build-essential libatlas-base-dev
$ sudo pip install keras
Hello AI World
- Website: https://developer.nvidia.com/embedded/twodaystoademo
- Source: https://github.com/dusty-nv/jetson-inference
- Supports: Jetson Nano, TX1, TX2, Xavier
- Build from Source:
# download the repo
$ git clone https://github.com/dusty-nv/jetson-inference
$ cd jetson-inference
$ git submodule update --init
# configure build tree
$ mkdir build
$ cd build
$ cmake ../
# build and install
$ make
$ sudo make install
Robotics
ROS
- Website: http://ros.org/
- Source: https://github.com/ros
- Version: ROS Melodic
- Supports: JetPack 4.2 (Jetson Nano / TX2 / Xavier)
- Installation: http://wiki.ros.org/melodic/Installation/Ubuntu
# enable all Ubuntu packages:
$ sudo apt-add-repository universe
$ sudo apt-add-repository multiverse
$ sudo apt-add-repository restricted
# add ROS repository to apt sources
$ sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
$ sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
# install ROS Base
$ sudo apt-get update
$ sudo apt-get install ros-melodic-ros-base
# add ROS paths to environment
sudo sh -c 'echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc'
IoT / Edge
AWS Greengrass
- Website: https://aws.amazon.com/greengrass/
- Source: https://github.com/aws/aws-greengrass-core-sdk-c
- Version: v1.9.1
- Supports: JetPack 4.2 (Jetson Nano / TX2 / Xavier)
- Forum Thread: https://devtalk.nvidia.com/default/topic/1052324/#5341970
1. Create Greengrass user group:
$ sudo adduser --system ggc_user
$ sudo addgroup --system ggc_group
2. Setup your AWS account and Greengrass group during this page: https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-config.html
After downloading your unique security resource keys to your Jetson that were created in this step, proceed to #3 below.
3. Download the AWS IoT Greengrass Core Software (v1.9.1) for ARMv8 (aarch64):
$ wget https://d1onfpft10uf5o.cloudfront.net/greengrass-core/downloads/1.9.1/greengrass-linux-aarch64-1.9.1.tar.gz
4. Following step #4 from this page, extract Greengrass core and your unique security keys on your Jetson:
$ sudo tar -xzvf greengrass-linux-aarch64-1.9.1.tar.gz -C /
$ sudo tar -xzvf <hash>-setup.tar.gz -C /greengrass # these are the security keys downloaded above
5. Download AWS ATS endpoint root certificate (CA):
$ cd /greengrass/certs/
$ sudo wget -O root.ca.pem https://www.amazontrust.com/repository/AmazonRootCA1.pem
6. Start Greengrass core on your Jetson:
$ cd /greengrass/ggc/core/
$ sudo ./greengrassd start
You should get a message in your terminal Greengrass sucessfully started with PID: xxx
Containers
Docker
- Website: https://docker.com/
- Source: https://github.com/docker
- Version: 18.06
- Support: ≥ JetPack 3.2 (Jetson Nano / TX1 / TX2 / Xavier)
- Installed by default in JetPack-L4T
To enable GPU passthrough, enable access to these device nodes with the --device
flag when launching Docker containers:
/dev/nvhost-ctrl
/dev/nvhost-ctrl-gpu
/dev/nvhost-prof-gpu
/dev/nvmap
/dev/nvhost-gpu
/dev/nvhost-as-gpu
The /usr/lib/aarch64-linux-gnu/tegra
directory also needs mounted.
Below is an example command line for launching Docker with access to the GPU:
docker run --device=/dev/nvhost-ctrl --device=/dev/nvhost-ctrl-gpu --device=/dev/nvhost-prof-gpu --device=/dev/nvmap --device=/dev/nvhost-gpu --device=/dev/nvhost-as-gpu -v /usr/lib/aarch64-linux-gnu/tegra:/usr/lib/aarch64-linux-gnu/tegra <container-name>
To enable IPVLAN for Docker Swarm mode: https://blog.hypriot.com/post/nvidia-jetson-nano-build-kernel-docker-optimized/