Jetson Zoo

This page contains instructions for installing various open source add-on packages and frameworks on NVIDIA Jetson, in addition to a collection of DNN models for inferencing.

Below are links to container images and precompiled binaries built for aarch64 (arm64) architecture\. These are intended to be installed on top of JetPack.

Note that JetPack comes with various pre-installed components such as the L4T kernel, CUDA Toolkit, cuDNN, TensorRT, VisionWorks, OpenCV, GStreamer, Docker, and more.

= Machine Learning =

Jetson is able to natively run the full versions of popular machine learning frameworks, including TensorFlow, PyTorch, Caffe2, Keras, and MXNet.

There are also helpful deep learning examples and tutorials available, created specifically for Jetson - like Hello AI World and JetBot.

Docker Containers
There are ready-to-use ML and data science containers for Jetson hosted on NVIDIA GPU Cloud (NGC), including the following:


 * l4t-tensorflow - TensorFlow 1.15.2 for JetPack 4.4
 * l4t-pytorch - PyTorch 1.2-1.5 for JetPack 4.4
 * l4t-ml - TensorFlow, PyTorch, scikit-learn, scipy, pandas, JupyterLab, ect.

These containers are highly recommended to reduce the installation time of the frameworks below, and for beginners getting started.

If you wish to modify them, the Dockerfiles and build scripts for these containers can be found on GitHub.



TensorFlow

 * Website: https://tensorflow.org
 * Source: https://github.com/tensorflow/tensorflow
 * Container: l4t-tensorflow
 * Version: 1.15.2, 2.1
 * Packages:
 * JetPack 4.4
 * 1.15.2 pip wheel (Python 3.6)
 * 2.1 pip wheel (Python 3.6)
 * JetPack 4.3
 * 1.15.2 pip wheel (Python 3.6)
 * 2.1 pip wheel (Python 3.6)
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Install Guide: Installing TensorFlow on Jetson
 * Forum Topic: devtalk.nvidia.com/default/topic/1048776/jetson-nano/official-tensorflow-for-jetson-nano-/
 * Build from Source: https://devtalk.nvidia.com/default/topic/1055131/jetson-agx-xavier/building-tensorflow-1-13-on-jetson-xavier/

PyTorch (Caffe2)
JetPack 4.4 DP
 * Website: https://pytorch.org/
 * Source: https://github.com/pytorch/pytorch
 * Container: l4t-pytorch
 * Version: PyTorch v1.0.0 - v1.5.0
 * Packages:

JetPack 4.2 / 4.3 note — the PyTorch and Caffe2 projects have merged, so installing PyTorch will also install Caffe2 As per the the PyTorch 1.4 Release Notes, Python 2 support is now deprecated and PyTorch 1.4 is the last version to support Python 2.
 * As per the PyTorch Release Notes, Python 2 is not longer supported in PyTorch v1.5.
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Forum Topic: devtalk.nvidia.com/default/topic/1049071/jetson-nano/pytorch-for-jetson-nano/
 * Build from Source: https://devtalk.nvidia.com/default/topic/1049071/#5324123

MXNet

 * Website: https://mxnet.apache.org/
 * Source: https://github.com/apache/incubator-mxnet
 * Version: 1.4, 1.6
 * Packages:
 * Supports: JetPack 4.2.x, JetPack 4.3 (Jetson Nano / TX1 / TX2 / AGX Xavier)
 * Forum Topic: v1.4 (JetPack 4.2.x) | v1.6 (JetPack 4.3)
 * Build from Source: https://devtalk.nvidia.com/default/topic/1049293/#5326119

Keras

 * Website: https://keras.io/
 * Source: https://github.com/keras-team/keras
 * Version: 2.2.4
 * Forum Topic: https://devtalk.nvidia.com/default/topic/1049362/#5325752

First, install TensorFlow from above.

Hello AI World



 * Website: https://developer.nvidia.com/embedded/twodaystoademo
 * Source: https://github.com/dusty-nv/jetson-inference
 * Supports: Jetson Nano, TX1, TX2, Xavier NX, AGX Xavier
 * Build from Source:

Model Zoo
Below are various DNN models for inferencing on Jetson with support for TensorRT. Included are links to code samples with the model and the original source.

Note that many other models are able to run natively on Jetson by using the Machine Learning frameworks like those listed above.

For performance benchmarks, see these resources:


 * Jetson Nano Deep Learning Inference Benchmarks
 * Jetson TX1/TX2 - NVIDIA AI Inference Technical Overview
 * Jetson AGX Xavier Deep Learning Inference Benchmarks

Pose Estimation
= Computer Vision =

OpenCV



 * Website: https://opencv.org/
 * Source: https://github.com/opencv/opencv
 * Version: 3.3.1 (JetPack <= 4.2.x), 4.1 (JetPack 4.3, JetPack 4.4)
 * Supports: Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier


 * OpenCV is included with JetPack, compiled with support for GStreamer. To build a newer version or to enable CUDA support, see these guides:
 * nano_build_opencv (GitHub)
 * Installing OpenCV 3.4.6

= Robotics =

ROS



 * Website: http://ros.org/
 * Source: https://github.com/ros
 * Version: ROS Melodic
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Installation: http://wiki.ros.org/melodic/Installation/Ubuntu

NVIDIA Isaac SDK



 * Website: https://developer.nvidia.com/isaac-sdk
 * Version: 2019.2, 2019.3, 2020.1
 * Supports: JetPack 4.2.x, JetPack 4.3 (Jetson Nano / TX2 / Xavier)
 * Downloads: https://developer.nvidia.com/isaac/downloads
 * Documentation: https://docs.nvidia.com/isaac

Isaac SIM

 * Website: https://developer.nvidia.com/isaac-sdk
 * Documentation: http://docs.nvidia.com/isaac/isaac_sim/index.html

= IoT / Edge =

AWS Greengrass



 * Website: https://aws.amazon.com/greengrass/
 * Source: https://github.com/aws/aws-greengrass-core-sdk-c
 * Version: v1.9.1
 * Supports: JetPack 4.2.x, JetPack 4.3 (Jetson Nano / TX1 / TX2 / Xavier)
 * Forum Thread: https://devtalk.nvidia.com/default/topic/1052324/#5341970

1. Create Greengrass user group:

2. Setup your AWS account and Greengrass group during this page: https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-config.html

After downloading your unique security resource keys to your Jetson that were created in this step, proceed to #3 below.

3. Download the AWS IoT Greengrass Core Software (v1.9.1) for ARMv8 (aarch64):

4. Following step #4 from this page, extract Greengrass core and your unique security keys on your Jetson:

5. Download AWS ATS endpoint root certificate (CA):

6. Start Greengrass core on your Jetson: You should get a message in your terminal

NVIDIA DeepStream



 * Website: https://developer.nvidia.com/deepstream-sdk
 * Version: 5.0 (Developer Preview)
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * FAQ: https://developer.nvidia.com/deepstream-faq
 * GitHub Samples:
 * deepstream_reference_apps
 * redaction_with_deepstream
 * 360° Smart Parking Application

= Containers =

Docker



 * Website: https://docker.com/
 * Source: https://github.com/docker
 * Version: 18.06
 * Support: ≥ JetPack 3.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Installed by default in JetPack-L4T

To enable GPU passthrough, enable access to these device nodes with the  flag when launching Docker containers:

The  directory also needs mounted.

Below is an example command line for launching Docker with access to the GPU:

To enable IPVLAN for Docker Swarm mode: https://blog.hypriot.com/post/nvidia-jetson-nano-build-kernel-docker-optimized/

Kubernetes



 * Website: https://kubernetes.io/
 * Source: https://github.com/kubernetes/
 * Support: ≥ JetPack 3.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Distributions:
 * MicroK8s (v1.14)
 * k3s (v0.5.0)

To configure L4T kernel for K8S: https://medium.com/@jerry_liang/deploy-gpu-enabled-kubernetes-pod-on-nvidia-jetson-nano-ce738e3bcda9 See also: https://medium.com/jit-team/building-a-gpu-enabled-kubernets-cluster-for-machine-learning-with-nvidia-jetson-nano-7b67de74172a