Jetson Zoo

This page contains instructions for installing various open source add-on packages and frameworks on NVIDIA Jetson, in addition to a collection of DNN models for inferencing.

Below are links to container images and precompiled binaries built for aarch64 (arm64) architecture. These are intended to be installed on top of JetPack.

Note that JetPack comes with various pre-installed components such as the L4T kernel, CUDA Toolkit, cuDNN, TensorRT, VisionWorks, OpenCV, GStreamer, Docker, and more.

= Machine Learning =

Jetson is able to natively run the full versions of popular machine learning frameworks, including TensorFlow, PyTorch, Caffe2, Keras, and MXNet.

There are also helpful deep learning examples and tutorials available, created specifically for Jetson - like Hello AI World and JetBot.

Docker Containers
There are ready-to-use ML and data science containers for Jetson hosted on NVIDIA GPU Cloud (NGC), including the following:


 * l4t-tensorflow - TensorFlow for JetPack 4.4 (and newer)
 * l4t-pytorch - PyTorch for JetPack 4.4 (and newer)
 * l4t-ml - TensorFlow, PyTorch, scikit-learn, scipy, pandas, JupyterLab, ect.

If you wish to modify them, the Dockerfiles and build scripts for these containers can be found on GitHub.

There are also the following ready-to-use container images for Jetson hosted on DockerHub and third-party registries:
 * ROS/ROS2 containers: https://github.com/dusty-nv/jetson-containers
 * ONNX Runtime for Jetson: mcr.microsoft.com/azureml/onnxruntime:v.1.4.0-jetpack4.4-l4t-base-r32.4.3

These containers are highly recommended to reduce the installation time of the frameworks below, and for beginners getting started.



TensorFlow

 * Website: https://tensorflow.org
 * Source: https://github.com/tensorflow/tensorflow
 * Container: l4t-tensorflow
 * Version: 1.15, 2.x (Python 3.6/3.8)
 * Packages:
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier / AGX Orin)
 * Install Guide: Installing TensorFlow on Jetson
 * Forum Topic: devtalk.nvidia.com/default/topic/1048776/jetson-nano/official-tensorflow-for-jetson-nano-/
 * Build from Source: https://devtalk.nvidia.com/default/topic/1055131/jetson-agx-xavier/building-tensorflow-1-13-on-jetson-xavier/

PyTorch (Caffe2)
JetPack 5.0
 * Website: https://pytorch.org/
 * Source: https://github.com/pytorch/pytorch
 * Container: l4t-pytorch
 * Version: PyTorch v1.0.0 - v1.12.0
 * Packages:

JetPack 4.4 / 4.4.1 / 4.5 / 4.5.1 / 4.6

JetPack 4.4 Developer Preview

JetPack 4.2 / 4.3 note — the PyTorch and Caffe2 projects have merged, so installing PyTorch will also install Caffe2 As per the the PyTorch 1.4 Release Notes, Python 2 support is now deprecated and PyTorch 1.4 is the last version to support Python 2.
 * As per the PyTorch Release Notes, Python 2 is not longer supported as of PyTorch v1.5 and newer.
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier / AGX Orin)
 * Forum Topic: devtalk.nvidia.com/default/topic/1049071/jetson-nano/pytorch-for-jetson-nano/
 * Build from Source: https://devtalk.nvidia.com/default/topic/1049071/#5324123

ONNX Runtime


JetPack 4.4 / 4.4.1 / 4.5 / 4.5.1 / 4.6 / 4.6.1
 * Website: https://microsoft.github.io/onnxruntime/
 * Source: https://github.com/microsoft/onnxruntime
 * Container: mcr.microsoft.com/azureml/onnxruntime:v.1.4.0-jetpack4.4-l4t-base-r32.4.3
 * Version: 1.4.0, 1.5.2, 1.6.0, 1.7.0, 1.8.0, 1.9.0, 1.10.0, 1.11.0, 1.12.1
 * Packages:

JetPack 5.0


 * Supports: JetPack >= 4.4 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier / AGX Orin)
 * Forum Support: https://github.com/microsoft/onnxruntime
 * Build from Source: Refer to these instructions
 * ONNX Runtime 1.10.0 Install instructions

MXNet

 * Website: https://mxnet.apache.org/
 * Source: https://github.com/apache/incubator-mxnet
 * Version: 1.4, 1.6, 1.7
 * Packages:
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Forum Topics: v1.4 | v1.6 | v1.7
 * Build from Source: v1.6 | v1.7

MXNet 1.7 Install Instructions:

MXNet 1.4 / 1.6 Install Instructions:

Keras

 * Website: https://keras.io/
 * Source: https://github.com/keras-team/keras
 * Version: 2.2.4
 * Forum Topic: https://devtalk.nvidia.com/default/topic/1049362/#5325752

First, install TensorFlow from above.

Hello AI World



 * Website: https://developer.nvidia.com/embedded/twodaystoademo
 * Source: https://github.com/dusty-nv/jetson-inference
 * Supports: Jetson Nano, TX1, TX2, Xavier NX, AGX Xavier, AGX Orin
 * Run the Container


 * Build from Source

Model Zoo
Below are various DNN models for inferencing on Jetson with support for TensorRT. Included are links to code samples with the model and the original source.

Note that many other models are able to run natively on Jetson by using the Machine Learning frameworks like those listed above.

For performance benchmarks, see these resources:


 * Jetson Nano Deep Learning Inference Benchmarks
 * Jetson TX1/TX2 - NVIDIA AI Inference Technical Overview
 * Jetson AGX Xavier Deep Learning Inference Benchmarks

Pose Estimation
= Computer Vision =

OpenCV



 * Website: https://opencv.org/
 * Source: https://github.com/opencv/opencv
 * Version: 3.3.1 (JetPack <= 4.2.x), 4.1.1 (JetPack 4.3, JetPack 4.4, JetPack 4.5)
 * Supports: Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier / AGX Orin


 * OpenCV is included with JetPack, compiled with support for GStreamer. To build a newer version or to enable CUDA support, see these guides:
 * nano_build_opencv (GitHub)
 * Installing OpenCV 3.4.6

= Robotics =

ROS



 * Website: http://ros.org/
 * Source: https://github.com/ros
 * Version: ROS Melodic, ROS Noetic, ROS2 Eloquent, ROS2 Foxy, ROS2 Galactic, ROS2 Humble
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier / AGX Orin)
 * Installation: http://wiki.ros.org/melodic/Installation/Ubuntu
 * Containers: https://github.com/dusty-nv/jetson-containers

You can also find various resources, libraries, and extra NVIDIA packages for ROS2 at https://nvidia-ai-iot.github.io/ros2_jetson/

NVIDIA Isaac SDK



 * Website: https://developer.nvidia.com/isaac-sdk
 * Version: 2019.2, 2019.3, 2020.1, 2020.2
 * Supports: JetPack 4.2.x, JetPack 4.3, JetPack 4.4 (Jetson Nano / TX2 / Xavier)
 * Downloads: https://developer.nvidia.com/isaac/downloads
 * Documentation: https://docs.nvidia.com/isaac

Isaac SIM

 * Website: https://developer.nvidia.com/isaac-sdk
 * Documentation: http://docs.nvidia.com/isaac/isaac_sim/index.html

= IoT / Edge =

AWS Greengrass



 * Website: https://aws.amazon.com/greengrass/
 * Source: https://github.com/aws/aws-greengrass-core-sdk-c
 * Version: v1.9.1
 * Supports: JetPack 4.2.x, JetPack 4.3, JetPack 4.4 (Jetson Nano / TX1 / TX2 / Xavier)
 * Forum Thread: https://devtalk.nvidia.com/default/topic/1052324/#5341970

1. Create Greengrass user group:

2. Setup your AWS account and Greengrass group during this page: https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-config.html

After downloading your unique security resource keys to your Jetson that were created in this step, proceed to #3 below.

3. Download the AWS IoT Greengrass Core Software (v1.9.1) for ARMv8 (aarch64):

4. Following step #4 from this page, extract Greengrass core and your unique security keys on your Jetson:

5. Download AWS ATS endpoint root certificate (CA):

6. Start Greengrass core on your Jetson: You should get a message in your terminal

NVIDIA DeepStream



 * Website: https://developer.nvidia.com/deepstream-sdk
 * Version: 5.0
 * Supports: JetPack >= 4.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * FAQ: https://developer.nvidia.com/deepstream-faq
 * GitHub Samples:
 * deepstream_reference_apps
 * redaction_with_deepstream
 * 360° Smart Parking Application

= Containers =

Docker



 * Website: https://docker.com/
 * Source: https://github.com/docker
 * Version: 18.06
 * Support: ≥ JetPack 3.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Installed by default in JetPack-L4T

Launch your container with  to enable GPU-passthrough. Launch your container with  to enable access to MIPI CSI cameras.

See https://github.com/NVIDIA/nvidia-docker/wiki/NVIDIA-Container-Runtime-on-Jetson for more documentation on using Docker on Jetson.

To enable IPVLAN for Docker Swarm mode: https://blog.hypriot.com/post/nvidia-jetson-nano-build-kernel-docker-optimized/

Kubernetes



 * Website: https://kubernetes.io/
 * Source: https://github.com/kubernetes/
 * Support: ≥ JetPack 3.2 (Jetson Nano / TX1 / TX2 / Xavier NX / AGX Xavier)
 * Distributions:
 * MicroK8s (v1.14)
 * k3s (v0.5.0)

To configure L4T kernel for K8S: https://medium.com/@jerry_liang/deploy-gpu-enabled-kubernetes-pod-on-nvidia-jetson-nano-ce738e3bcda9 See also: https://medium.com/jit-team/building-a-gpu-enabled-kubernets-cluster-for-machine-learning-with-nvidia-jetson-nano-7b67de74172a