Difference between revisions of "Jetson Zoo"

From eLinux.org
Jump to: navigation, search
m (Model Zoo)
m (Model Zoo)
Line 137: Line 137:
 
! Network !! Dataset !! Resolution !! Classes !! Framework !! TensorRT !! Samples !! Upstream
 
! Network !! Dataset !! Resolution !! Classes !! Framework !! TensorRT !! Samples !! Upstream
 
|-
 
|-
| AlexNet || ILSVRC12 || 224x224 || 1000 || Caffe || {{Yes}} || <code>[[Jetson_Zoo#Hello_AI_World|Hello AI World]]</code> || [https://github.com/BVLC/caffe/tree/master/models/bvlc_alexnet BVLC]
+
| AlexNet || ILSVRC12 || 224x224 || 1000 || Caffe || {{Yes}} || [[Jetson_Zoo#Hello_AI_World|Hello AI World]] || [https://github.com/BVLC/caffe/tree/master/models/bvlc_alexnet BVLC]
 
|-
 
|-
 
| GoogleNet || ILSVRC12 || 224x224 || 1000 || Caffe || {{Yes}} || [[Jetson_Zoo#Hello_AI_World|Hello AI World]] || [https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet BVLC]
 
| GoogleNet || ILSVRC12 || 224x224 || 1000 || Caffe || {{Yes}} || [[Jetson_Zoo#Hello_AI_World|Hello AI World]] || [https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet BVLC]

Revision as of 19:36, 17 June 2019

This page contains instructions for installing various open source add-on packages and frameworks on NVIDIA Jetson, in addition to a collection of DNN models for inferencing.

Below are links to precompiled binaries built for aarch64 (arm64) architecture, including support for CUDA where applicable. These are intended to be installed on top of JetPack.

Note that JetPack comes with various pre-installed components such as the L4T kernel, CUDA Toolkit, cuDNN, TensorRT, VisionWorks, OpenCV, GStreamer, Docker, and more.

For the latest updates and support, refer to the listed forum topics. Feel free to contribute to the list below if you know of software packages that are working & tested on Jetson.

Machine Learning

Jetson is able to natively run the full versions of popular machine learning frameworks, including TensorFlow, PyTorch, Caffe2, Keras, and MXNet.

There are also helpful deep learning examples and tutorials available, created specifically for Jetson - like Hello AI World and JetBot.

TensorFlow

TensorFlow Logo.png
# install prerequisites
$ sudo apt-get install libhdf5-serial-dev hdf5-tools libhdf5-dev python3-pip
$ pip3 install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v42 tensorflow-gpu==1.13.1+nv19.5 --user

PyTorch (Caffe2)

PyTorch Logo.png
Python 2.7 Python 3.6
v1.0.0 pip wheel pip wheel
v1.1.0 pip wheel pip wheel

note — the PyTorch and Caffe2 projects have merged, so installing PyTorch will also install Caffe2

# Python 2.7 (download pip wheel from above)
$ pip install torch-1.1.0-cp27-cp27mu-linux_aarch64.whl

# Python 3.6 (download pip wheel from above)
$ pip3 install numpy torch-1.1.0-cp36-cp36m-linux_aarch64.whl

MXNet

MXNet Logo.png
# Python 2.7
sudo apt-get install -y git build-essential libatlas-base-dev libopencv-dev graphviz python-pip
sudo pip install mxnet-1.4.0-cp27-cp27mu-linux_aarch64.whl

# Python 3.6
sudo apt-get install -y git build-essential libatlas-base-dev libopencv-dev graphviz python3-pip
sudo pip install mxnet-1.4.0-cp36-cp36m-linux_aarch64.whl

Keras

Keras Logo.png

First, install TensorFlow from above.

# beforehand, install TensorFlow (https://eLinux.org/Jetson_Zoo#TensorFlow)
$ sudo apt-get install -y build-essential libatlas-base-dev
$ sudo pip install keras

Hello AI World

Hello-AI-World-CV.png
# download the repo
$ git clone https://github.com/dusty-nv/jetson-inference
$ cd jetson-inference
$ git submodule update --init

# configure build tree
$ mkdir build
$ cd build
$ cmake ../

# build and install
$ make 
$ sudo make install

Model Zoo

Below are various DNN models that are known to run on Jetson with support for TensorRT. Included are links to samples and the upstream source of the model.

For performance benchmarks, see these resources:

Classification

Network Dataset Resolution Classes Framework TensorRT Samples Upstream
AlexNet ILSVRC12 224x224 1000 Caffe Yes Hello AI World BVLC
GoogleNet ILSVRC12 224x224 1000 Caffe Yes Hello AI World BVLC
ResNet-18 ILSVRC12 224x224 1000 Caffe Yes Hello AI World GitHub
ResNet-50 ILSVRC12 224x224 1000 Caffe Yes Hello AI World GitHub
ResNet-101 ILSVRC12 224x224 1000 Caffe Yes Hello AI World GitHub
ResNet-152 ILSVRC12 224x224 1000 Caffe Yes Hello AI World GitHub
VGG-16 ILSVRC12 224x224 1000 Caffe Yes - GitHub
VGG-19 ILSVRC12 224x224 1000 Caffe Yes - GitHub

Computer Vision

OpenCV

OpenCV Logo.png

Robotics

ROS

Ros logo.png
# enable all Ubuntu packages:
$ sudo apt-add-repository universe
$ sudo apt-add-repository multiverse
$ sudo apt-add-repository restricted

# add ROS repository to apt sources
$ sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
$ sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654

# install ROS Base
$ sudo apt-get update
$ sudo apt-get install ros-melodic-ros-base

# add ROS paths to environment
sudo sh -c 'echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc'

NVIDIA Isaac SDK

Isaac Gems.jpg

Isaac SIM

IoT / Edge

AWS Greengrass

Greengrass Logo.png

1. Create Greengrass user group:

$ sudo adduser --system ggc_user
$ sudo addgroup --system ggc_group

2. Setup your AWS account and Greengrass group during this page: https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-config.html
    After downloading your unique security resource keys to your Jetson that were created in this step, proceed to #3 below.

3. Download the AWS IoT Greengrass Core Software (v1.9.1) for ARMv8 (aarch64):

$ wget https://d1onfpft10uf5o.cloudfront.net/greengrass-core/downloads/1.9.1/greengrass-linux-aarch64-1.9.1.tar.gz

4. Following step #4 from this page, extract Greengrass core and your unique security keys on your Jetson:

$ sudo tar -xzvf greengrass-linux-aarch64-1.9.1.tar.gz -C /
$ sudo tar -xzvf <hash>-setup.tar.gz -C /greengrass   # these are the security keys downloaded above

5. Download AWS ATS endpoint root certificate (CA):

$ cd /greengrass/certs/
$ sudo wget -O root.ca.pem https://www.amazontrust.com/repository/AmazonRootCA1.pem

6. Start Greengrass core on your Jetson:

$ cd /greengrass/ggc/core/
$ sudo ./greengrassd start

You should get a message in your terminal Greengrass sucessfully started with PID: xxx

NVIDIA DeepStream

DeepStream 30 Stream.png

Containers

Docker

Docker Logo.png

To enable GPU passthrough, enable access to these device nodes with the --device flag when launching Docker containers:

/dev/nvhost-ctrl
/dev/nvhost-ctrl-gpu
/dev/nvhost-prof-gpu
/dev/nvmap
/dev/nvhost-gpu
/dev/nvhost-as-gpu

The /usr/lib/aarch64-linux-gnu/tegra directory also needs mounted.

Below is an example command line for launching Docker with access to the GPU:

docker run --device=/dev/nvhost-ctrl --device=/dev/nvhost-ctrl-gpu --device=/dev/nvhost-prof-gpu --device=/dev/nvmap --device=/dev/nvhost-gpu --device=/dev/nvhost-as-gpu -v /usr/lib/aarch64-linux-gnu/tegra:/usr/lib/aarch64-linux-gnu/tegra <container-name>

To enable IPVLAN for Docker Swarm mode: https://blog.hypriot.com/post/nvidia-jetson-nano-build-kernel-docker-optimized/

Kubernetes

Kubernetes-Logo.png

To configure L4T kernel for K8S: https://medium.com/@jerry_liang/deploy-gpu-enabled-kubernetes-pod-on-nvidia-jetson-nano-ce738e3bcda9