Jetson/H264 Codec

Jump to: navigation, search

About the codec

Although commonly referred to as the H.264 codec, Tegra K1's high-definition video hardware supports encoding and decoding formats in addition to H.264, including VC-1, VP8, MPEG-4 basic, MPEG-2, and JPEG. The driver for the codec hardware is an OpenMAX IL implementation, a standard media streaming interface used by mobile SoCs, and is accessible through gstreamer using the nv_omx_h264enc and nv_omx_h264dec elements. (However, directly access is not allowed.) Visit here if you are new to using gstreamer.

From the command line

Any easy way to get started with the codec is launch some gstreamer pipelines from the terminal.

  • Recieve a RAW RTP video stream over the network and encode it as H.264 to disk:
 gst-launch -e udpsrc host=<IP> port=5004 caps="application/x-rtp, media=(string)video, 
 clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2, 
 depth=(string)8, width=(string)<WIDTH>, height=(string)<HEIGHT>, payload=(int)96" 
 ! gstrtpjitterbuffer ! rtpvrawdepay ! queue ! video/x-raw-yuv ! nvvidconv 
 ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc quality-level=2 
 ! video/x-h264 ! matroskamux ! queue ! filesink location=<FILENAME>.mkv

(the <IP>, <WIDTH>, <HEIGHT>, and <FILENAME> parameters are to be provided by user)

  • Capture live video stream from usb camera, encode and store it to local filesystem:
 gst-launch -e v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=<FRAME_WIDTH>,height=<FRAME_HEIGHT>,framerate=<FRAME_RATE>/1' 
 ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location=<FILE_PATH>

e.g. gst-launch -e v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=1280,height=720,framerate=30/1' ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location=/home/room/family.mp4

  • Decode the H.264 file from disk and stream it out over the network:
 gst-launch filesrc location=<FILENAME>.mkv ! matroskademux ! queue ! h264parse ! nv_omx_h264dec 
 ! ffmpegcolorspace ! 'video/x-raw-yuv, format=(fourcc)UYVY' ! rtpvrawpay mtu=1472 
 ! queue ! udpsink host=<IP> port=5004 loop=false
  • Examine the properties & settings of elements using gst-inspect:
 gst-inspect nv_omx_h264enc

Note: The examples above are for gstreamer-0.1. For the newer gstreamer-1.0, read NVIDIA's Multimedia User Guide (such as v2.2 that can be downloaded here).


  • Running and Debugging GStreamer Applications
  • Easier debugging - create dot files for analyzing and visualizing a pipeline
    • This can be very helpful, especially when testing a custom gstreamer pipeline or plugin under development. Visit here for more info.
  • The "dot" command is available in the "graphviz" package. Graphviz intsallation on Ubuntu
  • Below is a handy script to visualize dot files on Ubuntu. It temporarily creates a 'png' file using 'dot' and shows it using eog - eye of gnome imageviewer
#description    :This script will display a dot file using eog - eye of gnome  
#usage <dot_file>
#example /tmp/test/ 
if [ $# -lt 1 ];then 
	echo "I don't know which dot file to open and show. Must supply 1 argument for that."; exit -1;
if [ ! -f $1 ]; then 
	echo "Invalid input file."; exit -1;
dot -Tpng $1 > /tmp/`basename $1 .dot`.png 
eog /tmp/`basename $1 .dot`.png &
  • Another script to watch for changes in local filesystem directory and display a dot file. This can be useful when doing iterative debug/test. This sample uses 'inotify-tools' package to get notified, you can explore other ways too.
#description    :This script will watch for any changes in <arg1> directory and display a dot file using
#usage <watch directory> <filename filter>
#example /tmp/test/ *PAUSED_PLAYING* 
if [ $# -lt 1 ];then 
	echo "I don't know what directory to use. Must supply 1 argument for that."; exit -1;
if [ ! -d $1 ]; then
	echo "Invalid directory path/name."; exit -1;
while inotifywait -e modify -e create -e delete $1; do
	if [ $# -ge 2 ]; then $1/$2

From within your application

Using the gstreamer-defined appsrc and appsink elements, it's possible to efficiently send application data into a local gstreamer pipeline running in the application's userspace. For example using appsrc a CUDA video processing application could send it's image buffers into gstreamer to be encoded, and then retrieve the H.264-encoded data from gstreamer using appsink. Code sample using nv_omx_h264enc/nv_omx_h264dec coming soon!