Jetson/H264 Codec

About the codec
Although commonly referred to as the H.264 codec, Tegra K1's high-definition video hardware supports encoding and decoding formats in addition to H.264, including VC-1, VP8, MPEG-4 basic, MPEG-2, and JPEG. The driver for the codec hardware is an OpenMAX IL implementation, a standard media streaming interface used by mobile SoCs, and is accessible through gstreamer using the nv_omx_h264enc and nv_omx_h264dec elements. (However, directly access is not allowed.) Visit here if you are new to using gstreamer.

From the command line
Any easy way to get started with the codec is launch some gstreamer pipelines from the terminal.

gst-launch -e udpsrc host= port=5004 caps="application/x-rtp, media=(string)video,  clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2,   depth=(string)8, width=(string), height=(string), payload=(int)96" ! gstrtpjitterbuffer ! rtpvrawdepay ! queue ! video/x-raw-yuv ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc quality-level=2 ! video/x-h264 ! matroskamux ! queue ! filesink location=.mkv (the , , , and  parameters are to be provided by user)
 * Recieve a RAW RTP video stream over the network and encode it as H.264 to disk:

gst-launch -e v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=,height=,framerate=/1' ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location= e.g. gst-launch -e v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=1280,height=720,framerate=30/1' ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location=/home/room/family.mp4
 * Capture live video stream from usb camera, encode and store it to local filesystem:

gst-launch filesrc location=.mkv ! matroskademux ! queue ! h264parse ! nv_omx_h264dec ! ffmpegcolorspace ! 'video/x-raw-yuv, format=(fourcc)UYVY' ! rtpvrawpay mtu=1472 ! queue ! udpsink host= port=5004 loop=false
 * Decode the H.264 file from disk and stream it out over the network:

gst-inspect nv_omx_h264enc
 * Examine the properties & settings of elements using gst-inspect:

Debugging

 * Running and Debugging GStreamer Applications
 * Easier debugging - create dot files for analyzing and visualizing a pipeline
 * This can be very helpful, especially when testing a custom gstreamer pipeline or plugin under development. Visit here for more info.
 * The "dot" command is available in the "graphviz" package. Graphviz intsallation on Ubuntu
 * Below is a handy script to visualize dot files on Ubuntu. It temporarily creates a 'png' file using 'dot' and shows it using eog - eye of gnome imageviewer

if [ $# -lt 1 ];then echo "I don't know which dot file to open and show. Must supply 1 argument for that."; exit -1; fi if [ ! -f $1 ]; then echo "Invalid input file."; exit -1; fi dot -Tpng $1 > /tmp/`basename $1 .dot`.png eog /tmp/`basename $1 .dot`.png &
 * 1) !/bin/bash
 * 2) title         :dotFileViewer.sh
 * 3) description   :This script will display a dot file using eog - eye of gnome
 * 4) usage         :dotFileViewer.sh 
 * 5) example       :dotFileViewer.sh /tmp/test/0.00.00.436609917-gst-launch.READY_PAUSED.dot


 * Another script to watch for changes in local filesystem directory and display a dot file. This can be useful when doing iterative debug/test. This sample uses 'inotify-tools' package to get notified, you can explore other ways too.

if [ $# -lt 1 ];then echo "I don't know what directory to use. Must supply 1 argument for that."; exit -1; fi if [ ! -d $1 ]; then echo "Invalid directory path/name."; exit -1; fi while inotifywait -e modify -e create -e delete $1; do 	if [ $# -ge 2 ]; then dotFileViewer.sh $1/$2 else dotFileViewer.sh $1/*PAUSED_PLAYING* fi done
 * 1) !/bin/bash
 * 2) title         :dotFileWatcherViewer.sh
 * 3) description   :This script will watch for any changes in directory and display a dot file using dotFileViewer.sh
 * 4) usage         :dotFileWatcherViewer.sh
 * 5) example       :dotFileWatcherViewer.sh /tmp/test/ *PAUSED_PLAYING*

From within your application
Using the gstreamer-defined appsrc and appsink elements, it's possible to efficiently send application data into a local gstreamer pipeline running in the application's userspace. For example using appsrc a CUDA video processing application could send it's image buffers into gstreamer to be encoded, and then retrieve the H.264-encoded data from gstreamer using appsink. Code sample using nv_omx_h264enc/nv_omx_h264dec coming soon!