Jetson/H264 Codec

About the codec
Although commonly referred to as the H.264 codec, Tegra K1's high-definition video hardware supports encoding and decoding formats in addition to H.264, including VC-1, VP8, MPEG-4 basic, MPEG-2, and JPEG. The driver for the codec hardware is an OpenMAX IL implementation, a standard media streaming interface used by mobile SoCs, and is accessible through gstreamer using the nv_omx_h264enc and nv_omx_h264dec elements. Visit here if you are new to using gstreamer.

From the command line
Any easy way to get started with the codec is launch some gstreamer pipelines from the terminal.

gst-launch -e udpsrc host= port=5004 caps="application/x-rtp, media=(string)video,  clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string)YCbCr-4:2:2,   depth=(string)8, width=(string), height=(string), payload=(int)96" ! gstrtpjitterbuffer ! rtpvrawdepay ! queue ! video/x-raw-yuv ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc quality-level=2 ! video/x-h264 ! matroskamux ! queue ! filesink location=.mkv (the , , , and  parameters are to be provided by user)
 * Recieve a RAW RTP video stream over the network and encode it as H.264 to disk:

gst-launch -e v4l2src ! 'video/x-raw-yuv,width=,height=,framerate=/1' ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location= e.g. gst-launch -e v4l2src ! 'video/x-raw-yuv,width=1280,height=720,framerate=30/1' ! nv_omx_h264enc quality-level=2 ! mp4mux ! filesink location=/home/room/family.mp4
 * Capture live video stream from usb camera, encode and store it to local filesystem:

gst-launch filesrc location=.mkv ! matroskademux ! queue ! h264parse ! nv_omx_h264dec ! ffmpegcolorspace ! 'video/x-raw-yuv, format=(fourcc)UYVY' ! rtpvrawpay mtu=1472 ! queue ! udpsink host= port=5004 loop=false
 * Decode the H.264 file from disk and stream it out over the network:

gst-inspect nv_omx_h264enc
 * Examine the properties & settings of elements using gst-inspect:

From within your application
Using the gstreamer-defined appsrc and appsink elements, it's possible to efficiently send application data into a local gstreamer pipeline running in the application's userspace. For example using appsrc a CUDA video processing application could send it's image buffers into gstreamer to be encoded, and then retrieve the H.264-encoded data from gstreamer using appsink. Code sample using nv_omx_h264enc/nv_omx_h264dec coming soon!