Difference between revisions of "Testing Video Input of Hawkboard"

From eLinux.org
Jump to: navigation, search
m (added dsp-code at the arm+dsp function-call)
 
(One intermediate revision by one other user not shown)
Line 15: Line 15:
 
== Using Off-the-shelf Software Components ==
 
== Using Off-the-shelf Software Components ==
  
I tried GStreamer and Mplayer on the board and both were failed to capture camera input.
+
Currently GStreamer and Mplayer on the board and both were failed to capture camera input.
  
 
===GStreamer===
 
===GStreamer===
Line 28: Line 28:
 
===MPlayer===
 
===MPlayer===
  
I'm fairly sure that MPlayer should support NV16 but I couldn't make it work on my Hawkboard. MPlayer was unable to find a suitable color format for my camera. Due to performance limitations, I didn't try so hard to make it work.
+
MPlayer should support NV16 but right now doesnt work on Hawkboard. MPlayer was unable to find a suitable color format for camera.  
  
== Firing up a Custom Solution ==
+
== Custom Solution ==
  
Since I couldn't test my camera with off-the-shelf components, I tried to do my own. Then I came-up with to solutions.
+
Since testing of camera with off-the-shelf components couldnt be performed,following solution could be used.
  
 
=== ARM only Solution ===
 
=== ARM only Solution ===
  
I wrote a simple V4L2 application, cross-compiled it and tried on my board. Results were:
+
Make a V4L2 application, cross-compiled it and tried on the board. Results were:
  
 
* Camera interface was functional.
 
* Camera interface was functional.
Line 43: Line 43:
 
=== ARM + DSP Solution ===
 
=== ARM + DSP Solution ===
  
To increase the fps, I modified existing Codec Engine following examples:
+
To increase the fps, modify the existing Codec Engine following examples:
  
 
* video_copy application : This is the ARM side application capturing camera data and showing on the VGA display.
 
* video_copy application : This is the ARM side application capturing camera data and showing on the VGA display.
 
* viddec_copy codec : This is the DSP codec doing the actual NV16 <-> RGB565 conversion.
 
* viddec_copy codec : This is the DSP codec doing the actual NV16 <-> RGB565 conversion.
  
This ARM + DSP solution gave me a 25 fps nice camera test on my board. You can find my binaries and source codes at [http://code.google.com/p/hawkboard/downloads/detail?name=vpif_example.tar.gz&can=2&q=]
+
This ARM + DSP solution that gives  a 25 fps nice camera test on the hawkboard. You can find the binaries and source codes at [http://code.google.com/p/hawkboard/downloads/detail?name=vpif_example.tar.gz&can=2&q=]
  
 
=== Using Provided Binaries ===
 
=== Using Provided Binaries ===
Line 74: Line 74:
 
* uImage
 
* uImage
  
  Note : I was having a TVP5146 issue with my board, such that TVP5146 was not detecting my camera and throwing input setting errors.  
+
  Note : There might be a TVP5146 issue with hawkboard, such that TVP5146 being not detected by camera and throwing input setting errors.  
         To overcome this situation I had to patch my kernel with the following patch. I uploaded my patched kernel for your convenience. Note that     
+
         To overcome this situation you could patch the kernel with the following patch. Patch can be downloaded from [http://groups.google.com/group/hawkboard/browse_thread/thread/723b11a40c287a4/e2948904ae45f154?lnk=gst]
        patch can be broken due to wiki formatting, if that is the case you can find the original patch at [http://groups.google.com/group/hawkboard/browse_thread/thread/723b11a40c287a4/e2948904ae45f154?lnk=gst]
 
 
   
 
   
 
  --- kernel_source_original/drivers/media/video/tvp514x.c        2010-03-18 14:33:44.000000000 +0200
 
  --- kernel_source_original/drivers/media/video/tvp514x.c        2010-03-18 14:33:44.000000000 +0200
Line 101: Line 100:
 
=== Testing ARM-DSP Code ===
 
=== Testing ARM-DSP Code ===
  
First of all make sure that your CMEM and DSPLINK modules are loaded. If you are using an Angstrom file-system, make sure that your kernel is booted with ''mem=34M'' option so that CMEM module is automatically loaded. Also you can decrease necessary kernel buffer size with ''vpif_capture.ch0_bufsize=1658880'' parameters. Here is how boot my board under U-Boot:
+
First of all make sure that your CMEM and DSPLINK modules are loaded. If you are using an Angstrom file-system, make sure that your kernel is booted with ''mem=34M'' option so that CMEM module is automatically loaded. Also you can decrease necessary kernel buffer size with ''vpif_capture.ch0_bufsize=1658880'' parameters. under U-Boot:
  
 
  setenv bootargs 'console=ttyS2,115200n8 ubi.mtd=filesystem root=ubi0:hawkboard-rootfs rootfstype=ubifs mem=34M vpif_capture.ch0_bufsize=1658880'
 
  setenv bootargs 'console=ttyS2,115200n8 ubi.mtd=filesystem root=ubi0:hawkboard-rootfs rootfstype=ubifs mem=34M vpif_capture.ch0_bufsize=1658880'
Line 109: Line 108:
 
  $insmod cmemk.ko phys_start=0xC2200000 phys_end=0xC3200000 pools=1x5250000,3x1048576,3x829440,1x256000,4x131072
 
  $insmod cmemk.ko phys_start=0xC2200000 phys_end=0xC3200000 pools=1x5250000,3x1048576,3x829440,1x256000,4x131072
 
  $insmod dsplinkk.ko
 
  $insmod dsplinkk.ko
 
Don't ask me why I didn't upload my modules as well, I don't know either :)
 
  
 
Then copy both ''all.x674'' and ''app_remote.xv5T'' binaries to the same folder and run the application with following commands:
 
Then copy both ''all.x674'' and ''app_remote.xv5T'' binaries to the same folder and run the application with following commands:
  
  ./app_remote.xv5T
+
  ./app_remote.xv5T all.x674
  
 
Now you should see 25 fps video on your screen. Unfortunately, If you want to compile this example on your own, you will need whole Codec Engine tree. You will find a patch containing necessary changes in the ''sources/arm+dsp'' folder along with my own Makefile. If you are not using DVSDK and you have a working OpenEmbedded tree around you can use provided Makefile to compile this example.
 
Now you should see 25 fps video on your screen. Unfortunately, If you want to compile this example on your own, you will need whole Codec Engine tree. You will find a patch containing necessary changes in the ''sources/arm+dsp'' folder along with my own Makefile. If you are not using DVSDK and you have a working OpenEmbedded tree around you can use provided Makefile to compile this example.

Latest revision as of 07:53, 23 September 2010

Problem Description

OMAP L-138 kernels are supporting composite(camera) interface of Hawkboard out of the box, however there are 3 obstacles for using it:

  • L-138 VPIF video input offers camera data in 2 plane YUV format, namely NV16.
  • L-138 LCDC output, VGA output of Hawkboard, accepts display data in RGB565 format
  • L-138 does not have hardware support for converting NV16 <-> RGB565

Due to these problems, if we want to see camera input on the VGA output of the board we need to perform color conversion in the software.

Using Off-the-shelf Software Components

Currently GStreamer and Mplayer on the board and both were failed to capture camera input.

GStreamer

At the time of this writing, GStreamer v4l2 plugin, v4l2src, doesn't support NV16 format. Here is the output from 'gst-inspect v4l2src | grep NV' :

format: NV12
format: NV21

NV16 uses 4:2:2 sampling whereas NV12 uses 4:2:0.

MPlayer

MPlayer should support NV16 but right now doesnt work on Hawkboard. MPlayer was unable to find a suitable color format for camera.

Custom Solution

Since testing of camera with off-the-shelf components couldnt be performed,following solution could be used.

ARM only Solution

Make a V4L2 application, cross-compiled it and tried on the board. Results were:

  • Camera interface was functional.
  • Frame-rate was very low, maybe 3-4 fps.

ARM + DSP Solution

To increase the fps, modify the existing Codec Engine following examples:

  • video_copy application : This is the ARM side application capturing camera data and showing on the VGA display.
  • viddec_copy codec : This is the DSP codec doing the actual NV16 <-> RGB565 conversion.

This ARM + DSP solution that gives a 25 fps nice camera test on the hawkboard. You can find the binaries and source codes at [1]

Using Provided Binaries

Assumptions:

When you download the provided vpi_example.tar.gz from the Hawkboard download page at code.google.com and extract it you will see the following files:

  • binaries
    • arm+dsp
      • all.x674
      • app_remote.xv5T
    • arm+only
      • camera-test
  • sources
    • arm+dsp
      • camera.patch
    • arm+only
      • .
      • .
      • .
  • uImage
Note : There might be a TVP5146 issue with hawkboard, such that TVP5146 being not detected by camera and throwing input setting errors. 
       To overcome this situation you could patch the kernel with the following patch. Patch can be downloaded from [2]

--- kernel_source_original/drivers/media/video/tvp514x.c        2010-03-18 14:33:44.000000000 +0200
+++ kernel_source/drivers/media/video/tvp514x.c 2010-05-07 00:46:05.284494625 +0300
@@ -741,8 +741,9 @@
                        break;
        }
-       if ((current_std == STD_INVALID) || (try_count < 0))
-               return -EINVAL;
+       if ((current_std == STD_INVALID) || (try_count < 0)) {
+               current_std = STD_PAL_BDGHIN;
+       }
        decoder->current_std = current_std;
        decoder->input = input;

Testing ARM only Code

Copy camera-test ARM-only binary somewhere in your filesystem on the board and run like any application:

./camera-test

No parameters are needed, program will open /dev/video0 and /dev/fb0 nodes, perform color conversion and you should see the results. You can find related source code in sources/arm-only' directory of the archive.

Testing ARM-DSP Code

First of all make sure that your CMEM and DSPLINK modules are loaded. If you are using an Angstrom file-system, make sure that your kernel is booted with mem=34M option so that CMEM module is automatically loaded. Also you can decrease necessary kernel buffer size with vpif_capture.ch0_bufsize=1658880 parameters. under U-Boot:

setenv bootargs 'console=ttyS2,115200n8 ubi.mtd=filesystem root=ubi0:hawkboard-rootfs rootfstype=ubifs mem=34M vpif_capture.ch0_bufsize=1658880'

If you need to load DSP modules by hand here is the necessary commands:

$insmod cmemk.ko phys_start=0xC2200000 phys_end=0xC3200000 pools=1x5250000,3x1048576,3x829440,1x256000,4x131072
$insmod dsplinkk.ko

Then copy both all.x674 and app_remote.xv5T binaries to the same folder and run the application with following commands:

./app_remote.xv5T all.x674

Now you should see 25 fps video on your screen. Unfortunately, If you want to compile this example on your own, you will need whole Codec Engine tree. You will find a patch containing necessary changes in the sources/arm+dsp folder along with my own Makefile. If you are not using DVSDK and you have a working OpenEmbedded tree around you can use provided Makefile to compile this example.