Difference between revisions of "ECE497 Projecting with Sense"

From eLinux.org
Jump to: navigation, search
(Instillation Instructions)
Line 1: Line 1:
Team members: [[user:lesterwm|Mike Lester]]
Team members: [[user:lesterwm|Mike Lester]]

Latest revision as of 14:03, 11 November 2011

Team members: Mike Lester

Executive Summary

TI's BeagleBoard and Pico Projector are a powerful combination. Using the Pico Projector to display output from the BeagleBoard yields a large picture from a very small package. However, the Pico's small size and weight mean that it does not often sit level, resulting in a skewed projection image. This project aims to use a 3-axis accelerometer mounted on the BeagleBoard to auto-correct this distortion, resulting in a correct projection even when the Pico Projector is not lying flat.

Currently, the BeagleBoard can read data from the accelerometer and compute its, and the Pico's, orientation. However, due to the difficulty of getting the 3D SDK running on the host computer, the front-end that would use this data to auto-correct the projection is not yet working.

Once that hurdle is cleared, it should be relatively easy to use OpenGL ES to de-skew the projection in real-time. One other problem is handling a distortion known as keystoning, which results from rotations that are not perpendicular to the projection surface. In order to account and correct for this, we would need to know the distance between the Pico and the projection surface. This can be circumvented by having the user manually hold the Pico in the correct position, "marking" that position (perhaps by pressing the User button), and then placing the Pico on its resting surface.

Installation Instructions

In order to install the project:

  • First, acquire a 3-axis accelerometer. We used an ADXL345 on a breakout board from SparkFun.

NOTE: the code for this project is hardware specific. In order to use it, you'll need to use the ADXL345

  • Connect the ADXL345 to the BeagleBoard using the i2c interface. We did this by taking advantage of the prototyping area on the BeagleBoard Trainer which exposes the i2c interface.
  • Compile on the BeagleBoard with:
gcc ADXL345.c demo.c -o demo

User Instructions

To run the project, simply run the executable we built at the end of the installation phase:



One of the highlights of this project is that it shows how easy it is to add extra devices to the already powerful BeagleBoard. It integrates the use of the Pico Project, 3-axis acceleromter, Beagle Trainer, and the BeagleBoard itself. The Beagle and Pico's size and portability make them great candidates for extra sensory peripherals like an accelerometer, and the Beagle Trainer makes them easy to prototype. This project is just a single example of what can be done when the Beagle can sense its environment.

Theory of Operation

The Beagle Trainer is connected to the Beagle by its expansion header. The Trainer exposes the pins of the Beagle's i2c interface, which we connect to the Accelerometer. The accelerometer is constantly measuring and outputting its data. We poll the chip as often as necessary using the i2c bus. By performing a multi-byte read, we can pull the orientation data for all 3 axes at once, resulting in what is essentially a normal vector for the BeagleBoard and Pico Projector.

Currently, that normal vector is simply printed when the program is run. In the future, the program will be more complex. OpenGL will be used to display a rectangle (which may have whatever we like on it, e.g. the desktop). When the normal vector changes, OpenGL will rotate the rectangle to remain steady.

Work Breakdown

Since this is a solo project, I'll be doing the entirity of the project.


If we combined the accelerometer with a depth sensor, such as the depth cameras on the Kinect or a laser and optical recognition system, this project could be improved to handle all kinds of projection deformation. With that extra depth information, one could essentially implement perfect, real-time digital keystone correction.

The most interesting thing about this project, in my opinion, is the implications it has for the Pico Projector. Since the target use for the Pico seems to be hand-held projection, it would be very helpful to have a method of reducing jittering and skew. This project aims to show how an accelerometer can account for skew, but it could be extended to handle jittering as well. If the Pico leaves a small border of dead space around the projected image, acceleration data could be used to translate the image in order to negate the jitters of the users hand.

In fact, the accelerometer chip is so small that I believe it could be integrated directly into the Pico. I also believe that this project could be implemented on the Pico hardware without too much trouble. That would result in a Pico Projector that could auto-correct its projected image in hardware and result in a much cleaner implementation while maximizing the possible resolution (since our implementation is entirely software-based, we are forced to reduce the resolution of the image when rotating it).