ECE497 Project Gesture Based Control
Embedded Linux Class by Mark A. Yoder
Team members: Jake Bellis
Contents
Grading Template
Draft Feedback
Good you see you are getting this started. Needs much more work.
I'm using the following template to grade. Each slot is 10 points.
0 = Missing, 5=OK, 10=Wow!
00 Executive Summary 00 Installation Instructions 00 User Instructions 00 Highlights 00 Theory of Operation 00 Work Breakdown 00 Future Work 00 Conclusions 00 Demo 00 Late Comments: I'm looking forward to seeing this. Score: 10/100
(Inline Comment)
Executive Summary
This project is meant to use gesture control to perform various functions. The project can accurately detect gestures using the Sparkfun Breakout for the Avago APDS-9960. As of now, the gestures the project can sense are left, right, up, and down. These gestures are used to control an etch-a-sketch game on an 8x8 LED Matrix.
Packaging
Installation Instructions
- Install project from
Give step by step instructions on how to install your project.
- Install the project from https://github.com/JakeBellis/ECE497
- The project is located in the project folder
- Install Node Package Manager Packages
- Include your github path as a link like this to the read-only git site: https://github.com/MarkAYoder/gitLearn.
- Be sure your README.md is includes an up-to-date and clear description of your project so that someone who comes across you git repository can quickly learn what you did and how they can reproduce it.
- Include a Makefile for you code.
- Include any additional packages installed via opkg.
- Include kernel mods.
- If there is extra hardware needed, include links to where it can be obtained.
User Instructions
Instructions:
- Run the file with node.js
- Swipe your hand over the sensor in any direction to draw in that direction.
- The pixel adjacent to the current pixel will light up depending on the direction of the swipe
- The sensor is not 100% accurate in this configuration so the direction will not be 100% accurate.
- Restart the program to clear the matrix.
Highlights
The project can control an Etch-a-Sketch game on the LED Matrix. Swiping in any direction will cause the corresponding LED on the matrix to light up. The purpose of this is to be able to draw a picture on the LED matrix without physically touching any hardware.
Include a YouTube demo.
Theory of Operation
The software uses an interrupt to detect whether a new gesture was received by the sensor. This is set up using the command registers of the APDS-9960. Once the sensor detects when an object is over the
Work Breakdown
Interfacing with the APDS-9960 - Jake Bellis
This is the only thing completely as of now as I had a really bad quarter. The only thing I am able to accomplish is detecting whether a hand was swiped left, right, up, or down over the sensor. This took a lot of work, as the two main i2c packages that I tried initially (Bonescript and i2c) did not.
Controlling an Etch-a-Sketch Game - Jake Bellis
The gesture recognition software can control an Etch-a-Sketch game on an 8x8 LED Matrix adapted from previous exercises. A new pixel is drawn whenever
Future Work
With the time constraints, I was only able to detect up, down, left, and right gestures. I would like to be able to detect more complex gestures in the future. Additionally, I would like to do more complex tasks with gesture control that were not implemented in this project. I would also like to make a game using gestures which would require more accurately sensing hand motions.
Conclusions
I am happy that I got gesture based control to work fairly well on the beaglebone black using the Avago APDS-9960. This was a complex sensor to use as there were many command registers on this device, and the settings are set up to work, however this is probably not the best configuration.
Embedded Linux Class by Mark A. Yoder