Difference between revisions of "ECE497 Project Gesture Based Control"

From eLinux.org
Jump to: navigation, search
(Created Gesture Based Control Project Page)
 
m
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
[[Category:ECE497 |Px]]
+
[[Category:ECE497Fall2016 |Px]]
 
{{YoderHead}}
 
{{YoderHead}}
  
 
Team members: [[user:Jbellis|Jake Bellis]]
 
Team members: [[user:Jbellis|Jake Bellis]]
 
== Grading Template ==
 
== Grading Template ==
 +
=== Draft Feedback ===
 +
Good you see you are getting this started.  Needs much more work.
 +
 +
 
I'm using the following template to grade.  Each slot is 10 points.
 
I'm using the following template to grade.  Each slot is 10 points.
 
0 = Missing, 5=OK, 10=Wow!
 
0 = Missing, 5=OK, 10=Wow!
Line 27: Line 31:
 
== Executive Summary ==
 
== Executive Summary ==
  
This project is meant to use gesture control to perform various functions.  The functions are as yet undefined.
+
This project is meant to use gesture control to perform various functions.  The project can accurately detect gestures using the Sparkfun Breakout for the Avago APDS-9960.  As of now, the gestures the project can sense are left, right, up, and down.  These gestures are used to control an etch-a-sketch game on an 8x8 LED Matrix.
 +
 
 +
== Packaging ==
 +
*[https://www.sparkfun.com/products/12787 SparkFun RGB and Gesture Sensor - APDS-9960]
 +
*[https://learn.adafruit.com/adafruit-led-backpack/bi-color-8x8-matrix Bi-Color 8x8 Matrix]
  
The project can accurately detect gestures using the Sparkfun Breakout for the Avago APDS 9960.
+
Both of these devices are connected to the second i2c bus on the beaglebone.  SCL is on pin P9_20, SDA is on pin P9_19.
  
Literally all I have working is an accurate detection of gestures.  There is nothing to control yet.  This was the result of extremely poor planning on my part.
+
== Installation Instructions ==
  
I am hoping to get the gesture sensor to control inputs and outputs today.
+
*Install project from
 +
Give step by step instructions on how to install your project.
  
== Packaging ==
+
*Install the project from [https://github.com/JakeBellis/ECE497 https://github.com/JakeBellis/ECE497]
If you have hardware, consider [http://cpprojects.blogspot.com/2013/07/small-build-big-execuition.html Small Build, Big Execuition] for ideas on the final packaging.
+
**The project is located in the project folder
  
== Installation Instructions ==
+
*Install Node Package Manager Packages
 +
**'''npm install i2c-bus'''
  
Give step by step instructions on how to install your project. 
+
*Set up Hardware on the beaglebone
  
* Include your [https://github.com/ github] path as a link like this to the read-only git site:  [https://github.com/MarkAYoder/gitLearn https://github.com/MarkAYoder/gitLearn].
+
There are no specific changes needed to the beaglebone black for this project to run.
* Be sure your README.md is includes an up-to-date and clear description of your project so that someone who comes across you git repository can quickly learn what you did and how they can reproduce it.
 
* Include a Makefile for you code.
 
* Include any additional packages installed via '''opkg'''.
 
* Include kernel mods.
 
* If there is extra hardware needed, include links to where it can be obtained.
 
  
 
== User Instructions ==
 
== User Instructions ==
  
Once everything is installed, how do you use the program?  Give details here, so if you have a long user manual, link to it here.
+
Instructions:
 +
*Run the file with node.js
 +
*Swipe your hand over the sensor in any direction to draw in that direction.
 +
**The pixel adjacent to the current pixel will light up depending on the direction of the swipe
 +
**The sensor is not 100% accurate in this configuration so the direction will not be 100% accurate.
 +
*Restart the program to clear the matrix.
 +
 
  
 
== Highlights ==
 
== Highlights ==
  
Here is where you brag about what your project can do.
+
The project can control an Etch-a-Sketch game on the LED Matrix.  Swiping in any direction will cause the corresponding LED on the matrix to light up.  The purpose of this is to be able to draw a picture on the LED matrix without physically touching any hardware.
  
 
Include a [http://www.youtube.com/ YouTube] demo.
 
Include a [http://www.youtube.com/ YouTube] demo.
Line 61: Line 72:
 
== Theory of Operation ==
 
== Theory of Operation ==
  
Give a high level overview of the structure of your softwareAre you using GStreamer? Show a diagram of the pipelineAre you running multiple tasks?  Show what they do and how they interact.
+
The software uses an interrupt to detect whether a new gesture was received by the sensor.  This is set up using the command registers of the APDS-9960.  Once the sensor detects when an object is over the sensor for four integration cycles it generates an interrupt linking to the gesture sensing functionOnce in this interrupt, it senses whether a new gesture is received. If a new gesture is received, the corresponding pixel on the LED Matrix will light upThe algorithm to detect gestures was based on an algorithm developed for an arduino by sparkfun.
  
 
== Work Breakdown ==
 
== Work Breakdown ==
Line 67: Line 78:
 
Interfacing with the APDS-9960 - Jake Bellis
 
Interfacing with the APDS-9960 - Jake Bellis
  
This is the only thing completely as of now as I had a really bad quarter.  The only thing I am able to accomplish is detecting whether a hand was swiped left, right, up, or down over the sensor.
+
The only thing I am able to accomplish is detecting whether a hand was swiped left, right, up, or down over the sensor.  This took a lot of work, as the two main i2c packages that I tried initially (Bonescript and i2c) did not set the command registers correctly.  This problem was solved by switching to the i2c-bus package.
 +
 
 +
Controlling an Etch-a-Sketch Game - Jake Bellis
 +
 
 +
The gesture recognition software can control an Etch-a-Sketch game on an 8x8 LED Matrix adapted from previous exercises.  A new pixel is drawn whenever a new gesture is sensed on by the gesture sensor.
  
 
== Future Work ==
 
== Future Work ==
  
With the time constraints, I was only able to detect up, down, left, and right gestures.  I would like to be able to detect more complex gestures in the future.  Additionally, I would like to do more complex tasks with gesture control that were not implemented in this project.
+
With the time constraints, I was only able to detect up, down, left, and right gestures.  I would like to be able to detect more complex gestures in the future.  Additionally, I would like to do more complex tasks with gesture control that were not implemented in this project.  I would also like to make a game using gestures which would require more accurately sensing hand motions.  One of the games that I thought would work well with the current hardware was the game snake.  In addition to expanding the work of the current configuration, one of the things that I thought could be something to do would be to use gesture control to interact with certain elements of a server.  This could be things like uploading and downloading data or choosing what elements to interact with on a webpage.  
  
 
== Conclusions ==
 
== Conclusions ==
  
Give some concluding thoughts about the project. Suggest some future additions that could make it even more interesting.
+
I am happy that I got gesture based control to work fairly well on the beaglebone black using the Avago APDS-9960.  This was a complex sensor to use as there were many command registers on this device, and the settings are set up to work, however this is probably not the best configuration.  With additional time, I would have spent more time updating the algorithm and command register values to sense the gestures more accurately.  I would have liked to accomplish more on this project, but am satisfied with what I accomplished. I think that this project can lead to more work in the future with the gesture sensor.
  
 
{{YoderFoot}}
 
{{YoderFoot}}

Latest revision as of 06:21, 27 October 2017

thumb‎ Embedded Linux Class by Mark A. Yoder


Team members: Jake Bellis

Grading Template

Draft Feedback

Good you see you are getting this started. Needs much more work.


I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

00 Executive Summary
00 Installation Instructions 
00 User Instructions
00 Highlights
00 Theory of Operation
00 Work Breakdown
00 Future Work
00 Conclusions
00 Demo
00 Late
Comments: I'm looking forward to seeing this.

Score:  10/100

(Inline Comment)

Executive Summary

This project is meant to use gesture control to perform various functions. The project can accurately detect gestures using the Sparkfun Breakout for the Avago APDS-9960. As of now, the gestures the project can sense are left, right, up, and down. These gestures are used to control an etch-a-sketch game on an 8x8 LED Matrix.

Packaging

Both of these devices are connected to the second i2c bus on the beaglebone. SCL is on pin P9_20, SDA is on pin P9_19.

Installation Instructions

  • Install project from

Give step by step instructions on how to install your project.

  • Install Node Package Manager Packages
    • npm install i2c-bus
  • Set up Hardware on the beaglebone

There are no specific changes needed to the beaglebone black for this project to run.

User Instructions

Instructions:

  • Run the file with node.js
  • Swipe your hand over the sensor in any direction to draw in that direction.
    • The pixel adjacent to the current pixel will light up depending on the direction of the swipe
    • The sensor is not 100% accurate in this configuration so the direction will not be 100% accurate.
  • Restart the program to clear the matrix.


Highlights

The project can control an Etch-a-Sketch game on the LED Matrix. Swiping in any direction will cause the corresponding LED on the matrix to light up. The purpose of this is to be able to draw a picture on the LED matrix without physically touching any hardware.

Include a YouTube demo.

Theory of Operation

The software uses an interrupt to detect whether a new gesture was received by the sensor. This is set up using the command registers of the APDS-9960. Once the sensor detects when an object is over the sensor for four integration cycles it generates an interrupt linking to the gesture sensing function. Once in this interrupt, it senses whether a new gesture is received. If a new gesture is received, the corresponding pixel on the LED Matrix will light up. The algorithm to detect gestures was based on an algorithm developed for an arduino by sparkfun.

Work Breakdown

Interfacing with the APDS-9960 - Jake Bellis

The only thing I am able to accomplish is detecting whether a hand was swiped left, right, up, or down over the sensor. This took a lot of work, as the two main i2c packages that I tried initially (Bonescript and i2c) did not set the command registers correctly. This problem was solved by switching to the i2c-bus package.

Controlling an Etch-a-Sketch Game - Jake Bellis

The gesture recognition software can control an Etch-a-Sketch game on an 8x8 LED Matrix adapted from previous exercises. A new pixel is drawn whenever a new gesture is sensed on by the gesture sensor.

Future Work

With the time constraints, I was only able to detect up, down, left, and right gestures. I would like to be able to detect more complex gestures in the future. Additionally, I would like to do more complex tasks with gesture control that were not implemented in this project. I would also like to make a game using gestures which would require more accurately sensing hand motions. One of the games that I thought would work well with the current hardware was the game snake. In addition to expanding the work of the current configuration, one of the things that I thought could be something to do would be to use gesture control to interact with certain elements of a server. This could be things like uploading and downloading data or choosing what elements to interact with on a webpage.

Conclusions

I am happy that I got gesture based control to work fairly well on the beaglebone black using the Avago APDS-9960. This was a complex sensor to use as there were many command registers on this device, and the settings are set up to work, however this is probably not the best configuration. With additional time, I would have spent more time updating the algorithm and command register values to sense the gestures more accurately. I would have liked to accomplish more on this project, but am satisfied with what I accomplished. I think that this project can lead to more work in the future with the gesture sensor.




thumb‎ Embedded Linux Class by Mark A. Yoder