ECE597 Fall2014 LED Helmet

Team members: Asa Bromenschenkel, Caio Silva

Grading Template
I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

 00 Executive Summary 00 Installation Instructions 00 User Instructions 00 Highlights 00 Theory of Operation 00 Work Breakdown 00 Future Work 00 Conclusions 00 Demo 00 Late Comments: I'm looking forward to seeing this.

Score: 10/100

(Inline Comment)

Executive Summary
This project will be using LEDs to create a wearable helmet similar to those used by the group Daft Punk. We will be using Adafruit's neopixel to accomplish this.

We are currently in the design phase of the project. We will update the page as the project develops.

Packaging
To be described later.

Installation Instructions

 * 1) Plug in Beaglebone into host computer via USB.
 * 2) Wire the first LED on the string. Use the P_22 I/O on the Beagle for the data.
 * 3) Run the following line to install the TwitterAPI
 * pip install TwitterAPI


 * 1) Also wire a Sparkfun’s ADXL335 3-axis accelerometer to the Beagle following the diagram


 * 1) Follow User Instructions to run the code.

User Instructions
The use of this project requires the USB connection to a host computer and two terminals. From the first, navigate to the LEDscape directory (cloned from https://github.com/Yona-Appletree/LEDscape.git) and run the following commands to setup the server:


 * echo BB-BONE-PRU-01 >/sys/devices/bone_capemgr.9/slots
 * make
 * sudo systemctl enable /path/to/LEDscape/ledscape.service
 * sudo ./run-ledscape

Then in the other terminal, navigate to the openpixelcontrol directory (cloned from https://github.com/zestyping/openpixelcontrol.git) and run the following line:


 * python_clients/.py

where  is the name of the file the pattern is stored in.

In a separated terminal, navigate to the accelerometer directory (cloned from https://github.com/caiocvsilva/ECE597-2.git) and run the following line: node ac.js

Highlights
Besides all the concepts behind the usage of the LEDscape library, the neonpixel library, twitter API and the accelerometer data on the BeagleBone, one of the most important characteristics of this project is the interface between all this components, that allow us to create a LED Helmet, that interact with the user, through tweets or movements of it owns head, changing the color or what is shown on the helmet.

Theory of Operation
To be described later.

Work Breakdown

 * This project was separated in small blocks that could be easily executed and later on were combined to finalize the project.
 * The first block was to understand how the LEDscape and neonpixelcontrol worked with the LED grid, displaying different patters and colors trough the grid.
 * After understanding that two different blocks were studied, the interface of the twitter API with the LEDscape library and the use of an accelerometer to change the patterns showed by the grid.
 * With a lot of work spent on this two blocks was time to get it interfaced, so to work on the final part of the project, a helmet was chosen and the programs and libraries were adapted to work with two grids.
 * The helmet was prepared and connected to the BeagleBone ecosystem, and then the LED Helmet was finalized.

Future Work

 * An Idea for a future work can be aggregate a sensor to the ecosystem that allows following someone that get close to the helmet, and draw a pattern that responds to this data.
 * As an example, two big eyes could be draw following the person to all the sides, and even crossing if he/she is too close.

Conclusions
These are some photos of the result of the project:



And some videos demonstrating:

https://www.youtube.com/watch?v=xqu2p-e1CKA