ECE597 Fall2014 LED Helmet

Team members: Asa Bromenschenkel, Caio Silva

Grading Template
I'm using the following template to grade. Each slot is 10 points. 0 = Missing, 5=OK, 10=Wow!

 00 Executive Summary 00 Installation Instructions 00 User Instructions 00 Highlights 00 Theory of Operation 00 Work Breakdown 00 Future Work 00 Conclusions 00 Demo 00 Late Comments: I'm looking forward to seeing this.

Score: 10/100

(Inline Comment)

Executive Summary
This project uses LED arrays on a wearable headpiece mimicking those used by the group Daft Punk. Along with the ability to display various patterns, the arrays can also be modified based on an accompanying accelerometer and interactions with Twitter.

The project uses three different APIs to implement all of its features. We use the LEDscape API to start a server that takes data in when connected to and translates then sends that data to LED arrays. openpixelcontrol is used to create the patterns that are sent to the LEDscape. Finally the TwitterAPI is used to interact with the user's Tweets and display them on the arrays.

We currently have a working prototype which can be seen in the conclusion section.

Packaging
Two 10x10 LED arrays have been mounted onto the front of a baseball hat just above the brim. These arrays were constructed on the Rose-Hulman campus and include small holes in the corners which were used to sew the arrays to the hat.

A communications wire was created to allow the Helmet range away from the BeagleBone Black. On one end it has male header pins to connect to the BeagleBone and the other end has female header pins to allow the arrays and accelorometer to be connected and disconnected without the need to permanently attach.

Because we do not have a function WiFi adapter to allow the BeagleBone access to the internet when not attached to a host computer, we have not packaged it this in a portable manner. It remains attached to the prototyping breadboard used for class.

Installation Instructions

 * 1) Plug in Beaglebone into host computer via USB.
 * 2) Wire the first neopixel LED array data wire to P_22 on the Beagle. Wire the second LED array data wire to P_21 on the Beagle.
 * 3) Run the following line to install the TwitterAPI
 * pip install TwitterAPI


 * 1) Also wire a Sparkfun’s ADXL335 3-axis accelerometer to the Beagle following the diagram


 * 1) Follow User Instructions to run the code.

User Instructions
The use of this project requires the USB connection to a host computer and two terminals. From the first, navigate to the LEDscape directory (cloned from https://github.com/Yona-Appletree/LEDscape.git) and run the following commands to setup the server:


 * echo BB-BONE-PRU-01 >/sys/devices/bone_capemgr.9/slots
 * make
 * sudo systemctl enable /path/to/LEDscape/ledscape.service
 * sudo ./run-ledscape

Running the Twitter Connection:
To run the interaction between the Beagle and Twitter, in the other terminal navigate to the openpixelcontrol directory (cloned from https://github.com/zestyping/openpixelcontrol.git) and run the following line:


 * python_clients/.py

where  is the name of the file the pattern is stored in.

Running the accelorometer:
If you wish to use the accelorometer, navigate to the accelerometer directory (cloned from https://github.com/caiocvsilva/ECE597-2.git) and run the following line:


 * node ac.js

Highlights
Besides all the concepts behind the usage of the LEDscape library, the neonpixel library, twitter API and the accelerometer data on the BeagleBone, one of the most important characteristics of this project is the interface between all this components, that allow us to create a LED Helmet, that interact with the user, through tweets or movements of it owns head, changing the color or what is shown on the helmet.

Theory of Operation
To be described later.

Work Breakdown

 * This project was separated in small blocks that could be easily executed and later on were combined to finalize the project.
 * The first block was to understand how the LEDscape and neonpixelcontrol worked with the LED grid, displaying different patters and colors trough the grid.
 * After understanding that two different blocks were studied, the interface of the twitter API with the LEDscape library and the use of an accelerometer to change the patterns showed by the grid.
 * With a lot of work spent on this two blocks was time to get it interfaced, so to work on the final part of the project, a helmet was chosen and the programs and libraries were adapted to work with two grids.
 * The helmet was prepared and connected to the BeagleBone ecosystem, and then the LED Helmet was finalized.

Future Work

 * An Idea for a future work can be aggregate a sensor to the ecosystem that allows following someone that get close to the helmet, and draw a pattern that responds to this data.
 * As an example, two big eyes could be draw following the person to all the sides, and even crossing if he/she is too close.

Conclusions
These are some photos of the result of the project:



And some videos demonstrating:

https://www.youtube.com/watch?v=xqu2p-e1CKA