ECE497 Project Makeshift Drums
Our project will communicate with an accelerometer to make music. The sensor can be attached by the user to everyday objects like books or a table top and the Beagle Bone Black will use the sensors' outputs to play sounds. In this way, a user can play the drums, bongos, maracas, et cetera without a drum kit or other bulky equipment.
The implementation of this project will involve the combination of four major parts.
- A ADXL345 accelerometer,
- A C++ node.js module to perform polling and data processing,
- A simple node.js server, which uses socket.io to send motion events, and
- A client side app which uses HTML5 WebAudio APIs to play back sound in response to motion events.
Building this project was easiest to perform on the beagle-bone itself, however it should be possible to cross-compile if you wish.
- Download the project source code from https://github.com/axiixc/beagle-band
- Building node.js add-ons uses a special build manager called
node-gyp, to install it you will have to first install
- You can then install
npm install -g node-gyp(note: if you are not already root this will require
- Build the
- Change directory to the
node-gyp clean configure buildto fully rebuild the module.
- If you wish you can test the module using
node run.js, which is a simple test script that simply prints out all motion events it receives.
- Change directory to the
- Install dependencies for the server.
- Start the server by running
node server.js. This will start the server listening on port 3001.
The only hardware we used, other than the beaglebone, was the aformentioned ADXL345 accelerometer.
- VCC should be tied to 3.3V and GND to GND.
- We used the I2C interface which required the /CS pin to be tied high. We tied the SDO/ALT ADDRESS pin low to give the device an address of 0x53.
- We used I2C bus 1 by connecting SCL and SDA to pins 19 and 20, respectively.
- The INT1 pin, which lets the beaglebone know when samples are ready for processing, was connected to GPIO60(P9_12).
Our project allows the user to navigate to a port on the beaglebone. The bone then provides an interface which allows the user to select which instrument he or she wishes to play.
At this point, the user can shake, strike or otherwise agitate the ADXL345 accelerometer connected to the beaglebone. The beaglebone processes these movements and and will play sounds corresponding to the selected instrument through the browser accordingly. In this way the user can act like the beaglebone is a maraca, bongo, or other percussion instrument to emulate a real percussionist.
A video of our working prototype is available at https://vimeo.com/79573880.
Theory of Operation
When the program first runs it initializes the accelerometer to sample at 100Hz with a range of +/-2g. Whenever a sample is collected, the accelerometer sends an interrupt signal to one of the BBB's GPIO pins. Our code then uses the I2C protocol to retrieve that sample and any others it has collected in the meantime. The accelerometer has a FIFO queue which will collect up to 32 samples in the even that the BBB does not service the interrupt before the next samples are collected.
The BBB then conditions the signal so it can ignore any offsets (such as those due to the force of gravity) or small changes, like noise of the device being rotated. By changing a few parameters of our conditioning algorithm we can change how strong a strike or shake must be before the device detects it.
To condition the signal, we first considered the fact that the accelerometer is affected by the force of gravity. This provides an offset depending on the orientation with which the user is holding the accelerometer and affects our calculation. To factor this out, we found the derivative of the acceleration by subtracting each incoming sample with the previous one. By accumulating the subsequent values we eliminate this offset.
However, if we reorient the device the offset comes back--we merely calibrated the offset for the accelerometer's starting position. To continuously factor out this offset as the orientation changes we made the accumulator only add together the latest handful of samples. This essential creates a highpass filter. By changing the number of points accumulated at once we can affect how quickly a change must occur for it to not be filtered out. This helps us detect shakes or strikes much easier.
Next we simply accumulate our filtered acceleration values to get our velocity. When our velocity is equal to zero (and our acceleration is non-zero) we have found a shake or a strike. This is when our program reports the strike to the browser. We also take the acceleration value at that point and use that to change the volume of the sound we make.
play_sound event to all listening clients, along with an intensity value which the clients use to adjust the volume of the played sound.
On the client side we used WebAudio to playback static mp3 files which are also hosted by our node.js server. When the client opens a new websocket connection, a
set_sounds event is emitted by the server to the client, enumerating all available audio files. The client uses this list to initialize a
SoundList structure on its end, which holds both the list of available sounds and contains logic to download and retain cached buffers of those sounds, allowing for seamless playback. Then, when a sound is selected by the client (either by explicit user action, or implicitly when a new SoundList is initialized), the relevant resources are downloaded from the server, and playback will begin as soon as playback events are received from the server.
Accelerometer interface -- Will Elswick - 7 hours
Signal conditioning -- Will Elswick - 9 hours
Node.js addon -- James Savage - 12 hours
Web browser interface -- James Savage - 7 hours
Inspiration/sarcasm -- James Savage - his entire life
This project could easily be expanded to take inputs from multiple accelerometers. We didn't have any other ADXL345 accelerometers so we couldn't duplicate our inputs. With this, it we could easily modify our interface to recognize another accelerometer and assign to it its own unique sounds. This way we could have a whole percussion section or even a whole drum kit.
Additionally we could add some composing tools to the interface so that the user could record and playback or loop his performance. By switching the instrument someone could create an entire rythmic track using only the one accelerometer we have.
Give some concluding thoughts about the project. Suggest some future additions that could make it even more interesting.