During Summer 2011, my friend Rich Whalley and I were experimenting with some sets of Philips Color Kinetics independently addressable LED lighting arrays in the name of science. Then Hurricane Irene struck and we were stranded in the lab with nothing to do but dance -- what better way to pass a rainy afternoon than building a better dance party?
Using a Processing Library to connect to the ColorKinetics host controller, we wrote our own beat-detection algorithm in Processing using FFTs and a simple moving average window to take sound input from the line-in jack on a computer and control the lights. Once the waveforms have been transformed to the frequency domain using the FFT, we cut up the signal and sorted it into different frequency bins. The moving average window keeps track of the past few samples in the time domain and when the current signal deviates in frequency by a specified threshold from the moving average, a beat is detected. Meanwhile, the lights are shifting through a hue cycle to change the visual experience. The result, 'Beat It!', is an audiovisual experience of hundreds of independently addressable light units that dance along to the beats of any music or audio input.
Here's our first functional prototype of the algorithm in action:
We later extended this to control multiple arrays of lights (about 500 light sources in total) and threw a great party [footage not found].