The concert-video OIOI uses a technique called lagged embedding to map samples to coordinates on a plane. The most interesting thing in this work I think is to find the a graphical representation at the musical phrase-level coupled with the soundwave/spectral charachteristics of the ongoing sound.

A DVD with OIOI is available through NMI.

Pictures © Anders Vinjar 1997. Not to be used in production or commerce.

Make your own movie!

The principle is straightforward, and can be homebrewed with most fast computers. To draw a point on the plane you map the amplitude-value of the sound at a certain position to the x-coordinate, and map the amplitude-value of a delayed version of the sound to the y-coordinate. Repeat this in a sample-loop going through the whole sound, and youve got yourself a movie!

An easy way to extend the technique used in OIOI would be to map further delayed versions of the sound to coordinates in 3D-space or whatever.

While fiddling about with the delay-values, the figures formed by the technique changes from circles through ellipsoides, to squares, (thats waveshaping, clipping at a certain amplitude!), stars, pointillistic scenes, whatever. Whats usually considered a phrase in electroacoustic music is represented quite well in this rendering process, since both the soundwave and the spectral evolution of a sound is rather efficiently represented graphically. OIOI is done using Common Lisp Music (CLM) and Common Music (CM) software.

The pictures (zillions of them!) making up the movie are direct engravings from 10 channels in the music, in the fashion described above.