The concert-video OIOI uses a technique called lagged embedding to map samples to coordinates on a plane. The most interesting thing in this work I think is to find the a graphical representation at the musical phrase-level coupled with the soundwave/spectral charachteristics of the ongoing sound.
A DVD with OIOI is available through NMI.
An easy way to extend the technique used in OIOI would be to map further delayed versions of the sound to coordinates in 3D-space or whatever.
While fiddling about with the delay-values, the figures formed by the technique changes from circles through ellipsoides, to squares, (thats waveshaping, clipping at a certain amplitude!), stars, pointillistic scenes, whatever. Whats usually considered a phrase in electroacoustic music is represented quite well in this rendering process, since both the soundwave and the spectral evolution of a sound is rather efficiently represented graphically. OIOI is done using Common Lisp Music (CLM) and Common Music (CM) software.
The pictures (zillions of them!) making up the movie are direct engravings from 10 channels in the music, in the fashion described above.