Processing Sketchbook

I have committed a number of Processing programs and experiments to GitHub (older). Here are some of the highlights, which I hope you find useful.

I also have a number of Processing.js sketches (mostly made for educational purposes) available at OpenProcessing.


An homage to David Hockney's composites. Each frame is presented as a grid, in which each cell is presented with a random delay.

In the audio version, the delay will be mapped to the audio input level.


Each frame is rendered as a set of rows or columns, and each is copied from a specific frame in a buffer. In the keyboard version you can set the buffer depth and the rendering will be linear, from newer to older frames.

In the audio version, the rows and columns will be retrieved from frames mapped to the audio input level. Sometimes a high amplitude will also trigger a mode change (horizontal/vertical).


Places a video/webcam feed in a 3D space, mapping the pixels' depth to their brightness:

This sketch uses PeasyCam for mouse-driven 3D control and JMyron for the video capture, and was made with Processing 1.5 (might work with 2.0+).

Executables for Windows and Mac are available here.


This sketch enables you to control a set of cameras as well as a video playlist, allowing its presentation to be controlled with the keyboard. MultiProjection was made to fit the technical requirements of the Identity Project performance art piece.

This sketch only works in Processing 1.5 and was only tested on Windows. It requires the Fullscreen library and GS Video (version 20110709) for video playback. Video capture uses the standard Processing 1.5 video library (for the DV device support discontinued in 2.0), which requires Quicktime and WinVDIG 1.01.

Yes, this sketch is quite picky in its dependencies...


Both these sketches were made in a small workshop on object-oriented Processing I taught. Lineforms draws a number of randomly animated lines that tend to congregate around the mouse pointer if it is over the sketch window. You can try a Processing.js version right now.

Cameraforms requires a camera and Processing 2.0. In it the random lines follow the brightest spot in the image. Try it with a flashlight in a dark room!


This sketch uses PeasyCam to control a resonating cube. The sound, generated with Minim, changes as you inspect that cube.

The Camera version uses JMyron to detect movement in front of a camera and inspect the cube accordingly.