Atomised – a realtime generative portrat

Quick bit of background… Jimmy C contacted us to see if we were interested in collaborating on a project. After an initial chat we came up with a couple of ideas, one of which was to try and recreate version of Jimmy’s painting style in realtime so that individuals could see themselves transformed.

I spent a few weeks of coding during evenings and weekends using OpenFrameWorks, OpenCV, Kinect2, a webcam and fast windows PC, lots of photos of the actual spray paint spots with their drips and regular feedback from Jimmy.

An over view of what happens every frame, 1/30 second. The Kinect captures a depth map and with some thresholding isolates the subject form the background. An additional webcam (because the Kinect colour camera is too limited) captures the visible image. This is then processed in a number of ways. First to extract a details map using canny edge detection which in turn controls the size of the paint spots.  Then a silhouette coverage map is created to ensure the spots filled out the whole portrait area. Next each spot checks if its too close to its neighbors and tries to move towards some space. The colour of the image is sampled and mapped onto Jimmy’s colour palette with variable noise. The radiating circles travel out and grow as they leave the silhouette area.
All the parameters are controlled with sliders so the overall effect can be fine tuned in realtime and the settings saved. For the installation, the image was projected onto a square canvas to make it feel more like a painting.

Finally we had something we both liked and it was opened to the public at the Lollipop Gallery. It was great seeing how people interacted with the experience and their reactions.

Many thanks Peter Collis for your time and skill creating the video. Here’s a link to the press release

Looking forward to the next collaboration.