Trying to create a single line portrait for the Processing day event in Bournemouth. I wanted a project that the public could engage with and which could be used with a plotter which some of the others are building. Bumbled across some single-line drawings on Pinterest and thought that would be good to try and draw in realtime in response to a webcam image.
Initially, I wrote a flocking algorithm where the boids were attracted to the darker areas of the portrait leaving a line trail as they go. This kind of worked but struggled to capture the details and make the final line drawing very recognisable.
Did some more research and came across LineLabs work using a stippling algorithm to act as the backbone for the line.
So, using a particle simulation, each particle has a repulsive forcefield strength and pushes against its neighbours if they are close enough, the strength of the forcefield is based on the brightness of the pixel at their position over an image, either a loaded picture or a live webcam feed. There’s a single particle in the middle with a weak attractive force and a large radius pulling all the particles in to keep them from just floating off the screen.
Sliders let me control all the parameters to play around and find optimum values.
More particles can be added in batches of 100 randomly scattered around the mouse position. Once 3-4 thousand are in the simulation processing starts to struggle and the frame rate drops to ~5fps. (I’m using Toxiclibs, perhaps writing my own particle quad-tree storage would speed up nearest neighbour searching)
When happy with the stippling I create a line through all the particle positions by searching for the next nearest neighbour. Processings curve function has a tightness parameter that controls the fit of the curve and with a very sloppy fit, it creates a nice squiggly line that overshoots each point like fast drawing.
It’s nice because it reacts in a really fluid way to a changing/moving image with the particles rippling about.