The Heliconius Butterfly Wing Pattern Evolver game is finished and ready for it’s debut as part of the Butterfly Evolution Exhibit at the Royal Society Summer Exhibition 2014. Read more about the scientific context on the researcher’s website, and click the image above to play the game.
The source code is here, it’s the first time I’ve used WebGL for a game, and it’s using the browser version of fluxus. It worked out pretty well, even to the extent that the researchers could edit the code themselves to add new explanation screens for the genetics. Like any production code it has niggles, here’s the function to render a butterfly:
(define (render-butterfly s) (with-state ;; set tex based on index (texture (list-ref test-tex (butterfly-texture s))) ;; move to location (translate (butterfly-pos s)) ;; point towards direction (maim (vnormalise (butterfly-dir s)) (vector 0 0 1)) (rotate (vector 0 90 90)) ;; angle correctly (scale (vector 0.5 0.5 0.5)) ;; make smaller (draw-obj 4) ;; draw the body (with-state ;; draw the wings in a new state (rotate (vector 180 0 0)) (translate (vector 0 0 -0.5)) ;; position and angle right ;; calculate the wing angle based on speed (let ((a (- 90 (* (butterfly-flap-amount s) (+ 1 (sin (* (butterfly-speed s) (+ (butterfly-fuzz s) (time))))))))) (with-state (rotate (vector 0 0 a)) (draw-obj 3)) ;; draw left wing (with-state (scale (vector 1 -1 1)) ;; flip (rotate (vector 0 0 a)) (draw-obj 3)))))) ;; draw right wing
There is only immediate mode rendering at the moment, so the transforms are not optimised and little things like draw-obj takes an id of a preloaded chunk of geometry, rather than specifying it by name need to be fixed. However it works well and the thing that was most successful was welding together the Nightjar Game Engine (HTML5 canvas) with fluxus (WebGL) and using them together. This works by having two canvas elements drawn over each other – all the 2D (text, effects and graphs) are drawn using canvas, and the butterflies are drawn in 3D with WebGL. The render loops are run simultaneously with some extra commands to get the canvas pixel coordinates of objects drawn in 3D space.
Deershed is a music festival designed to accommodate families with lots of activities for children. Part of this year’s festival was a Machines Tent, including Lego robot building, Mechano constructions, 3D printing and computer games.
Slub’s daily routine in the Machines Tent started by setting up the Al Jazari gamepad livecoding installation, a couple of hours with Martyn Eggleton teaching Scratch programming on an amazing quad Raspberry Pi machine (screens/processors and keyboards all built into a welded cube).
At some point we would switch to Minecraft, trying some experiments livecoding the LAN game world using Martyn’s system to access the Minecraft API using Waterbear, a visual programming language using a similar blocks approach as Scratch and Scheme Bricks.
During the afternoons Alex and I could try some music livecoding experiments. This was a great environment for playful audience participatory performances, with families continually passing through the tent I could use a dancemat to trigger synths in fluxus while Alex livecoded music designed to encourage people to jump up and down.
One of the most interesting things for me was to be able to see how lots of children (who mostly didn’t know each other) collaborate and self organise themselves in a LAN game, there was quite a pattern to it with all the groups:
- Mess around with Minecraft as usual (make some blocks, start building a house).
- Find something built by someone else, destroy a few bricks.
- Snap out of the game to notice that the other kids are complaining.
- Realise that there are other people in the world – and they are sat around them!
- Attempt to fix the damage.
At this point other people would join in to help fix things, after which there would be some kind of understanding reached between them to respect each other’s creations. This has all really inspired me to work on Al Jazari 2 which combines a lot of these ideas.
Preparations for a busy summer, new Al Jazari installation gamepads on the production line:
This weekend Alex and I are off to the Deershed Festival in Yorkshire to bring slub technology to the younger generation. We’ll be livecoding algorave, teaching scratch programming on Raspberry Pis and running an Al Jazari installation in between. Then onwards to London for a Sonic Bike Lab with Kaffe Matthews where we’re going to investigate the future of sonic bike technology and theory – including possibly, bike sensor driven synthesis and on the road post-apocalyptic mesh networking.
At the end of August I’m participating in my local media arts festival – Fascinate in Falmouth, where I’ll be dispensing a dose of algorave and probably even more musical robot techno.
Plutonian Botzlang is a new language I’m working on for a commission we’ve had from Arnolfini and Kunsthal Aarhus. The idea is to make the Naked on Pluto game bots programmable in a way that allows them to be scripted from inside the game interface, able to inspect all the objects around them and carry out actions on the world like a normal player. We can then strip the game down and make it into an online multiplayer musical livecoding installation.
Bots can be fed code line by line by talking to them, started and stopped and pinged to check their status. I toyed with the idea of making a one-line programming language with lots of semi-cryptic punctuation but opted instead for something a bit simpler and longer, but requiring line numbers.
Here is an example program that looks in the current node, or room for Bells, picks them up if found then saying their descriptions. Each time it loops it might drop the Bell and walk to a new location. This results in bots that walk around a game world playing bells.
10 for e in node.entities 20 if e.name is Bell 30 pickup e.name 40 end 50 end 60 for e in this.contents 70 say e.desc 80 end 90 if random lessthan 5 100 drop Bell 110 walk 120 end 130 goto 10
Here is a screenshot of the modified version of the game with a bot being programmed:
I’ve been doing more remote install work on Kaffe’s latest piece she’s been building while resident at Hai Art in Hailuoto, an island in the north of Finland. The zone building, site specific sample composing and microscopic Beagleboard log debugging is over, and two new GPS Opera bikes are born! Go to Hai Art or Kaffe’s site for more details.
Prepare your bicycle clips! Kaffe Matthews and I are starting work on a new Bicycle Opera piece for the city of Porto, I’m working on a new mapping tool and adding some new zone types to the audio system.
While working on a BeagleBoard from one of the bikes used in the Ghent installation of ‘The swamp that was…’, I found (in true Apple/Google style) 4Mb of GPS logs, taken every 10 seconds during the 2 month festival that I forgot to turn off. Being part of a public installation (and therefore reasonably anonymised 🙂 – this is the first 5th of the data, and about all it was possible to plot in high resolution on an online map:
It’s interesting to see the variability of the precision, as well as being able to identify locations and structures that break up the signal (such as the part underneath a large road bridge).
Jaye Louis Douce, Ruth Ross-Macdonald and I took to the ramps of Mount Hawke skate park in deepest darkest Cornwall to test the prototype tracker/projection mapper (now know as ‘The Cyber-Dog system‘) in it’s intended environment for the first time. Mount Hawke consists of 20,000 square feet of ramps of all shapes and sizes, an inspiring place for thinking about projections and tracing the flowing movements of skaters and BMX riders.
Finding a good place to mount the projector was the first problem, it was difficult to get it far enough away to cover more than a partial area of our chosen test ramp – even with some creative duct tape application. Meanwhile the Kinect camera was happily tracking the entire ramp, so we’ll be able to fix this by replacing my old battered projector with a better model in a more suitable location.
The next challenge is calibrating the projection mapping to align it with what the camera is looking at. As they are in different places this is quite fiddly and time consuming to get right, some improvements to the fluxus script will make it faster. Here is Jaye testing it once we had it lined up:
Next it was time to recruit some BMX test pilots to give it a go:
At higher speed it needs a bit of linear interpolation to ‘connect the dots’, as the visualisation is running at 60fps while the tracking is more like 20fps:
This test proved the fundamental idea, and opens up lots of possibilities, different types of visualisations, recording/replaying paths over time as well as the possibility of identifying individual skaters or BMX riders with computer vision. One great advantage this setup has is once it’s running it will work all the time, with no need for continuous calibration (as with RGB cameras) or the use of any additional tracking devices.
Photo thanks to zzkt
As the coder for “The swamp that was…” bike opera, my view of things was from “inside” the bikes – listening to the GPS data and playing samples. So it was super (and somewhat surreal) to finally become a rider and take one of the bikes (called Nancy) for a spin through the streets of Ghent to experience it like everyone else at the Electrified festival.
I followed the different routes, and tried some out backwards and got lost in the “garden” – the zone of mysterious ghost butterflies and wandering sounds. During the end of the final route shelter had to be sought in Julius de Vigneplein during a gigantic thunderstorm, to the sound of looping saxophones before retreating back to the Vooruit.
It didn’t crash (always my main preoccupation with testing something I’ve been involved with writing software for) and there seemed to be continuous audio from the routes. Once I had ascertained that the software seemed to be working properly I could actually start to pay attention to the sounds which were a very fluid mix, interspersed with sudden bursts of Flemish – recordings of local people.
The sounds are a widely varied mix ranging from digital glitch to ethereal sounds and processed ducks that accompany you as you cycle along the canals. The “garden” is not a route as such but occupies a maze of small streets in the Ledeberg area and populates the streets with many insects, birds and other surprises.
The custom bike/speaker arrangement designed and built by Timelab was satisfyingly loud – pulling up next to other innocent cyclists at junctions with blaring jazz is quite an intriguing social experience. It makes you want to say “I can’t turn it off” or “I am an art installation!” The beagleboards also seem fairly durable, as the bikes have been running for a month now, and the cobbled streets and some areas with bumpy roadworks give them a lot of shocks to cope with.
The “click click” of car indicator relays tell you when you’ve reached junctions where you have to turn, and while our method of calculating direction (by comparing positions every 10 seconds) doesn’t really work well enough, they still had a useful role, saying “pay attention, you need to turn here!”. This installation, and the rest of the festival will be running for another month, until the 4th November.
I spent last week working on various activities associated with the Electrified festival in Ghent, which included a mix of plant care, games dev, low level android audio hacking, beagleboard-bike fixing. Here are some photos of the Borrowed Scenery installation/physical narrative, home of the mysterious patabotanists and temporary research laboratory for FoAM – excellent for getting into the spirit of the work while developing it. More details in further posts.