New camouflage pattern engine

One of the new projects we have at foam kernow is a ambitious new extension of the egglab player driven camouflage evolution game with Laura Kelley and Anna Hughes at Cambridge Uni.

As part of this we are expanding the patterns possible with the HTML5 canvas based pattern synthesiser to include geometric designs. Anna and Laura are interested in how camouflage has evolved to disrupt perception of movement so we need a similar citizen science game system as the eggs, but with different shapes that move at different speeds.

Here are some test mutations of un-evolved random starting genomes:

3

4

2

This is an example pattern program:

5gen

Evolving butterflies game released!

toxic

The Heliconius Butterfly Wing Pattern Evolver game is finished and ready for it’s debut as part of the Butterfly Evolution Exhibit at the Royal Society Summer Exhibition 2014. Read more about the scientific context on the researcher’s website, and click the image above to play the game.

The source code is here, it’s the first time I’ve used WebGL for a game, and it’s using the browser version of fluxus. It worked out pretty well, even to the extent that the researchers could edit the code themselves to add new explanation screens for the genetics. Like any production code it has niggles, here’s the function to render a butterfly:

(define (render-butterfly s)
  (with-state
   ;; set tex based on index
   (texture (list-ref test-tex (butterfly-texture s)))  
   ;; move to location
   (translate (butterfly-pos s))                        
   ;; point towards direction
   (maim (vnormalise (butterfly-dir s)) (vector 0 0 1)) 
   (rotate (vector 0 90 90))      ;; angle correctly
   (scale (vector 0.5 0.5 0.5))   ;; make smaller
   (draw-obj 4)                   ;; draw the body
   (with-state          ;; draw the wings in a new state
    (rotate (vector 180 0 0))                         
    (translate (vector 0 0 -0.5))  ;; position and angle right
    ;; calculate the wing angle based on speed
    (let ((a (- 90 (* (butterfly-flap-amount s)         
                      (+ 1 (sin (* (butterfly-speed s)  
                                   (+ (butterfly-fuzz s) 
                                      (time)))))))))
      (with-state
       (rotate (vector 0 0 a))
       (draw-obj 3))              ;; draw left wing
      (with-state
       (scale (vector 1 -1 1))    ;; flip
       (rotate (vector 0 0 a))
       (draw-obj 3))))))          ;; draw right wing

There is only immediate mode rendering at the moment, so the transforms are not optimised and little things like draw-obj takes an id of a preloaded chunk of geometry, rather than specifying it by name need to be fixed. However it works well and the thing that was most successful was welding together the Nightjar Game Engine (HTML5 canvas) with fluxus (WebGL) and using them together. This works by having two canvas elements drawn over each other – all the 2D (text, effects and graphs) are drawn using canvas, and the butterflies are drawn in 3D with WebGL. The render loops are run simultaneously with some extra commands to get the canvas pixel coordinates of objects drawn in 3D space.

News from egglab

9,000 players, 20,000 games played and 400,000 tested egg patterns later we have over 30 generations complete on most of our artificial egg populations. The overall average egg difficulty has risen from about 0.4 seconds at the start to 2.5 seconds.

Thank you to everyone who contributed their time to playing the game! We spawned 4 brand new populations last week, and we’ll continue running the game for a while yet.

In the meantime, I’ve started working on ways to visualise the 500Mb of pattern generating code that we’ve evolved so far – here are all the eggs for one of the 20 populations, each row is a generation of 127 eggs starting at the top and ordered in fitness score from left to right:

viz-2-cf-0-small

This tree is perhaps more useful. The ancestor egg at the top is the first generation and you can see how mutations happen and successful variants get selected.

viz-2-cf-1-6-tree-small

Egglab – meet Ms Easter Robot Nightjar and her genetically programmed eggs!

roboteasternightjar

We’ve released our latest citizen science camouflage game Egglab! I’ve been reporting on this for a while here so it’s great to have it released in time for Easter – we’ve had coverage in the Economist, which is helping us recruit egg hunters and 165,000 eggs have been tested so far over the last 3 days. At time of writing we’ve turned over 13 generations starting with random pattern programs and evolving them with small mutations, testing them 5 times with different players and picking the best 50% each time.

Here is an image of some of the first generation of eggs:

egglab0

And this shows how they’ve developed 13 generations later with the help of many thousands of players:

egglab

We can also click on an individual egg and see how it’s evolved over time:

egglab2

And we see how on average the time taken to find eggs is changing:

egglab3

Technically this project involves distributed pattern generation on people’s browsers using HTML5 Canvas, making it scalable. Load balancing what is done on the server over three machines and a Facebook enabled subgame – which I’ll use another blog post to explain.

Egglab – pattern generation obsession

I’m putting the final pieces together for the release of the all new Project Nightjar game (due in the run up to Easter, of course!) and the automatic pattern generation has been a focus right up to this stage. The challenge I like most about citizen science is that along with all the ‘normal’ game design creative restrictions (is it fun? will it work on the browser?) you also have to satisfy the fairly whopping constraints of the science itself, determining which decisions impact on the observations you are making – and being sure that they will be robust to peer review in the context of publication – I never had to worry about that with PlayStation games 🙂

variation

pattern2gen

With this game, similar to the last two, we want to analyse people’s ability to recognise types of pattern in a background image. Crucially, this is a completely different perception process from recognition of a learned pattern (a ‘search image’), so we don’t want to be generating the same exact egg each time from the same description – we don’t want people to ‘learn’ them. This also makes sense in the natural context of course, in that an individual bird’s eggs will not be identical, due to there being many many additional non-deterministic processes happening that create the pattern.

The base images we are using are wrapped Perlin noise at different scales, and with different thresholds applied. These are then rotated and combined with each other and plain colours with the browser’s built in composite operations. Ideally we would generate the noise each time we need it with a different random seed to make them all unique, but this is way too slow for HTML5 Canvas to do (pixel processing in Javascript is still painful at this scale). To get around this we pre-render a set of variations of noise images, the genetic program picks one of four scales, and one of two thresholds (and one without threshold) and we randomly pick a new variation of this each time we render the egg. The image at the top shows the variation that happens across 6 example programs. Below are some of the noise images we’re using:

noise-patterns

Egg camouflage evolution tests in different nest sites

I’ve spent some time testing Project Nightjar EggLab: clicking on algorithmically generated eggs on backgrounds taken from nightjar nest sites and recording the time it takes for each egg. It’s designed for lots of people to play in parallel, but I wanted to test it before coming up with more gameplay mechanic ideas.

The timing is used to rank the eggs, I keep the top 1024 individuals that took longest to find, and generate new ones from them. The idea is that successful traits will increase throughout the population and the average score will increase – from this small test it seems to be the case, a slow but consistent rise over the latest 500 eggs:

fitness

Most of the eggs are still really easy to see, but some of them take a few seconds and every now and again there is a good one that can take longer. These are some nest sites from the fiery-necked nightjar, which seems to consistently favour leafy ground – the last one took me a while to spot:

3

2

4

This are the top 50 eggs for the fiery-necked population, it’s quite noisy with false positives due to the fact that if you get distracted when playing the egg will score highly (this is one of the things to fix):

top50

For comparison, here is the top 50 for the Mozambique nightjar:

top50

These birds nest on a bigger variety of sites, including bare earth – here’s a good one of them:

4

The other project nightjar citizen science games, where you can search for real nightjars and nesting sites can be found here.

Visualising egg pattern genomes

A couple of screenshots from the upcoming Project Nightjar citizen science game – the genetic programming pattern generator is now working in a simple test framework, and even with myself as the only player at the moment, it’s gradually producing eggs that are harder and harder to find against one of the background images from the field site in South Africa. Today I’ve been working on a viewer for the pattern genome itself, which displays all the base images and the operations and intermediate steps as it builds up the final image. Unlike any natural form of genetics, genetic programming is all about growing trees of functions connected together, and here we are interested in combining simple images using HTML5 canvas’s composite operations to make complex patterns.

genome1

genome2

This will be useful for debugging mutations, where the sub-trees are jumbled up, but as we’re building a citizen science game, we’re also going to be exposing as much as possible about how it’s working – from the game mechanics like this to the underlying camouflage theories we’ll be testing. If you recognise the graph drawing algorithm, I’ve been plundering a long forgotten project: fastbreeder.

More procedurally rendered eggs in HTML5 canvas

The first Project Nightjar game was a big success, with 6 thousand players in the first few days – so we’ll have lots of visual perception data to get through! Today I’ve been doing a bit more work on the egg generator for the next citizen science camouflage game:

egglab3

I’ve made 24 new, more naturalistic base images than the abstract ones I was testing with before, and implemented the start of the genetic programming system – each block of 4×4 eggs here are children of a single randomly created parent, each child is created with a 1% mutation rate. The programs trees themselves are 6 levels deep, so a maximum of 64 binary composite operations.

All the genetic programming effort will happen in HTML5, thus neatly scaling the processing with the number of players, which is going to be important if this game proves as popular as the last – all the server has to do then is keep a record of the genotypes (the program trees) and their corresponding fitness.

One catch with this approach is the implementation of globalCompositeOperation in HTML5, the core of the image synthesis technique I’m using, is far from perfect across all browsers. Having the same genotype look different to different people would be a disaster, so I’m having to restrict the operations to the ones consistently supported – “source-over”,”source-atop”,”destination-over”,”destination-out”,”lighter” and “xor”.

Where is that nightjar?

The first Project Nightjar game is online!

ss3

It’s a perception test to see how good you are at spotting the camouflaged birds – a great use of the photos the researchers are collecting in the field, and we can also use the data as an experiment by comparing our timing when searching for birds with different predator perception, Monkeys – who see the same colours we do, or Mongeese – who being dichromats can’t differentiate between red and green.

We also had a great chance to test the game very thoroughly at the Science in the Square event in Falmouth last week, set up by Exeter University to promote science to the public. We had a touchscreen computer set up that people could use, and had a large range of people hunting for nightjars (4155 attempted spotting “clicks” in total!).

The source code is online here, and makes use of the Scheme->Javascript compiler built for planet fluxus which as come in really handy for rapid prototyping this game. Keep up to date with these games (there are more on the ‘drawing board’) on the official Project Nightjar site.

ss1

Aniziz on iPad

Thanks to HTML5 canvas, my first game that works on the iPad. All I had to do was hook up the touch events to call the mouse handlers and it was pretty functional, although the game would be better dealing with touch events differently using more drag-drop approach. The game runs around 20fps compared to 60fps on my laptop, but it’s fairly playable on Safari.