Artificially evolved camouflage

As the egglab camouflage experiment continues, here are some recent examples after 40 or so generations. If you want to take part in a newer experiment, we are currently seeing if a similar approach can evolving motion dazzle camouflage in Dazzle Bug.

Each population of eggs is being evolved against a lot of background images, so it’s interesting to see the different strategies in use – it seems like colour is one of the first things to match, often with some dazzle to break up the outline. Later as you can see in some of these examples, there is some quite accurate background matching happening.

It’s important to say that all of this is done entirely by the perception from tens of thousands of people playing the game – there is no analysis of the images at any point.

022

020

019

018

016

012

010

009

005

004

Robot nightjar eggshibition at the Poly, Falmouth

As part of this year’s Fascinate festival we took over the bar at Falmouth’s Poly with visualisations of the camouflage pattern evolution process from the egglab game.

IMG_20140828_103921

tree-125-fs-part

This was a chance to do some detective work on the massive amount of genetic programming data we’ve amassed over the last few months, figure out ways to visualise it and create large prints of the egg pattern generation process. I selected family trees of eggs where mutations caused new features that made them difficult for people to spot, and thus resulted in large numbers of descendants. Then I printed examples of the eggs at different stages to see how they progressed through the generations.

IMG_20140828_104022

We also ran the egglab game in the gallery on a touch screen which accidentally coincided with some great coverage in the Guardian and Popular Science, but the game kept running (most of the time) despite this.

IMG_20140829_151430

IMG_20140829_152451

IMG_20140830_112644

The Poly (or Royal Cornwall Polytechnic Society) was really the perfect place for this exhibition, with its 175 year history of promoting scientists, engineers and artists and encouraging innovation by getting them together in different ways. Today this seems very modern (and would be given one of our grand titles like ‘cross-displinary’) but it’s quite something to see that in a lot of ways the separation between these areas is currently bigger than it ever has been, and all the more urgent because of this. The Poly has some good claims to fame, being the first place Alfred Nobel demonstrated nitro‐glycerine in 1865! Here are some pages from the 1914 report, a feel for what was going on a century ago amongst other radical world changes:

IMG_20140830_105017

IMG_20140830_104857

Egglab – pattern generation obsession

I’m putting the final pieces together for the release of the all new Project Nightjar game (due in the run up to Easter, of course!) and the automatic pattern generation has been a focus right up to this stage. The challenge I like most about citizen science is that along with all the ‘normal’ game design creative restrictions (is it fun? will it work on the browser?) you also have to satisfy the fairly whopping constraints of the science itself, determining which decisions impact on the observations you are making – and being sure that they will be robust to peer review in the context of publication – I never had to worry about that with PlayStation games 🙂

variation

pattern2gen

With this game, similar to the last two, we want to analyse people’s ability to recognise types of pattern in a background image. Crucially, this is a completely different perception process from recognition of a learned pattern (a ‘search image’), so we don’t want to be generating the same exact egg each time from the same description – we don’t want people to ‘learn’ them. This also makes sense in the natural context of course, in that an individual bird’s eggs will not be identical, due to there being many many additional non-deterministic processes happening that create the pattern.

The base images we are using are wrapped Perlin noise at different scales, and with different thresholds applied. These are then rotated and combined with each other and plain colours with the browser’s built in composite operations. Ideally we would generate the noise each time we need it with a different random seed to make them all unique, but this is way too slow for HTML5 Canvas to do (pixel processing in Javascript is still painful at this scale). To get around this we pre-render a set of variations of noise images, the genetic program picks one of four scales, and one of two thresholds (and one without threshold) and we randomly pick a new variation of this each time we render the egg. The image at the top shows the variation that happens across 6 example programs. Below are some of the noise images we’re using:

noise-patterns

Egg camouflage evolution tests in different nest sites

I’ve spent some time testing Project Nightjar EggLab: clicking on algorithmically generated eggs on backgrounds taken from nightjar nest sites and recording the time it takes for each egg. It’s designed for lots of people to play in parallel, but I wanted to test it before coming up with more gameplay mechanic ideas.

The timing is used to rank the eggs, I keep the top 1024 individuals that took longest to find, and generate new ones from them. The idea is that successful traits will increase throughout the population and the average score will increase – from this small test it seems to be the case, a slow but consistent rise over the latest 500 eggs:

fitness

Most of the eggs are still really easy to see, but some of them take a few seconds and every now and again there is a good one that can take longer. These are some nest sites from the fiery-necked nightjar, which seems to consistently favour leafy ground – the last one took me a while to spot:

3

2

4

This are the top 50 eggs for the fiery-necked population, it’s quite noisy with false positives due to the fact that if you get distracted when playing the egg will score highly (this is one of the things to fix):

top50

For comparison, here is the top 50 for the Mozambique nightjar:

top50

These birds nest on a bigger variety of sites, including bare earth – here’s a good one of them:

4

The other project nightjar citizen science games, where you can search for real nightjars and nesting sites can be found here.

Project Nightjar: Camouflage data visualisation and possible internet robot predators

We’ve had tens of thousands of people spotting nightjars and donating a bit of their time to sensory ecology research. The results of this (of course it’s still on-going, along with the new nest spotting game) is a 20Mb database with hundreds of thousands of clicks recorded. One of the things we were interested in was seeing what people were mistaking for the birds – so I had a go at visualising all the clicks over the images (these are all parts of the cropped image – as it really doesn’t compress well):

clicks-ReflectanceCF017Vrgb0.54-crop

clicks-ReflectanceCF022Vrgb0.52-crop

Then, looking through the results – I saw a strange artefact:

clicks-ReflectanceCP018VRgb0.55-crop

Uncompressed high res version

My first thought was that someone had tried cheating with a script, but I can hardly imagine that anyone would go to the bother and it’s only in one image. Perhaps some form of bot or scraping software agent – I thought that browser click automation was done by directly interpreting the web page? Perhaps it’s a fall back for HTML5 canvas elements?

It turns out it’s a single player (playing as a monkey, age 16 to 35 who had played before) – so easy enough to filter away, but in doing that I noticed the click order was not as regular as it looked, and it goes a bit wobbly in the middle:

Someone with very precise mouse skills then? 🙂

Project Nightjar: Where is that nest?

We’ve released a new game for Project Nightjar called Where is that nest? This is an adaptation of Where is that nightjar?, but the variety of species of birds is greater, some of the nests are much harder to find than the birds were so we added two levels – and a hall of fame high score table!

nest1

nest2

nest3

Visualising egg pattern genomes

A couple of screenshots from the upcoming Project Nightjar citizen science game – the genetic programming pattern generator is now working in a simple test framework, and even with myself as the only player at the moment, it’s gradually producing eggs that are harder and harder to find against one of the background images from the field site in South Africa. Today I’ve been working on a viewer for the pattern genome itself, which displays all the base images and the operations and intermediate steps as it builds up the final image. Unlike any natural form of genetics, genetic programming is all about growing trees of functions connected together, and here we are interested in combining simple images using HTML5 canvas’s composite operations to make complex patterns.

genome1

genome2

This will be useful for debugging mutations, where the sub-trees are jumbled up, but as we’re building a citizen science game, we’re also going to be exposing as much as possible about how it’s working – from the game mechanics like this to the underlying camouflage theories we’ll be testing. If you recognise the graph drawing algorithm, I’ve been plundering a long forgotten project: fastbreeder.

More procedurally rendered eggs in HTML5 canvas

The first Project Nightjar game was a big success, with 6 thousand players in the first few days – so we’ll have lots of visual perception data to get through! Today I’ve been doing a bit more work on the egg generator for the next citizen science camouflage game:

egglab3

I’ve made 24 new, more naturalistic base images than the abstract ones I was testing with before, and implemented the start of the genetic programming system – each block of 4×4 eggs here are children of a single randomly created parent, each child is created with a 1% mutation rate. The programs trees themselves are 6 levels deep, so a maximum of 64 binary composite operations.

All the genetic programming effort will happen in HTML5, thus neatly scaling the processing with the number of players, which is going to be important if this game proves as popular as the last – all the server has to do then is keep a record of the genotypes (the program trees) and their corresponding fitness.

One catch with this approach is the implementation of globalCompositeOperation in HTML5, the core of the image synthesis technique I’m using, is far from perfect across all browsers. Having the same genotype look different to different people would be a disaster, so I’m having to restrict the operations to the ones consistently supported – “source-over”,”source-atop”,”destination-over”,”destination-out”,”lighter” and “xor”.

Where is that nightjar?

The first Project Nightjar game is online!

ss3

It’s a perception test to see how good you are at spotting the camouflaged birds – a great use of the photos the researchers are collecting in the field, and we can also use the data as an experiment by comparing our timing when searching for birds with different predator perception, Monkeys – who see the same colours we do, or Mongeese – who being dichromats can’t differentiate between red and green.

We also had a great chance to test the game very thoroughly at the Science in the Square event in Falmouth last week, set up by Exeter University to promote science to the public. We had a touchscreen computer set up that people could use, and had a large range of people hunting for nightjars (4155 attempted spotting “clicks” in total!).

The source code is online here, and makes use of the Scheme->Javascript compiler built for planet fluxus which as come in really handy for rapid prototyping this game. Keep up to date with these games (there are more on the ‘drawing board’) on the official Project Nightjar site.

ss1

Genetic programming egg patterns in HTML5 canvas

Part of the ‘Project Nightjar’ camouflage work I’m doing for the Sensory Ecology group at Exeter University is to design citizen science games we can build to do some research. One plan is to create lots of patterns in the browser that we can run perceptual models on for different predator animals, and use an online game to compare them with human perception.

The problem is that using per-pixel processing in order to generate the variety of patterns we need is hard to do quickly in Javascript, so I’m looking at using genetic programming to build image processing trees using the HTML5 canvas image composition modes. You can see it in action here, and the source is here. Here are some example eggs built from very simple test images blended together, these are just random programs, but a couple of them are pretty good:

random-eggs

A nice aspect of this is that it’s easy to artistically control the patterns by changing the starting images, for example much more naturalistic patterns would result from noisier, less geometric base images and more earthy colours. This way we can have different ‘themes’ for levels in a game, for example.

I’m using the scheme compiler I wrote for planet fluxus to do this, and building trees that look like this:


'("op"
  "lighter"
  ("op"
   "source-in"
   ("op"
    "source-out"
    ("terminal" (124 57 0 1) "stripe-6")
    ("terminal" (42 23 0 1) "dots-6"))
   ("op"
    "copy"
    ("terminal" (36 47 0 1) "stripe-7")
    ("terminal" (8 90 1.5705 1) "red")))
  ("terminal" (108 69 0 1) "green"))

Ops are the blend mode operations, and terminals are images, which include translation and scale (not currently used). The egg trees get drawn with the function below, which shows the curious hybrid mix of HTML5 canvas and Scheme I’m using these days (and some people may find offensive 🙂 Next up is to actually do the genetic programming part, so mutating and doing crossover on the trees.


(define (draw-egg ctx x y program)
  (if (eq? (program-type program) "terminal")
      (begin
        (set! ctx.fillStyle
              (ctx.createPattern
               (find-image (terminal-image program) 
                           image-lib) "repeat"))

        ;; centre the rotation
        (ctx.translate 64 64)
        (ctx.rotate 
            (transform-rotate (terminal-transform program)))
        (ctx.translate -64 -64)

        ;; make the pattern translate by moving, 
        ;; drawing then moving back
        (ctx.translate 
            (transform-x (terminal-transform program))
             (transform-y (terminal-transform program)))

        (ctx.fillRect 
            (- 0 (transform-x (terminal-transform program)))
            (- 0 (transform-y (terminal-transform program)))
            (* 127 2) (* 127 2))

        (ctx.translate 
            (- 0 (transform-x (terminal-transform program)))
            (- 0 (transform-y (terminal-transform program)))))
      (begin
        ;; slightly overzealous context saving/restoring
        (ctx.save)
        ;; set the composite operation
        (set! ctx.globalCompositeOperation (operator-type program))
        (ctx.save)
        (draw-egg ctx x y (operator-operand-a program))
        (ctx.restore)
        (ctx.save)
        (draw-egg ctx x y (operator-operand-b program))
        (ctx.restore)
        (ctx.restore))))