Hungry birds citizen science at the Paris Natural History Museum

Some photos of Mónica Arias running her “Hungry Birds” butterfly catching experiment at the Muséum national d’Histoire naturelle in Paris.

comp

The Museum’s internet capability was challenging, so we ran the game server on a Raspberry Pi with an adhoc wifi and provided the data collection ourselves. The project is concerned with analysing pattern recognition and behaviour in predators. We’re using ten different wing patterns (or morphs), and assigning one at random to be the toxic one, and looking at how long it takes people to learn which are edible.

hungrybirds2

AI as puppetry, and rediscovering a long forgotten game.

AI in games is a hot topic at the moment, but most examples of this are attempts to create human-like behaviour in characters – a kind of advanced puppetry. These characters are also generally designed beforehand rather than reacting and learning from player behaviour, let alone being allowed to adapt in an open ended manner.

geo-6
Rocketing around the gravitational wells.

Geo was an free software game I wrote around 10 years ago which I’ve recently rediscovered. I made it a couple of years after I was working for William Latham’s Computer Artworks – and was obviously influenced by that experience. At the time it was a little demanding for graphics hardware, but it turns out the intervening years processing power has caught up with it.

This is a game set in a large section of space inhabited by lifeforms comprised of triangles, squares and pentagons. Each lifeform exerts a gravitational pull and has the ability to reproduce. It’s structure is defined by a simple genetic code which is copied to it’s descendants with small errors, giving rise to evolution. Your role is to collect keys which orbit around gravitational wells in order to progress to the next level, which is repopulated by copies of the most successful individuals from the previous level.

A simple first generation lifeform.
A simple first generation lifeform.

Each game starts with a random population, so the first couple of levels are generally quite simple, mostly populated by dormant or self destructive species – but after 4 or 5 generations the lifeforms start to reproduce, and by level 10 a phenotype (or species) will generally have emerged to become an highly invasive conqueror of space. It becomes an against the clock matter to find all the keys before the gravitational effects are too much for your ship’s engines to escape, or your weapons to ‘prune’ the structure’s growth.

I’ve used similar evolutionary strategies in much more recent games, but they’ve required much more effort to get the evolution working (49,000 players have now contributed to egglab’s camouflage evolution for example).

A well defended 'globular' colony - a common species to evolve.
A well defended 'globular' colony – a common phenotype to evolve.

What I like about this form of more humble AI (or artificial life) is that instead of a program trying to imitate another lifeform, it really just represents itself – challenging you to exist in it’s consistent but utterly alien world. I’ve always wondered why the dominant post-human theme of sentient AI was a supercomputer deliberately designed usually by a millionaire or huge company. It seems to me far more likely that some form of life will arise – perhaps even already exists – in the wild variety of online spambots and malware mainly talking to themselves, and will be unnoticed – probably forever, by us. We had a little indication of this when the facebook bots in the naked on pluto game started having autonomous conversations with other online spambots on their blog.

A densely packed 'crystalline' colony structure.
A densely packed 'crystalline' colony structure.

Less speculatively, what I’ve enjoyed most about playing this simple game is exploring and attempting to shape the possibilities of the artificial life while observing and categorising the common solutions that emerge during separate games – cases of parallel evolution. I’ve tweaked the between-levels fitness function a little, but most of the evolution tends to occur ‘darwinistically’ while you are playing, simply the lifeforms that reproduce most effectively survive.


An efficient and highly structured 'solar array' phenotype which I’ve seen emerge twice with different genotypes.

You can get the updated source here, it only requires GLUT and ALUT (a cross platform audio API). At one time it compiled on windows, and should build on OSX quite easily – I may distribute binaries at some point if I get time.

geo-5
A ‘block grid’ phenotype which is also common.

New camouflage pattern engine

One of the new projects we have at foam kernow is a ambitious new extension of the egglab player driven camouflage evolution game with Laura Kelley and Anna Hughes at Cambridge Uni.

As part of this we are expanding the patterns possible with the HTML5 canvas based pattern synthesiser to include geometric designs. Anna and Laura are interested in how camouflage has evolved to disrupt perception of movement so we need a similar citizen science game system as the eggs, but with different shapes that move at different speeds.

Here are some test mutations of un-evolved random starting genomes:

3

4

2

This is an example pattern program:

5gen

Cricket Tales – scaling up

Work on cricket tales the last few weeks has been concerned with scaling everything for the sheer amount of data involved. The numbers are big – we’re starting with the footage from 2013 as a test (a ‘smaller’ year), where 145 cameras recorded in total 438 days worth of video of cricket burrows. Our video processing robot is currently chopping this up into 211,889 sped up one minute clips, and encoding them into webm, ogg and h.264 mp4 for maximum browser compatibility. It looks like this would take a few months to do in total, and over the last week or so we have 8,000 videos processed.

vid1

Now we have a framework in place that will support this quantity of data, for example we can be continuously processing video while people are tagging – and swap them in and out so we don’t need to fit them all on disk. The database contains an entry for every video with a status (currently ‘not encoded’, ‘ready’ and ‘complete’). Movies could be marked complete after some number of players have watched it or some correlation metric with their tags has been met, then the files can be deleted off the disk. In terms of feasibility, we had 68K people playing the camouflage citizen science games over the last year – so if we managed to get that many people we’d need them to view 3 minutes each to get the entire dataset watched once.

map

This is a fictional map of the cricket’s burrows, with the name of the player who contributed most to them so far displayed – one idea for the ‘play’ element, which is the next big thing to consider. This is a balance between making it quick and fun and making people feel like they are contributing to something bigger and being able to see a tangible result for their efforts. Do we aim for something that takes someone five minutes of their time and getting enough people to make it work, or do we aim for a smaller number of more dedicated players who keep coming back? We have loads of options at this point, and to a large extent this depends on the precise nature of what the researchers need – so we need to do some more thinking together at this stage.

Hungry birds 2 – the citizen science edition

One of the three citizen science game projects we currently have running at Foam Kernow is a commission for Mónica Arias at the Muséum national d’Histoire naturelle in Paris, who works with this research group. She needed to use the Evolving butterflies game we made last year for the Royal Society Summer exhibition to help her research in pattern evolution and recognition in predators, and make it into a citizen science game. This is a standalone game for the moment, but the source is here.

title

aie

We had several issues to address with this version. Firstly a lot more butterfly wing patterns were needed – still the heliconius butterfly species but different types. Mónica then needed to record all player actions determining toxic or edible patterns, so we added a database which required a server (the original is an educational game that runs only on the browser, using webgl). She also needed to run the game in an exhibition in the museum where internet access is problematic, so we worked on a system than could run on a Raspberry Pi to provide a self contained wifi network (similar to Mongoose 2000). This allows her to use whatever makes sense for the museum or a specific event – multiple tablets or PC with a touchscreen, all can be connected via wifi to display the game in a normal browser with all the data recorded on the Pi.

The other aspect of this was to provide her with an ‘admin’ page where she can control and tweak the gameplay as well as collect the data for analysis. This is important as once it’s on a Raspberry Pi I can’t change anything or support it as I could in a normal webserver – but changes can still be made on the spot in reaction to how people play. This also makes the game more useful as researchers can add their own butterfly patterns and change how the selection works, and use it for more experiments in the future.

admin

Some algorithms for Wild Cricket Tales

On the Wild Cricket Tales citizen science game, one of the tricky problems is grading player created data in terms of quality. The idea is to get people to help the research by tagging videos to measure behaviour of the insect beasts – but we need to accept that there will be a lot of ‘noise’ in the data, how can we detect this and filter it away? Also it would be great if we can detect and acknowledge players who are successful at hunting out and spotting interesting things, or people who are searching through lots of videos. As we found making the camouflage citizen science games, you don’t need much to grab people’s attention if the subject matter is interesting (which is very much the case with this project), but a high score table seems to help. We can also have one per cricket or burrow so that players can more easily see their progress – the single egglab high score table got very difficult to feature on after a few thousand players or so.

We have two separate but related problems – acknowledging players and filtering the data, so it probably makes sense if they can be linked. A commonly used method, which we did with egglab too (also for example in Google’s reCAPTCHA which is also crowdsourcing text digitisation as a side effect) is to get compare multiple people’s results on the same video, but then we still need to bootstrap the scoring from something, and make sure we acknowledge people who are watching videos no one has seen yet, as this is also important.

Below is a simple naive scoring system for calculating a score simply by quantity of events found on a video – we want to give points for finding some events, but over some limit we don’t want to reward endless clicking. It’s probably better if the score stops at zero rather than going negative as shown here, as games should never really ‘punish’ people like this!

quantity-score

Once we have a bit more data we can start to cluster events to detect if people are agreeing. This can give us some indication of the confidence of the data for a whole video, or a section of it – and it can also be used to figure out a likelihood of an individual event being valid using the sum of neighbouring events weighted by distance via a simple drop-off function.

quality-score

If we do this for all the player’s events over a single video we can get an indication of how consistent they are with other players. We could also recursively weight this by a player’s historical scores – so ‘trusted’ players could validate new ones – this is probably a bit too far at this point, but it might be an option if we pre-stock some videos with data from the researchers who are trained with what is important to record.

Bumper Crop released

A release of Bumper Crop is now up on the play store with the source code here. As I reported earlier this has been about converting a board game designed by farmers in rural India into a software version – partly to make it more easily accessible and partly to explore the possibilities and restrictions of the two mediums. It’s pretty much beta really still, as some of the cards behave differently to the board game version, and a few are not yet implemented – we need to work on that, but it is playable now, with 4 players at the same time.

Screenshot_2014-08-21-23-45-51

The 3D and animation is done using the fluxus engine on android, and the game is written in tinyscheme. Here’s a snippet of the code for one of the board locations, I’ve been experimenting with a very declarative style lately:

;; description of location that allows you to fertilise your crops
;; the player has a choice of wheat/onion or potatoes
(place 26 'fertilise '(wheat onion potato) 
  ;; this function takes a player and a 
  ;; selected choice and returns a new player
  (lambda (player choice)
    (if (player-has-item? player 'ox) ;; do we have an ox?
      ;; if so, a complete a free fertilise task if needed
      (if (player-check-crop-task player choice 'fertilise 0)
        (player-update-crop-task player choice 'fertilise)
        player)
      ;; otherwise it costs 100 Rs
      (if (player-check-crop-task player choice 'fertilise 100)
        (player-update-crop-task
          (player-add-money player -100) ;; remove money
            player choice 'fertilise)
          player)))
  (place-interface-crop)) ;; helper to make the interface

Testing the board game, which you can download on this page:

boardgame

The game on tablet:

Screenshot_2014-08-21-23-30-41

Screenshot_2014-08-21-23-40-04

This is the game running on a phone:

Screenshot_2014-08-22-10-25-54

Evolving butterflies game released!

toxic

The Heliconius Butterfly Wing Pattern Evolver game is finished and ready for it’s debut as part of the Butterfly Evolution Exhibit at the Royal Society Summer Exhibition 2014. Read more about the scientific context on the researcher’s website, and click the image above to play the game.

The source code is here, it’s the first time I’ve used WebGL for a game, and it’s using the browser version of fluxus. It worked out pretty well, even to the extent that the researchers could edit the code themselves to add new explanation screens for the genetics. Like any production code it has niggles, here’s the function to render a butterfly:

(define (render-butterfly s)
  (with-state
   ;; set tex based on index
   (texture (list-ref test-tex (butterfly-texture s)))  
   ;; move to location
   (translate (butterfly-pos s))                        
   ;; point towards direction
   (maim (vnormalise (butterfly-dir s)) (vector 0 0 1)) 
   (rotate (vector 0 90 90))      ;; angle correctly
   (scale (vector 0.5 0.5 0.5))   ;; make smaller
   (draw-obj 4)                   ;; draw the body
   (with-state          ;; draw the wings in a new state
    (rotate (vector 180 0 0))                         
    (translate (vector 0 0 -0.5))  ;; position and angle right
    ;; calculate the wing angle based on speed
    (let ((a (- 90 (* (butterfly-flap-amount s)         
                      (+ 1 (sin (* (butterfly-speed s)  
                                   (+ (butterfly-fuzz s) 
                                      (time)))))))))
      (with-state
       (rotate (vector 0 0 a))
       (draw-obj 3))              ;; draw left wing
      (with-state
       (scale (vector 1 -1 1))    ;; flip
       (rotate (vector 0 0 a))
       (draw-obj 3))))))          ;; draw right wing

There is only immediate mode rendering at the moment, so the transforms are not optimised and little things like draw-obj takes an id of a preloaded chunk of geometry, rather than specifying it by name need to be fixed. However it works well and the thing that was most successful was welding together the Nightjar Game Engine (HTML5 canvas) with fluxus (WebGL) and using them together. This works by having two canvas elements drawn over each other – all the 2D (text, effects and graphs) are drawn using canvas, and the butterflies are drawn in 3D with WebGL. The render loops are run simultaneously with some extra commands to get the canvas pixel coordinates of objects drawn in 3D space.