Cricket Tales released

Cricket Tales is an ambitious citizen science project. 438 days of CCTV footage from the Wild Crickets Research group – the only record of wild behaviour of insects of it’s kind. It turns out that insects have more complex lives and individuality than we thought, and the game is a way of helping uncover this more precisely. For Foam Kernow, this was also a significant project as the biggest production that all three of us have worked on together.

title

My favorite aspect of this project is that the movies are a strangely different way of viewing an ecosystem, tiny close up areas of a perfectly normal field in northern Spain. The footage is 24 hour, with infrared at night, recording a couple of frames a second only when movement is detected. Some of the videos get triggered when there is simply movement of shadows, but there are plenty of moments that we wouldn’t normally notice. Worms and bugs of all kinds going about their lives, sudden appearances of larger animals or swarms of ants, condensation of dew at dawn. The crickets themselves, mostly with tags stuck to them so we can tell which is which, but other than that – this is their normal habitat and way of life. Compared to the study of insects in lab conditions, it’s not surprising they act in a more complex way.

movie2
Screenshots from the Spanish version, as I’m particularly proud of that (my first experience using GNU gettext with Django).

We combined the task of watching the 1 minute long movies with the ability to build houses for the crickets – we needed to provide a way for people to leave something behind, something that marks progress on this gigantic collective task. You get to design a little house for each burrow, and your name gets recorded on the meadow until the next person takes over by watching more videos.

map2

We’ve had plenty of conversations about what kind of people take part in this sort of citizen science activity, what the motivations may be. We ask a couple of questions when people sign up, and this is something we are interested in doing more research on in general for our projects. In this case, we are interested in depth of involvement more than attracting thousands of brief encounters – it only takes a few motivated people to make the researcher’s jobs much easier and provide some data they need.

For me a bigger objective of Cricket Tales is as a way to present more diverse and personal views of the world that surround us, and tends to go unnoticed. Being asked to contemplate a tiny organism’s view of the world for a minute can be quite an eye opener.

AI as puppetry, and rediscovering a long forgotten game.

AI in games is a hot topic at the moment, but most examples of this are attempts to create human-like behaviour in characters – a kind of advanced puppetry. These characters are also generally designed beforehand rather than reacting and learning from player behaviour, let alone being allowed to adapt in an open ended manner.

geo-6
Rocketing around the gravitational wells.

Geo was an free software game I wrote around 10 years ago which I’ve recently rediscovered. I made it a couple of years after I was working for William Latham’s Computer Artworks – and was obviously influenced by that experience. At the time it was a little demanding for graphics hardware, but it turns out the intervening years processing power has caught up with it.

This is a game set in a large section of space inhabited by lifeforms comprised of triangles, squares and pentagons. Each lifeform exerts a gravitational pull and has the ability to reproduce. It’s structure is defined by a simple genetic code which is copied to it’s descendants with small errors, giving rise to evolution. Your role is to collect keys which orbit around gravitational wells in order to progress to the next level, which is repopulated by copies of the most successful individuals from the previous level.

A simple first generation lifeform.
A simple first generation lifeform.

Each game starts with a random population, so the first couple of levels are generally quite simple, mostly populated by dormant or self destructive species – but after 4 or 5 generations the lifeforms start to reproduce, and by level 10 a phenotype (or species) will generally have emerged to become an highly invasive conqueror of space. It becomes an against the clock matter to find all the keys before the gravitational effects are too much for your ship’s engines to escape, or your weapons to ‘prune’ the structure’s growth.

I’ve used similar evolutionary strategies in much more recent games, but they’ve required much more effort to get the evolution working (49,000 players have now contributed to egglab’s camouflage evolution for example).

A well defended 'globular' colony - a common species to evolve.
A well defended 'globular' colony – a common phenotype to evolve.

What I like about this form of more humble AI (or artificial life) is that instead of a program trying to imitate another lifeform, it really just represents itself – challenging you to exist in it’s consistent but utterly alien world. I’ve always wondered why the dominant post-human theme of sentient AI was a supercomputer deliberately designed usually by a millionaire or huge company. It seems to me far more likely that some form of life will arise – perhaps even already exists – in the wild variety of online spambots and malware mainly talking to themselves, and will be unnoticed – probably forever, by us. We had a little indication of this when the facebook bots in the naked on pluto game started having autonomous conversations with other online spambots on their blog.

A densely packed 'crystalline' colony structure.
A densely packed 'crystalline' colony structure.

Less speculatively, what I’ve enjoyed most about playing this simple game is exploring and attempting to shape the possibilities of the artificial life while observing and categorising the common solutions that emerge during separate games – cases of parallel evolution. I’ve tweaked the between-levels fitness function a little, but most of the evolution tends to occur ‘darwinistically’ while you are playing, simply the lifeforms that reproduce most effectively survive.


An efficient and highly structured 'solar array' phenotype which I’ve seen emerge twice with different genotypes.

You can get the updated source here, it only requires GLUT and ALUT (a cross platform audio API). At one time it compiled on windows, and should build on OSX quite easily – I may distribute binaries at some point if I get time.

geo-5
A ‘block grid’ phenotype which is also common.

Some algorithms for Wild Cricket Tales

On the Wild Cricket Tales citizen science game, one of the tricky problems is grading player created data in terms of quality. The idea is to get people to help the research by tagging videos to measure behaviour of the insect beasts – but we need to accept that there will be a lot of ‘noise’ in the data, how can we detect this and filter it away? Also it would be great if we can detect and acknowledge players who are successful at hunting out and spotting interesting things, or people who are searching through lots of videos. As we found making the camouflage citizen science games, you don’t need much to grab people’s attention if the subject matter is interesting (which is very much the case with this project), but a high score table seems to help. We can also have one per cricket or burrow so that players can more easily see their progress – the single egglab high score table got very difficult to feature on after a few thousand players or so.

We have two separate but related problems – acknowledging players and filtering the data, so it probably makes sense if they can be linked. A commonly used method, which we did with egglab too (also for example in Google’s reCAPTCHA which is also crowdsourcing text digitisation as a side effect) is to get compare multiple people’s results on the same video, but then we still need to bootstrap the scoring from something, and make sure we acknowledge people who are watching videos no one has seen yet, as this is also important.

Below is a simple naive scoring system for calculating a score simply by quantity of events found on a video – we want to give points for finding some events, but over some limit we don’t want to reward endless clicking. It’s probably better if the score stops at zero rather than going negative as shown here, as games should never really ‘punish’ people like this!

quantity-score

Once we have a bit more data we can start to cluster events to detect if people are agreeing. This can give us some indication of the confidence of the data for a whole video, or a section of it – and it can also be used to figure out a likelihood of an individual event being valid using the sum of neighbouring events weighted by distance via a simple drop-off function.

quality-score

If we do this for all the player’s events over a single video we can get an indication of how consistent they are with other players. We could also recursively weight this by a player’s historical scores – so ‘trusted’ players could validate new ones – this is probably a bit too far at this point, but it might be an option if we pre-stock some videos with data from the researchers who are trained with what is important to record.

News from egglab

9,000 players, 20,000 games played and 400,000 tested egg patterns later we have over 30 generations complete on most of our artificial egg populations. The overall average egg difficulty has risen from about 0.4 seconds at the start to 2.5 seconds.

Thank you to everyone who contributed their time to playing the game! We spawned 4 brand new populations last week, and we’ll continue running the game for a while yet.

In the meantime, I’ve started working on ways to visualise the 500Mb of pattern generating code that we’ve evolved so far – here are all the eggs for one of the 20 populations, each row is a generation of 127 eggs starting at the top and ordered in fitness score from left to right:

viz-2-cf-0-small

This tree is perhaps more useful. The ancestor egg at the top is the first generation and you can see how mutations happen and successful variants get selected.

viz-2-cf-1-6-tree-small

Procedural landscape demo on OUYA/Android

A glitchy procedural, infinite-ish landscape demo running on Android and OUYA. Use the left joystick to move around on OUYA, or swiping on Android devices with touchscreens. Here’s the apk, and the source is here.

Screenshot_2014-01-06-07-18-45

It’s great to be able to have a single binary that works across all these devices – from OUYA’s TV screen sizes to phones, and using the standard gesture interface at the same time as the OUYA controller.

The graphics are programmed in Jellyfish Lisp, using Perlin noise to create the landscape. The language is probably still a bit too close to the underlying bytecode in places, but the function calling is working and it’s getting easier to write and experiment with the code.

(define terrain
  '(let ((vertex positions-start)
         (flingdamp (vector 0 0 0))
         (world (vector 0 0 0)))

     ;; recycle a triangle which is off the screen
     (define recycle 
       (lambda (dir)         
         ;; shift along x and y coordinates:
         ;; set z to zero for each vertex
         (write! vertex       
                 (+ (*v (read vertex) 
                        (vector 1 1 0)) dir))
         (write! (+ vertex 1) 
                 (+ (*v (read (+ vertex 1)) 
                        (vector 1 1 0)) dir))
         (write! (+ vertex 2) 
                 (+ (*v (read (+ vertex 2)) 
                        (vector 1 1 0)) dir))

         ;; get the perlin noise values for each vertex
         (let ((a (noise (* (- (read vertex) world) 0.2)))
               (b (noise (* (- (read (+ vertex 1)) 
                               world) 0.2)))
               (c (noise (* (- (read (+ vertex 2))
                               world) 0.2))))

           ;; set the z coordinate for height
           (write! vertex 
                   (+ (read vertex) 
                      (+ (*v a (vector 0 0 8)) 
                         (vector 0 0 -4))))
           (write! (+ vertex 1) 
                   (+ (read (+ vertex 1)) 
                      (+ (*v b (vector 0 0 8)) 
                         (vector 0 0 -4))))
           (write! (+ vertex 2) 
                   (+ (read (+ vertex 2)) 
                      (+ (*v c (vector 0 0 8)) 
                         (vector 0 0 -4))))

           ;; recalculate normals
           (define n (normalise 
                      (cross (- (read vertex)
                                (read (+ vertex 2)))
                             (- (read vertex)
                                (read (+ vertex 1))))))

           ;; write to normal data
           (write! (+ vertex 512) n)
           (write! (+ vertex 513) n)
           (write! (+ vertex 514) n)

           ;; write the z height as texture coordinates
           (write! (+ vertex 1536) 
                   (*v (swizzle zzz a) (vector 0 5 0)))          
           (write! (+ vertex 1537) 
                   (*v (swizzle zzz b) (vector 0 5 0)))          
           (write! (+ vertex 1538) 
                   (*v (swizzle zzz c) (vector 0 5 0))))))

     ;; forever
     (loop 1
       ;; add inertia to the fling/gamepad joystick input
       (set! flingdamp (+ (* flingdamp 0.99)
                          (*v
                           (read reg-fling)
                           (vector 0.01 -0.01 0))))

       (define vel (* flingdamp 0.002))
       ;; update the world coordinates
       (set! world (+ world vel))

       ;; for each vertex
       (loop (< vertex positions-end)         

         ;; update the vertex position
         (write! vertex (+ (read vertex) vel))
         (write! (+ vertex 1) (+ (read (+ vertex 1)) vel))
         (write! (+ vertex 2) (+ (read (+ vertex 2)) vel))

         ;; check for out of area polygons to recycle 
         (cond
          ((> (read vertex) 5.0)
           (recycle (vector -10 0 0)))         
          ((< (read vertex) -5.0)
           (recycle (vector 10 0 0))))
         
         (cond
          ((> (swizzle yzz (read vertex)) 4.0)
           (recycle (vector 0 -8 0)))
          ((< (swizzle yzz (read vertex)) -4.0)
           (recycle (vector 0 8 0))))

         (set! vertex (+ vertex 3)))
       (set! vertex positions-start))))

This lisp program compiles to 362 vectors of bytecode at startup, and runs well even on my cheap Android tablet. The speed seems close enough to native C++ to be worth the effort, and it’s much more flexible (i.e. future livecoding/JIT compilation possibilities). The memory layout is shown below, it’s packing executable instructions and model data into the same address space and doesn’t use any memory allocation while it’s running (no garbage collection and not even any C mallocs). The memory size is configurable but the nature of the system is such that it would be possible to put executable data into unused graphics sections (eg. normals or vertex colours), if appropriate.

jvm

Project Nightjar: Where is that nest?

We’ve released a new game for Project Nightjar called Where is that nest? This is an adaptation of Where is that nightjar?, but the variety of species of birds is greater, some of the nests are much harder to find than the birds were so we added two levels – and a hall of fame high score table!

nest1

nest2

nest3

Where is that nightjar?

The first Project Nightjar game is online!

ss3

It’s a perception test to see how good you are at spotting the camouflaged birds – a great use of the photos the researchers are collecting in the field, and we can also use the data as an experiment by comparing our timing when searching for birds with different predator perception, Monkeys – who see the same colours we do, or Mongeese – who being dichromats can’t differentiate between red and green.

We also had a great chance to test the game very thoroughly at the Science in the Square event in Falmouth last week, set up by Exeter University to promote science to the public. We had a touchscreen computer set up that people could use, and had a large range of people hunting for nightjars (4155 attempted spotting “clicks” in total!).

The source code is online here, and makes use of the Scheme->Javascript compiler built for planet fluxus which as come in really handy for rapid prototyping this game. Keep up to date with these games (there are more on the ‘drawing board’) on the official Project Nightjar site.

ss1

Slub at the Deershed festival

Deershed is a music festival designed to accommodate families with lots of activities for children. Part of this year’s festival was a Machines Tent, including Lego robot building, Mechano constructions, 3D printing and computer games.

Slub’s daily routine in the Machines Tent started by setting up the Al Jazari gamepad livecoding installation, a couple of hours with Martyn Eggleton teaching Scratch programming on an amazing quad Raspberry Pi machine (screens/processors and keyboards all built into a welded cube).

IMG_20130720_102902

At some point we would switch to Minecraft, trying some experiments livecoding the LAN game world using Martyn’s system to access the Minecraft API using Waterbear, a visual programming language using a similar blocks approach as Scratch and Scheme Bricks.

During the afternoons Alex and I could try some music livecoding experiments. This was a great environment for playful audience participatory performances, with families continually passing through the tent I could use a dancemat to trigger synths in fluxus while Alex livecoded music designed to encourage people to jump up and down.

IMG_20130720_150827

One of the most interesting things for me was to be able to see how lots of children (who mostly didn’t know each other) collaborate and self organise themselves in a LAN game, there was quite a pattern to it with all the groups:

  1. Mess around with Minecraft as usual (make some blocks, start building a house).
  2. Find something built by someone else, destroy a few bricks.
  3. Snap out of the game to notice that the other kids are complaining.
  4. Realise that there are other people in the world – and they are sat around them!
  5. Attempt to fix the damage.

At this point other people would join in to help fix things, after which there would be some kind of understanding reached between them to respect each other’s creations. This has all really inspired me to work on Al Jazari 2 which combines a lot of these ideas.

IMG_20130720_084658

Visual livecoding environments: big screenshots

Some decent sized screenshots of al jazari and scheme bricks rendered with fluxus’s tiled frame dump command. This set includes some satisfyingly glitchy al jazari shots – not sure what was causing this, I initially assumed it was the orthographic projection, but the same artefacts occurred on the perspective first-person robot views, so it needs further investigation.

ajbig

ajbig2

ajbig5

ajbig6

schemebricks

schemebricks2

schemebricks3

Teaching at the Düsseldorf Institute for Music and Media

Last week I was kindly invited by Julian Rohrhuber to do a couple of talks and teach a livecoding workshop alongside Jan-Kees van Kampen at the Düsseldorf Institute for Music and Media. Jan-Kees was demoing /mode +v noise a Supercollider chat bot installation using IRC, so it was the perfect opportunity to play test the work-in-progress slubworld project, including the plutonian botzlang language. It also proved a good chance to try using a Raspberry Pi as a LAN game server.

IMG_20130620_163002

There wasn’t enough time to get deeply into botzlang, but we were able to test the text to sound code that Alex has been working on with a good sound system, and the projection of the game world that visualises what is happening based on the Naked on Pluto library installation:

world5

The Raspberry Pi was useful as a dedicated server I could set up beforehand and easily plug into the institutes wireless router. We didn’t need to worry about internet connectivity, and everyone could take part by using a browser pointed at the right IP address. With access to the “superuser” commands from the Naked on Pluto game, the participants had quite a bit of fun making objects and dressing each other up in different items, later making and programming their own bots to say things that were sonified through the speakers.