Dazzlebug in Alaska

Some photos of the our citizen science game Dazzlebug being exhibited at the Anchorage Museum in Alaska as part of their Camouflage: In Plain Sight Exhibition running from the 28th October 2016 to the 5th Febuary 2017.

The bugs which have evolved throughout this exhibition and before have already been sent for processing by the researchers – more info on that here soon!

Dazzlebug in Alaska

Dazzlebug in Alaska Dazzlebug in Alaska Dazzlebug in Alaska

Crab camouflage citizen science game

The Natural History Museum London commissioned us to build a crab catching camouflage game with the Sensory Ecology Group at the University of Exeter (who we’ve worked with previously on the Nightjar games and Egglab). This citizen science game is running on a touchscreen as part of the Colour and Vision exhibition which is running through the summer. Read more about it here.

crabtitle

28457014310_4f22f34c39_o

28123741394_5420a5331f_o

28741344715_b972d1edaa_o

Pixelquipu installed at the Open Data Institute

Pixelquipu Inca Harddrives installed at the Open Data Institute (Weaving Codes/coding with knots, with Julian Rohrhuber at the Institut Fuer Musik und Medien) Part of their Thinking out Loud exhibition. Julian also built a sonification installation to play the quipu at different times of the day.

pixelquipu

Here’s a closeup:

bit

Weavecoding performance experiments in Cornwall

Last week the weavecoding group met at Foam Kernow for our Cornish research gathering. As we approach the final stages of the project our discussions turn to publications, and which ideas from the start need revisiting. While they were here, I wanted to give local artists and researchers working with code and textiles a chance to meet Ellen, Emma and Alex. As we are a non-academic research organisation I wanted to avoid the normal powerpoint talks/coffee events and try something more informal and inclusive.

IMG_1650

One of the original ideas we had was to combine weaving and coding in a performance setting, to both provide a way to make livecoding more inclusive with weaving, and at the same time to highlight the digital thought processes involved in weaving. Amber made vegetarian sushi for our audience and we set up the Jubilee Warehouse with a collection of experiments from the project:

  • The newly warped table loom with a live camera/projection from underneath the fabric as it was woven with codes for different weaves on post-it notes for people to try.
  • The tablet/inkle loom to represent ancient weaving techniques.
  • The pattern matrix tangible weavecoding machine and Raspberry Pi.
  • A brand new experiment by Francesca with a dancemat connected to the pattern matrix software for dance code weaving!
  • The slub livecoding setup.

IMG_1634

This provided an opportunity for people to try things out and ask questions/provide discussion starting points. Our audience consisted of craft researchers, anthropological biologists, architects, game designers and technologists – so it all went on quite a lot longer than we anticipated! Alex and I provided some slub livecoded music to weave by, and my favourite part was the live weaving projection – with more projectors we could develop this combination of code and weaving performance more. Thanks to Emma for all the videos and photos!

IMG_1692

IMG_1564

Hungry birds citizen science at the Paris Natural History Museum

Some photos of Mónica Arias running her “Hungry Birds” butterfly catching experiment at the Muséum national d’Histoire naturelle in Paris.

comp

The Museum’s internet capability was challenging, so we ran the game server on a Raspberry Pi with an adhoc wifi and provided the data collection ourselves. The project is concerned with analysing pattern recognition and behaviour in predators. We’re using ten different wing patterns (or morphs), and assigning one at random to be the toxic one, and looking at how long it takes people to learn which are edible.

hungrybirds2

Picademy Exeter and Future Thinking for Social Living

Last week I had the chance to help out the Raspberry Pi foundation at their Picademy in Exeter. It was good to meet up with Sam Aaron again to talk livecoding on Pis, and also see how they run these events. They are designed for local teachers to get more confident with computers, programming and electronics to the point where they can start designing their own teaching materials on the second day of the two day course. This is a model I’m intending to use for the second inset teacher training day I’m doing next week at Truro school – it’s pretty exciting to see the ideas that they have for activities for their pupils, and a good challenge to help find ways to bring them into existence in a day.

IMG_20150605_141147

We also had the ending of Future Thinking for Social Living at the Miners Court summer party last week. We exhibited the map made during the workshops, made lots of tea, and had some fun with the pattern matrix in musical mode out in the garden – I adapted Alex’s music system we used with Ellen in Munich to run on Raspberry Pi so it didn’t require a laptop, or a screen at all – simply a speaker. It was interesting how quickly people got the idea, in many ways music is easier to explain than weaving as listening while coding is multi-sensory.

IMG_20150603_151747

Pattern matrix – putting it together

Here is a member of staff at Miners Court trying some tangible weave coding in the midst of our crafts area – at the moment it’s simply displaying the weave structure on the simulated warp weighed loom with a single colour each for warp and weft threads, the next thing is to get ‘colour & weave’ patterns working.

DSC_1064

The pattern matrix is the second generation of tangible programming device from the weavecoding project. It’s been built as an open hardware project in collaboration with Falmouth University’s Makernow fablab, who have designed and built the chassis using many 3D printed parts and assembled the electronics using surface mount components (far beyond my stripboard skills).

Here you can see the aluminium framework supporting the AVR based row controller boards with the Raspberry Pi in the corner. The hall effect sensors detect magnetic fields – this picture was taken before any of the wiring was started.

IMG_20150408_105446

The row controllers are designed to read the sensor data and dispatch it to the Raspberry Pi using i2c serial communication running on their atmega328 processors. This design was arrived at after the experience of building flotsam which centralised all of the logic in the Raspberry Pi, resulting in lots of wiring required to collect the 128 bits of information and pass it to the GPIO port on the Pi. Using i2c has the advantage that you only need two wires to communicate everything, processing can be distributed and it can be far more modular and extendible in future. In fact we plan to try different sensors and configurations – so this is a great platform for experimenting with tangible programming.

This video shows the current operation of the sensors and row controllers, I’ve programmed the board with test code that displays the state of the magnetic field with the status LED, making sure that it can tell the orientation of the programming block:

The row controllers have a set of multiplexers that allow you to choose between 20 sensor inputs all routed to an analogue pin on the AVR. We’re just using digital here, but it means we can try totally different combinations of sensors without changing the rest of the hardware.

After getting the first couple of rows working and testing it with elderly people at our Miners Court residency there were a couple of issues. Firstly the magnets were really strong, and I worried about leaving it unattended with the programming blocks snapping together so violently (as we plan to use it in museum settings as well as at Miners Court). The other problem was that even with strong magnets, the placement of the blocks needed to be very precise. This is probably to do with the shape of the magnets, and the fact that the fields bend around them and reverse quite short distances from their edges.

To fix these bugs it was a fairly simple matter to take the blocks apart, remove 2 of the 3 magnets and add some rings to guide placement over the sensors properly:

IMG_20150418_114347

Robot nightjar eggshibition at the Poly, Falmouth

As part of this year’s Fascinate festival we took over the bar at Falmouth’s Poly with visualisations of the camouflage pattern evolution process from the egglab game.

IMG_20140828_103921

tree-125-fs-part

This was a chance to do some detective work on the massive amount of genetic programming data we’ve amassed over the last few months, figure out ways to visualise it and create large prints of the egg pattern generation process. I selected family trees of eggs where mutations caused new features that made them difficult for people to spot, and thus resulted in large numbers of descendants. Then I printed examples of the eggs at different stages to see how they progressed through the generations.

IMG_20140828_104022

We also ran the egglab game in the gallery on a touch screen which accidentally coincided with some great coverage in the Guardian and Popular Science, but the game kept running (most of the time) despite this.

IMG_20140829_151430

IMG_20140829_152451

IMG_20140830_112644

The Poly (or Royal Cornwall Polytechnic Society) was really the perfect place for this exhibition, with its 175 year history of promoting scientists, engineers and artists and encouraging innovation by getting them together in different ways. Today this seems very modern (and would be given one of our grand titles like ‘cross-displinary’) but it’s quite something to see that in a lot of ways the separation between these areas is currently bigger than it ever has been, and all the more urgent because of this. The Poly has some good claims to fame, being the first place Alfred Nobel demonstrated nitro‐glycerine in 1865! Here are some pages from the 1914 report, a feel for what was going on a century ago amongst other radical world changes:

IMG_20140830_105017

IMG_20140830_104857

Evolving butterflies game released!

toxic

The Heliconius Butterfly Wing Pattern Evolver game is finished and ready for it’s debut as part of the Butterfly Evolution Exhibit at the Royal Society Summer Exhibition 2014. Read more about the scientific context on the researcher’s website, and click the image above to play the game.

The source code is here, it’s the first time I’ve used WebGL for a game, and it’s using the browser version of fluxus. It worked out pretty well, even to the extent that the researchers could edit the code themselves to add new explanation screens for the genetics. Like any production code it has niggles, here’s the function to render a butterfly:

(define (render-butterfly s)
  (with-state
   ;; set tex based on index
   (texture (list-ref test-tex (butterfly-texture s)))  
   ;; move to location
   (translate (butterfly-pos s))                        
   ;; point towards direction
   (maim (vnormalise (butterfly-dir s)) (vector 0 0 1)) 
   (rotate (vector 0 90 90))      ;; angle correctly
   (scale (vector 0.5 0.5 0.5))   ;; make smaller
   (draw-obj 4)                   ;; draw the body
   (with-state          ;; draw the wings in a new state
    (rotate (vector 180 0 0))                         
    (translate (vector 0 0 -0.5))  ;; position and angle right
    ;; calculate the wing angle based on speed
    (let ((a (- 90 (* (butterfly-flap-amount s)         
                      (+ 1 (sin (* (butterfly-speed s)  
                                   (+ (butterfly-fuzz s) 
                                      (time)))))))))
      (with-state
       (rotate (vector 0 0 a))
       (draw-obj 3))              ;; draw left wing
      (with-state
       (scale (vector 1 -1 1))    ;; flip
       (rotate (vector 0 0 a))
       (draw-obj 3))))))          ;; draw right wing

There is only immediate mode rendering at the moment, so the transforms are not optimised and little things like draw-obj takes an id of a preloaded chunk of geometry, rather than specifying it by name need to be fixed. However it works well and the thing that was most successful was welding together the Nightjar Game Engine (HTML5 canvas) with fluxus (WebGL) and using them together. This works by having two canvas elements drawn over each other – all the 2D (text, effects and graphs) are drawn using canvas, and the butterflies are drawn in 3D with WebGL. The render loops are run simultaneously with some extra commands to get the canvas pixel coordinates of objects drawn in 3D space.