A tanglebots workshop report

I’ve tried a lot of different ways of teaching children programming, starting a few years ago with primary school children in a classroom, then doing inset training days for teachers and finally private tutoring in homes. For the finale to the weavingcodes project we are trying a new approach, teaching families about code, robotics and thread by building “tanglebots”.

25956891515_f3d0ea3ec7

The concept is to combine programming with physical objects, concentrating on sensor input and movement as output. It’s important that we incorporate our weavingcodes research process, so deliberately setting goals we don’t yet know the answers to.

The weaving focus allows us to ground the workshop in loom technology and demonstrate the challenges of manipulating thread, with its enormous history of technological development. For the first Cornwall workshop, Ellen started us off with an introduction using FoAM Kernow’s Harris loom and the fundamentals of weaving. We were also joined by Janet and Jon from lovebytes who are helping us to run these events. When first talking about possible workshops with children, we’d discussed the impossibility of making a functional loom in a couple of hours with only broken toys and lego – and so the focus on tangling was suggested by Alex as a way to turn these difficulties to an advantage. Similarly we created a series of prizes for different categories such as “Most technical effort with least impressive result” – inspired by hebocon events.

25956891515_f3d0ea3ec7_2

25324352694_30f0a1a2a4_k2

The workshop format we used is also influenced by Paul Granjon’s wrekshops – wherever possible we’re recycling by pulling apart e-waste, making use of electronics, motors, gears and ideas from the surprising complexity of what’s inside things people are throwing away. This turned out have a powerful implicit message about recycling, parents I talked to had tried taking things apart to learn about them, but the next step – making use of the parts discovered as we were doing here, needs a bit more help to do.

Also as normal for FoAM projects was the importance of the food, in this case tangled by Amber and Francesca to both provide sustenance and inspiration with cardamom knots, spiralised courgetti and tangle fritters.

25931225896_007b088e5d_k2

The groups ended up a bit lopsided, so in future we plan to pre-arrange them as we did on the machine wilderness workshop. In order to do that we need to ask for more information from participants beforehand such as family ages and backgrounds.

We tried using the small Pi touchscreens – these were a bit too fiddly to get away without a mouse, but are much less oppressive somehow than larger PC monitors – as they are so small, they became incorporated into the tanglebots themselves.

Crocodile clips were the best way to connect to random/plundered electronics as well as the lego motors. These removed the need for soldering (which we had set up anyway, but in a separate space).

A selection of other notes we made:

  • Start with a manual tangling exercise (weaving with rope, tablets etc)
  • Lego has a strange all or nothing effect, once you start using it – everything has to work that way, avoiding it may lead to more creative options than including it
  • A first aid kit is needed for these sorts of things
  • The Pimoroni Explorer Hats are good but needed periodic resets in some cases – the motors seemed to get jammed, not sure if this is short circuits interrupting the i2c comms?
  • The Raspberry Pi docs are riddled with minor errors, e.g. the Scratch GPIO section on the explorer hats has a lot of sometimes confusing typos.

All our resources are being uploaded to the kairotic github repository so other people can make use of the materials.

As well as being supported by AHRC Digital Transformations, this project was part of British Science Week, supported by the British Science Association.

25956947035_a44aa6bdd9_k(1)

Sonic Kayaks Hacklab

Part one of our two events for British Science Week was the Sonic Kayak open Hacklab with Kaffe Matthews and Dr. Kirsty Kemp. Amber has reported our findings here, this was the first time we successfully trialled the technology and ideas behind the Sonic Kayak, in future we will be refining them into instruments for experiencing the marine world. More on that soon!

25225222424_869158198f_o

Artificially evolved camouflage

As the egglab camouflage experiment continues, here are some recent examples after 40 or so generations. If you want to take part in a newer experiment, we are currently seeing if a similar approach can evolving motion dazzle camouflage in Dazzle Bug.

Each population of eggs is being evolved against a lot of background images, so it’s interesting to see the different strategies in use – it seems like colour is one of the first things to match, often with some dazzle to break up the outline. Later as you can see in some of these examples, there is some quite accurate background matching happening.

It’s important to say that all of this is done entirely by the perception from tens of thousands of people playing the game – there is no analysis of the images at any point.

022

020

019

018

016

012

010

009

005

004

Sonic Bikes to Sonic Kayaks – using puredata

When I first started working on the Sonic Bikes project with Kaffe Matthews in 2013 I had just moved to Cornwall, and I used the Penryn river for developing “The swamp that was” installation we made for Ghent. We’ve always talked about bringing this project here, but the various limitations of cycling (fast roads, stupid drivers and ridiculous hills) were always too much of a problem – so we wondered about sonic kayaks, as a distant vague idea. However now, thanks to help from the British Science Association, Feast Cornwall and the Port Eliot Festival they are fast becoming a reality!

We’re also using this opportunity to convert kayaks into instruments for sensing marine microclimates – an area which is currently lacking in scientific knowledge. In order to do this, we need to expand the sonic potential of our current system – moving it from sample playback to a more open ended synthesis approach. We’re running a open hacklab to trial the use of sensors, and actually get out on the water with Kaffe later in the month.

zones

To do all this – and keep it functioning on a Raspberry Pi, we’re using Pure Data. For the moment it seemed most appropriate to stick to the concept of audio zones, previously these defined areas associated with samples that would play back when you were inside of them. The screenshot above is the sonic bike mapping tool – recently rebuilt by Francesca. Using Pure Data we can associated each zone with a specific patch, which leaves the use of samples or not, effects, interpretation of sensor data and any other musical decisions completely open.

pd

The patch above is the first version of the zone patch mixer – it reads OSC messages from the GPS map system (which is written in Lua) and when a patch is triggered, it turns on audio processing for it and gradually fades it up. When the zone is left it fades it down and deactivates it – this way we can have multiple overlayed patches, much like the sample mixing we used before. We can also have loads of different patches as it’s only processing the active ones at any one time, it won’t stress out the Raspberry Pi too much.

I’ve been testing this today by walking around a lot with headphones on – this is a GPS trace, which gives some ideas of the usual problems of GPS (I didn’t actually switch to kayak halfway through, although it thought I did).

trace