Pixelquipu installed at the Open Data Institute

Pixelquipu Inca Harddrives installed at the Open Data Institute (Weaving Codes/coding with knots, with Julian Rohrhuber at the Institut Fuer Musik und Medien) Part of their Thinking out Loud exhibition. Julian also built a sonification installation to play the quipu at different times of the day.

pixelquipu

Here’s a closeup:

bit

A tanglebots workshop report

I’ve tried a lot of different ways of teaching children programming, starting a few years ago with primary school children in a classroom, then doing inset training days for teachers and finally private tutoring in homes. For the finale to the weavingcodes project we are trying a new approach, teaching families about code, robotics and thread by building “tanglebots”.

25956891515_f3d0ea3ec7

The concept is to combine programming with physical objects, concentrating on sensor input and movement as output. It’s important that we incorporate our weavingcodes research process, so deliberately setting goals we don’t yet know the answers to.

The weaving focus allows us to ground the workshop in loom technology and demonstrate the challenges of manipulating thread, with its enormous history of technological development. For the first Cornwall workshop, Ellen started us off with an introduction using FoAM Kernow’s Harris loom and the fundamentals of weaving. We were also joined by Janet and Jon from lovebytes who are helping us to run these events. When first talking about possible workshops with children, we’d discussed the impossibility of making a functional loom in a couple of hours with only broken toys and lego – and so the focus on tangling was suggested by Alex as a way to turn these difficulties to an advantage. Similarly we created a series of prizes for different categories such as “Most technical effort with least impressive result” – inspired by hebocon events.

25956891515_f3d0ea3ec7_2

25324352694_30f0a1a2a4_k2

The workshop format we used is also influenced by Paul Granjon’s wrekshops – wherever possible we’re recycling by pulling apart e-waste, making use of electronics, motors, gears and ideas from the surprising complexity of what’s inside things people are throwing away. This turned out have a powerful implicit message about recycling, parents I talked to had tried taking things apart to learn about them, but the next step – making use of the parts discovered as we were doing here, needs a bit more help to do.

Also as normal for FoAM projects was the importance of the food, in this case tangled by Amber and Francesca to both provide sustenance and inspiration with cardamom knots, spiralised courgetti and tangle fritters.

25931225896_007b088e5d_k2

The groups ended up a bit lopsided, so in future we plan to pre-arrange them as we did on the machine wilderness workshop. In order to do that we need to ask for more information from participants beforehand such as family ages and backgrounds.

We tried using the small Pi touchscreens – these were a bit too fiddly to get away without a mouse, but are much less oppressive somehow than larger PC monitors – as they are so small, they became incorporated into the tanglebots themselves.

Crocodile clips were the best way to connect to random/plundered electronics as well as the lego motors. These removed the need for soldering (which we had set up anyway, but in a separate space).

A selection of other notes we made:

  • Start with a manual tangling exercise (weaving with rope, tablets etc)
  • Lego has a strange all or nothing effect, once you start using it – everything has to work that way, avoiding it may lead to more creative options than including it
  • A first aid kit is needed for these sorts of things
  • The Pimoroni Explorer Hats are good but needed periodic resets in some cases – the motors seemed to get jammed, not sure if this is short circuits interrupting the i2c comms?
  • The Raspberry Pi docs are riddled with minor errors, e.g. the Scratch GPIO section on the explorer hats has a lot of sometimes confusing typos.

All our resources are being uploaded to the kairotic github repository so other people can make use of the materials.

As well as being supported by AHRC Digital Transformations, this project was part of British Science Week, supported by the British Science Association.

25956947035_a44aa6bdd9_k(1)

Sonic Bikes to Sonic Kayaks – using puredata

When I first started working on the Sonic Bikes project with Kaffe Matthews in 2013 I had just moved to Cornwall, and I used the Penryn river for developing “The swamp that was” installation we made for Ghent. We’ve always talked about bringing this project here, but the various limitations of cycling (fast roads, stupid drivers and ridiculous hills) were always too much of a problem – so we wondered about sonic kayaks, as a distant vague idea. However now, thanks to help from the British Science Association, Feast Cornwall and the Port Eliot Festival they are fast becoming a reality!

We’re also using this opportunity to convert kayaks into instruments for sensing marine microclimates – an area which is currently lacking in scientific knowledge. In order to do this, we need to expand the sonic potential of our current system – moving it from sample playback to a more open ended synthesis approach. We’re running a open hacklab to trial the use of sensors, and actually get out on the water with Kaffe later in the month.

zones

To do all this – and keep it functioning on a Raspberry Pi, we’re using Pure Data. For the moment it seemed most appropriate to stick to the concept of audio zones, previously these defined areas associated with samples that would play back when you were inside of them. The screenshot above is the sonic bike mapping tool – recently rebuilt by Francesca. Using Pure Data we can associated each zone with a specific patch, which leaves the use of samples or not, effects, interpretation of sensor data and any other musical decisions completely open.

pd

The patch above is the first version of the zone patch mixer – it reads OSC messages from the GPS map system (which is written in Lua) and when a patch is triggered, it turns on audio processing for it and gradually fades it up. When the zone is left it fades it down and deactivates it – this way we can have multiple overlayed patches, much like the sample mixing we used before. We can also have loads of different patches as it’s only processing the active ones at any one time, it won’t stress out the Raspberry Pi too much.

I’ve been testing this today by walking around a lot with headphones on – this is a GPS trace, which gives some ideas of the usual problems of GPS (I didn’t actually switch to kayak halfway through, although it thought I did).

trace

Tanglebots workshop preparation

It’s workshop time again at Foam Kernow. We’re running a Sonic Kayak development open hacklab with Kaffe Matthews (more on this soon) and a series of tanglebots workshops which will be the finale to the weavingcodes project.

Instead of using my cobbled together homemade interface board, we’re using the pimoroni explorer hat (pro). This comes with some nice features, especially a built in breadboard but also 8 touch buttons, 4 LEDs and two motor drivers. The only downside is that it uses the same power source as the Pi for the motors, so you need to be a little careful as it can reset the Pi if the power draw is too great.

2hat

We have a good stock of recycled e-waste robotic toys we’re going to be using to build with (along with some secondhand lego mindstorms):

toys

Also lots of recycled building material from the amazing Cornwall Scrap Store.

material

In order to keep the workshop balanced between programming and building, and fun for all age groups, we want to use Scratch – rather than getting bogged down with python or similar. In a big improvement to previous versions of the Pi OS, the recent raspbian version (jessie) supports lots of extension hardware including the explorer hat. Things like firing the built in LEDs work ‘out of the box’ like this:

2scratch-led

While the two motor controllers (with speed control!) work like this:

2scratch-motor

The touch buttons were a bit harder to get working as they are not supported by default, so I had to write a scratch driver to do this which you can find here. Once the driver script is running (which launches automatically from the desktop icon), it communicates to scratch over the network and registers the 8 touch buttons as sensors. This means they can be accessed in scratch like so:

2scratch-touch

Red King: Host/Parasite co-evolution citizen science

A new project begins, on the subject of ecology and evolution of infectious disease. This one is a little different from a lot of Foam Kernow’s citizen science projects in that the subject is theoretical research – and involves mathematical simulations of populations of co-evolving organisms, rather than the direct study of real ones in field sites etc.

The simulation, or model, we are working with is concerned with the co-evolution of parasites and their hosts. Just as in more commonly known simulations of predators and prey, there are complex relationships between hosts and parasites – for example if parasites become too successful and aggressive the hosts start to die out, in turn reducing the parasite populations. Hosts can evolve to resist infection, but this has an overhead that starts to become a disadvantage when most of a population is free of parasites again.

graph
Example evolution processes with different host/parasite trade-offs.

Over time these relationships shift and change, and this happens in different patterns depending on the starting conditions. Little is known about the categorisation of these patterns, or even the range of relationships possible. The models used to simulate them are still a research topic in their own right, so in this project we are hoping to explore different ways people can both control a simulation (perhaps with an element of visual live programming), and also experience the results in a number of ways – via a sonifications, or game world. The eventual, ambitious aim – is to provide a way for people to feedback their discoveries into the research.

sketch

Pixel Quipu

The graphviz visualisations we’ve been using for quipu have quite a few limitations, as they tend to make very large images, and there is limited control over how they are drawn. It would be better to be able to have more of an overview of the data, also rendering the knots in the right positions with the pendants being the right length.

Meet the pixelquipu!

ur018

These are drawn using a python script which reads the Harvard Quipu Database and renders quipu structure using the correct colours. The knots are shown as a single pixel attached to the pendant, with a colour code of red as single knot, green for a long knot and blue as a figure of eight knot (yellow is unknown or missing). The value of the knot sets the brightness of the pixel. The colour variations for the pendants are working, but no difference between twisted and alternating colours, also no twist direction is visualised yet.

hp017

Another advantage of this form of rendering is that we can draw data entropy within the quipu in order to provide a different view of how the data is structured, as a attempt to uncover hidden complexity. This is done hierarchically so a pendant’s entropy is that of its data plus all the sub-pendants, which seemed most appropriate given the non-linear form that the data takes.

ur037

e-ur037

We can now look at some quipus in more detail – what was the purpose of the red and grey striped pendants in the quipu below? They contain no knots, are they markers of some kind? This also seems to be a quipu where the knots do not follow the decimal coding pattern that we understand, they are mostly long knots of various values.

ur051

There also seems to be data stored in different kinds of structure in the same quipu – the collection of sub-pendants below in the left side presumably group data in a more hierarchical manner than the right side, which seems much more linear – and also a colour change emphasises this.

ur015

Read left to right, this long quipu below seems very much like you’d expect binary data to look – some kind of header information or preamble, followed by a repeating structure with local variation. The twelve groups of eight grey pendants seem redundant – were these meant to be filled in later? Did they represent something important without containing any knots? We will probably never know.

UR1176

The original thinking of the pixelquipu was to attempt to fit all the quipus on a single page for viewing, as it represents them with the absolute minimum pixels required. Here are both pendant colour and entropy shown for all 247 quipu we have the data for:

all

entropy-local

The UAV toolkit & appropriate technology

The UAV toolkit’s second project phase is now complete, the first development sprint at the start of the year was a bit of research into what we could use an average phone’s sensors for, resulting in a proof of concept remote sensing android app that allowed you to visually program different scripts which we then tested on some drones, a radio controlled plane and a kite.

photo-1444743618-504214

This time we had a specific focus on environmental agencies, working with Katie Threadgill at the Westcountry Rivers Trust has meant we’ve had to think about how this could be used by real people in an actual setting (farm advisors working with local farmers). Making something cheap, open source and easy to use, yet open ended has been the focus – and we are now looking at providing WRT with a complete toolkit which would comprise a drone (for good weather) a kite (for bad weather/no flight licences required) and an android phone so they don’t need to worry about destroying their own if something goes wrong. Katie has produced this excellent guide on how the app works.

The idea of appropriate technology has become an important philosophy for projects we are developing at Foam Kernow, in conjunction with unlikely connections in livecoding and our wider arts practice. For example the Sonic Bike project – where from the start we restricted the technology so that no ‘cloud’ network connections are required and all the data and hardware required has to fit on the bike – with no data “leaking” out.

photo-1444742698-352249

With the UAV toolkit the open endedness of providing a visual programming system that works on a touchscreen results in an application that is flexible enough to be used in ways and places we can’t predict. For example in crisis situations, where power, networking or hardware is not available to set up remote sensing devices when you need them most. With the UAV toolkit we are working towards a self contained system, and what I’ve found interesting is how many interface and programming ‘guidelines’ I have to bend to make this possible – open endedness is very much against the grain of contemporary software design philosophy.

The “app ecosystem” is ultimately concerned with elevator pitches – to do one thing, and boil it down to the least actions possible to achieve it. This is not a problem in itself, but the assumption that this is the only philosophy worth consideration is wrong. One experience that comes to mind recently is having to make and upload banner images of an exact size to the Play Store before it would allow me to release an important fix needed for Mongoose 2000, which is only intended to ever have 5 or 6 users.

photo-1444743271-137365

For the UAV toolkit, our future plans include stitching together photos captured on the phone and producing a single large map without the need to use any other software on a laptop. There are also interesting possibilities regarding distributed networking with bluetooth and similar radio systems – for example sending code to different phones is needed, as currently there is no way to distribute scripts amongst users. This could also be a way of creating distributed processing – controlling one phone in a remote location with another via code sent by adhoc wifi or SMS for example.

Airborne drag-drop programming, the next steps

This autumn we are continuing work on the UAV toolkit with Karen Anderson and her research group at the Environment and Sustainability Institute. This time we have a mission to help the Westcountry Rivers Trust by coming up with fast and cheap ways they can build maps of farms to determine water run-off problems, which gives farmers proof they need to get funding to fix pollution issues.

IMG_20151014_154601
Flight planning operations and photo checking

In order to make the software usable in this case, we decided on two directions. On the one hand there needs to be a simple way to start and stop programs (or “flight modes”) that read sensor data, as well as defining certain global settings, ie. flight altitude, desired image coverage etc. At the same time, the code to define what this does needs to still be programmable in the app – and more complex behaviours need to be possible to support both kites and UAVs. Our philosophy is that it has to be open ended, as we don’t know where the toolkit it might be useful (ie. crisis mapping situations) or what new sensors will be available on a device in the future.

Screenshot_2015-10-19-21-48-01
The new main screen

One specific set of new behaviours we need is for kite mapping. We already have the ability to choose when to take pictures based on GPS and altitude, but with a kite there can be lots of turbulence and the camera is in a much less controlled state, flipping around taking shots of the sky etc. So we need to calculate things like jerk from change in acceleration and use orientation sensors to only take photos when the lens is pointing directly down, within some degree of acceptable margin.

Below is a section of the code that calculates if we are pointing down using the magnetometer and accelerometer – the drag drop visual code can now be used to build normal Scheme functions using a touchscreen (a bit like scheme bricks). In fact I managed to do all of this work on the phone. There are now two types of code, the main programs or “flight modes” that you can run from the front screen, and a library of editable functions which they use. This means there are now three levels that the software can be used – using it without needing to see any of the code at all, editing the basic behaviour like which sensor’s data are captured, and finally modifying the more detailed code to make it do completely new things.

Screenshot_2015-10-19-22-08-12

Coding with knots: Inca Quipu

This week I’m teaching at IMM Düsseldorf with Julian Rohrhuber which has given me a chance to follow up a bit on Inca Quipu coding with knots, a dangling thread from the weavecoding project. Quipu are how the Incas organised their society, as they had no written texts or money – things like exchanges (for example from their extensive store houses) were recorded via knots. Researchers have been able to decode the basic numeric system they used, but 20% of the quipu seem to follow a different set of rules, along with extra information encoded via thread material, twist direction, colour and other knot differences. I’ve written a python program for converting the Khipu Database Project excel charts into graphviz files for visualising:

quipu5

The knots are described in ascii art, with S and Z relating to the ply and knot ‘handedness’ direction they are tied in:

O : a single knot 
O/O : two single knots tied in S direction (it's rotated 90 degrees :)
(\\\\) : a long knot of value '4' tied in the Z direction
/8 : end (figure of 8) knot tied S direction

The pendant nodes also have labels describing their ply direction and the side the attach on, so “S R” is S ply & recto attached.

The hardest part of this has been a bit of more recent media archeology to figure out the colour values, I’ve had to cross reference the original Ascher-Ascher Quipu Databooks published in 1978 which contain their own colour system which more or less maps to the NBS-ISCC Munsell colour chart originally proposed in 1898. Luckily that site provides hex colour values – hopefully they are vaguely accurate, the current lookup table is here:

colour_lookup = {
    "W": "#777777",
    "SR":  "#BF2233",
    "MB" : "#673923",
    "GG" : "#575E4E",
    "KB" : "#35170C",
    "AB" : "#A86540",
    "HB" : "#5A3D30",
    "RL" : "#AA6651",
    "BG" : "#4A545C",
    "PG" : "#8D917A",
    "B" : "#7D512D",
    "0B" : "#64400F",
    "RM" : "#AB343A",
    "PR" : "#490005",
    "FR" : "#7F180D",
    "DB" : "#4D220E",
    "YB" : "#BB8B54",
    "MG" : "#817066",
    "GA" : "#503D33"
}

Tangible programming: detecting flip, rotation and id with magnets

When we started designing the pattern matrix we wanted to include the possibility of encoding more than binary (which side is up) using the magnets. In order to test this, we made the bottom row of sensors with 4 in a square – the rest only have one sensor currently (to avoid blowing the budget on hall effect sensors).

Here are some test blocks with four magnets glued on. The one at the back is easy to make as they naturally snap together edge to edge in this pattern, the closer one required superglue and lots of patience – I’m still expecting it to fire a magnet off unexpectedly at some point:

IMG_20150429_153706

The orientation seems to work well in our tests so far, as you rotate the blocks the sensors latch from one state to the other – and it seems like they stick to their previous reading until the block is very nearly aligned straight. I’ve added some sound on the Pi to give some haptic feedback which is turning out to be very useful.

The next job was to head back to makernow make some better blocks with the magnets inside. Oliver Hatfield milled out new holes in some of our spares:

IMG_20150501_114719

Luckily the fit is really tight so with some force the magnets can be placed inside without the need for any gluing – and they don’t rattle around at all:

IMG_20150501_115947

The next thing was to make some visual indication of the polarity and meaning of the patterns, and show how the binary encoding changes with flipping and rotating. Andy Smith designed and laser engraved these new caps and locating rings:

IMG_20150501_180731

The 4 bit binary codes read in clockwise order from the top left (same as the notation for tablet weaving) so rotation causes the same effect as bitwise rotate in programming – multiply/divide by 2 with overflow. There are 4 possible different configurations of magnets (which can provide block identification). Two of the configurations are mirrored on both sides but you can read rotation still, with the other two you also can tell which side is up, and one – bottom left in the photo below, can represent 8 states all by itself (flip as well as rotate).

In future we’ll make more of these with specific meanings dependant on the language we use them for and what they actually do – at this point they are for debugging/experimenting further.

IMG_20150502_124856