Pattern Matrix at Algomech (part 1)

I’m writing this on the train with a slightly sleep deprived brain fizzing and popping from thoughts, ideas and conversations from this year’s Algomech festival in Sheffield. The Penelope project took a significant role in the festival, with the group’s participation in the Unmaking Symposium, the exhibition and also testing our latest weavecoding technology at the Algorave. I’ll be writing more on the algorave in a subsequent post.

During the symposium we discussed the critical, liberating and potentially dangerous aspects of Unmaking in a wide variety of contexts – from reverse engineering knitwear and classical Greek dance to discovering the untapped abilities of classical musical instruments when human limitations become a secondary consideration. The symposium also provided us with a good opportunity to take stock of our own group’s current directions and thinking in regard to the Penelope project.

We also had our own corner of the Algomech exhibition, which included Jacquard woven experiments, the Quipu sonification and visualisation and the first public trials of the new pattern matrix V2.

exhib

As is our usual practice, we used this exhibition to get essential feedback on our new design for the pattern matrix, as well as the interpretation of what we are doing – being there in person talking to people allows you to very quickly determine what works, and adapt the focus based on the responses you get.

We discussed the long view on digital technology, the role of weavers in foundations of western mathematics and simply the practicalities of the technology we are developing – the constant stream of visitors represented a wide range of different ages and backgrounds. These aspects of the pattern matrix seemed significant (in no particular order):

  • The construction technique was immediately interesting to most people, specifically the material, beech wood spalting and open frame construction.
  • Related to this, there seems little association with the pattern matrix as a ‘device’ in the sense of a ‘gizmo’ – which is interesting as the Raspberry Pi and other PCBs are clearly visible. We’ve noticed this at some level with the previous version but with the wood construction this effect is much more pronounced.
  • It is seen as being game like, e.g. a “70’s educational toy”. There is an expectation that it is something to be ‘played’, and similarly its potential as a musical sequencer is a common observation.
  • The understandability of the magnet sensing seems a key ingredient. There is no other particularly hidden magic like computer vision or RFID involved, and polarity and digital arrangements of magnets are easily explained and experienced by holding the tokens together.
  • Having some extra circuit boards and wood cut parts to hand, originally intended as backups – were great for people who wanted to know more about what was going on. In future we should also have the token block parts to show as well.
  • The different shaped blocks were immediately appealing – they seemed to invite experimentation more than alternating the binary tokens by flipping them. To follow this up we need to investigate ways to use different shapes to configure thread colour at the same time as structure in a better way than we are now. The black/white sides could define structure while the shape could correspond to the colour for the specific thread. It also indicates that using shaped tokens as instructions for tablet weaving is worth experimenting with quite soon.
  • Younger children focused on the blocks alone (and of course tried all sorts of things no one else did, like stacking them) but slightly older children worked out they were having an effect on the weaving process and generally could patiently work themselves it out without any explanation required.
  • Having Anni Albers’ ‘On Weaving’ book next to pattern matrix helped with older visitors, perhaps representing a more conventional and authoritative source of information to introduce the concepts of notation, structure and pattern in weaving.
The 8-way tangible colour switching instruction
An 8-way tangible colour switching instruction

Physical vs digital – a false dichotomy

More general concepts that came up in conversation included a common theme during Algomech, exploring the inescapably fuzzy boundaries of concepts such as digital, physical and analogue. The myth of the “real world” being analogue and the “virtual world” being digital is a troublesome one to a weaver.

The ‘anti-device’ effect of the pattern matrix has the potential to explore this conundrum, as it represents a seemingly acceptable demonstration of the physical nature of the digital, and that forms of digital technology have inhabited the world of the reassuringly physical for many millennia of human invention.

Check your supply chains

One aspect of the pattern matrix I picked on for the symposium which came up in the exhibition as well was the fact that the beech wood came from a single tree in Cornwall courtesy of Aaron Moore – and used this as an example of our design philosophy of taking on the myth of collapse rather than the myth of infinite abundance.

Feedback from Penelope team members

Having a 5X5 grid initially was thought to be excessive, as most ancient weaves can be expressed with a 4X4 matrix. This larger capability turns out to be more important than we first thought, as it means you can demonstrate the importance of odd and even numbers in the mathematics of weaving.

There is a problem with the single colour change block due to a faulty use of the code from the old version. This has always been a somewhat temporary feature so we should sort this out properly (e.g. using other shapes for colour across the matrix) before we use it next.

We can also try using an augmented reality approach to show the weave structure directly over the grid, so it’s easier to see how the token block changes relate to specific crossings. This could be displayed alongside the current warp weighted loom rather than as a replacement for it.

Red King: Host/Parasite co-evolution citizen science

A new project begins, on the subject of ecology and evolution of infectious disease. This one is a little different from a lot of Foam Kernow’s citizen science projects in that the subject is theoretical research – and involves mathematical simulations of populations of co-evolving organisms, rather than the direct study of real ones in field sites etc.

The simulation, or model, we are working with is concerned with the co-evolution of parasites and their hosts. Just as in more commonly known simulations of predators and prey, there are complex relationships between hosts and parasites – for example if parasites become too successful and aggressive the hosts start to die out, in turn reducing the parasite populations. Hosts can evolve to resist infection, but this has an overhead that starts to become a disadvantage when most of a population is free of parasites again.

graph
Example evolution processes with different host/parasite trade-offs.

Over time these relationships shift and change, and this happens in different patterns depending on the starting conditions. Little is known about the categorisation of these patterns, or even the range of relationships possible. The models used to simulate them are still a research topic in their own right, so in this project we are hoping to explore different ways people can both control a simulation (perhaps with an element of visual live programming), and also experience the results in a number of ways – via a sonifications, or game world. The eventual, ambitious aim – is to provide a way for people to feedback their discoveries into the research.

sketch

The UAV toolkit & appropriate technology

The UAV toolkit’s second project phase is now complete, the first development sprint at the start of the year was a bit of research into what we could use an average phone’s sensors for, resulting in a proof of concept remote sensing android app that allowed you to visually program different scripts which we then tested on some drones, a radio controlled plane and a kite.

photo-1444743618-504214

This time we had a specific focus on environmental agencies, working with Katie Threadgill at the Westcountry Rivers Trust has meant we’ve had to think about how this could be used by real people in an actual setting (farm advisors working with local farmers). Making something cheap, open source and easy to use, yet open ended has been the focus – and we are now looking at providing WRT with a complete toolkit which would comprise a drone (for good weather) a kite (for bad weather/no flight licences required) and an android phone so they don’t need to worry about destroying their own if something goes wrong. Katie has produced this excellent guide on how the app works.

The idea of appropriate technology has become an important philosophy for projects we are developing at Foam Kernow, in conjunction with unlikely connections in livecoding and our wider arts practice. For example the Sonic Bike project – where from the start we restricted the technology so that no ‘cloud’ network connections are required and all the data and hardware required has to fit on the bike – with no data “leaking” out.

photo-1444742698-352249

With the UAV toolkit the open endedness of providing a visual programming system that works on a touchscreen results in an application that is flexible enough to be used in ways and places we can’t predict. For example in crisis situations, where power, networking or hardware is not available to set up remote sensing devices when you need them most. With the UAV toolkit we are working towards a self contained system, and what I’ve found interesting is how many interface and programming ‘guidelines’ I have to bend to make this possible – open endedness is very much against the grain of contemporary software design philosophy.

The “app ecosystem” is ultimately concerned with elevator pitches – to do one thing, and boil it down to the least actions possible to achieve it. This is not a problem in itself, but the assumption that this is the only philosophy worth consideration is wrong. One experience that comes to mind recently is having to make and upload banner images of an exact size to the Play Store before it would allow me to release an important fix needed for Mongoose 2000, which is only intended to ever have 5 or 6 users.

photo-1444743271-137365

For the UAV toolkit, our future plans include stitching together photos captured on the phone and producing a single large map without the need to use any other software on a laptop. There are also interesting possibilities regarding distributed networking with bluetooth and similar radio systems – for example sending code to different phones is needed, as currently there is no way to distribute scripts amongst users. This could also be a way of creating distributed processing – controlling one phone in a remote location with another via code sent by adhoc wifi or SMS for example.

Weavecoding performance experiments in Cornwall

Last week the weavecoding group met at Foam Kernow for our Cornish research gathering. As we approach the final stages of the project our discussions turn to publications, and which ideas from the start need revisiting. While they were here, I wanted to give local artists and researchers working with code and textiles a chance to meet Ellen, Emma and Alex. As we are a non-academic research organisation I wanted to avoid the normal powerpoint talks/coffee events and try something more informal and inclusive.

IMG_1650

One of the original ideas we had was to combine weaving and coding in a performance setting, to both provide a way to make livecoding more inclusive with weaving, and at the same time to highlight the digital thought processes involved in weaving. Amber made vegetarian sushi for our audience and we set up the Jubilee Warehouse with a collection of experiments from the project:

  • The newly warped table loom with a live camera/projection from underneath the fabric as it was woven with codes for different weaves on post-it notes for people to try.
  • The tablet/inkle loom to represent ancient weaving techniques.
  • The pattern matrix tangible weavecoding machine and Raspberry Pi.
  • A brand new experiment by Francesca with a dancemat connected to the pattern matrix software for dance code weaving!
  • The slub livecoding setup.

IMG_1634

This provided an opportunity for people to try things out and ask questions/provide discussion starting points. Our audience consisted of craft researchers, anthropological biologists, architects, game designers and technologists – so it all went on quite a lot longer than we anticipated! Alex and I provided some slub livecoded music to weave by, and my favourite part was the live weaving projection – with more projectors we could develop this combination of code and weaving performance more. Thanks to Emma for all the videos and photos!

IMG_1692

IMG_1564

Airborne drag-drop programming, the next steps

This autumn we are continuing work on the UAV toolkit with Karen Anderson and her research group at the Environment and Sustainability Institute. This time we have a mission to help the Westcountry Rivers Trust by coming up with fast and cheap ways they can build maps of farms to determine water run-off problems, which gives farmers proof they need to get funding to fix pollution issues.

IMG_20151014_154601
Flight planning operations and photo checking

In order to make the software usable in this case, we decided on two directions. On the one hand there needs to be a simple way to start and stop programs (or “flight modes”) that read sensor data, as well as defining certain global settings, ie. flight altitude, desired image coverage etc. At the same time, the code to define what this does needs to still be programmable in the app – and more complex behaviours need to be possible to support both kites and UAVs. Our philosophy is that it has to be open ended, as we don’t know where the toolkit it might be useful (ie. crisis mapping situations) or what new sensors will be available on a device in the future.

Screenshot_2015-10-19-21-48-01
The new main screen

One specific set of new behaviours we need is for kite mapping. We already have the ability to choose when to take pictures based on GPS and altitude, but with a kite there can be lots of turbulence and the camera is in a much less controlled state, flipping around taking shots of the sky etc. So we need to calculate things like jerk from change in acceleration and use orientation sensors to only take photos when the lens is pointing directly down, within some degree of acceptable margin.

Below is a section of the code that calculates if we are pointing down using the magnetometer and accelerometer – the drag drop visual code can now be used to build normal Scheme functions using a touchscreen (a bit like scheme bricks). In fact I managed to do all of this work on the phone. There are now two types of code, the main programs or “flight modes” that you can run from the front screen, and a library of editable functions which they use. This means there are now three levels that the software can be used – using it without needing to see any of the code at all, editing the basic behaviour like which sensor’s data are captured, and finally modifying the more detailed code to make it do completely new things.

Screenshot_2015-10-19-22-08-12

Picademy Exeter and Future Thinking for Social Living

Last week I had the chance to help out the Raspberry Pi foundation at their Picademy in Exeter. It was good to meet up with Sam Aaron again to talk livecoding on Pis, and also see how they run these events. They are designed for local teachers to get more confident with computers, programming and electronics to the point where they can start designing their own teaching materials on the second day of the two day course. This is a model I’m intending to use for the second inset teacher training day I’m doing next week at Truro school – it’s pretty exciting to see the ideas that they have for activities for their pupils, and a good challenge to help find ways to bring them into existence in a day.

IMG_20150605_141147

We also had the ending of Future Thinking for Social Living at the Miners Court summer party last week. We exhibited the map made during the workshops, made lots of tea, and had some fun with the pattern matrix in musical mode out in the garden – I adapted Alex’s music system we used with Ellen in Munich to run on Raspberry Pi so it didn’t require a laptop, or a screen at all – simply a speaker. It was interesting how quickly people got the idea, in many ways music is easier to explain than weaving as listening while coding is multi-sensory.

IMG_20150603_151747

Weavecoding Munich

Ellen’s exhibition in Munich was always going to be a pivotal event in the weavecoding project – one of the first opportunities to expose our work to a large audience. The Museum of casts of classical sculptures was the perfect context for the mythical aspects of weaving, overlooked by Penelope and friends with her subversive woven/unwoven work, we could explore the connections between livecoding and weaving.

IMG_8477 2

Practically we focused on developing the tangible weavecoding exhibit for events later in the week, as well as discussing the many languages we have developed so far for different looms and weaving techniques. One of our discoveries is that none of the models or languages we have created seem sufficient in themselves – weaving could be far too big to be able to be described or solved from a single perspective. We’ve tried approaches describing weave structures from the actions of the weaver, setup of the loom and structure of the fabric – perhaps the most promising is to explor the story of weaving from the perspective of the thread itself.

IMG_20150510_062737

IMG_20150508_153211

One of the distinctive things about weaving in antiquity is how multiple technologies were combined to form a single piece of fabric, weaving in different directions, weft becoming warp, use of tablets vs warp weighted weaving. To explain this via the path of a single conceptual thread crossing through itself may make this possible to describe in a more flexible, declarative and abstracted manner than having to explain each method separately as if in it’s own world.

IMG_20150508_152045

IMG_20150509_100421

The pattern matrix has now been made into good shape for explaining the relationship between colour and structure in pattern formation. For the first time we also used all 4 sensors per block on the bottom row which meant we could use a special “colour” block that the system recognises from the normal warp/weft ones and use it’s rotation to choose between 8 preset colour settings. This was quite a breakthrough as it had all been theoretical before.

IMG_20150508_194220

Adding this more complex use of the magnetic patterns meant that Alex could set up the matrix as a tangible interface for his tidal livecoding software meaning Ellen could join us for a collaborative slub weavecoding performance on the Saturday evening. The prospect of performing together was something we have talked about since the very beginning of the project, so it was great to finally reach this point. The reverb in the museum was vast, meaning that we had to play the space a lot, and provide ‘music for looking at sculptures by':

Loose threads from weavecoding

Midway through the weavecoding project and our researches have thrown up a whole load of topics that either don’t quite fit into our framework, or we simply won’t have time to pursue properly. Here are some of the tangents I’ve collected so far.

Coding with knots: Khipu

One of the cultures I’m increasingly interested in are the Incas. Their empire flourished to up to 37 million people, without the need of money or a written language. We know that some numeric information was stored using Khipu, a knot based recording system which was used in combination with black and white stones to read and calculate. Two thirds of the quipus we have are un-translated, and do not fit into the known numeric coding system – what information do they hold?

quipu

Harvard University provides a Khipu Database Project with many surviving examples documented – I’m hoping to run a workshop soon to look through some of this data in a variety of ways.

Tablet weaving NAND gates

Image2
Diagram thanks to Phiala’s String Page – the only place I’ve seen tablet weaving explained properly.

There are logic gates in tablet weaving logic. I haven’t fully figured this out yet, but I noticed modelling tablet weaving that you end up basically mapping the combinations of the weaving actions (such as turn direction) and colour as truth tables.

Top face colour based on top left/top right hole yarn in a single card and turn direction (clockwise/counter clockwise)

TL Yarn : TR Yarn : Turn : Top face colour
--------------------------------------------
Black   : Black   : CCW  : Black
Black   : Black   : CW   : Black
Black   : White   : CCW  : Black
Black   : White   : CW   : White
White   : Black   : CCW  : White
White   : Black   : CW   : Black
White   : White   : CCW  : White
White   : White   : CW   : White

Things get stranger when you include twist and combinations of actions with multiple cards. Would it be possible to compile high level programming languages into weaving instructions for carrying out computation? Perhaps this is what the untranslatable quipus are about?

Nintendo made a knitting machine

We could really do with some of these, unfortunately they never went beyond prototype stage.

nintendo

Asemic writing

Asemic writing is a post-literate written form with no semantic content. Miles Visman programs procedural asemic languages and hand weaves them. I think this may be an important connection to livecoding at some point.

aw001