Weavecoding performance experiments in Cornwall

Last week the weavecoding group met at Foam Kernow for our Cornish research gathering. As we approach the final stages of the project our discussions turn to publications, and which ideas from the start need revisiting. While they were here, I wanted to give local artists and researchers working with code and textiles a chance to meet Ellen, Emma and Alex. As we are a non-academic research organisation I wanted to avoid the normal powerpoint talks/coffee events and try something more informal and inclusive.

IMG_1650

One of the original ideas we had was to combine weaving and coding in a performance setting, to both provide a way to make livecoding more inclusive with weaving, and at the same time to highlight the digital thought processes involved in weaving. Amber made vegetarian sushi for our audience and we set up the Jubilee Warehouse with a collection of experiments from the project:

  • The newly warped table loom with a live camera/projection from underneath the fabric as it was woven with codes for different weaves on post-it notes for people to try.
  • The tablet/inkle loom to represent ancient weaving techniques.
  • The pattern matrix tangible weavecoding machine and Raspberry Pi.
  • A brand new experiment by Francesca with a dancemat connected to the pattern matrix software for dance code weaving!
  • The slub livecoding setup.

IMG_1634

This provided an opportunity for people to try things out and ask questions/provide discussion starting points. Our audience consisted of craft researchers, anthropological biologists, architects, game designers and technologists – so it all went on quite a lot longer than we anticipated! Alex and I provided some slub livecoded music to weave by, and my favourite part was the live weaving projection – with more projectors we could develop this combination of code and weaving performance more. Thanks to Emma for all the videos and photos!

IMG_1692

IMG_1564

Skate/BMX ramp projection

Jaye Louis Douce, Ruth Ross-Macdonald and I took to the ramps of Mount Hawke skate park in deepest darkest Cornwall to test the prototype tracker/projection mapper (now know as ‘The Cyber-Dog system‘) in it’s intended environment for the first time. Mount Hawke consists of 20,000 square feet of ramps of all shapes and sizes, an inspiring place for thinking about projections and tracing the flowing movements of skaters and BMX riders.

IMG_20130328_204629

Finding a good place to mount the projector was the first problem, it was difficult to get it far enough away to cover more than a partial area of our chosen test ramp – even with some creative duct tape application. Meanwhile the Kinect camera was happily tracking the entire ramp, so we’ll be able to fix this by replacing my old battered projector with a better model in a more suitable location.

IMG_20130328_203523

The next challenge is calibrating the projection mapping to align it with what the camera is looking at. As they are in different places this is quite fiddly and time consuming to get right, some improvements to the fluxus script will make it faster. Here is Jaye testing it once we had it lined up:

Next it was time to recruit some BMX test pilots to give it a go:

At higher speed it needs a bit of linear interpolation to ‘connect the dots’, as the visualisation is running at 60fps while the tracking is more like 20fps:

This test proved the fundamental idea, and opens up lots of possibilities, different types of visualisations, recording/replaying paths over time as well as the possibility of identifying individual skaters or BMX riders with computer vision. One great advantage this setup has is once it’s running it will work all the time, with no need for continuous calibration (as with RGB cameras) or the use of any additional tracking devices.