Dagstuhl – Collaboration and learning through live coding

Dagstuhl seminars are week long free form meetings between different disciplines centred around computer science. The location is a specially designed complex in the German countryside, and activities include long walks in the surrounding hills, a well equipped and beautiful music room and a well stocked wine cellar.

Our seminar was called ‘Collaboration and learning through live coding’, organised by Alan Blackwell, Alex McLean, James Noble and Julian Rohrhuber and included people from the fields of Software Engineering, Computer Science Education as well as plenty of practising livecoders and multidisciplinary researchers.

IMG_20130919_104725

Discussion was wide ranging and intense at times, and the first job was to sufficiently explain what livecoding actually was – which turned out to require performances in different settings:

1. Explanatory demo style livecoding: talking through it as you do it.
2. Meeting room coffee break gigs: with a closely attentive audience.
3. The music room: relaxed evening events with beer and wine.

So Dagstuhl’s music room was immediately useful in providing a more ‘normal’ livecoding situation. It was of course more stressful than usual, knowing that you were being critically appraised in this way by world experts in related fields! However it paid off hugely as we had some wonderful interpretations from these different viewpoints.

One of the most important for me was the framing of livecoding in terms of the roots of software engineering. Robert Biddle, Professor of Human-Computer Interaction at Carleton University put it into context for us. In 1968 NATO held a ‘Software Components Conference’ in order to tackle a perceived gap in programming expertise with the Soviet Union.

Software_components_lecture_large

This conference (attended my many of the ‘big names’ of programming in later years) led to many patterns of thought that pervade the design of computers and software – a tendency for deeply hierarchical command structures in order to keep control of the arising complexity, and a distrust of more adhoc solutions or any hint of making things up as we go along. In more recent times we can see a fight against this in the rise of Agile programming methodologies, and it was interesting to look at livecoding as a part of this story too. For example it provides a way to accept and demonstrate the ‘power to think and feel’ that programming give us as humans. The big question is accessibility, in a ubiquitously computational world – how can this reach wider groups of people?

IMG_20130918_025213

Ellen Harlizius-Klück works with three different domains simultaneously – investigating the history of mathematics via weaving in ancient Greece. Her work includes livecoding, using weaving as a performance tool – demonstrating the algorithmic potential of looms and combinations of patterns. Her work exposes the hidden shared history of textiles and computation, and this made a lot of sense to me as at the lowest level the operations of computers are not singular 0’s and 1’s as is often talked about, but actually in terms of transformations of whole patterns of bits.

Mark Guzdial was examining livecoding through the lens of education, specifically teaching computer science. The fact that so many of us involved in the field are also teaching in schools – and already looking at ways of bringing livecoding into this area, is noteworthy, as is the educational potential of doing livecoding in nightclub type environments. Although here it works more on the level of showing people that humans make code, it’s not a matter of pure mathematical black boxes – that can be the ground breaking realisation for a lot of people.

IMG_20130917_095349

Something that was interesting to me was to concentrate on livecoding as a specifically musical practice (rather than also a visual one) as there are many things about perceiving the process with a different sense from your description of it that are important. Julian Rohrhuber pointed out that “you can use sound in order to hear what you are doing” – the sound is the temporal execution of the code – and can be a close representation of what the computer is actually doing. This time based approach is also part of livecoding working against the notion that producing an ‘end result’ is important, Juan A. Romero said that “if you’re livecoding, you’re not just coding the final note” – i.e. the process of coding is the artform.

IMG_20130917_092532

In terms of a school teaching situation sound is also powerful, as described by Sam Aaron, livecoder and creator of Sonic Pi. A child getting a music program to work for the first time in a classroom is immediately obvious to everyone else – as it is broadcast as sound, inspiring a bit of competition and ending up with a naturally collaborative learning experience.

It’s impossible to cover all the discussions that we had, these are just the ones I happened to get down in my notebook, but it was a great opportunity to examine what livecoding is about now in relation to other practices, where it came from and where it might go in the future.

IMG_20130918_220833

Sonic Bike Hacklab Part 3: The anti-cloud – towards bike to bike mesh networking

IMG_20130726_122857

[Continued from part 2] One of the philosophies that pre-dates my involvement with the sonic bikes project is a refusal of cloud technologies – to avoid the use of a central server and to provide everything required (map, sounds and computation) on board the bikes. As the problems with cloud technology become more well known, art projects like this are a good way to creatively prototype alternatives.

The need to abstractly “get the bikes to talk to one another” beyond our earlier FM transmission experiments implies some kind of networking, and mesh networking provides non-hierarchical peer to peer communication, appropriate if you want to form networks between bikes on the street over wifi (which may cluster at times, and break up or reform as people go in different directions). After discussing this a bit with hacklab participant and fellow Beagleboard enthusiast Adam Parkinson I thought this would be a good thing to spend some time researching.

The most basic networking information we can detect with wifi is the presence of a particular bike, and we decided to prototype this first. I’d previously got hold of a low power wifi usb module compatible with a Raspberry Pi (which I found I could run with power from the bike’s beagleboard usb!), and we could use an Android phone on another bike, running fluxa to plug the signal strength into a synth parameter:

mesh

It’s fairly simple to make an ad-hoc network on the Raspberry Pi via command line:

ifconfig wlan0 down
iwconfig wlan0 channel 4
iwconfig wlan0 mode ad-hoc
iwconfig wlan0 essid 'bikemesh'
iwconfig wlan0 key password
ifconfig wlan0 192.168.2.1

On the Android side, the proximity synth software continuously measures the strength of the wifi network from the other bike, using a WifiScanReceiver we set up like so:

wifi = (WifiManager) getSystemService(Context.WIFI_SERVICE);
wifi.startScan();
registerReceiver(new WiFiScanReceiver(), 
                 new IntentFilter(
                 WifiManager.SCAN_RESULTS_AVAILABLE_ACTION));

The WifiScanReceiver is a subclass of BroadcastReceiver, that re-triggers the scan process. This results in reasonably high frequency scanning, a couple a second or so and we also check the SSID names of the networks around the bike for the correct “bikemesh” node:

import java.util.List;

import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.net.wifi.ScanResult;
import android.net.wifi.WifiManager;
import android.util.Log;
import android.widget.Toast;

public class WiFiScanReceiver extends BroadcastReceiver {
    private static final String TAG = "WiFiScanReceiver";

    public WiFiScanReceiver() {
        super();
        Level=0;
    }

    static public int Level;

    @Override
    public void onReceive(Context c, Intent intent) {
        List<ScanResult> results = ((Earlobes)c).wifi.getScanResults();
        ScanResult bestSignal = null;

        for (ScanResult result : results) {
            if (result.SSID.equals("bikemesh")) {
                String message = String.format("bikemesh located: strength %d",
                                               result.level);
                Level = result.level;
            }
        }
        ((Earlobes)c).wifi.startScan();
    }

}

The synth was also using the accelerometers, but ramped up the cutoff frequency of a low pass filter on some white noise and increased modulation on the accelerometer driven sine waves when you were close to the other bike. The result was quite surprising with such a simple setup, as it immediately turned into a game playing situation, bike “hide and seek” – as rider of the proximity synth bike you wanted to hunt out where the wifi bike was, the rider of which would be trying to escape. The range was surprisingly long distance, about halfway across London Fields park. Here is an initial test of the setup (we had the sounds a bit more obvious than this in later tests):

With the hardware and some simple software tested, the next stage would be to run multiple wifi nodes and get them to connect and form a mesh network. I got some way into using Babel for this, which is very self contained and compiles and runs on Beagleboard and Raspberry Pi. The other side to this is what kind of things do we want to do with this kind of “on the road” system, how do we notate and artistically control what happens over a sonic bike mesh network?

Some ideas we had included recording sounds and passing them between bikes, or each bike forming a synth node, so you create and change audio dependant on who is around you and what the configuration is. I made few more notes on the technical stuff here.

Sonic Bike Hacklab Part 2: FM accelerometer transmissions

[Continued from part 1] On day one, after we introduced the project and the themes we wanted to explore, Ryan Jordan had a great idea of how to prototype the bike-bike communication using FM radio transmissions. He quickly freeform built a short range FM transmitter powered by a 9v battery.

IMG_20130725_135022

The next thing we needed was something to transmit – and another experiment was seeing how accelerometers responded during bike riding on different terrains. I’d been playing with running the fluxa synth code in Android native audio for a while, so I plugged the accelerometer input into parameters of a simple ring modulation synth to see what would happen. We set off with the following formation:

fm

The result was that the vibrations and movements of a rider were being transmitted to the other bikes for playback, including lots of great distortion and radio interference. As the range was fairly short, it was possible to control how much of the signal you received – as you cycled away from the “source cyclist”, static (and some BBC radio 2) started to take over.

We needed to tune the sensitivity of the accelerometer inputs – as this first attempt was a little too glitchy and overactive, the only changes really discernible were the differences between the bike moving and still (and it sounded like a scifi laser battle in space). One of the great things about prototyping with android was that we could share the package around and run it on loads of phones. So we went out again with three bikes playing back their own movements with different synth settings.

accel

MS Stubnitz Algorave #2

Our second offshore Algorave on the MS Stubnitz, during the ship’s final night in London. The crowd was pleasingly diverse with lots of people new to algorave and livecoding, and although behind the scenes we had some hitches due to the ship’s impending departure for France, the event was relaxed and went smoothly. Our performance was honoured by guest appearance from Elvi$ Ca$h as well as a re-compile of the Al Jazari rave bots. As one of those spending the night on the ship afterwards, I had to be careful not to have too much of a lie in!

eee
Alexandra Cárdenas live coded dark textures and sharply angled beats.
eee
Shelly and some Mandelbrots, transmitting multi-layered synthetic tones with Davide Della Casa in the foreground operating livecodelab who, with Guy John live coded inspiring minimalist geometric expressions to match the music throughout the night.

Better, and considerably wider angle photos can be found here by Yoshizen.

Algorave practice

It’s been a huge amount of time since I recorded anything, but I thought I would a) try and do some actual livecoding practice for the upcoming algorave on Thursday and b) record everything. As usual I’m following my foolhardy approach of improvising both musical structure and sound material by livecoding synth graphs from scratch. Sometimes it takes while longer than I would like to reach a suitable musical complexity (this is faster in a real live situation, with increased adrenaline), and some fiddly things there never seems time to sort out, such as stereo! For these recordings, and live on stage with slub I use the scheme bricks visual programming language. Here are some of my favourites, the complete set is here.

Bike Opera – layering sounds in space

New advancements on the the bike opera project with Kaffe Matthews include a brand new mapping tool based on, yes you guessed it – Ushahidi which I’ve been using for a lot of wildly different projects recently. This time the work has been mainly focused in improving the area mapping – adding features for editing polygons so Kaffe can layer her sounds in space:

swamp-edit

This work is fairly reusable, as it only concerns changes to the submit_edit_js.php file in the standard Ushahidi install. In the meantime, Kaffe has been collecting sounds from musicians in Porto and building up a work of truly operatic proportions. We keep our fingers crossed that the bike mounted BeagleBoards can cope with all this material!

Angelika

Life on an Algorave Tour

Some pictures taken during the recent Algorave Tour, making people dance to algorithms in Brighton, London, Karlesruhe, Cologne and Dusseldorf.

The MS Stubnitz moored in Canary Wharf surrounded by financial architecture:
IMG_20130418_135658

IMG_20130418_140810

Wandering around during soundcheck, a heavy duty workshop on the Stubnitz:
IMG_20130418_143850

Sound checking with Andrew Sorenson:
IMG_20130418_162540

A speaker close up, one of many:
IMG_20130418_162740

Norah Lorway shaking the boat’s superstructure with sub bass:
IMG_20130418_200317

MYK livecoded acid squelch:
IMG_20130418_204823

Sick Lincoln deploying highly danceable crowd pleasing algorithms:
IMG_20130418_213634

Mico Rex, Mexico’s finest algorave pop duo closing the night:
IMG_20130418_222153

Onward to Karleruhe, a random photo of the slub soundcheck:
IMG_20130420_080039

Hernani Villaseñor 8bit house with visible parentheses:
IMG_20130420_082809

Fredrik Olofsson livecoding 5 arduino’s simultaneously for 2 bit grindcore:
IMG_20130420_111707

New Portuguese Bicycle Operatics

Prepare your bicycle clips! Kaffe Matthews and I are starting work on a new Bicycle Opera piece for the city of Porto, I’m working on a new mapping tool and adding some new zone types to the audio system.

While working on a BeagleBoard from one of the bikes used in the Ghent installation of ‘The swamp that was…’, I found (in true Apple/Google style) 4Mb of GPS logs, taken every 10 seconds during the 2 month festival that I forgot to turn off. Being part of a public installation (and therefore reasonably anonymised 🙂 – this is the first 5th of the data, and about all it was possible to plot in high resolution on an online map:

It’s interesting to see the variability of the precision, as well as being able to identify locations and structures that break up the signal (such as the part underneath a large road bridge).

scheme bricks 2

A new version of scheme bricks is under way, planned to be tested out with slub on the Mozilla Fest Party, then taken across the Atlantic for some more livecoding action in Mexico City! New things include blocks with depth – cosmetic for the moment, but I plan to prototype some new ideas based on this, separately zoom-able code blocks, and most importantly it’s a complete rewrite into functional R5RS Scheme for portability – it should now be relatively simple to get it on android via Nomadic which uses Tinyscheme.

Using the new temporal recursion, the code produced is much less monolithic. Massively tall structures resulting from plugging together sequences during long performances were a bit of an issue before, but splitting the code into a multitude of functions (which can be shrunk and put in the “background”) seems to be a far easier way of working so far.

Livenotations gig at Arnolfini – The hair of the horse

Thanks to Farrows Creative we have some great photos of the livenotations performance with Alex McLean, Hester Reeve and me at the Arnolfini a few weeks ago. This was a completely unrehearsed combination of Hester Reeve’s live art and slub’s live coding. A score was made from rocks and stones, using their position and also drawing on them with brushes and water, made temporary with a heat gun. A selection of good branches from Alex’s garden provided a tripod for the camera which allowed us to project the score along with a clock time marker, my code and Alex’s emacs overlaid with a second projector for a multi layer image of what we were all doing.

I could see the output from the camera (running using Gabor’s fluxus addon code) underneath a semi transparent version of scheme bricks, and my original plan was to attempt to read the score in some symbolic way. Instead I found myself using more playful methods, dragging sections of code over particular stones – and switching to using them when Hester worked on the relevant one. Her movements also helped me break out from normal programming flow more than usual, reminding me of nearby unused bits of code and I generally took a slower more considered approach.

As I said in my previous post, this seems like an encouraging direction for livecoding to follow – given how naturally it fits with performance/live art, it seems refreshing. The impulse is to augment this kind of performance with further machine vision and tracking software, but perhaps much like slub’s preference for listening to each other over complex networking, it’s interesting to concentrate on interpretations on a more open ended manner, at least to begin with.