Sonic Kayaks: musical instruments for marine exploration

Here is a bit of a writeup of the gubbins going into the sonic kayaks project. We only have a few weeks to go until the kayaks’ maiden voyages at the British Science Festival, so we are ramping things up, with a week of intense testing and production last week with Kirsty Kemp, Kaffe Matthews and Chris Yesson joining us at FoAM Kernow. You can read Amber’s report on the week here.

IMG_20160727_173525

setup-annotated

The heart of the system is the Raspberry Pi 2. This is connected to a USB GPS dongle, and running the sonic bike software we have used in many cities over the last couple of years. We have some crucial additions such as two water temperature sensors and a hydrophone. We have also switched all audio processing over to pure data, so we can do a lot more sound wise – such as sonify sensor data directly.

How to do this well has been a tricky part to get right. There is a trade off between constant irritating sound (in a wild environment this is more of a problem than a city, as we found out in the first workshop) and ‘overcooking’ the sound so it’s too complex to be able to tell what the sensors are actually reporting.

watertempsine

This is the current pd patch – I settled on cutting out the sound when there is no change in temperature, so you only hear anything when you are paddling through a temperature gradient. The pitch represents the current temperature, but it’s normalised to the running minimum and maximum the kayak has observed. This makes it much more sensitive, but it takes a few minutes to self calibrate at the start. Currently it ranges from 70 to 970 Hz, with a little frequency modulation at 90 Hz to make the lower end more audible.

Here it is on the water with our brand new multi-kayak compatible mounting system and 3D printed horn built in blender. The horrible sound right at the start is my rubbish phone.

In addition to this, we have the hydrophone, which is really the star of the show. Even with a preamp we’re having to boost it in pure data by 12 times to hear things, but what we do hear is both mysterious and revealing. It seems that boat sounds are really loud – you can hear engines for quite a long way, useful in expanding your kayak senses if they are behind you. We also heard snapping sounds from underwater creatures and further up the Penryn river you can hear chains clinking and there seems to be a general background sound that changes as you move around.

We still want to add a layer of additional sounds to this experience for the Swansea festival for people to search for out on the water. We are planning different areas so you can choose to paddle into or away from “sonic areas” comprising multiple GPS zones. We spent the last day with Kaffe testing some quick ideas out:

Looking at sea temperature and sensing the hidden underwater world, climate change is the big subject we keep coming back to, so we are looking for ways to approach this topic with our strange new instrument.

Sonic Kayaks Hacklab

Part one of our two events for British Science Week was the Sonic Kayak open Hacklab with Kaffe Matthews and Dr. Kirsty Kemp. Amber has reported our findings here, this was the first time we successfully trialled the technology and ideas behind the Sonic Kayak, in future we will be refining them into instruments for experiencing the marine world. More on that soon!

25225222424_869158198f_o

Sonic Bikes to Sonic Kayaks – using puredata

When I first started working on the Sonic Bikes project with Kaffe Matthews in 2013 I had just moved to Cornwall, and I used the Penryn river for developing “The swamp that was” installation we made for Ghent. We’ve always talked about bringing this project here, but the various limitations of cycling (fast roads, stupid drivers and ridiculous hills) were always too much of a problem – so we wondered about sonic kayaks, as a distant vague idea. However now, thanks to help from the British Science Association, Feast Cornwall and the Port Eliot Festival they are fast becoming a reality!

We’re also using this opportunity to convert kayaks into instruments for sensing marine microclimates – an area which is currently lacking in scientific knowledge. In order to do this, we need to expand the sonic potential of our current system – moving it from sample playback to a more open ended synthesis approach. We’re running a open hacklab to trial the use of sensors, and actually get out on the water with Kaffe later in the month.

zones

To do all this – and keep it functioning on a Raspberry Pi, we’re using Pure Data. For the moment it seemed most appropriate to stick to the concept of audio zones, previously these defined areas associated with samples that would play back when you were inside of them. The screenshot above is the sonic bike mapping tool – recently rebuilt by Francesca. Using Pure Data we can associated each zone with a specific patch, which leaves the use of samples or not, effects, interpretation of sensor data and any other musical decisions completely open.

pd

The patch above is the first version of the zone patch mixer – it reads OSC messages from the GPS map system (which is written in Lua) and when a patch is triggered, it turns on audio processing for it and gradually fades it up. When the zone is left it fades it down and deactivates it – this way we can have multiple overlayed patches, much like the sample mixing we used before. We can also have loads of different patches as it’s only processing the active ones at any one time, it won’t stress out the Raspberry Pi too much.

I’ve been testing this today by walking around a lot with headphones on – this is a GPS trace, which gives some ideas of the usual problems of GPS (I didn’t actually switch to kayak halfway through, although it thought I did).

trace

Cornwall “Let’s Get Digital” presentation

Here’s a presentation I gave at the end of last year at a Creative Skills Cornwall meeting at Falmouth University. I introduced the problems of a growing producer/consumer digital divide – the need for more public discourse in the politics of technology and how free software, codeclub, livecoding, algorithmic weaving and sonic bikes can indicate other relationships we can have with technology.

The talk went down really well, but the slides are a little minimal so it might not be super clear what it was all about based just on them :)

slide

Sonic Bike Hacklab Part 3: The anti-cloud – towards bike to bike mesh networking

IMG_20130726_122857

[Continued from part 2] One of the philosophies that pre-dates my involvement with the sonic bikes project is a refusal of cloud technologies – to avoid the use of a central server and to provide everything required (map, sounds and computation) on board the bikes. As the problems with cloud technology become more well known, art projects like this are a good way to creatively prototype alternatives.

The need to abstractly “get the bikes to talk to one another” beyond our earlier FM transmission experiments implies some kind of networking, and mesh networking provides non-hierarchical peer to peer communication, appropriate if you want to form networks between bikes on the street over wifi (which may cluster at times, and break up or reform as people go in different directions). After discussing this a bit with hacklab participant and fellow Beagleboard enthusiast Adam Parkinson I thought this would be a good thing to spend some time researching.

The most basic networking information we can detect with wifi is the presence of a particular bike, and we decided to prototype this first. I’d previously got hold of a low power wifi usb module compatible with a Raspberry Pi (which I found I could run with power from the bike’s beagleboard usb!), and we could use an Android phone on another bike, running fluxa to plug the signal strength into a synth parameter:

mesh

It’s fairly simple to make an ad-hoc network on the Raspberry Pi via command line:

ifconfig wlan0 down
iwconfig wlan0 channel 4
iwconfig wlan0 mode ad-hoc
iwconfig wlan0 essid 'bikemesh'
iwconfig wlan0 key password
ifconfig wlan0 192.168.2.1

On the Android side, the proximity synth software continuously measures the strength of the wifi network from the other bike, using a WifiScanReceiver we set up like so:

wifi = (WifiManager) getSystemService(Context.WIFI_SERVICE);
wifi.startScan();
registerReceiver(new WiFiScanReceiver(), 
                 new IntentFilter(
                 WifiManager.SCAN_RESULTS_AVAILABLE_ACTION));

The WifiScanReceiver is a subclass of BroadcastReceiver, that re-triggers the scan process. This results in reasonably high frequency scanning, a couple a second or so and we also check the SSID names of the networks around the bike for the correct “bikemesh” node:

import java.util.List;

import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.net.wifi.ScanResult;
import android.net.wifi.WifiManager;
import android.util.Log;
import android.widget.Toast;

public class WiFiScanReceiver extends BroadcastReceiver {
    private static final String TAG = "WiFiScanReceiver";

    public WiFiScanReceiver() {
        super();
        Level=0;
    }

    static public int Level;

    @Override
    public void onReceive(Context c, Intent intent) {
        List<ScanResult> results = ((Earlobes)c).wifi.getScanResults();
        ScanResult bestSignal = null;

        for (ScanResult result : results) {
            if (result.SSID.equals("bikemesh")) {
                String message = String.format("bikemesh located: strength %d",
                                               result.level);
                Level = result.level;
            }
        }
        ((Earlobes)c).wifi.startScan();
    }

}

The synth was also using the accelerometers, but ramped up the cutoff frequency of a low pass filter on some white noise and increased modulation on the accelerometer driven sine waves when you were close to the other bike. The result was quite surprising with such a simple setup, as it immediately turned into a game playing situation, bike “hide and seek” – as rider of the proximity synth bike you wanted to hunt out where the wifi bike was, the rider of which would be trying to escape. The range was surprisingly long distance, about halfway across London Fields park. Here is an initial test of the setup (we had the sounds a bit more obvious than this in later tests):

With the hardware and some simple software tested, the next stage would be to run multiple wifi nodes and get them to connect and form a mesh network. I got some way into using Babel for this, which is very self contained and compiles and runs on Beagleboard and Raspberry Pi. The other side to this is what kind of things do we want to do with this kind of “on the road” system, how do we notate and artistically control what happens over a sonic bike mesh network?

Some ideas we had included recording sounds and passing them between bikes, or each bike forming a synth node, so you create and change audio dependant on who is around you and what the configuration is. I made few more notes on the technical stuff here.

Sonic Bike Hacklab Part 2: FM accelerometer transmissions

[Continued from part 1] On day one, after we introduced the project and the themes we wanted to explore, Ryan Jordan had a great idea of how to prototype the bike-bike communication using FM radio transmissions. He quickly freeform built a short range FM transmitter powered by a 9v battery.

IMG_20130725_135022

The next thing we needed was something to transmit – and another experiment was seeing how accelerometers responded during bike riding on different terrains. I’d been playing with running the fluxa synth code in Android native audio for a while, so I plugged the accelerometer input into parameters of a simple ring modulation synth to see what would happen. We set off with the following formation:

fm

The result was that the vibrations and movements of a rider were being transmitted to the other bikes for playback, including lots of great distortion and radio interference. As the range was fairly short, it was possible to control how much of the signal you received – as you cycled away from the “source cyclist”, static (and some BBC radio 2) started to take over.

We needed to tune the sensitivity of the accelerometer inputs – as this first attempt was a little too glitchy and overactive, the only changes really discernible were the differences between the bike moving and still (and it sounded like a scifi laser battle in space). One of the great things about prototyping with android was that we could share the package around and run it on loads of phones. So we went out again with three bikes playing back their own movements with different synth settings.

accel

Sonic Bike Hacklab: Part 1

IMG_20130726_122148

Time to report on the sonic bike hacklab Kaffe Matthews and I put on in AudRey HQ in Hackney. We had a sunny and stormy week of investigation into sonic bike technology. After producing three installations with sonic bikes, the purpose of the lab was to open the project up to more people with fresh ideas, as well as a chance to engage with the bikes in a more playful research oriented manner without the pressure of an upcoming production.

IMG_20130726_121414

Each of the three previous installation, in Ghent, Hailuoto island, Finland, and Porto, we’ve used the same technology, a Beagleboard using a GPS module to trigger samples to play back over speakers mounted on the handlebars. The musical score is a map created using Ushahidi consisting of zones tagged with sample names and playback parameters that the bikes carry around with them.

swamp

We decided to concentrate on two areas of investigation, using the bike as a musical instrument and finding ways to get the bikes to talk to each other (rather than being identical independent clones). We had a bunch of different components to play with, donated by the participants, Kaffe and I – while the bikes already provided power via 12v batteries, amplification and speakers. We focused on tech we could rapid prototype with minimal fuss.

components

The next few posts will describe the different experiments we carried out using these components.

‘The Marja trio’ – Sonic Bike Experience for Marjaniemi

I’ve been doing more remote install work on Kaffe’s latest piece she’s been building while resident at Hai Art in Hailuoto, an island in the north of Finland. The zone building, site specific sample composing and microscopic Beagleboard log debugging is over, and two new GPS Opera bikes are born! Go to Hai Art or Kaffe’s site for more details.

The Marja trio_score2_KM2013

183123_486670614739967_1256052974_n

Screen Shot 2013-05-03 at 9.55.33 AM

Bike Opera – layering sounds in space

New advancements on the the bike opera project with Kaffe Matthews include a brand new mapping tool based on, yes you guessed it – Ushahidi which I’ve been using for a lot of wildly different projects recently. This time the work has been mainly focused in improving the area mapping – adding features for editing polygons so Kaffe can layer her sounds in space:

swamp-edit

This work is fairly reusable, as it only concerns changes to the submit_edit_js.php file in the standard Ushahidi install. In the meantime, Kaffe has been collecting sounds from musicians in Porto and building up a work of truly operatic proportions. We keep our fingers crossed that the bike mounted BeagleBoards can cope with all this material!

Angelika

New Portuguese Bicycle Operatics

Prepare your bicycle clips! Kaffe Matthews and I are starting work on a new Bicycle Opera piece for the city of Porto, I’m working on a new mapping tool and adding some new zone types to the audio system.

While working on a BeagleBoard from one of the bikes used in the Ghent installation of ‘The swamp that was…’, I found (in true Apple/Google style) 4Mb of GPS logs, taken every 10 seconds during the 2 month festival that I forgot to turn off. Being part of a public installation (and therefore reasonably anonymised :) – this is the first 5th of the data, and about all it was possible to plot in high resolution on an online map:

It’s interesting to see the variability of the precision, as well as being able to identify locations and structures that break up the signal (such as the part underneath a large road bridge).