Red King progress, and a sonification voting system

We have now launched the Red King simulation website. The fundamental idea of the project is to use music and online participation to help understand a complex natural process. Dealing with a mathematical model is more challenging than a lot of our citizen science work, where the connection to organisms and their environments is more clear. The basic problem here is how to explore a vast parameter space in order to find patterns in co-evolution.

After some initial experiments we developed a simple prototype native application (OSX/Ubuntu builds) in order to check we understand the model properly by running and tweaking it.

d714be3e49925c7492edf0623b27a41c

The next step was to convert this into a library we could bind to python. With this done we can run the model on a server, and have it autonomously update it’s own website via django. This way we can continuously run the simulation, storing randomly chosen parameters to build a database and display the results. I also set up a simple filter to run the simulation for 100 timesteps and discard parameters that didn’t look so interesting (the ones that went extinct or didn’t result in multiple host or virus strains).

There is also now a twitter bot that posts new simulation/sonifications as they appear. One nice thing I’ve found with this is that I can use the bot timeline to make notes on changes by tweeting them. It also allows interested people an easy way to approach the project, and people are already starting discussions with the researchers on twitter.

1570fc8a4b4e93b560c2c3c159ba1710

Up to now, this has simply been a presentation of a simulation – how can we involve people so they can help? This is a small project so we have to be realistic what is possible, but eventually we need a simple way to test how the perception of a sonification compares with a visual display. Amber’s been doing research into the sonification side of the project here. More on that soon.

For now I’ve added a voting system, where anyone can up or down-vote the simulation music. This is used as a way to tag patterns for further exploration. Parameter sets are ranked using the votes – so the higher the votes are the higher the likelihood of being picked as the basis for new simulations. When we pick one, we randomise one of its parameters to generate new audio. Simulations store their parents, so you can explore the hierarchy and see what changes cause different patterns. An obvious addition to this is to hook up the twitter retweets and favorites for the same purpose.

ced6bbb49c7d71fd10b46719452d4e9b

Little J – many ears to the ground

The essence of the Little J journalism project with Behaviour is to prototype things that make it as easy as possible for people to report stories and get them seen by journalists working for small local papers. Over the last week we’ve got the site and database up and running, created a new Facebook app for uploading stories, built a twitter scanner for automatically harvesting tweets with a specific hashtag (including geo-reference information) and have an Android app up and running!

littlej-site

The site is in heavy testing mode so no url yet, but all the code is available here. Next we will be playing with reputation systems and similar ideas to allow our “Little J reporters” to have a form of progression to provide a structure for their work. Hopefully some of our developments will be useful as contributions back to the Ushahidi Platform we are using for rapid prototyping.

littlej-android littlej-facebook