Evolving musical bytecode

At it’s core, Betablocker provides a simple virtual machine for playing with music which is ‘unstoppable’ (any combination of instructions can be executed). It started off originally as a Fluxus scheme program, then got rewritten in C++ for the Nintendo DS, and has now been ported to Supercollider. I’ve built yet another Linux commandline version, and glued it to the Java Genetic Algorithms Package via Clojure.

This means I can continue with the genetic programming experiments and try evolving programs using a real genetic algorithm framework, using the betablocker instruction set. This is mainly just for kicks, but it’s also partly to “grow” new strategies for livecoding betablocker and researching instruction sets more generally.

The first test was to try and find a program to just play note 99. I would write this in betablocker like this (using a text representation of the icon interface for convenience):

    pshl 99 ; push the literal value 99 on to the stack
    note    ; play the top of the stack as a note 

First we need a fitness function to score the results from the programs which are generated:

(defn fitness [notes]
    (- 255 (Math/abs (- (first notes) 99))))

It takes a list of the notes played by a candidate program, and measures how far away the first element is from 99 – the bigger the number, the fitter the individual program is. I gave it 8 instructions to work with and a population of 500 individuals – after just a couple of generations, this chromosome/program popped up with a top fitness of 255 – meaning it was successfully playing note 99:

99 46 213 89 7 142 23 168

Which when disassembled looks something like this:

    99       ; <-- address 0 
    nop
    nop
    nop
    pshi 142 ; pushes contents of address at location 142 (which is 0)
    note     ; play the note
    nop

This program has grown taking advantage of the fact that the memory is initialised to zero, so if it dereferences a large address like 142 it will be zero and load the contents of the first byte - 99. It's this kind of wacked out approach genetic algorithms are great at finding/abusing.

The next problem, something a bit more general - create a sequence of notes which are all different to one another. The fitness function:

(defn list-contains? [l i]
  (cond
   (empty? l) false
   (= (first l) i) true
   :else (recur (rest l) i)))

(defn fitness [a]
  (count
   (reduce
    (fn [r v]
      (if (not (list-contains? r v))
        (cons v r)
        r))
    '()
    a)))

This just counts the number of unique elements in the list - again, the higher the better. This one took longer, but eventually came up with this:

139 246 1 23 12 22 23 12 20 95 22 3

Which when disassembled:

    nop
    nop
    org
start:
    note      ; play a note
    inc       ; increment top stack byte
    dup       ; duplicate top stack byte
    note      ; play a note (consumes 1 stack byte)
    inc       ; increment again
    pip 95    ; increment address 95 (junk - doesn't have any effect)
    dup       ; duplicate again
    jmp start ; return to the start

Which plays a rising sequence of notes, with none repeated, so the highest fitness achievable:

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

This is already doing much better than the genetic programming I was mucking about with before - indeed there seems to be some literature on evolving java bytecode, but with betablocker there is an advantage that you don't need to enforce the correctness of the programs before they can be run.

Genetic programming music patterns #2

A bit more from the growing of music programs department. (Part 1 can be found here along with reasons for doing this). I’ve been trying to generate ever so slightly more complicated patterns, such as:

(50 50 0 0 50 50 0 0)

Not too exciting, but it proves quite difficult for my naive system to generate code for. A lot of times the evolution becomes stuck in the average number pattern, eg (25 25 25 25 25 25). In genetic programming (and other similar fields) such stuck states are called “local minima” as the system can’t get itself out of a relatively stable state. The problem is that in order to get to a better solution we may need to get worse first. A good description is that you are trying to find the lowest point in some imaginary terrain, and you are stuck in a puddle at the top of a mountain.

Here is a detailed look at exactly what is happening during the evolution process of a program which almost worked. We start with this:

(20 23 1 4 3 7 11 15)

Which was scored well due to the first four elements, the code for this individual (randomly generated, as it’s the first generation) is:

(mod (mod (+ (even 31) 97 22) (- 86 53 (clock))) 25))

Notice that bits of this code are unnecessary, (+ (even 31) 97 22) could be replaced by 119. Some genetic programming systems do this automatically, the approach I’ve taken is to assign a cost to the size of the procedure which should make shorter programs fitter – but I haven’t really tested this works yet.

This code has a cost of 31684. I’m squaring what I was calling fitness before, and calling it cost instead, which makes a bit more sense.

The following generation the cost drops to 15376 with:

(33 23 13 3 29 20 11 2)

Which is starting to approach the periodic nature of the target. This is the code:

(mod (mod (+ (even 31) 97 (+ 97 31 91)) (- (max (/ 56 8 75 92) (mod 21 84)) 53 (clock))) 37)

Considerably more complex, however if we manually remove all the constant expressions, we get this:

(mod (mod 316 (- -32 (clock))) 37)

It looks like the inner modulo providing something like the periodic nature that we need:

(mod 316 (- -32 (clock))) => (-4 -14 -24 -34 -8 -17 -26 -35)

While the outer modulo attempts to scale the “rhythm” values to a range closer to the target pattern. This seems like a good strategy, lets see what happens next. Generation 5 6 and 7 rapidly refine this process by tweaking the numbers – I’ve highlighted the mutations for each generation in red:

(mod (mod (+ (even 37) 96 (+ 97 31 91)) (- (max (/ 56 8 75 92) (mod 21 84)) 53 (clock))) 37) => (32 22 12 0 28 19 10 1)

(mod (mod (+ (even 37) 96 (+ 97 31 91)) (- (max (/ 56 8 75 5) (mod 21 84)) 53 (clock))) 62) => (57 47 37 0 53 44 35 26)

(mod (mod (+ (even 37) 96 (+ 97 31 91)) (- (max (/ 62 96 75 5) (mod 21 84)) 53 (clock))) 53) => (48 38 28 0 44 35 26 17)

The cost is now 11236. Generation 13 this drops to 11025 with another minor tweak:

(mod (mod (+ (even 37) 96 (+ 97 31 91)) (- (max (/ 62 96 75 5) (mod 21 84)) 53 (clock))) 56) => (51 41 31 0 47 38 29 20)

Then, on generation 31 we get a big improvement, with the cost dropping to 2601.

(mod (mod (+ (even 37) 96 (+ (clock) 31 91)) (- (max (/ 56 96 75 5) (mod 21 84)) 29 (clock))) 56) => (50 50 0 46 50 45 0 0)

A couple of numbers tweaked and a (clock) in the previously constant addition expression – this is a good case for leaving redundant code expanded, as it can be used later on due to the processes of mutation. Removing the constant bits to find out what’s going on, this can be simplified to:

(mod (mod (+ (clock) 218) (- -8 (clock))) 56)

The outer modulo is scaling the inner one as before, which is now providing this “rhythm” pattern:

(mod (+ (clock) 218) (- -8 (clock))) => (-6 -6 0 -10 -6 -11 0 0)

Unfortunately, even after 1000 generations (50 50 0 46 50 45 0 0) remains the closest to (50 50 0 0 50 50 0 0) we get – close, but no cigar.

Genetic programming music patterns #1

I have a problem when livecoding music, in that while I’m happy livecoding synth graphs to make sounds, I sometimes get a bit stuck coming up with patterns of notes for them to play. I generally start with something like (note (modulo clock 8)) and work my way fairly randomly from there.

In need of inspiration for making more complicated patterns with the minimum of code I thought I’d continue my sporadic ongoing mission to explore genetic programming (towards the much desired future of a machine programming itself). The idea is to use a fitness function to grow programs that make patterns I can specify. I could then use these programs as they are, generate them live, or steal ideas from the evolved code.

To give an example of a pattern, this is the first 13 notes of the famous techno number “mary had a little lamb” in major scale:

(2 1 0 1 2 2 2 1 1 1 2 4 4)

The first thing we need is an instruction set that can be used in machine generated programs. These need to be robust to any inputs, and rich enough to provide the sort of patterns we want to generate. I’ve started with a fairly minimal set:

(mod a b) : return modulo of a to b, or zero if b is zero.
(odd a) : return 1 if a is odd, 0 otherwise.
(even a) : return 1 if a is even, 0 otherwise.
(min a b) : return the minimum of a or b.
(max a b) : return the maxium of a or b.
(+ a b . . .) : addition.
(- a b . . .) : subtraction.
(* a b . . .) : multiplication.
(/ a b) : division – returns zero if b is zero

In order to drive the program we also need a (clock) function that returns the time – or more precisely, the place in the pattern we are in.

So the program: (+ (mod (time) 3) 2) repeated 5 times would result in the pattern (2 3 4 2 3) .

We also need a fitness function which will give us a measure of how close a pattern is to the one we are trying to find a program for. This could be the sum of the differences between each element of the generated pattern and that of the target pattern – where 0 is perfect and the bigger the number the worse the fit, eg:

(define (fitness pattern target)
    (foldl
        (lambda (a b r)
            (+ r (abs (- a b))))
        0 pattern target))

Lets try something simple to begin with – the pattern (50 0 50 0 50 0). We start with a population of 1000 randomly generated programs and pick the best one – hold on to your hats:

21

A complex program that surprisingly results in the pattern: (21 21 21 21 21 21). This has a fitness of 150 – and 21 is quite close to the average of 0 and 50, so although it’s disappointing, it makes sense.

We now create a new population, some of the individuals are new random programs, others are mutated versions of the previous best attempt (25%/75% if you really want to know). We also include the previous best attempt without mutation – this stops generations getting worse over time. Unfortunately in this run it takes 90 generations before a fitter individual is found:

(* 35 (even (clock)))

Which generates the pattern: (35 0 35 0 35 0) with a fitness of 45. Now it is only a matter of time, in fact the next 3 generations slowly home in:

(* 55 (even (clock)))
(* 52 (even (clock)))
(* 49 (even (clock)))

and then then on generation 104 we finally get:

(* 50 (even (clock)))

resulting in (50 0 50 0 50 0), with a fitness of 0.

More soon – code here.