HENVI and Finnish Bioarts Society

I spent the morning on Kallvikinniemi with HENVI (the University of Helsinki Centre for the Environment) and the Finnish Bioarts Society learning about 3 new projects involving Forests and Climate Change, Sustainable Urban Development through Ecosystem Services and Multidisciplinary Baltic Sea Research. All of these project are interested in collaborating with artists for public engagement and multidisciplinary work – calls will be made by the Bioarts Society later in the year.

Rainy sunday

A rainy Sunday, with only the dog for company, so in between walks I thought I’d try and learn some assembler. I’ve been unhappy with triggering samples with Betablocker DS (I prefer synthesis) and I’ve heard good things about ARM asm – so it seemed like a good opportunity to attempt a small, fast and dirty synth.

I found some really nice tutorials here and here. I’ve done a tiny bit of this sort of thing before with microcontrollers, but this is a more of a respectable flavour of assembler, on a decent RISC processor (which derives from the Acorn Archimedes and is now used on IPhones, Androids and Gameboys). Here is a white noise generator:

; white_noise(r0=*dst, r1=clock, r2=length, r3=freq)
        push    {r4,r5,r6}          ; need to restore registers we use
        mov     r4, r1              ; r4 is the rand state (start with clock)
        ldr     r5, .rnd_data       ; r5 is the multiplier value
        ldr     r6, .rnd_data+1     ; r6 is the addition value
        mla     r4, r5, r4, r6      ; the maths bit: r4 = (r6 + (r5 * r4))
        strh    r4, [r0], #2        ; *dst++ = clock; 
        subs    r2, r2, #1          ; length--;
        bne     .noise_loop         ; branch if length not zero
        pop     {r4,r5,r6}
        bx      lr                  ; return
        .word   0x000343FD          ; nicked from ansi c rand()
        .word   0x00269EC3          ; need to keep large numbers (>8bit) as data

This code is based on the ansi C rand() function that basically looks like this:

randnum = randnum * 214013 + 2531011;

Which we can do in a single instruction – mla (multiply with accumulate). Of course, gcc would presumably optimise much better code than mine from C++, but there is something more satisfying about doing it this way. I certainly prefer the sound – and over half the cpu usage remains unused with 5 voices and the interface running. The rest of the code is here.

Why open licences?

I was recently asked by Wendy Van Wynsberghe from constant to explain how and why I use open licences for a lecture she’s doing on the subject. Like a lot of seemingly straightforward questions – it took me quite a while to work out the answers. With her permission, I thought it might be worth posting here.

So how do you use free & open licenses in your work?

I consider myself a software artist – so I am concerned with the process of creating software (via the combination of code and other digital assets). All the software I write is covered by the GPL licence, which means that all derivatives of it (created by me or anyone else) have to also be published as open source. I use similar licences for videos, images and written documents I produce as well.


Firstly, most of my work is publicly funded (either via arts grants or science research funding) so this is a moral issue for me – why should tax money be spent on work which is then removed from public access?

Secondly, and perhaps more interesting, is that the use of open licencing changes the way you go about your work in some fundamental ways.

For example, making your working method open immediately makes your work accessible to your peers, encouraging comment and collaboration at all stages. This for me is one of the most important lessons art can learn from the scientific method.

The initial fear that someone may steal “all your good ideas” is actually less likely if they are published and disseminated widely, as the danger for anyone wanting to borrow without attribution is that they will be found out all the easier.

This is not an absolute position. The embryonic stages of an idea for me need to be carried out and understood to a certain level away from such a public view. However it seems that the earlier you can open the working process for the idea, the faster it will develop and in more interesting directions.

Lirec: Facial and body expressions for companions

As part of my research for Germination X, I’ve been reading a Lirec deliverable report on facial and body expressions for companions (robots and graphical characters). It covers a lot of non verbal communication, and is useful for me as it concerns displaying the raw values coming from the FAtiMA AI system in a slightly more research grounded manner than my ad-hoc animations we are initially using in the game. This is a very different approach to character design/animation for me – but it’s great to see Tex Avery being referenced.

The document starts by explaining the work of Paul Ekman and Wallace V. Friesen in categorising ranges of basic emotional expressions and how they can be combined into blends and more complex expressions. They went on to develop the Facial Action Coding System for encoding expressions.

Some important aspects of the art of animation are discussed including breaking the rules of physics (easier for a graphical character perhaps) in order to achieve exaggerated expressions and movements. Animation has already built up a well understood set of rules and techniques which are now deeply routed in our expectations via puppetry as well as traditional animation. Even the way simple robots move can be thought about carefully, using slow in/out to make motion less rigid.

Here is the description of surprise, in text form and it’s corresponding encoding:

Surprise is described as having the eyebrows raised, eyes wide open, the jaw drop
open, and the head tilts upward. In terms of timing, it is a fast motion. In our notation, we define it as: IB(1,5)+OB(1,5)+UE(1,5)+LE(1,5)+JA(1,4)+HT(1,3)+SPEED(fast)

An area I find really interesting is finding more abstract ways of expressing emotion, for companions where facial animation is not possible. Eva Heller’s work in 1989 (see picture above) on linking combinations of colour to emotional meaning is exciting, apparently this work was the result of asking 1888 people to match colours and abstract feelings. These colours can be expressed easily with lights or simple displays.

Of course sound also has a big role to play. The table above comes from an experiment by Scherer & Oshinsky in 1977, exposing 48 undergraduates to “sawtooth wave bursts” from a MOOG synthesiser, ranging from simple tones to Beethoven melodies and then asking them to rate the sounds in terms of corresponding emotions.

Pink polygons & multitouch

A new version of android fluxus, with pdata, multitouch capability, scene inspection and a very pink test script included (see code below) fluxus-0.0.2.apk & source. Press the trackball button to edit the scheme code.


(define twirl-shape 
   (build-polygons 40 triangle-strip)))

(define finger-shapes
   (list (build-polygons 30 triangle-strip)
         (build-polygons 30 triangle-strip))))

(define (spiral)
  (line-width 5)
  (pdata-map! (lambda (c) (vector 1 0.29 0.42)) "c")
   (lambda (i p)
     (let ((i (* i 0.8)))
       (vmul (vector (sin i) (cos i) 0) (* 0.02 i i))))
  (pdata-copy "p" "pref") ; only really needed for animation
   (lambda (p pref)
     (vadd pref (vmul (crndvec) 0.2)))
   "p" "pref"))

 (lambda (finger-shape)
   (with-primitive finger-shape (spiral)))

  (lambda (c)
    (vector 1 0.29 0.42))

; using with-primitive is really slow, so directly use grab
; returns the distance between the objects
(define (get-pinch)
  (grab (car finger-shapes))
  (let ((a (vtransform (vector 0 0 0) (get-transform))))
    (ungrab)(grab (cadr finger-shapes))
    (let ((b (vtransform (vector 0 0 0) (get-transform))))
      (vdist a b))))

; store pinch as it's slow to calculate
(define pinch 1)

   ; do the twirling
     (lambda (i p)
       (let ((i (* i 0.5)))
         (vmul (vector (sin i) (cos i) (* 2 (cos (* i 10.43)))) 
               (* 5 (sin (* pinch (+ i (* 0.1 (time))) 0.1))))))
   ; check for touch events
    (lambda (touch-id)
      (let ((finger-shape (list-ref finger-shapes touch-id)))
          (get-point-from-touch touch-id))
         (rotate (vector 0 0 (* (time) 10)))))
      (set! pinch (get-pinch)))

PS2 homebrew #5

Getting stuff to work on PS2 wasn’t quite as easy as I probably made it sound in the last homebrew post. The problem with loading code from usb stick is that there is no way to debug anything, no remote debugging, no stdout – not even any way to render text unless you write your own program to do that.

The trick is to use the fact that we are rendering a CRT TV signal and that you can control what gets rendered in the overscan area (think 8bit loading screens). There is a register which directly sets the background colour of the scanline – this macro is all you need:

#define gs_p_bgcolor        0x120000e0    // Set CRTC background color

#define GS_SET_BGCOLOR(r,g,b) \
            *(volatile unsigned long *)gs_p_bgcolor =    \
        (unsigned long)((r) & 0x000000FF) <<  0 | \
        (unsigned long)((g) & 0x000000FF) <<  8 | \
        (unsigned long)((b) & 0x000000FF) << 16

Which you can use to set the background to green for example:


Its a good idea to change this at different points in your program. When you get a crash the border colour it’s frozen with will tell you what area it was last in, allowing you to track down errors.

There is also a nice side effect that this provides a visual profile of your code at the same time. Rendering is synced to the vertical blank – when the CRT laser shoots back to the top of the screen a new frame is started and you have a 50th of a second (PAL) to get everything done. In the screenshot below you can see how the frame time breaks down rendering 9 animated primitives – and why it might be a good idea to use some of these other processors:

Betablocker machine running on Supercollider

Is seems our meetings at the TAI studio are bearing fruit already, Till Bovermann has ported the Betablocker machine as a Supercollider UGen. What’s interesting is rather than taking a similar approach to music making, this version is running at audio rate and the processes directly generate sample data. He can also run lots of them at the same time and control them from SCLang. It’s great when people use your code, but even better when it gets used in ways you didn’t think about.

PS2 homebrew #4

Getting things to render on the PS2 is a little more complicated than using OpenGL and it’s also a very different system to a PC. On the right you can see a block diagram of the Emotion Engine – it consists of the EE core, the CPU on the left and the GS – Graphics Synthesiser, on the right. In between are 2 other processors called Vector Units – very fast processors designed to do things to vectors – points, colours etc.

All the Graphics Synthesiser can do is rasterise 2D shapes with points given in screen space and do your texturing & gouraud shading for you. It can draw points, lines, triangles or quads in various configurations (similar to OpenGL). However all the 3D transformations and lighting calculations have to happen elsewhere – in one, or both of the Vector Units or the CPU.

So how do we get the GS to render something? Well you send it chunks of data, called GS Packets, that look a little like this:

GIF Tag 1
Primitive data
GIF Tag 2
Primitive data

The GIF Tags contain information on what sort of primitive it should draw and how the primitive data is laid out. The primitive data is the same as the primitive data in fluxus – vertex positions, colours, texture coordinates, texture data etc.

Once I had tinyscheme and the basic scenegraph working that the minimal fluxus build uses, I wrote a very simple renderer running on the EE core to apply the transformation matrices to the primitives (with similar push and pop to OpenGL). It doesn’t calculate lighting at the moment, so it’s just setting the vertex colours to the normal’s values for debugging. This is a literal photographic screen shot of my PS2 running exactly the same test fluxus script as the android was running:

PS2 homebrew #3

The next thing I wanted to do was see if I could compile the minimal android version of fluxus for the PS2. All the PS2SDK examples are written in C, and when I tried the C++ compiler at link time I got a bunch of these odd errors:

ps2-main.cpp: undefined reference to `__gxx_personality_v0′

It turns out the C++ compiler does not support exceptions so you need to add this line to the makefile:

EE_CXXFLAGS += -fno-exceptions

The other thing to get used to is one of the side effects of a machine with so many processors is that you need send lots of data around between them using DMA transfer, or direct memory access. DMA works on chunks of memory at a time, so your data needs to be aligned on particular byte boundaries. This sounds a lot more complicated than it is in practice (although it does lead to really obscure bugs if you get it wrong).

For instance, when making arrays on the heap you can do this:

struct my_struct
    int an_int;
    float my_array[8] __attribute__((__aligned__(16)));
    float a_float;

Which tells gcc to sort it out for you by forcing the pointer to my_array to fall on a 16 byte boundary.

When allocating from the heap the EE kernel provides you with a memalign version of malloc:

float *some_floats=(float*)memalign(128, sizeof(float) * 100);

The pointer some_floats will be aligned to a 128 byte boundary. This works as normal with free().

At this point, other than a few changes to tinyscheme for string functions that don’t exist on the PS2 libraries, most of the fluxus code was building. The only problem was the OpenGL ES code, as although the PS2 has some attempts at libraries that work a bit like OpenGL, the real point of playing with this machine is to write your own realtime renderer. a bit more on that next…