The UAV toolkit & appropriate technology

The UAV toolkit’s second project phase is now complete, the first development sprint at the start of the year was a bit of research into what we could use an average phone’s sensors for, resulting in a proof of concept remote sensing android app that allowed you to visually program different scripts which we then tested on some drones, a radio controlled plane and a kite.

photo-1444743618-504214

This time we had a specific focus on environmental agencies, working with Katie Threadgill at the Westcountry Rivers Trust has meant we’ve had to think about how this could be used by real people in an actual setting (farm advisors working with local farmers). Making something cheap, open source and easy to use, yet open ended has been the focus – and we are now looking at providing WRT with a complete toolkit which would comprise a drone (for good weather) a kite (for bad weather/no flight licences required) and an android phone so they don’t need to worry about destroying their own if something goes wrong. Katie has produced this excellent guide on how the app works.

The idea of appropriate technology has become an important philosophy for projects we are developing at Foam Kernow, in conjunction with unlikely connections in livecoding and our wider arts practice. For example the Sonic Bike project – where from the start we restricted the technology so that no ‘cloud’ network connections are required and all the data and hardware required has to fit on the bike – with no data “leaking” out.

photo-1444742698-352249

With the UAV toolkit the open endedness of providing a visual programming system that works on a touchscreen results in an application that is flexible enough to be used in ways and places we can’t predict. For example in crisis situations, where power, networking or hardware is not available to set up remote sensing devices when you need them most. With the UAV toolkit we are working towards a self contained system, and what I’ve found interesting is how many interface and programming ‘guidelines’ I have to bend to make this possible – open endedness is very much against the grain of contemporary software design philosophy.

The “app ecosystem” is ultimately concerned with elevator pitches – to do one thing, and boil it down to the least actions possible to achieve it. This is not a problem in itself, but the assumption that this is the only philosophy worth consideration is wrong. One experience that comes to mind recently is having to make and upload banner images of an exact size to the Play Store before it would allow me to release an important fix needed for Mongoose 2000, which is only intended to ever have 5 or 6 users.

photo-1444743271-137365

For the UAV toolkit, our future plans include stitching together photos captured on the phone and producing a single large map without the need to use any other software on a laptop. There are also interesting possibilities regarding distributed networking with bluetooth and similar radio systems – for example sending code to different phones is needed, as currently there is no way to distribute scripts amongst users. This could also be a way of creating distributed processing – controlling one phone in a remote location with another via code sent by adhoc wifi or SMS for example.

Airborne drag-drop programming, the next steps

This autumn we are continuing work on the UAV toolkit with Karen Anderson and her research group at the Environment and Sustainability Institute. This time we have a mission to help the Westcountry Rivers Trust by coming up with fast and cheap ways they can build maps of farms to determine water run-off problems, which gives farmers proof they need to get funding to fix pollution issues.

IMG_20151014_154601
Flight planning operations and photo checking

In order to make the software usable in this case, we decided on two directions. On the one hand there needs to be a simple way to start and stop programs (or “flight modes”) that read sensor data, as well as defining certain global settings, ie. flight altitude, desired image coverage etc. At the same time, the code to define what this does needs to still be programmable in the app – and more complex behaviours need to be possible to support both kites and UAVs. Our philosophy is that it has to be open ended, as we don’t know where the toolkit it might be useful (ie. crisis mapping situations) or what new sensors will be available on a device in the future.

Screenshot_2015-10-19-21-48-01
The new main screen

One specific set of new behaviours we need is for kite mapping. We already have the ability to choose when to take pictures based on GPS and altitude, but with a kite there can be lots of turbulence and the camera is in a much less controlled state, flipping around taking shots of the sky etc. So we need to calculate things like jerk from change in acceleration and use orientation sensors to only take photos when the lens is pointing directly down, within some degree of acceptable margin.

Below is a section of the code that calculates if we are pointing down using the magnetometer and accelerometer – the drag drop visual code can now be used to build normal Scheme functions using a touchscreen (a bit like scheme bricks). In fact I managed to do all of this work on the phone. There are now two types of code, the main programs or “flight modes” that you can run from the front screen, and a library of editable functions which they use. This means there are now three levels that the software can be used – using it without needing to see any of the code at all, editing the basic behaviour like which sensor’s data are captured, and finally modifying the more detailed code to make it do completely new things.

Screenshot_2015-10-19-22-08-12

More kite based UAV toolkit action

Some further kite testing with UAV Toolkit last week at Gwithian beach, with a strong offshore wind (and rather good looking surf too). We managed to max out the kite altitude and get some great photographs including surfers and flocks of birds. See the previous kite post for more details on the kite.

photo-1428496974-445621

photo-1428500394-816255

photo-1428497002-687142

I’ve also started experimenting with combining the sensor data with the images to provide geo-referenced images in GeoTIFF format. This uses the magnetometer data to orient the image and GPS for position. This is still work in progress with quite a lot of converting between coordinates going on, all using the GDAL suit of tools with python to glue it all together:

kitewalk

Kite mapping with UAV toolkit

Some photos taken by the UAV toolkit on a recent flight at our gyllyngvase beach test site, using a KAP foil 1.6 kite instead of a drone. Kites have many advantages, no flight licences required, no vibration from engines and a fully renewable power source!

photo-1427901150-33648

We’re using a 3D printed mounting plate for the phone strung from the top of the single line just below the kite. It needs more wind than we had to get higher altitudes but the first impressions are good. I’ve also added a new trigger mode to the UAV toolkit programming language that remembers the GPS coordinates where all the photos are taken, so it can build up overlapping images even if the movement is harder to control.

photo-1427899695-642548

photo-1427900271-271541

The tail of the kite – which turned out to be important for stabilising the flight.

photo-1427900189-120468

photo-1427900242-710290

Here is the code using the when-in-new-location trigger to calculate overlap based on the camera angle, gps and altitude – which ideally should be driven somehow by the length of the line. As an aside, this screenshot was taken in the chrome browser which now runs android apps.

newlocation

Test flight day!

View of ground control from a OnePlus phone mounted on a Y6 UAV:

photo-1421927742-36242

A report from the first flight test of the new UAV android software with the Exeter University UAV science group. We had two aircraft, a nice battle hardened fixed wing RC plane and a very futuristic 3D robotics RTF Y6. We also had two phones for testing, an old cheap Acer Liquid Glow E330 (the old lobster phone) and a new, expensive Oneplus One A0001. Both were running the same version (0.2) of the visual programming toolkit which I quietly released yesterday.

IMG_20150122_104141

Here is the high-tech mounting solution for the RC aircraft. There were a lot of problems with the Acer, I’m not sure if the GPS triggering was happening too fast or if there is a problem with this particular model of phone but the images appear to be corrupted and overwriting each other (none of this happened in prior testing of course 🙂 Despite this, there were some Ok shots, but a lot of vibration from the petrol motor during acceleration.

photo-1421923592-621148

photo-1421923613-98013

We tried two types of programs running on the phones, one triggered photos by a simple timer, the other used GPS distance, altitude and camera angle in order to calculate an overlap coverage. Both seems to work well, although I need to go through the sensor data for each image to check the coverage by positioning the images using the GPS. One thing I was worried about was the pitch and yaw of the aircraft – but with the Y6 this was extremely stable, along with the altitude too, which can be controlled automatically at a set height.

The vibration seemed less of a problem on the Y6, but on one of the flights the power button got pressed bringing up the keylock screen which annoyingly prevents the camera from working. We did however capture lots of sensor data – accelerometer, magnetometer, orientation and gravity with no problems on the Acer.

The OnePlus phone worked pretty flawlessly overall, and we left it till last as it’s a bit less expendable! It’s possible to mount phones easily underneath the batteries on the Y6 without the need for tape, which looks a bit more professional:

IMG_20150122_114937

We still have problems with vibration, which seems to cause the bands of fuzziness (see the bottom and top photos) so things to look at next include:

  1. Cushioning for the phone (probably just a small bit of foam).
  2. Reproducing and fixing the Acer camera problem.
  3. Some kind of audio indication from the phone that the camera is working etc.
  4. Try again to lock the keys on the phone or override the key lock screen.
  5. More camera controls, override and lock the exposure.
  6. Output raw files instead of running the jpeg compression in the air! This seems to take longer than actually taking the photo, and we don’t care about space on the sdcard.

IMG_20150122_113543

photo-1421927801-544702

Livecoding UAVs for environmental research

Some screenshots of the UAV livecoding visual programming language. Weather being on our side, we’re planning some test flights later this week! The first program uses GPS to take photos with an overlap of 50% at 300 metres altitude, based on the vertical camera angle as reported from the device. It assumes the the flight orientation is level:

Screenshot_2015-01-20-23-17-49

The blocks are all drag and drop and get converted into Scheme code which is run by a modified tinyscheme interpreter. The code can be saved and loaded, and I’m planning to make it possible for people to share code via email.
This is a simpler program which takes a photo every 3 seconds and records a handful of sensor data to the database:

Screenshot_2015-01-20-23-18-44

At the bottom you can see a squashed camera preview – I’ve tried various approaches (hiding, scaling to 0 pixels etc) but android requires that there is a preview somewhere in order to take a photo properly. You can view the recorded data on the device too, for checking. There is also a ‘flight mode’ which locks and turns off the screen, and ignores all button events. On some phones you need to take out the battery to stop the program running but unfortunately on others you can still use the power button to close the program.

Screenshot_2015-01-21-01-09-56

Visual programming for environmental research with UAVs

ua3

I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?

Enumerating and displaying all the sensors on a phone
Enumerating and displaying all the sensors on a phone

This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.

Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.

Nested drag/droppable code which gets converted to Scheme code for the tinyscheme interpreter
Nested drag/droppable code which gets converted to Scheme for the tinyscheme interpreter
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense - just testing!)
A block menu works much like Scratch, allowing you to pick new code blocks (this code is nonsense – just testing!)
The "Hello World" program, displays every 3 seconds  (even when the app is running in the background)
The "Hello World" program, displays every 3 seconds (even when the app is running in the background)