Some further kite testing with UAV Toolkit last week at Gwithian beach, with a strong offshore wind (and rather good looking surf too). We managed to max out the kite altitude and get some great photographs including surfers and flocks of birds. See the previous kite post for more details on the kite.
I’ve also started experimenting with combining the sensor data with the images to provide geo-referenced images in GeoTIFF format. This uses the magnetometer data to orient the image and GPS for position. This is still work in progress with quite a lot of converting between coordinates going on, all using the GDAL suit of tools with python to glue it all together:
Some photos taken by the UAV toolkit on a recent flight at our gyllyngvase beach test site, using a KAP foil 1.6 kite instead of a drone. Kites have many advantages, no flight licences required, no vibration from engines and a fully renewable power source!
We’re using a 3D printed mounting plate for the phone strung from the top of the single line just below the kite. It needs more wind than we had to get higher altitudes but the first impressions are good. I’ve also added a new trigger mode to the UAV toolkit programming language that remembers the GPS coordinates where all the photos are taken, so it can build up overlapping images even if the movement is harder to control.
The tail of the kite – which turned out to be important for stabilising the flight.
Here is the code using the when-in-new-location trigger to calculate overlap based on the camera angle, gps and altitude – which ideally should be driven somehow by the length of the line. As an aside, this screenshot was taken in the chrome browser which now runs android apps.
View of ground control from a OnePlus phone mounted on a Y6 UAV:
A report from the first flight test of the new UAV android software with the Exeter University UAV science group. We had two aircraft, a nice battle hardened fixed wing RC plane and a very futuristic 3D robotics RTF Y6. We also had two phones for testing, an old cheap Acer Liquid Glow E330 (the old lobster phone) and a new, expensive Oneplus One A0001. Both were running the same version (0.2) of the visual programming toolkit which I quietly released yesterday.
Here is the high-tech mounting solution for the RC aircraft. There were a lot of problems with the Acer, I’m not sure if the GPS triggering was happening too fast or if there is a problem with this particular model of phone but the images appear to be corrupted and overwriting each other (none of this happened in prior testing of course 🙂 Despite this, there were some Ok shots, but a lot of vibration from the petrol motor during acceleration.
We tried two types of programs running on the phones, one triggered photos by a simple timer, the other used GPS distance, altitude and camera angle in order to calculate an overlap coverage. Both seems to work well, although I need to go through the sensor data for each image to check the coverage by positioning the images using the GPS. One thing I was worried about was the pitch and yaw of the aircraft – but with the Y6 this was extremely stable, along with the altitude too, which can be controlled automatically at a set height.
The vibration seemed less of a problem on the Y6, but on one of the flights the power button got pressed bringing up the keylock screen which annoyingly prevents the camera from working. We did however capture lots of sensor data – accelerometer, magnetometer, orientation and gravity with no problems on the Acer.
The OnePlus phone worked pretty flawlessly overall, and we left it till last as it’s a bit less expendable! It’s possible to mount phones easily underneath the batteries on the Y6 without the need for tape, which looks a bit more professional:
We still have problems with vibration, which seems to cause the bands of fuzziness (see the bottom and top photos) so things to look at next include:
Cushioning for the phone (probably just a small bit of foam).
Reproducing and fixing the Acer camera problem.
Some kind of audio indication from the phone that the camera is working etc.
Try again to lock the keys on the phone or override the key lock screen.
More camera controls, override and lock the exposure.
Output raw files instead of running the jpeg compression in the air! This seems to take longer than actually taking the photo, and we don’t care about space on the sdcard.
I’ve recently begun a new project with Karen Anderson who runs the UAV research group at the Exeter University Environment and Sustainability Institute. We’re looking at using commodity technology like android phones for environmental research with drones. Ecology research groups and environmental agencies have started using drones as a replacement for expensive and risky light aircraft for gathering data on changes to landscapes due to climate change and erosion. How can we make tools that are simpler and cheaper for them to set up and use? Can our software also be relevant for children using kites in cities for making their own maps, or farmers wishing to record changes to their own fields themselves?
This is a more open ended project than our previous environmental and behavioural projects, so we’re able to approach this with an R&D perspective in relation to the technology. One of the patterns I’ve noticed with this kind of work is that after providing scientists with something that meets their immediate needs, it inspires a ton of new ideas and directions – and I become a bottleneck. Ideally I need to provide something that allows them to build things themselves once they have an understanding of all the possibilities, also adapting to needs ‘in the field’ is an important aspect of the kind of work that they do – which can be in remote locations anywhere in the world.
Some time ago I had a go at porting my musical livecoding language scheme bricks to android for the open sauces project. I’m now applying it as a way of configuring sensor data acquisition and recording by drag/dropping a visual programming language. It’s early days yet, I’m still debugging the (actually rather amazing) android drag/drop API – here are some initial screenshots.
Some photos from Shakti Lamba who is currently testing Symbai in the Chhattisgarh state in north eastern India.
The whole system is solar powered, and provides it’s own networking via the Raspberry Pi synchronisation node shown here. The android tablets also recharge from the same power source. The Raspberry Pi networking is a direct descendant of the experiments we carried out in London during the Sonic Bike workshop.
Here are three of the tablets syncing their data – photographs for people to identify each other (names are used differently to western culture), audio recordings of verbal agreements (a requirement in preliterate societies), and information of who knows who.
More on this free software project, links to source etc over at foam kernow.
A couple of screenshots of the hindi version of Symbai – our solar powered Raspberry Pi/Android anthropological research tool. As is usual we’re still having a few issues with the unicode but it’s nearly there. We’ve been working on this software for the last few months, making sure the data (including photos and audio recordings of verbal agreements) synchronise properly across multiple devices.
A release of Bumper Crop is now up on the play store with the source code here. As I reported earlier this has been about converting a board game designed by farmers in rural India into a software version – partly to make it more easily accessible and partly to explore the possibilities and restrictions of the two mediums. It’s pretty much beta really still, as some of the cards behave differently to the board game version, and a few are not yet implemented – we need to work on that, but it is playable now, with 4 players at the same time.
The 3D and animation is done using the fluxus engine on android, and the game is written in tinyscheme. Here’s a snippet of the code for one of the board locations, I’ve been experimenting with a very declarative style lately:
;; description of location that allows you to fertilise your crops;; the player has a choice of wheat/onion or potatoes(place26'fertilise'(wheat onion potato);; this function takes a player and a ;; selected choice and returns a new player(lambda(player choice)(if(player-has-item? player 'ox);; do we have an ox?;; if so, a complete a free fertilise task if needed(if(player-check-crop-task player choice 'fertilise0)(player-update-crop-task player choice 'fertilise)
player);; otherwise it costs 100 Rs(if(player-check-crop-task player choice 'fertilise100)(player-update-crop-task(player-add-money player -100);; remove money
player choice 'fertilise)
player)))(place-interface-crop));; helper to make the interface
Testing the board game, which you can download on this page:
I’ve recently been building the Mongoose 2000 “group composition” tool that the researchers will use for recording information about a whole pack of mongooses (and synchronise data via a Raspberry Pi providing a local wifi node) in their field site in Uganda. As I wrote a bit about before, one of the interesting things about this project is that the interface design has to focus on long term speed and flexibility over immediate ease of use. In this way it seems appropriate that it’s moving in the direction of a musical interface rather than a normal touch screen interface. The different colours in the mongoose selectors show which individuals are present and which have data recorded from them already, the screenshot below is the section where they record relationships between the adult females (at the top) and adult males that may be guarding – or pestering them (below). At the same time, they need to be able to record events that may be occurring with the pack as a whole – in this case an interaction with another pack of mongeese.
Bumper crop is an android game I’ve just started working on with Dr Misha Myers as part of the Play to Grow project: “exploring and testing the use of computer games as a method of storytelling and learning to engage urban users in complexities of rural development, agricultural practices and issues facing farmers in India.”
(Warning – contains machine translated Hindi!)
I’m currently working out the details with artist Saswat Mahapatra and Misha, who have been part of the team developing this game based on fieldwork in India working with farmers from different regions. They began by developing a board game, which allowed them to flexibly prototype ideas with lots of people without needing to worry about software related matters. This resulted in a great finished product, super art direction and loads of assets ready to use. I very much like this approach to games design.
From my perspective the project relates very closely to groworld games, germination x, as well as the more recent farm crap app. I’m attempting to capture the essence of the board game and restrict the necessary simplifications to a minimum. The main challenge now that the basics are working is providing an approximation of bartering and resource management between players that board games are so good at, into a simple interface – also with the provision of AI players.