.size-medium .wp-image-170 width="300"
The last weeks I spend quite a lot of time hacking on a Python library
for the AR.Drone. The AR.Drone is a nice
toy for nerds. You connect to it via WIFI and soon you'll realize that
it has 4 ports open. Reading the specs you'll find, that on one port it
listens for AT-Commands with which you can remote control the drone, on
the other two ports it waits for an incoming package which will trigger
the drone to send the navdata (speed, angles, battery status, etc) and
the video stream. Heck, you can even telnet into the drone...
Unfortunately it comes without a proper software to control the drone,
only an iPhone app (w/o iPhone of course). But given the documentation,
it should be easy to write your own. While getting the beast to fly was
relatively easy, decoding the "almost"-jpg-video-stream was not.
Almost-jpg, since the images the drone sends are more or less jpg with a
small difference which makes it impossible to decode them using
standard multi-media libraries. Anyways, the format is documented and
implementing a decoder was not that hard. The tricky part was to get the
framerates from unacceptable 0.5 FPS to 12-22 FPS -- the whole decoder
is written in Python. I'm cheating a bit by using
psyco, but the code in arvideo.py is
heavily optimized to minimize calculations and to please psyco.
In the code is also a small demo app which uses
Pygame to display the video stream and allows to
control the AR.Drone with the keyboard. It should be ready-to-use as
soon as you are connected to the drone via WIFI.
The git repository is here,
the license is MIT. Suggestions and patches are welcome.
Here is a video of the drone flying
through the office.