Jazoon 2013 – Open Safety For Drones

25. October, 2013

Jazoon 2013 badgeDrones – fully or partially autonomous aircraft – are the second most wide-spread form of private-owned robots today only challenged by room-cleaners. The current estimation is that there are 500’000 drones currently in use. Parrot AR alone sold about 250’000 units. In his talk “Open Safety For Drones,” Lorenz Meier asks important questions about safety, regulation and how this area will develop in the future.

When it comes to safety, then there are two areas: Safety when developing drones and safety when using them. People have started to demonstrate the usual stupidity by flying drones in crowded areas without regard to safety. Lorenz showed a video where an inexperienced controller flew a drone in Manhattan until it crashed into skyscraper and dropped into the streets below. In another example, someone flew a drone through a campus. I say “through” because he went below footbridges that connect the first floor of buildings, eventually passing a bus on the way. The “pilot” also failed to catch the drone after the stunt, crashing it into a hedge.

It will be interesting to see how legal bodies like the FAA respond to things like this.

Despite this, autonomous drones are already useful. A nice example is the new 3D model of the Matterhorn.

A more related issue is the other one: Building safe aircraft. Bugs in a drone can really ruin your day (or life). How can we make sure that hundreds of thousands of lines of code have an acceptable number of bugs?

The issues here are many:

  • If you want to fly an autonomous aircraft, you need to license it.
  • Regulation for drones are non-existing or immature. They might try to handle you like a normal aircraft manufacturer (think Boeing or Airbus). Can you even afford to apply for such a license?
  • You’re using an image recognition software. It uses a math library. In that library, a single line of code has changed. What is the impact on the safety of your drone? How do we bring safety and the need for faster iterations together?
  • Regulators might insist that you license the whole thing again.
  • To get real engineers on your project for a new drone, your API must be really simple. As every seasoned developer knows, nothing is harder than “simple.”

A word on the safety of military drones: Not a single one of them is allowed in civilian airspace. But it’s probably very expensive. Germany spent €562 million to certify the EuroHawk drone. The project was cancelled and the defense minister “referred to the project as being ‘a horror without end'”.

So how can we make a DIY drone ever safe?

There are some approaches:

  1. Involve as many people as possible in a project. Everyone will find something that can be improved.
  2. Allow commercial use of your project. Companies have completely different views on safety than hobbyists.
  3. Collect logs of your flights and share them. Share your project on github and make sure you include the hash of the changeset that was used to run the drone.
    Over time, you can use statistical data to locate even “once in a million years” bugs.

But there are good news, too. With a plane, you have X people on the aircraft and 0 on the ground (most crash in empty areas). With a drone, you have 0 people on the aircraft and X on the ground. But using the laws of physics, it’s pretty simple to estimate the maximum radius that a drone can still travel if it’s “brain” shuts down right now. That means we can test them pretty safely in a virtual cage. Also, since all the components are open, we can run extensive simulations before bringing the thing into the wild.

But the OSS approach also has drawbacks. With OSS, a unit test is easy to write, the test case is known and if something goes horribly wrong, we might experience data loss.

With unmanned systems, the worst case is loss of life. And where do you get your test cases from? How do you plan to test for a broken coil in an actuator? An overheated battery which results in unstable power? Wind gusts? Collisions with birds? Interference with other electrical systems?

Maybe it’s time to open a new field:

Open Safety – Build a safety track record in confined test areas in an open, community-driven process, and only let statistically proven safe systems out of the sandbox. SHARE validation results on the library and version level across systems.

Open Safety Manifesto

  1. Dropping is Free – Ground impact must only damage vehicle
  2. Anytime Shutdown – A separate shutdown unit is present to enforce a drop
  3. Built to Drop – Battery protected against impact, parachutes, etc.
  4. Every Flight is Logged
  5. Every Log is Reviewed – Can be done on Droneshare

State of the Art

Sites like Droneshare have collected so many flight hours for some drones to satisfy normal regulation rules.

Keep in mind that certification alone doesn’t make things safer, they just help to catch the most common errors. Simulations and test protocols improve the situation further. Lastly, lots of people mean that every part is tested in thousands of ways. Software developers know how quickly someone else finds problems with their work (one reason why pairing is so unpopular).

A big problem is YouTube since it teases out the “I more stupid” behavior in humans.

Military (Ab)Use

Another issue is the (ab-)use of this work in military drones used to kill. The scene is aware of this but they currently feel that the good (like cheap access to aerial photography, autonomous mesh networks built by drones in disaster areas) to outweigh the risk. A more visible discussion about OSS and hardware in military projects would be welcome. Some projects forbid military use (Gnutella). It’s worth noting that the US Navy has started to buy Linux based drones. Personally, I find things like the Samsung SGR-A1 military robot sentry (with 5.56 mm robotic machine gun and a grenade launcher) disturbing. Especially when it’s being advertised to be able to “identify and shoot a target automatically from over two miles (3.2 km) away.

The Cost of Safety

6. February, 2009

Worried about your safety? The safety of your wife/daughter/son/house/car/whatever? If you did worry about something like that in the past, when considering options to make something more safe, did you consider the cost?

Paul Graham wrote a nice essay “Artists Ship” (after the remark by Steve Jobs). Please ignore his “only programmers love to work hard”. The rest of the argument is very convincing. When people talk about “improving” some situation (crime rate, child abuse, revenue streams), they often propose solutions but there is little to no discussion about the cost of said “solution”.

So we want to protect our children against molesters. Fair enough. Only in the discussion, you can’t argue with reason because it’s so emotional. People don’t know anything about the reasons why someone becomes a pedophile or how (and if at all) this can be treated. They want a “solution”, completely ignorant of the cost. It’s a fact that “better” solutions (which will catch more violators) will always harm more innocent people.

Let’s look at a related case. Make up your mind about this case: “Julie Amero, a 40-year-old substitute teacher from Connecticut is facing up to 40 years in prison for exposing her seventh grade class to a cascade of pornographic imagery.” (more). Guilty? Innocent? What’s “exposing” supposed to mean here? Did she show them intentionally? Such a simple case and so many questions …

Say I want to write a program that automatically searches the Internet for child pr0n and sends alerts to the authorities. I can’t. It’s not possible anymore in any western country because I could neither test my program nor use it: Even the download of child pr0n is illegal. It’s illegal before a human can see it. I wonder how all those web filters work … Maybe they build them in a country where child abuse is not illegal.

So you like to watch pr0n but don’t want to pay? The Internet is full of “free” ware. But downloading “good.jpg” might get you into jail, depending on what you might find in the image afterwards. Guilty? Innocent?

Most computers on the Internet are vulnerable to all kinds of attacks. It’s ridiculously simple to spread viruses and worms which effectively take over your computer. Who is guilty when a cracker puts illegal pictures on your PC? You, because you didn’t understand the technology? You, because it is too hard to catch the cracker? You, because the prosecution doesn’t understand the technology, either? You, because the jury can’t follow the explanations of the experts anymore?

On the other hand, a clever pervert might infect his computer deliberately, so he can always say “it was the virus!”. With todays paint software, how hard is it to replace the head of an adult with one of a child and reduce the cup size? How hard is it to prove that the picture is real? How about pencil drawings? You do know that most paint programs come with “artistic filters”.

Such topics tend to become witch hunts where anyone can potentially be as guilty as we want them to be. Justice isn’t blind to protect the successful criminal, she’s blind in order to protect the innocent against prejudice.

So next time, you ask for a new rule, think about the cost, first.

Btw. During the research for this article, I googled for “teacher england hacker child porn“. Condemn me.

Links (in the order in which I stumbled over them):

%d bloggers like this: