P.W. Singer Wastes an Opportunity in the Atlantic

Peter W. Singer is arguably the most famous scholar of drones and robotic warfare today.  His book Wired for War probably did as much to introduce unmanned aircraft to the popular consciousness as any single work.  His article in the Atlantic on July 19th blew a huge opportunity to advance the discussion of unmanned aircraft regulation.  



Singer and his co-author Jeffrey Lin, spend most of the article saying that an industry code of conduct released by AUVSI is not adequate to regulate the industry.  To the best of my knowledge as an AUVSI chapter president, AUVSI has never contended that industry ought to self-regulate.  The point of the code of conduct was to acknowledge that the unmanned systems industry takes concerns about unmanned aircraft seriously and has an internal expectation of ethical behavior.  While everyone agrees that this is not a substitute for regulation, it is a really useful compliment to it, especially in an industry that is still as small and close knit as the unmanned systems industry.   The unmanned aircraft industry is dying to be regulated.  Being regulated is a huge improvement on being prohibited, as currently we are prohibited from all applications that are not military or law enforcement.

Navy Global Hawk UAV

A Navy drone soars over Maryland. (Reuters)


Personally, I'm outraged that pilots die every year fighting forest fires (six already this year)--there is not a single technological reason why these flights need to be manned.  There is no cause for laws against farmers monitoring their crops with an RC plane.  And FedEx and UPS should definitely be allowed to robotically deliver my Amazon Prime order across the country between lunch and the time I get home.  The current regulatory regime for is outmoded, unproductive, uneconomical, and immoral.  It is literally killing people.

Rather than attacking a straw man, the Atlantic article should have offered a path forward.  To start with there are four basic concerns that people have around unmanned aircraft.  Sensationalists, will bandy about the drone label (technically, a drone is a male insect or an expendable aerial target) and try and conflate them into something scary and incomprehensible, but each issue deserves separate treatment.  I'd also like to call the Hizook audience's attention to the fact that every robotic domain is going to face these same issues.  The laws and norms we develop around unmanned aircraft will influence anything with a sensor and data collection ability, so every robot.

(1) National Security Ethics and Policy:  

I'd like to leave this one aside and focus on domestic unmanned aircraft.  All this talk of missiles scares the shit out of people.  Domestic aircraft do not have weapons regardless of the method they are piloted.

(2) Safety:  

Everyone believes that unmanned aircraft should be held to the same standard of safety for other aircraft and people on the ground as manned aircraft.  Unfortunately, the FAA does not have a way of assessing what constitutes this level of safety.  No system is ever perfect, but the current technology allows unmanned aircraft to be as at least as safe, or safer, to others as manned systems by any objective, statistical measure.

(3) Privacy:  

This is the real biggie, and the issue where the other robotic and technology domains have a dog in the fight.  I do not want public or private entities (to include individuals--especially my mom and my ex) to gather data on my personal movements.  Just by going out in the open, I should not have surrendered the ability to control the use of my likeness, have others know my location, or otherwise be personally identifiable to those I haven't given some permission to.  Law enforcement, brings up a whole host of other issues.  Law enforcement already has a fair degree regulation around them, so again I'll set aside the special case and focus the argument on commercial use.  

As basic premise in this area, an astute scholar ought to have proposed platform agnosticism, and further pointed out that what we're afraid of in privacy is not the technology but the people on the other end.  There is nothing special about being spied on from a robotic aerial platform--I don't want to be spied on by an unmanned ground sensor, a manned aircraft, or guy in plainclothes--unless I've done something egregiously wrong.  Any laws in this area ought to have clear boundaries that dependent on information use and the type of data retained.  To accommodate new technology, regulation should not depend on collection method, technology, or platform distinctions.  The unmanned systems industry supports strong privacy policies as long as they are evenly applied between manned and unmanned; air, ground, and space.  This is the reason why AUVSI is actively and constructively engaged with the ACLU.  I'm not sure why Singer and Lin failed to include this. It comes up as the third result if you Google AUVSI and ACLU.

(4) Disruption: 

This is the other big one that lurking in the background for every robotic application.  The introduction of any robot is going to change the way that activities around the robot are organized.  I don't want to minimize the psychic and economic impact that changing the status quo has on anyone--I have lots of friends who are pilots.  Allowing unmanned aircraft into the national airspace will change how they earn their living and the way that they think about themselves.  In the robotics industry, we have to show real sensitivity any time we make these kinds of changes. The fears and insecurities of today should not rob us of the progress in the human condition that robotic technology offers.  


Unmanned aircraft are the canaries in the coal mine for robotics.  At some level, these fears drive the resistance to all robotic applications.  Particularly on privacy and data, the rules that come to govern unmanned aircraft will inevitably impact other domains.  The good news is that by being the first one through the regulatory wringer, they may break in the bureaucracy to these issues.  When we set aside the sensationalism, it is clear that each of these issues has dedicated—sometimes even redundant—regulators on the federal level: 

National Security: Department of State, Department of Defense, and the various agencies under the National Security Council

Safety: Department of Transportation (Federal Aviation Administration)

Privacy: Federal Communications Commission and the Department of Justice

Disruption:  Commerce, economic development authorities, state government, market solutions


None of these issues are insurmountable and none of them is utterly unlike other industries that are very successfully regulated today.  We do not need to be afraid of unmanned or remotely piloted aircraft technology.  There is a path forward where we can all reap the benefits of robotic technology in the sky, but don’t have to fear for our privacy and safety.   We have an amazing future if we keep our heads and do this right. 



Robert Morris is a former Army officer who led the first RQ-7B Shadow (UAV) platoon in Afghanistan.  Subsequently, he consulted to the Navy, primarily on unmanned systems issues, with Deloitte Consulting.  He is currently pursuing graduate studies at Carnegie Mellon University and serving as the chapter president of AUVSI-Pittsburgh.



A few random thoughts:


Existing standards for autonomous systems typically make device-class distinctions that determine the amount of regulation and the quality assurance that is required.  Hizook previously covered ISO and ANSI standards for robots, where they define multiple robot classes for different levels of regulation.  The FDA has similar classes for medical devices.  UAVs should probably have something similar, coupled with airspace classifications and restrictions.



UAVs are very different from many other types of ground-based robots.  The privacy implications of a mobile manipulator (say, a PR2) operating in my own home are more akin to a personal computer than a spy satellite.  The fear with UAVs is simply an extension of the latter.  I think a more apt comparison would be with "Google Glasses" and other pervasive media devices.  Frankly... this boondoggle has been festering in the background for some time.  There needs to be some way for individuals to "opt out" of all pervasive monitoring.  People are reluctant to trust government (or industry) in any case... but it doesn't matter.  The genie is out of the bottle.  Incidentally, David Marusek sorta touched on this in his scifi books (Couting Heads and Mind Over Ship) with the "media bees."  



I am also disappointed by Singer's essay.  It's too much bellyaching without any actionable suggestions.



The "terrorism" concerns are (to a degree) moot.  The fact is... it's too easy for any basic engineer to build a capable UAV, and regulation won't prevent it. Again, the genie is out of the bottle.  If federal regulations are enacted to prevent legit uses, it's just going to cause more long-term harm than good -- exactly what happened with home chemistry and home rocket building -- and I'll be livid!   

—Travis Deyle

Good work - nice piece.  But what about the words we use? There is clear conceptual lineage from drones down toward whatever other machines are considered robots or robotic, but it's kinda not warranted. A Reaper or Predator drone isn't a robot, it's a super-tech 1:1 scale R/C plane with weaponry and autopilot. 

Shouldn't serious journalists, commentators, and even technosnarky manchildren with digital bullhorns (speaking of myself) make an effort to draw a very clear line between what exactly gets the label drone or robot or robot drone? Wouldn't that be a positive step toward clarifying legislation and avoiding sweeping generalizations of both perception and public policy? 

If I may offer an excerpt to further muddle the issue: WarBot Update: What to Call the Drones Now that They’re here at Home

"A drone is under control, so does a machine become a robot if it has a measure of autonomy? Does the label also depend on function, manner of movement, or distance from the controller? Are Google’s self-driving cars drones or robots? Čapek’s robots were all humanoid, so does form also have some bearing? And what of artificial or non-biological intelligence – how does that fit in? At the end of the day, until we have a standardized system, isn’t the robot label just a flavor enhancer?

No one’s bothered to codify or standardize the lingo, but clearly we’re already operating on some subliminal guidelines. Garbage trucks and 10-story mechanical parking garages are partially autonomous, and a hobbyist's mini helicopter is a UAV, but we can all agree that none of these would be called drones or robots or robot drones." (more here: http://goo.gl/whDQw)


Reno www.anthrobotic.com