Aerospace America logo, click or touch this logo to return to the homepageAerospace America - click or touch the Aerospace America logo to return to the homepage.
Editor's Notebook

The unexpected impacts of collision avoidance


In the U.S. in the 1950s, the skies must have seemed so vast that experts assumed airliners could avoid collisions provided pilots kept their eyes peeled. This belief was proved deadly wrong in 1956, when two airliners collided over the Grand Canyon, killing 128 people (a piece of fuselage from one of the aircraft is pictured above).

That piece of history, discussed in our cover story explains a lot about today.

No wonder the FAA insists on a step-by-step approach to allowing drones and larger unmanned aircraft to fly regularly in the national airspace. No wonder the FAA and its partners are determined to create versions of collision avoidance software that can handle the anticipated growth in passenger flights and also the exploding demand for drones.

The FAA has learned from history, but the lessons should extend beyond aviation. The space industry might unwittingly be setting itself up for the equivalent of the Grand Canyon collision.

Thousands of small satellites are about to be launched into orbit without clear plans for preventing collisions and debris. A devastating wake-up call in orbit would be much harder to clean up afterward. It would be as though the debris from the Grand Canyon collision were circulating over the canyon decades later. Once the wake-up call is heard, the satellites can’t land to have new equipment installed.

The history of collision avoidance in aviation also raises questions about the willingness of humans to place trust in technology. In the case of an airliner collision over Germany in 2002, a problem was compounded when the pilot on one of the planes didn’t follow the advisory sounded by his collision avoidance software.

Will these kinds of trust issues crop up more often as designers add new levels of artificial intelligence and automation to aircraft, or will pilots and the rest of us learn to accept software in control? If I had to guess, I’d say that humans won’t change and that the best automation software will be written in a manner that recognizes that we have trust issues.

Related Topics

Air Traffic Management and ControlCommercial AircraftGeneral Aviation

About Ben Iannotta

Ben Iannotta

Ben became editor-in-chief of Aerospace America in 2013, after two decades as a contributor. He was editor of C4ISR Journal, a military intelligence magazine, and has written for Air & Space Smithsonian, Popular Mechanics and Space News.

The unexpected impacts of collision avoidance

  • Share this:
  • Share this article on Facebook
  • Share this article on Twitter
  • Share this article on LinkedIn