Tesla drives on Autopilot through a regulatory grey zone

The fatal crash of a Tesla with no one apparently behind the wheel has cast a new light on the safety of semi-autonomous vehicles and the nebulous U.S. regulatory terrain they navigate.

Police in Harris County, Texas, said a Tesla Model S smashed into a tree on Saturday at high speed after failing to negotiate a bend and burst into flames, killing one occupant found in the front passenger seat and the owner in the back seat.

Tesla Chief Executive Elon Musk tweeted on Monday that preliminary data downloaded by Tesla indicate the vehicle was not operating on Autopilot, and was not part of the automaker’s “Full Self-Driving” (FSD) system.

Tesla’s Autopilot and FSD, as well as the growing number of similar semi-autonomous driving functions in cars made by other automakers, present a challenge to officials responsible for motor vehicle and highway safety.

U.S. federal road safety authority, the National Highway Traffic Safety Administration (NHTSA), has yet to issue specific regulations or performance standards for semi-autonomous systems such as Autopilot, or fully autonomous vehicles (AVs).

There are no NHTSA rules requiring carmakers to ensure systems are used as intended or to stop drivers misusing them. The only significant federal limitation is that vehicles have steering wheels and human controls required under federal rules.

With no performance or technical standards, systems such as Autopilot inhabit a regulatory grey area.

The Texas crash follows a string of crashes involving Tesla cars being driven on Autopilot, its partially automated driving system which performs a range of functions such as helping drivers stay in lanes and steer on highways.

Tesla has also rolled out what it describes as a “beta” version of its FSD system to about 2,000 customers since October, effectively allowing them to test how well it works on public roads.

Facebook Comment Box