How a self-driving car handles when an unavoidable accident occurs has been a hotly debated concept in the automotive world. Should self driving tech be held responsible? How does it choose the “best” way to crash, etc. While companies like Mercedes have taken stances like “protect the passenger as much as possible,” Tesla has taken a very different approach, one that maybe frees them of any legal liability: revoke control back to the driver in the second before the crash.

According to a recent report by the National Highway Traffic Safety Administration, Tesla models are intentionally aborting the self-driving mode right before crashing. The NHTSA did an investigation where 16 Teslas hit a stationary emergency vehicle. They found that, “on average,” the autopilot feature was indeed running, but it relinquished control of the vehicle “less than one second prior to the first impact.” They also found that these accidents would have been identifiable by humans as early as eight second before impact.

It’s worth noting, before we speculate, some other details. The report also stated that in a majority of the incidents, the forward collision warnings and automatic emergency braking systems were in-fact activated. So while it’s not clear how early on those systems were activated, it does imply that the drivers still did not react quickly enough to reclaim control and avoid said accidents.

So what does all of this mean? A common theory online is that Tesla does this intentionally in order to avoid the legal ramifications. If the crash didn’t occur while autopilot was active, they can’t be blamed, right?

Wrong, technically.

Those who own the technology would be aware that, while it’s claimed to be fully self driving, it still is nothing more than a driving assistance program, and that the overall safety on the road is still the responsibility of the driver. It would also be fairly easy (we imagine) to argue in a court of law that revoking control with less than a second from collision is not sufficient to wash themselves of all blame, should they be blamed in the first place.

Not to say that Tesla IS free of all blame. In those reported instances, the autopilot still failed to stop adequately. And while they may or may not be legally responsible, this is still a major concern for if consumers are to trust the system in the future. Tesla has a bit of a history of over-advertising their products, and for trying to hide any and all shortcomings they have from the public.

Tesla, expectedly, has not issued any kind of statement or responded to any questions at this time.

Regardless of the reason, it’s hard to recommend Tesla’s self-driving technology right now. As cool of a concept as it is, we need to know it’s implemented safely before we can use it often. And even then, we can’t fully rely on it.