U.S. Prosecutors Focus On Securities And Others In Tesla Autopilot Investigation

US authorities are investigating whether Tesla committed securities or wire fraud by deceiving investors and customers about the self-driving capabilities of its electric vehicles, according to a Reuters report.

TakeAway Points:

  • U.S. prosecutors are examining whether Tesla committed securities or wire fraud by misleading investors and consumers about its electric vehicles’ self-driving capabilities.
  • Elon Musk, the CEO of Tesla, initially cautioned drivers to be prepared to take over when driving.
  • According to the report, in 2022, a Tesla engineer provided testimony in a lawsuit about a fatal Autopilot crash.

Tesla’s Autopilot and Full Self-Driving systems

While they can help with lane changes, braking, and steering, Tesla’s Autopilot and Full Self-Driving systems are not entirely autonomous. Although Elon Musk, the CEO of Tesla, has cautioned drivers to be prepared to take over when driving, the Justice Department is looking into additional remarks Musk and Tesla have made indicating that their vehicles are self-driving.

The firm issued a mass recall after U.S. agencies conducted independent investigations into hundreds of collisions involving Teslas that had Autopilot activated, some of which were fatal.

The sources claimed that by misrepresenting its driver-assistance technologies to customers, Tesla may have committed wire fraud, which is defined as misrepresentation in interstate communications. Two of the individuals stated that they are also investigating the possibility that Tesla misled investors in order to conduct securities fraud.

According to the report, Tesla has been under investigation by the Securities and Exchange Commission for misleading investors about its driver-assistance technologies. 

The corporation said in a document from last October that the Justice Department had requested data regarding Autopilot and Full Self-Driving.

Musk has aggressively bragged about the capabilities of Tesla’s driver-assistance technology.

In a Tesla video demonstrating the technology that remains archived on its website, it says: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

Tesla Fatal Autopilot Crash

According to the report, in 2022, a Tesla engineer provided testimony in a lawsuit about a fatal Autopilot crash. The engineer stated that one of the films, which was uploaded in October 2016, was meant to demonstrate the technology’s potential rather than correctly depict its capabilities at the time. Nevertheless, Musk shared the video on social media and wrote: “Tesla drives itself (no human input at all) through urban streets to highway streets, then finds a parking spot.”

During a 2016 press conference call, Elon Musk stated that Autopilot is “probably better” than a human driver. Musk discussed an upcoming FSD update that he said would let users go “to your work, your friend’s house, or the grocery store without you touching the wheel” on a call in October 2022.

Musk is concentrating more on autonomous vehicle technologies as Tesla’s auto sales and earnings decline. Recently, Tesla reduced expenses by firing a large number of employees and abandoned plans to release the much anticipated $25,000 model, which was supposed to boost sales.

“Going balls to the wall for autonomy is a blindingly obvious move,” Musk posted on his social-media platform X in mid-April. Tesla shares—which had dropped more than 28% so far this year—saw a sharp increase as Musk travelled to China and advanced the process of obtaining authorization to sell FSD there.

Since roughly ten years ago, Musk has made numerous promises concerning self-driving Teslas. “Mere failure to realize a long-term, aspirational goal is not fraud,” Tesla lawyers said in a 2022 court filing.

Legal Disputes

According to those involved with the investigation, prosecutors looking into Tesla’s autonomous car claims are moving cautiously because they are aware of the legal obstacles they must overcome.

According to three legal experts who were not engaged in the investigation, they will have to show that Tesla’s assertions went beyond legitimate salesmanship and into the realm of significant and willful misleading statements that illegally hurt investors or consumers.

Prior decisions by US courts have determined that product promises that are perceived as “puffery” or “corporate optimism” do not constitute fraud. A federal appeals court decided in 2008 that a company official’s words of optimism alone do not prove that they were spoken with the purpose of deceiving investors.

Justice Department representatives, according to Daniel Richman, a professor at Columbia Law School and former federal prosecutor, would probably look for internal Tesla communications as proof that Musk or others knew they were making false claims. That presents a hurdle, but the oversold self-driving system safety risk also “speaks to the seriousness with which prosecutors, a judge, and a jury would take the statements,” according to Richman.

Tragic Collisions

Lawsuits and governmental investigations have scrutinised Tesla’s assertions regarding Autopilot and FSD.

In recent months, courts and safety regulators have expressed worry over corporate advertising regarding the technology, which includes the terms Autopilot and Full Self-Driving, which may have given consumers a false sense of security.

Based on police records, the driver of a Tesla struck and killed a motorcyclist in April while using Autopilot. The man was detained by the Washington State Patrol on suspicion of vehicular murder. One trooper noted in a probable-cause statement that the motorist had “confessed inattention to driving while in autopilot mode, putting trust in the machine to drive for him.”

Regardless of a car’s technological capabilities, a driver in Washington State is still “responsible for the safe and lawful operation of that vehicle,” a state patrol official said.

The U.S. National Highway Traffic Safety Administration began looking into whether Tesla’s December recall of more than 2 million cars sufficiently addressed safety concerns with Autopilot in the same month.

The recall was the result of a protracted investigation that regulators launched after vehicles operating on Autopilot were colliding with other vehicles at first-responder emergency scenes. After reviewing hundreds of collisions in which Autopilot was activated, regulators discovered 14 fatalities and 54 injuries.

Tesla accepted the recall, which included over-the-air software changes meant to warn careless drivers, while contesting the NHTSA’s conclusions.

NHTSA records state that the agency’s examination discovered “a critical safety gap between drivers’ expectations” of Tesla’s technology and “the system’s genuine capabilities. This loophole resulted in predictable abuse and preventable collisions.”

To Top

Pin It on Pinterest

Share This