Elon Musk admits latest Tesla self-driving kit is ‘not great’ after crashes

Elon Musk admits latest Tesla self-driving kit is ‘not great’ after crashes

08/24/2021

Tech billionaire and Tesla CEO Elon Musk admitted this week that the new software which enables 'full self-driving' in his cars was "actually not that great".

Musk said on Twitter that his Autopilot / AI team are "rallying" to improve the software as soon as possible, following the announcement of a formal investigation into crashes caused by his self-driving cars.

"FSD Beta" refers to a test version of Tesla's "Full Self-Driving" or 'Autopilot' software – which in theory should enable a car to drive itself without a human driver. Full self-driving has been an aspiration for Tesla since the company was founded.

Up until now, though, the vehicles have required a human driver to remain alert at the steering wheel in case of danger while the Autopilot is activated.

However, Tesla drivers have caused so many accidents by taking their eyes off the road that last week the U.S. government announced an investigation into the safety of the self-driving system.

  • Elon Musk sends SpaceX fans into meltdown with message about 'old world dying in blaze'

The investigation will look at more than 765,000 vehicles, or almost every vehicle Tesla has sold in America since 2014. The US road safety agency claims to have identified 11 crashes since 2018 caused by Tesla's Autopilot, which have injured 17 people and killed one.

It also said that Autopilot is being abused by drivers, who are often choosing to drive drunk – or even sit in the back seat of the car – while the computer does the work.

However, in a tweet, Tesla claimed that its self-driving cars have an accident rate 9x less than the US average.

Musk's comments follow a 'Tesla AI Day' event where he announced plans to build an AI-powered servant robot – by getting a dancer to dress up as one.

Source: Read Full Article