DEV Community

Cover image for Tesla Pushes Out Dangerous Software Update, Bungles Rollback Attempt
Ben Halpern
Ben Halpern

Posted on

Tesla Pushes Out Dangerous Software Update, Bungles Rollback Attempt

There are a lot of headlines about Tesla that overstate a story for clicks, and I can assure you this is not one of them.

Before I'm accused of being a hater: I am a Tesla owner and shareholder. I'm no Elon Musk fanboy, but I definitely root on Tesla's environmentally-driven mission of accelerating the world’s transition to sustainable energy. I'll also say that I personally hope this does not result in reactionary regulation of autonomous driving software in the USA because current rules seem reasonable and I don't necessarily trust that changes would happen in good faith.

And major hugops to all the developers and admins doing their best with all of this — I'm posting this on DEV because it's an interesting software story that deserves attention, not to be critical of the engineering teams. They strike me as very misssion-driven over there, and are generally doing incredible things.

With all that said, this was a doozy...

Here's what went down:

I am one of the customers with the FSD beta myself. I am out of town, but my wife described a pretty scary scene that fits in alongside the reports from the rest of the users.

Rolling back software is generally always easier said than done, but that doesn't act as an excuse for a company building software which is fundamentally unsafe if buggy.

Musk's latest tweet today...

Tesla certainly gives off the impression of a company that rides some of their software teams to meet absurd deadlines, work through the weekends, and generally work to exhaustion: That is the story I've gotten from anyone I've met who works in Tesla or SpaceX engineering. They definitely hire for people who are passionate about the mission and self-select for the journey, but that kind of culture tends to drive these kinds of mistakes.

At the end of the day, mistakes happen, but failure to communicate effectively has consistently caused frustration between Tesla and its customers/users. Tesla no longer has a PR team, which is fine in and of itself, but this is an organization that badly needs to delegate its communication strategy over matters like this. Not to a PR department per se, but a trusted leader within the organization who can communicate directly.

Tesla is not exactly opaque. They are remarkably transparent in many ways, but these problems hurt their mission and they need to level up. If they are going to be rolling out beta software and pushing a lot of good things forward at a great pace, they need to learn from this — starting at the top.

Happy coding ❤️

Oldest comments (10)

Collapse
 
ben profile image
Ben Halpern

At the end of the day, beyond thinking about what is responsible, beyond the big picture of it all, I'm definitely curious about the rollout/rollback approach of this sort of thing in general. It's all familiar in software, but also brand new. Q/A in autonomous driving must be fascinating.

Collapse
 
grahamthedev profile image
GrahamTheDev • Edited

It would be super interesting to see the footage from all of the erroneous manoeuvres.

From that single clip on the second reddit post I would immediately be looking at how it recognised that black and yellow end stop on the barrier as that would have been right in the cars vision and is the sort of thing I would expect a computer to get confused by (as I imagine it can easily result in an incorrect depth perception).

It would also be interesting to see if this was an engineer who botched or their machine learning (ML) that made this mistake.

As I said pure speculation and probably complete b****cks but thought I would say it to see if anyone else has some better theories / knows the Tesla software model better so I can be educated 🤣

Collapse
 
nombrekeff profile image
Keff

This is scary actually. I don't think there should be betas for these kind of software, as bugs can result in death. These kinds of software should be tested extensively, and then tested again. Betas should be tested in closed environments IMHO, not in the real world. As accidents do not only affect the driver but could harm other innocent people. I understand that some tests are difficult to test on closed environments, but delegating that testing to the users is not right.

It's not like a beta for a note taking app or a fridge, where talking cars that go >60mph on a highway.... they should have a bit more responsibility... not the engineers of course, but the company.

Collapse
 
phantas0s profile image
Matthieu Cneude • Edited

The software industry is more preoccupied by speed without taking into account the limitation of the human mind. Nobody cares about the later. Yet, our brain doesn't function well with absurd amount of work, especially if the complexity is there. And it's almost always there.

To me, it's how this kind of stuff happens.

Collapse
 
mellen profile image
Matt Ellen

For anyone else who was wondering: FSD is short for Full Self-Driving

Collapse
 
ky1e_s profile image
Kyle Stephens

You can’t treat mission critical software like a startup.

“Move fast and break things” just won’t cut it.

Collapse
 
foresthoffman profile image
Forest Hoffman

especially when the thing that is moving fast is a 2-ton vehicle 😮

Collapse
 
jeffreyfate profile image
Jeffrey Fate • Edited

I'm going to take this approach:

"Everyone is speculating like crazy.

I think I’m going to just hang out and wait."

There is way too little information in any of these accounts to confirm what you're saying.

Coming from an engineers, I'm surprised to see such a reactionary approach here, of all places.

Collapse
 
jayjeckel profile image
Jay Jeckel

At the end of the day, mistakes happen

Yep, those words are completely true and they are also why I will never have a self-driving car and why I oppose them being tested on the open roads.

It's sketchy enough when games do their beta testing in the wild, it's down right terrifying when it's done with code running in multi-ton boxes of metal flying around at nearly a hundred miles an hour.

Before I'll even consider supporting self-driving vehicles, the entire stack of software MUST be open sourced. I'm reluctant enough to put my life in the hands of software, but I'm definitely not going to put it in the hands of proprietary enterprise software.

Collapse
 
aeiche profile image
Aaron Eiche • Edited

I recently ended a 7+ year career in the Automotive industry, working for a Automotive OEM. To me, Tesla's biggest blind spot comes from lack of experience as a car company. To outsiders, it seems like Tesla is light years ahead of competition. In reality, other car makers can match capabilities in terms of self-driving, but have been extremely hesitant to rollout the technology - where Tesla will throw it into vehicles before it's finished. The Automotive industry is one of the most heavily-regulated in the United States, and I continue to be appalled at what Elon Musk gets away with promising and delivering catastrophically - both from a product stand-point and from securities standpoint.

I don't want to dismiss the impact that Tesla has had on the industry - it's been great in terms of user experience, and having a new competitor enter the market. For what it's worth, I think Tesla is much further from Level 3 self-driving than they're trying to sell. I think the industry as a whole is at least a decade away from Level 4 - if ever.