We know bits and pieces, but the specific combination for self driving cars could play out a lot differently as a whole that what you would get from piecing it together and guessing. Take texting and driving for example. Texting capability was used for a long time before texting-while-driving became a problem. It only became a large enough problem after iPhones were released and the subsequent market shift to touch-based smart phones. Prior to that phones had tactile buttons, so for the most part people could text reliably (i.e. t9word) without taking their eyes off the road. But after the market shift, people were getting into a lot more accidents. Another example, "hoverboards"... a lot of them are prone to randomly catch on fire, prompting airlines to ban them for obvious reasons. We knew how Lithium batteries work. We knew how segways work. But nobody really foresaw that.
It does not make sense to speculate something into law. We already have laws around electronics, cars (and in fact it is a really difficult process to become an automotive manufacturer), etc. I'm sure we will eventually see some laws around self-driving cars specifically. But the right time to do that is when we know which aspects have proven to be dangerous. Guesses get us nowhere toward real safety. And perhaps speculative safety laws will give us imagined safety, which is even worse.
I'm a small business programmer. I love solving tough problems with Python and PHP. If you like what you're seeing, you should probably follow me here on dev.to and then checkout my blog.
I don't think it views are actually that different. It's just difficult to communicate effectively in the comments section.
At the level you've define the problem, I agree that preventive legislation would be counter-productive.
I was imagining regulation asked at a much lower level. Like requiring these systems be programmed in a safe subset of C (if you want to use C) because overflows, null references, etc. are dangerous.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
We know bits and pieces, but the specific combination for self driving cars could play out a lot differently as a whole that what you would get from piecing it together and guessing. Take texting and driving for example. Texting capability was used for a long time before texting-while-driving became a problem. It only became a large enough problem after iPhones were released and the subsequent market shift to touch-based smart phones. Prior to that phones had tactile buttons, so for the most part people could text reliably (i.e. t9word) without taking their eyes off the road. But after the market shift, people were getting into a lot more accidents. Another example, "hoverboards"... a lot of them are prone to randomly catch on fire, prompting airlines to ban them for obvious reasons. We knew how Lithium batteries work. We knew how segways work. But nobody really foresaw that.
It does not make sense to speculate something into law. We already have laws around electronics, cars (and in fact it is a really difficult process to become an automotive manufacturer), etc. I'm sure we will eventually see some laws around self-driving cars specifically. But the right time to do that is when we know which aspects have proven to be dangerous. Guesses get us nowhere toward real safety. And perhaps speculative safety laws will give us imagined safety, which is even worse.
I don't think it views are actually that different. It's just difficult to communicate effectively in the comments section.
At the level you've define the problem, I agree that preventive legislation would be counter-productive.
I was imagining regulation asked at a much lower level. Like requiring these systems be programmed in a safe subset of C (if you want to use C) because overflows, null references, etc. are dangerous.