|
I only just made it in 5
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I won't tell if you don't.
Jeremy Falcon
|
|
|
|
|
Wordle 934 6/6*
β¬β¬π¨β¬π¨
β¬β¬β¬π¨π¨
β¬π¨β¬π©β¬
π©π©β¬π©π©
π©π©β¬π©π©
π©π©π©π©π©
Happens to all of us
Happiness will never come to those who fail to appreciate what they already have. -Anon
And those who were seen dancing were thought to be insane by those who could not hear the music. -Frederick Nietzsche
|
|
|
|
|
Quote: Wordle 934 3/6
β¬π¨β¬π¨π¨
β¬π¨β¬π©π¨
π©π©π©π©π©
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Wordle 934 5/6
β¬π¨β¬π¨β¬
β¬β¬π¨π©π©
β¬π©π¨π©π©
π©π©β¬π©π©
π©π©π©π©π©
Jeremy Falcon
|
|
|
|
|
How long before self driving cars claims stop?
I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse.
Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher.
So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children.
Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc.
So in the above with a self driving care no person can be at fault. Because they were not driving.
Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car.
And then they will sue them for 10 million. Or 100 million.
Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded.
Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal.
One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath.
Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that?
And even if the action was exactly right, is a lawsuit against the company still going to happen?
|
|
|
|
|
I think most programmers would never get in a self driving car. We all know how buggy our own code is.
Heh, just thought buggy Buggy code.
Iβve given up trying to be calm. However, I am open to feeling slightly less agitated.
Iβm begging you for the benefit of everyone, donβt be STUPID.
|
|
|
|
|
MarkTJohnson wrote: most programmers would never get in a self driving car.
20 CEOs of avionics companies were all on a plane, just before taxiing away from the terminal. The purser came down the aisle, and whispered to each CEO that their company's avionics are controlling the plane. 19 out of 20 CEOs got off the plane immediately, while the 20th stayed in place.
The purser said that he/she/it must be very confident in the company's avionics, to which the answer was "with the programmers we employ, the plane won't even take off!"
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I was expecting a punchline that takes a shot at CEO's. I wasn't expecting a dig at programmers.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Buggy2
Software Zen: delete this;
|
|
|
|
|
I agree that automatic driving has the potential to drastically reduce the number of car accidents. However, given the litigious climate in the U.S. (and increasingly - in the rest of the world), I doubt whether any car manufacturer will actually advertise "automatic driving" as a feature.
The only way that I see this happening is that car manufacturers be required to submit their cars for rigorous external tests, in return for receiving legal indemnity from lawsuits. Something similar exists for vaccines.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: The only way that I see this happening is that car manufacturers be required to submit their cars for rigorous external tests
I'm pretty sure the NHTSA is already responsible for that. For testing the self-driving features? Probably not so much. I don't really see a government agency keeping up.
Daniel Pfeffer wrote: , in return for receiving legal indemnity from lawsuits. Something similar exists for vaccines.
Reagan indemnified pharmaceuticals back in the 80s. Sure, there's plenty of testing going on, but holding Big Pharma accountable should be a thing.
[Edit]
Worse, it's actually called the National Childhood Vaccine Injury Act of 1986. Can't sue for harming your kids with a bad vaccine. That sounds so wrong...
|
|
|
|
|
The problem is not the car or the software, it's the environment.
My understanding is that the auto-drive vehicles continuously scan the environment, looking at the road, lines on the road, signs, other vehicles, etc. This works great if everything is marked well and signs exist.
However, if the road is unmarked (fresh pavement), the lines are badly worn, or there are old-n-new lines, things get dicey.
Recently I drove through construction where there were three sets of lines, two looked fresh, and all ended abruptly at different places. *I* had a problem figuring out where my lane was. How is software going to do it?
Self-driving vehicles will continue until someone connected to money gets seriously injured or killed. THEN the house of cards will come crashing down.
|
|
|
|
|
|
These safety problems are addressed in autonomous driving cars.
I am working in the automotive industry, and I can assure you that I would buy an (electrical) AD vehicle from any European or Japanese car manufacturer without any discussion.
I still have my own doubts regarding quality of EVs from China. And do not get me started about the Musk company - nice toy, best car to die in.
|
|
|
|
|
Rage wrote: These safety problems are addressed in autonomous driving cars.
Not sure I understand your statement.
For example self driving cars are not going to prevent heated seats from catching on fire. That is one of the recalls.
And Telsa has a recall in effect to reduce the ability of their cars to self drive. So very specific to self driving.
|
|
|
|
|
I don't see how any of that leads to OP's claim that self-driving cars will never happen. Recalls happen. All the time. My dad was a mechanic for over 40 years, and recalls have provided plenty of work, even for the silliest things. Buggy self-driving software? That's an over-the-air update, I don't see that as a big deal.
I suppose retro-fitting an existing car with new sensors would be something else. But then, if there was a need for that, the manufacturers would just take the feature away and claim it was never sold as "fully self-driving" anyway.
|
|
|
|
|
Just a small note on the self driving car that dragged the person when it attempted to pull off to the side of the road. The person was first hit by a car driven by a human driver. Their body was thrown in front of the self driving car. The self driving car could not stop. (Physics). The human driver ran from the accident and is still being looked for.
Unfortunately, the person fell into a spot that was outside the range of the car sensors. The car proceeded to try and pull over to wait for help with the accident and made things worse by running them over.
Yes, a human being would get out of the car and look to aid the injured person before trying to move their car (unless they panicked and simply drove away.)
Yes, the self driving car needs to have it's software upgraded to include the case where a body is thrown in front of it, collides with said body, and can not locate the body after hitting it. In that case it needs to simply stop, call 911, and wait for human assistance.
Yes, There are many unique things that a car can encounter. Will software ever be up to the challenge? I honestly don't know. But I do know that the current carnage on our highways will continue, with or without automated driving help. Software can be upgraded, people; not so much.
|
|
|
|
|
Gary Stachelski 2021 wrote: Yes, a human being would get out of the car and look to aid the injured person before trying to move their car (unless they panicked and simply drove away.)
Not sure I agree with that.
Not even sure I agree that that is the best action to take.
As I noted in that was a highway. And at night.
Not sure about you but for me slamming on the brakes at any time on a highway is not something that I consider safe. Not for me and not for the cars behind me.
Also as a driver I have been in an accident where I had no idea what had happened. Also on a highway. So the 'correct' behavior becomes much less clear.
Gary Stachelski 2021 wrote: the self driving car needs to have it's software upgraded to include the case where a body is thrown in front of it,
For a driverless vehicle that means programming every possible scenario. That is just not going to happen.
Some examples.
I have been on a higher speed road and the car in front of me hit a bumper that fell off another car and it launched the bumper into my car.
I have seen a car that was side swiped (literally knocked off the road) because it came from behind and speed up in a turn lane besides a long line of stopped cars and one of the cars in the stopped line decided to change lanes abruptly. I saw the car speed up because I was further down the line of cars. Not sure it was even visible to the car that changed lanes.
I have seen a bicyclist going the wrong way down a one way street at night with no lights and moving quickly. I actually know that person and he had previously been in a accident doing the exactly the same thing except that time he was hit and went flying over the car.
Note that in these scenarios it is not only that the car must be programmed to handle it but that the car maker must be able to show that what it did was the correct and best way to handle it.
Gary Stachelski 2021 wrote: Will software ever be up to the challenge? I honestly don't know.
That however is the point. When those accidents do occur the car maker will be sued for large amounts of money.
|
|
|
|
|
I don't think it will - instead, I think the public opinion will shift to revulsion at the whole idea of manually driving a car. Think about existing legislation: seat belts, ABS, speed limiters, the recent whole-of-Wales reduction of the default speed limit to 20mph from 30 - it's all about increasingly small reductions in death and serious injury; self driving offers that a large reduction (which will be touted as a total prevention) may be possible and there isn't a politician who dares fight that! Car companies being sued as a result of their products failing? It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ...
And as the number of self driving cars increases and the communication between them (to increase safety and economy) rises as well the accident rate will plumet as a result. When humans realise that they can do what they want (legally) while the car does the work they will leap at the chance to browse social media, messages, phone calls, alcohol, drugs, tv, pr0n, ... Stuff they do at the moment anyway while they are supposed to be in control!
I don't commute any more, but my regular commute was an hour each way with the lemmings on a motorcycle and the things I've seen while traffic is moving at 70mph was horrific, phones, texts, newspapers, even one guy with his lappie propped open on the dashboard typing away and steering with his elbows!
Self driving cars will (eventually) be safer: and they are - probably - the future whether we like it or not.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: It happens already and they probably have a budget for it because it's cheaper to be sued than to do the job properly ...
Didn't this famously happen in the '70s with the Pinto, which had a design flaw making it liable to bursting into flames in an accident, but it would have been more costly to retool than to pay out when it caught fire?
|
|
|
|
|
|
I've had almost the same thoughts about autonomous vehicles. There's a whole range of accidents that occur because meat sacks are in control. Have you ever arrived at your destination and realized you have no clear memory of the journey? There's other things our brain does to edit reality. There's a well known example of a group of people asked to watch a football match, and answer questions afterwards. The first question is "Did you notice the guy in the gorilla suit?" Most people miss it. Because your brain edits it out as "not important" to the football game. Similarly with driving - or really any activity.
My thought is that several things are going to happen. Firstly, insurance companies are going to look at the numbers and raise the rates on non-autonomous vehicles, to the point where the average Joe is going to be motivated to move to an AV. Then, as non AV's move into the minority, and communication between AVs becomes standardized, NAVs will be required to have transponders that alert AVs to their presence. Eventually, NAVs will be banned, except in tightly controlled situations (e.g. parades, etc).
I expect that as the technology grows, there will be some terrible incidents. But like the airline industry, investigations and recommendations will continue to make AVs safer over time.
"A little song, a little dance, a little seltzer down your pants"
Chuckles the clown
|
|
|
|
|
It isn't a matter of whether they can be safer.
It is what will happen every single time that any sort of accident does occur.
|
|
|
|
|
Slightly distracting from your main point, maybe, but what I don't understand about self-driving cars is that everybody is doing his own thing.
Why not make this a collaborative effort? So when one unanticipated scenario comes up, someone writes a fix once, the community at large tests it (like bug fixes in open source - in theory) and every manufacturer gets to benefit from it. It seems to me things would evolve a lot more quickly than having everyone roll his own version, no?
Is this a matter of patents? Or each car manufacturer using different types of sensors, so there isn't one common/re-usable source of data that can be acted upon?
|
|
|
|
|