|
That didn't help us with chess and go. In a battle between engineering and the evolution -- in short term my bet is on evolution, and in long term on engineering.
Resistance is futile, robots will assimilate you AND your cat.
|
|
|
|
|
Evolution doesn't have anything to do with the ability of braking in time when a pedestrian jumps in your way in unexpected places while you're controlling a 1500kg mobile object at 38 mph. Or pretty much any other situation that we have to deal with when controlling a car. If anything, the instincts that evolution got us will make us behave inappropriately.
If anything, most of evolution taught us that it's best to run over any pedestrian who's stupid enough to run into our path - one less competitor on our hunt for food! In that respect, most autonomous systems are already better than that before they even start training!
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
Stefan_Lang wrote: Evolution doesn't have anything to do with the ability of braking in time when a pedestrian jumps in your way Indeed? So you needed someone to teach you how to detect the pedestrian jumping your way? You did not have a naturally evolved image processing system (among other things) in that grey matter between your ears? And a neural net that is by orders of magnitude smaller and with only a tiny fraction of the training time (no matter how you measure it) will do the job better?
I wish I could share your optimism.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I did also say at 38 mph. Typically a human moving at 38 mph through pretty much all of evolution was only seeing one thing, and that is the ground he was about to hit - not the kind of stuff going into the genes except into the genes of the onlookers. If evolution taught us anything it is that moving at 38 mph is fatal.
Now, of course, if your forefathers were running through the jungle they certainly did learn to react to a creature moving into their path. But, depending on the number of claws and teeth (or raised clubs) of that creature, stopping might not have been the preferred type of reaction.
I'm not saying that this is not an important bit of information when deciding that you need to slow down when something moves into your path, but it's also so much different from the evolutionary training, that the lesson learned can be pretty much reduced to saying that: if something moves into your path, slow down. And that is trivial to learn for any autonomous system, no matter how small.
In case of the accident, this raises the question why the cars sensors did not detect the woman, or identify it as an actual obstacle. Apparently the driver didn't either, or at least not in time, and his millions of years of evolution didn't help him in any way there. But the car's systems should have been able to both detect the woman (using the LiDAR sensors) and react to it as well (thanks to super-human reaction times). The investigation should focus on these questions.
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
Stefan_Lang wrote: I did also say at 38 mph. Typically a human moving at 38 mph through pretty much all of evolution was only seeing one thing, and that is the ground he was about to hit - not the kind of stuff going into the genes except into the genes of the onlookers. If evolution taught us anything it is that moving at 38 mph is fatal.
Now, of course, if your forefathers were running through the jungle they certainly did learn to react to a creature moving into their path. But, depending on the number of claws and teeth (or raised clubs) of that creature, stopping might not have been the preferred type of reaction.
I would say your well thought out logic is interfering with an ill-thought out rant.
|
|
|
|
|
That's exactly the reason why I am going to stick to driving my car myself rather than handling it over to AI. If a fool jumps in front of my car suddenly, I am going to run over the guy. I don't want the AI to brake hard and send my head to the steerwheel.
|
|
|
|
|
Home sapiens have only been around 200k years. We have only been using faster than human modes of transportation, starting with horses(?), for around 6k years. Regardless of all that, with a human, you are still counting on that person's physical limitations (age, reaction time, visual acuity, etc), their attention span, and the skills they have acquired to be a good driver. With an AI you have (hopefully) a system that pays attention 100% of the time, can aggregate and build upon past experiences of multiple individual systems, and can actually have sensors that surpass what humans can see. Look at it this way, think of the quality of cars before robots were used in mainstream production. The tolerances were able to be tightened and quality has improved by using them. Over time I would think we would get to a point where cars could talk to each other and even help avoid accidents all together.
|
|
|
|
|
That's very nice, but falls short of the mark.
milo-xml wrote: ou are still counting on that person's physical limitations (age, reaction time, visual acuity, etc), their attention span, and the skills they have acquired to be a good driver Quite so. Since when is any AI capable of forseeing future events by using experience? So far, only we have been able to do that, not even the closest relatives.
Here we have stretches of highways without any speed limit. I really enjoy a ride at the maximum speed my car is capable of, usually while having a good eye on what happens on the lanes to the right. Most poeople see you coming and wait until you have passed, but there is always a 'Kamikaze' who pulls out right in front of your nose at a fraction of your current speed. A AI would not react to them until they actually pull out, but then it may already be to late. How do I notice them ahead of time? I don't know. It must be something in the way they behave prior to changing lanes, but I notice them and hit the brakes before they actually do it.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: How do I notice them ahead of time?
I would suspect that you see the person looking at the lane to see if there's room before moving over.
Normal human reaction time is around a quarter of a second. I think most of the self driving cars are quite a bit less than that, although I don't have the numbers in front of me. Think of it this way though, if that other car had AI, it would see you and not pull out in front of you, or at least speed up before doing that. You're looking at it as me, try looking at it from a collective stand point and I think the advantages tip way to machine advantage.
|
|
|
|
|
|
Exactly what I mean. It can only react to a situation, but posesses no foresight.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: Really? How will they do that? How do you unit test the AI? How do you prove that your AI can deal with any circumstances a very complex world throws at it?
You must drive somewhere, some other country even, than the one I live in.
Nothing like, every single time, there is a major storm watching the videos of many cars crashed off the side of the road because people failed when driving in that. Not to mention multiple car pile ups where people were going to fast for conditions.
Then there are the accidents where someone hits the wrong pedal and ends up inside a building. Or actual clubs whose sole purpose is to race, actually race, down normal streets late at night. Hundreds of people show up at these meet ups.
Not to mention, drunks, high, medicated (prescribed by the way), falling asleep and a huge variety of other distractions.
I once was on the highway and looked over to see a car with no driver. Turned out the driver was completely prone reaching for something in the passenger seat.
CodeWraith wrote: Look at how miserably we fail at testing normal code made up of simple, limited functions.
However that proves the very point. You are claiming that human programmers are fallible. But so are human drivers. But the code IS tested. Are you claiming that every human driver is tested as extensively? Especially on an on-going basis?
|
|
|
|
|
No. All I am saying is that you are making a deal with the devil. The good part is that the devil likes to honor agreements to the letter, but usually in a way you are not going to like at all.
I have played enough with AI to tell you that exactly this is going to happen. It already happens in simple scenarios and complex real world scenarios just beg for this behavior. It's the very nature of any AI to explore the possibilities within the frame you have set with your directives.
I wish you good luck when someone wants to hold you accountable for the actions of your product and you have to explain everything to a judge.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: No. All I am saying is that you are making a deal with the devil. The good part is that the devil likes to honor agreements to the letter, but usually in a way you are not going to like at all.
I doubt that. For example I would expect that a self-driving car would stop at a red light always. Now I always attempt to stop at red lights. Always. Very occasionally that is a bad decision because I end up sliding through the intersection on the ice. And that is something that I am very ill-equipped to deal with. I suspect a self driving car would be better able to do that.
CodeWraith wrote: I have played enough with AI to tell you that exactly this is going to happen
You mean versus my last three cars that were totaled by the illegal actions of other drivers? So the AI is not going to be obeying the traffic laws and would not be better capable of detecting and avoiding collisions?
CodeWraith wrote: and you have to explain everything to a judge.
Versus the multiple drivers whose cars have already unexpectedly accelerated or refused to stop?
Versus the drivers who are still driving with multiple DUI convictions? Versus the drivers whose licenses are suspended immediately by a judge and then who leave the court and get into their car and drive away?
|
|
|
|
|
I doubt that 'all' bugs will be sorted out, but that is not the point. At all. The point is that the system works better than most human drivers. Judging by the very few reports of autonomous vehicles involved in accidents, these systems have already surpassed that mark!
I'm sure if, today, all vehicles would be equipped with the latest autonomous systems, the number of accidents would be drastically reduced, and the main cause for accidents still happening would be pedestrians, bikers, and other road users that are not equipped with such a system for whatever reason, behaving in erratic ways.
The only good reason against such a stepp would be indications that autonomous systems can cause crashs among themselves - so far I am not aware of a single incident of that kind, but of course there are too few autonomous vehcles around for that to be a useful statement at this time.
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
Stefan_Lang wrote: Judging by the very few reports of autonomous vehicles involved in accidents, these systems have already surpassed that mark!
Read up on how many vehicles are being tested, it's not that many, especially in comparison to the average number of cars on the road in the test areas. Besides, you are basing your statement on reported incidents -- I have little doubt that a lot of "minor" incidents have not been reported.
Let's put this in IT terms: Testing so far has been unit testing. One report mentioned 600 cars. Ok ... over what time span and how many on the road at once? What are the recorded incident types, and how many of each occurred during testing?
So far there's been no system testing, and certainly not load testing. Before we can do these the following critical question must be answered:
How many deaths of innocent persons during testing are an acceptable number?
More deaths will occur as the number of autonomous vehicles are on the road.
|
|
|
|
|
You're assuming that the dead woman was seen. From the initial police statement it seems likely that she managed to successfully hide herself in shadow prior to stepping out onto the road.
"There is video of the crash, which investigators are examining but not been released to the public. "It’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway," Moir said. Police have previously said Herzberg was not using a crosswalk."
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
It's one thing when you see other road users well ahead that, judging by their current behaviour, can be expected to perhaps act eratically. But it's simply impractical, even impossible, to give everyone a wide berth as well as slowing down your speed, just in case.
I am not aware of autonomous systems that in fact watch other road users and try predict their behaviour, but that doesn't mean they couldn't learn to do just that in the future. The question is whether we want it to have that capability - it may eventually decide that it's not a good idea to give us a ride at all: "I'm sorry Dave, I'm afraid I can't do that"
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|
lopati: roaming wrote: If this woman "walked out in front of it so close than nothing could have prevented the collision" seems likely she was already close to the edge of the road, most humans would (1) gently nudge the car away from that lane/road edge before reaching (I'm sure in Az the lanes are wide enough), and (2) pay extra attention to watch for change of direction.
Most? Exactly what percentage is "most"?
I know for a fact that where I live there is a law, a specific law, that says that people must move into another lane when approaching emergency vehicles on the side of the road. They even recently made a special effort to look for and ticket people that did not (so effectively like speed traps.)
So certainly some people believe "most", what ever percentage that is, is certainly not enough in exactly the situation that is most apparent - the one with the big blinding police lights.
That suggests to me that even less than "most" are going to do that when there is some obstacle on the side.
I will note that I do slow down and give extra room. Where upon sometimes other people pass me, sometimes illegally, going at a speed that exceeds the speed limit, when I do so.
lopati: roaming wrote: You see a drunk on the road do you pass within inches or wait till a nice big gap appears...
You think that is an argument for not having self driving cars?
|
|
|
|
|
OriginalGriff wrote: she walked out in front of it so close than nothing could have prevented the collision
Common sense and experience would probably have saved her, i don't know how you guys drive but as soon as i see someone behaving like he would just jump on the road i drive slower and focus to react as fast as possible. Same logic as with a ball jumping on the road, expect a kid jumping after it and you'll save a live.
And another thing, the car was obviously speeding brilliant technology
Rules for the FOSW ![ ^]
if(!string.IsNullOrWhiteSpace(_signature))
{
MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
}
else
{
MessageBox.Show("404-Signature not found");
}
|
|
|
|
|
I haven't seen and video of it, so I can't comment - it's entirely possible she was engrossed in her phone and just turned sharply and walked out, I don't know.
If the car was speeding - and not of the reports I've seen have mentioned it - I'd be surprised. Some cars have come with "up to the limit" cruise control for years so I suspect that Uber would have limited it to the posted limit. Heck, my GPS goes "bong, bong, bong" when I exceed a speed limit!
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Your right with that, i'd just wanted to point that out
Well, according to German News the car was driving ~39,5Mph on a 35Mph road, that's speeding i guess.
And yea my GPS does the same
EDIT:
BTW! found a link Self-Driving Uber Car Kills Pedestrian in Arizona, Where Robots Roam - The New York Times[^]
Don't know how much you like the nytimes, but there are plenty of other papers telling the car has speeded
Rules for the FOSW ![ ^]
if(!string.IsNullOrWhiteSpace(_signature))
{
MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
}
else
{
MessageBox.Show("404-Signature not found");
}
|
|
|
|
|
39.5 on a 35 is within the UK "unofficial tolerance" applied by the police: posted speed + 10% + 2.
So they don't worry about 35 in a 30, 46 in a 40, 57 in a 50, ... and 35 would be 40.5
It's to allow for inaccuracies in speedometers and / or tire wear affecting the speedo reading I understand. I'd suspect other countries do the same thing.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I know, we have the same "rule" though it's not 10% but more like 7% + 0 but would you expect an autonomus car that is design to drive BETTER than the human driver to drive faster than officially allowed?
I think if it is 50 or 35 the AI car should drive 50 or 35.
Rules for the FOSW ![ ^]
if(!string.IsNullOrWhiteSpace(_signature))
{
MessageBox.Show("This is my signature: " + Environment.NewLine + _signature);
}
else
{
MessageBox.Show("404-Signature not found");
}
|
|
|
|
|
I know what you mean, but ... that may be a problem in "real world" traffic which isn't doing that.
Just how annoyed is Joe BMW going to be when driving six inches from a self drive car with the headlights on full beam doesn't speed it up?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|