Self driving cars

You've been warned

Tuesday, April 9, 2019 by Iben | Discussion: Everything Else

Will a self driving car be able to see in dense fog in the day time?
I use fog lights and slow down.
What happens when the detector can't see but the brain box thinks it still can?
Human drivers know from instinct what other human drivers might do in many
driving situations. Human drivers can't predict what self driving cars will do,
adding another layer of uncertainty and possible chaos to a human's drive.
How many miles have the people driven who think self driving cars is a good idea?
When given a split second choice will the self driving car hit a cow or the bus?
I'll take the cow choice and aim for anything but dead center.
Will a self driving car even know one is a bus and one is a cow and take into
account the speed, direction, weight and possible passengers of each choice?
Cruise control for the gas is all I would use and have used, I would never feel safe in a self driving car.
They are crazy.

First Previous Page 2 of 2 Next Last
Jafo
Reply #21 Wednesday, July 17, 2019 10:27 AM

I've met some of the world's most capable drivers.  The holders of a 'Super Licence' - that means Formula One [in spite of what the locals might say...it IS the pinnacle of Motorsport] and yet I can say quite unequivocally that the current World Champion [and thus in theory the best driver on the planet] is an utter dickhead.  Doing burnouts in a rental car outside the circuit in a public city street?....there are eff-wit HOONS that do that.

Provided Hamilton doesn't program the self-drive cars there's a fair to middle chance they will end up superior/safer than him....

Sadly, as an FIA Observer I'm required to be impartial....but I do tend to smile a wee bit when he sticks it into a barrier...

Dcrew57
Reply #23 Saturday, August 3, 2019 9:07 AM


At least self-driving cars don't read mobile/cell phone text messages while driving....

I am in total agreement there. I commute on the interstate back and forth to work everyday. At least a 3rd are distracted with something. I have had a car totaled because of a distracted driver. She totaled 2 cars but got to drive hers home. That's not right.

MindlessMe
Reply #24 Saturday, August 3, 2019 8:14 PM

At the end of the day, we have to look at the many ways the car has to get information. Visually there are simple cameras, multi-spectrum cameras, OCR technologies, etc. to help "see" what is around them. With tech like that, something like fog is irrelevant. When you add an advanced AI into the mix, you have the ability to make a car far safer than if a human was behind the wheel. The problem comes when the AI is required to make a decision that has potential impacts on the vehicles around it. It's the classic trolley problem. How do you teach AI to rationalize the potential loss of life.

Autonomous vehicles are something we won't escape, and in fact, the tech is growing rapidly. 

Franton
Reply #25 Sunday, August 4, 2019 5:11 AM

That there are a few cases of bad accidents with self driving cars, that's no proof of them being unsafe. In fact, when you count the number of cases, it's more of a proof of them being much safer than human drivers! And that's before figuring out whether the self driving car is actually to blame in the reported case - most I've read were just about SDCs being involved, not them causing it.

Mind you, I agree that there are still many situations human drivers can handle better than SDCs. Or I should say, could handle better, if they pay attention, and are experienced enough to react on time and in an appropriate manner. SDCs come to the rescue when the human driver doesn't pay attention or is too unexperienced to react properly and in time.

Unfortunately, when you ask a human driver, (s)he'll always only consider how good they can be at the wheel under optimal conditions, and therefore not choose to let the car drive in their stead. They won't consider the off-chance of them being unattentive at the wrong time, or not being able to handle a case they've never experienced before. It's a psychological problem, not a technical one.

Franton
Reply #26 Sunday, August 4, 2019 5:28 AM

MindlessMe

At the end of the day, we have to look at the many ways the car has to get information. Visually there are simple cameras, multi-spectrum cameras, OCR technologies, etc. to help "see" what is around them. With tech like that, something like fog is irrelevant. When you add an advanced AI into the mix, you have the ability to make a car far safer than if a human was behind the wheel. The problem comes when the AI is required to make a decision that has potential impacts on the vehicles around it. It's the classic trolley problem. How do you teach AI to rationalize the potential loss of life.

Autonomous vehicles are something we won't escape, and in fact, the tech is growing rapidly. 
This is not a problem at all, because if you ask three humans, you'll get four opinions on how to handle such a situation! No matter how an autonomous system will answer this question, it will always be dissatisfactory, because we, humans, cannot answer the question ourselves!

Anyway, it's a constructed situation that will likely not occur to anyone in our lifetime; so even if an autonomous system won't react in the way we'd like it to - whatever that is - the sheer amount of avoided accidents, injuries, and deaths in other situations will easily outweigh it. Besides, only an autonomous system would even be fast enough to estimate the probable outcome: a human will almost always have to make an uninformed decision within a fraction of a second. That's why nobody thinks of blaming a human if he made a wrong decision. Why blame an autonomous system that makes a different decision based on facts and calculations? Why should we even want it not to do that?

Or, to put it another way: the chance of dying in such a situation because an autonomous system decided to sacrifice you is much lower than to die by pretty much anything else we decide on every day, e. g. taking a plane that might crash, or simply crossing a street at the wrong moment. Nobody will then say, afterwards, that is was your fault because you made the 'wrong' decision.

You can't blame the autonomous system for making a decision, as long as it was a reasonable one at the time it was made. If everyone would use one, I'd feel much safer on the streets. The off-chance of my autonomous system killing me to safe other people's lifes is acceptable under the premise that the autonomous systems in other cars will do the same to protect me.

Franton
Reply #27 Sunday, August 4, 2019 5:48 AM

starkers

Besides, how lazy is the human race getting? I mean, why can't people learn to drive, be tested and licensed to drive properly, or is that too much effort and concentration?

Nope, self-driving cars are like sleeping with a cow pat under your pillow . You're better off not doing it.
If I had to decide between a human who doesn't understand the difference between lazyness and safety, or makes inadequate analogies, and an autonomous system that can make educated decisions before a human driver's receptory even recognize there is a dangerous situation, I'll take the latter.

Please login to comment and/or vote for this skin.

Welcome Guest! Please take the time to register with us.
There are many great features available to you once you register, including:

  • Richer content, access to many features that are disabled for guests like commenting on the forums and downloading skins.
  • Access to a great community, with a massive database of many, many areas of interest.
  • Access to contests & subscription offers like exclusive emails.
  • It's simple, and FREE!



web-wc01