Hacker News new | past | comments | ask | show | jobs | submit login
G.M. Unveils Its Driverless Cars, Aiming to Lead the Pack (nytimes.com)
150 points by elsewhen on Nov 29, 2017 | hide | past | favorite | 224 comments



Cruise vehicles practically swarm the area where my office in South SF is. I have to say that they're very annoying to drive around, because as the article states, they do not react to surroundings in the same way a human driver probably would. I've nearly rear-ended a couple myself. While I wouldn't say their vehicles make driving more dangerous for human vehicle operators around them, I do believe that they likely increase the risk of minor collisions.


I'm going to argue that this is a feature, and you're already less likely to die on the road, thanks to those annoyances. We've known that humans tailgate like hell for decades, and haven't found a cure. Cruise vehicles have pointed out that you (like almost everyone else) tailgate to a dangerous degree, and have at least slightly changed that behavior. If that saves you from a violent high-speed collision in a year or two, you owe them your life. Having a sprinkling of more "ethical" vehicles around might be a good thing.


On one hand I totally agree with you in that it's your fault if you're tailgating and can't safely stop, regardless of what the vehicle in front of you is doing.

However, I totally sympathize with him, living in Pittsburgh and driving in proximity to Uber's and a few other company's cars. There have been times when I've been following an autonomous car and it makes a turn onto a completely clear road, then suddenly slams on the brakes for no observable reason. Yes, I should be ready to handle this situation, but it's certainly not expected.

Knowing that Uber safety drivers intervene an average of about once a mile certainly makes me wary of these cars and not enjoy driving around them.


Interesting, wonder if it will become necessary to build automated car coordination networks past a certain density. Systems like these could get into funny feedback loops where one car stops dead, causing the next one to stop dead, take off, rinse and repeat. Or, my favorite part of living near Boston, the traffic circles; I can't wait for the day a convoy of automated cars deadlocks a rotary :-)


Feedback loops are how we get traffic jams right now. A well-built synchronized system should be less susceptible to them than the current, unsynchronized one.


I don’t think so. Humans employ more advanced techniques when they’re driving, and as autonomous systems improve they should be able to do these “risky” moves too. Right now the primary concern is safety, so V1 will be overly cautious to the point of causing problems. Things will start to get interesting when the system is hyper-accurate and can route cars around at high speeds within seconds of hitting each other for maximum efficiency.

Humans employ these driving techniques not because they’re stupid, but because they’re confident that they can pull off these moves safely (even though that is sometimes not true). In the future I’d expect the software to also advance to a state where it can confidently speed and pull off risky moves.


I think you've missed my point, which was narrow - the stated "problem" is much better than a fiery collision problem.

Of course, there are other problems (bugs really) still extant that don't have an upside I can see - such as remaining stationary and letting another vehicle back into you from 20 ft away, out of "caution"; which I think is a rough description Las Vegas crash.

I don't doubt that AI that isn't ethical (say causes others to crash sometimes, but saves you ten seconds) could be create and might appeal to many, but I don't look forward to this development.


> We've known that humans tailgate like hell for decades, and haven't found a cure.

Haven't found a solution, or have found one and haven't implemented it? For example, I've been recently pondering how to resolve tailgating. How about a system like this:

Vehicles will be mandated to have distance sensors in both front and rear (some already have this for sake of back-up cameras and auto-braking), essentially a car sonar system. If your car is moving, it detects the distance between you and those in front and in back. If you are traveling too closely to the car in front of you (tailgating), your vehicle will be mandated to emit a warning buzz sound in the cabin, which intensifies in strength the smaller the gap. For someone tailgating you, perhaps a visual indication behind.

This way the closer one tailgates, the more audible (and possibly visual) feedback a driver gets. Hopefully the audio is annoying enough that most sensible tailgaters (oxymoron?) would back off.

Strengthen this with a visual indicator on the roof of the vehicle, perhaps, so that if someone is tailgating, a light on the roof turns on indicating such. Would make it trivial for police officers and traffic cams to notice. I think part of bad driving behavior is the anonymity cars today offer: with tinted glass and the feeling like you're in a safe, sound-proof cocoon, so this visual indicator may help shatter that in a good way.


I agree, we are at the point where this should be done. To update an old saying: "A stricture by a nanny-state in time saves nine."


I find turning on my Antilag is a good solution to tail-gaters. Nothing like shooting fire at the car behind under deceleration to give you a bit of room.


Let's just make it legal to equip your car with a flamethrower on the back as long as the flames don't go out further than 20 feet while your car is stationary, and it only activates when an inbuilt radar detects a tailgater behind you.


There is a general rule here, that I forgot to state - we should probably be building scare-factor variability into automatic safety systems of all kinds; to ward off risk homeostasis. So people don't just press their luck until the safety cushion is gone.


Agree. I’m looking forward to being able to walk / bike without taking on other drivers externalities in the form of risk.


Ah, optimizing for maximal sanctimony. Interesting concept.


No, it's all of us, nobody's a saint, we all need reminders.


If you rear-end a vehicle that's established in your lane, it's always your own fault, no exceptions. It's your job to follow at a safe stopping distance. The other cars can be blamed for disputing traffic, sure, but you as a driver are responsible for always following at a safe distance and assuming every other car could slam on their brakes at any time.


> every other car could slam on their brakes at any time

Or veer out of the way of an obstacle you can't see, leaving you to either similarly veer or collide with it. IOW, do not assume you only need to account for your own reaction time; rather assume the vehicle in front of you can instantly stop moving and you must account for both your reaction time and your vehicle's stopping distance.


Your safety advice is completely unreasonable to follow on highways. When you give that sorta advice, you make safety worse not better.

It takes about a 100 yards -- a full football field -- for a normal passenger car to react and come to a complete stop.

https://www.qld.gov.au/transport/safety/road-safety/driving-...

This is equivalent to 4 full seconds of following distance, which is double the typical 2-second-rule recommended by transportation governing bodies.

https://en.wikipedia.org/wiki/Two-second_rule

If your suggestion was taken seriously, it would literally halve the carrying capacity of our highways.

Luckily, situations where the vehicle in front of you comes to an instantaneous stop are extraordinarily rare, basically requiring a head-on collision.


From your own link:

> The two-second rule tells a defensive driver the minimum distance needed to reduce the risk of collision under ideal driving conditions. The allotted two-seconds is a safety buffer, to allow the following driver time to respond. [emphasis mine]

> Some authorities regard two seconds as inadequate, and recommend a three-second rule. […] The United States National Safety Council suggests that a three-second rule—with increases of one second per factor of driving difficulty—is more appropriate. Factors that make driving more difficult include poor lighting conditions (dawn and dusk are the most common); inclement weather (ice, rain, snow, fog, etc.), adverse traffic mix (heavy vehicles, slow vehicles, impaired drivers, pedestrians, bicyclists, etc.), and personal condition (fatigue, sleepiness, drug-related loss of response time, distracting thoughts, etc.). For example, a fatigued driver piloting a car in rainy weather at dusk would do well to observe a six-second following distance, rather than the basic three-second gap.

I've had to veer to avoid accidents. At least three sitautions I can remember on the interstate: a semi truck tire blowout in front of me leaving large parts of the tire in my lane; a bike rack with several bikes falling off the back of a car in front of me; a truck in front of me veering around a obstacle (it appeared to be a pickup truck bed liner) I was blind to because I couldn't see beyond the truck in front of me.

I was lucky to avoid all of these because I had a parallel lane or shoulder I could escape into. When I drive now, if it's congested I leave extra following diatance. If that reduces the road capacity so be it. Compared to drivers who follow much too closely (which itself causes sudden slowdowns since they tend to over brake) or who drive in the wrong lane for their speed, I don't think I'm the person you need to worry about.

FWIW, I haven't been in an accident in over 25 years.


Nowhere on my link does anyone recommend a general 4-second rule, and this doesn't even account for the fact that we should expect safety organizations to err on the side of too much safety. Note that your recommendation is specifically not a tailored approach designed to adjust to risky situations, but rather a fixed criteria.

Your strategy to take all possible precautions regardless of large efficiency loss doesn't align with the reasonable tradeoffs that the vast majority of humans make.


I disagree. On a congested road, If you can't see two cars ahead of you and react to the actions of that second car, you're not driving safely.

On a non-congested road, there's no excuse for putting sufficient distance between you and the car in front.


Your recommendation on non-congested roads is vague and I don't think your recommendation for congested roads conflicts with my critique of js2. Certainly, it's often impossible to see 200 yards ahead on various stretches of the highway, so any reasonable driver will be violating js2's advice if they follow yours.


If you think putting sufficient distance between you and the car in front is vague, you are a poorly educated driver and most likely a dangerous one.

If you can't see 200 yards ahead, you should not be travelling at a speed which precludes you from stopping within 200 yards.

On a single lane uncongested road, there's no excuse for putting a reasonable (at least 3 second) gap between you and the vehicle in front.


> If you think putting sufficient distance between you and the car in front is vague, you are a poorly educated driver and most likely a dangerous one.

Huh? The whole context of this discussion starting from js2's comment is about exactly how far is should be, and in particular whether 2 seconds or 4 seconds (which at highway speeds is 50 yards vs. 100 yards) is more reasonable as a rule of thumb. As you can see from my link, most government safety organizations recommend 2 or 3 seconds in good weather.


A well known insurance scam is changing lanes immediately in front of another car then coming to a sudden stop, forcing a rear end collision. It's hard to prove unless you have a dash cam and even then you might be found at fault if the other party is convincing enough to the officer who investigates the accident.


> A well known insurance scam is changing lanes immediately in front of another car then coming to a sudden stop, forcing a rear end collision.

It's an even more brilliant way to defend barely-working robot cars. Program them to err toward the letter of the law rather than road safety, then blame other drivers when they slam on their brakes because they mistake plastic bags for pedestrians.


> If you rear-end a vehicle that's established in your lane, it's always your own fault, no exceptions.

Of course there are exceptions. There are any number of well-established circumstances that, if proven, place responsibility on the driver in front.

It is not legal to suddenly panic-brake to a stop, for instance, without exigent circumstances which make it necessary to do so. If a driver causes a collision by doing so they bear liability. There are any number of insurance scams that are thwarted by such exceptions.


Citation? I haven't been able to find any instance where this is true except where the driver was found to have braked suddenly with the intent to cause an accident, which would clearly not have been the case with a self-driving car.


Are you saying brake checking a tailgater is not illegal? It is the persons fault for not rear ending a vehicle, its also the first drivers responsibility to drive responsibly (i.e. not braking or stopping for things that are imaginary)


Similar to how being an asshole and flipping someone off while driving is not illegal, or how being an asshole and tailgating someone is not illegal, brake checking a tailgater certainly makes you an asshole, but it is not illegal.

Social responsibility be damned.


It is illegal in many states under 'aggressive driving' or 'road rage'.


Feel free to feel smug whole dying in a ditch by the side of a road. Or you know maintain a reasonable following distance so sudden stops are a non issue...

Really, when it's ego vs several tons at highway speeds it's not ego that wins. Further, 'I thought I saw something' is a blanket exception for sudden breaking.


Im just saying self driving cars should be required to drive within a reasonable expectation of a normal, safe, predicable manner. If they are not there yet, maybe they shouldn't be on the roads yet.


They are. If this was about sudden evasive manovers and lane changes then sure, but simply breaking more frequently is a non issue because again any car can break at any time and you need to deal with such reasonable and normal behavior.


Based on the number of comments from people that live in these town I do not believe this is simply more frequent braking. It sounds like it happen a lot and unnecessarily, which adds risk and frustration to humans also using the road.


People are more likely to notice and remember self driving cars, so you can't assume they are accurately reporting frequency vs. just remember more frequently and or a biased sample.


...someone brake checking is the one running on ego. And they're the one likely to get hit despite being "technically correct".


If someone is dangerously tailgating me, I will very lightly brake-check them (barely slowing down) because the vast majority of tailgaters seem to be mediocre drivers who aren't thinking deeply about their actions. If they persist, I'll slow down and let them overtake.

When I'm driving, I'm constantly scanning for bad (i.e. unsafe or unpredictable) drivers in all directions. Once spotted, the choice is either to be well past them or well behind them. Usually the choice is easy.


Tailgating is a sign of gross incompetence. I never said you should break check, but if your going to have an accident it's better to be rear ended than hit something and then been rear ended with a much larger speed differential.

PS: Tail gaters are creating a highly dangerous situation so slowing down without jamming the breaks is the correct response.


I've been rear-ended a couple of times. In all cases, I talked to their insurance and said, "They had the responsibility to maintain a safe driving distance, it doesn't matter what I did, they can't rear end me," and that has been an argument that has always completely worked: they've always taken full responsibility for the accident.

(To be clear, I wasn't intentionally provoking anything: in one case I hit my brakes because the car in front of me panic-stopped, in another I was in a right-turn-yield-into-another lane situation and I checked my blindspot and found that there was a car who had right-of-way, so I hit my brakes to keep from encroaching on them.)


My car got rear ended in the snow. It was parked outside my house and I was in bed at the time.

It was a good couple of months before they took responsibility for the accident.


i believe "right-turn-yield-into-another lane" is forbidden in man places for this reason, although I might have misunderstood you.


I think you probably did. I was making a right turn at a controlled intersection. Specifically, the dedicated right turn lane was controlled by a yield sign. I had slowed and was preparing to merge into the other lane, checked my blind spot over my shoulder, and found that there was a car in my blind spot, so I braked to let it pass me (since the yield sign gave it right of way). The car behind me failed to react to my breaking and rear-ended me.


> It is the persons fault for not rear ending a vehicle, its also the first drivers responsibility to drive responsibly (i.e. not braking or stopping for things that are imaginary)

Pitch this line to your insurance company if you ever end up causing a rear ending. It'll be entertaining for the rest of us at least.

Doesn't matter what the car in front of you is up to or not. If you do not leave enough space to be able to handle them stopping, you are at fault. You should be utilizing defensive driving, and not depending on the driver in front of you ever acting responsibly or rationally.


You will be liable but the first driver has a responsibility to drive responsibly. Stopping for no reason and causing an accident should be considered aggressive driving and in many states is illegal.


How do you propose I tell the difference between something I imagine seeing and something that is really there? In both cases there's an observable phenomenon, and it is only though collision with the imagined object that I can tell the difference.

If I think I see a child in the road I'm sure as hell not going to stop and wonder if I'm just imagining it before I brake.


He didn't cause an accident, though. If you crashed into him, you were too close and caused the collision.

It's not about intent, it's about behavior.


Your insurance carrier will not care what the other driver should have been doing, if you rear end someone you are at FAULT as you failed to keep a safe distance from the car in front of you. It doesn't matter if they slow or stop for any reason ahead of you, if you run into them it's your fault.

> "For purposes of insurance and policing, the driver of the car that rear-ends the other car is almost always considered to be at fault due to not leaving enough stopping distance or lack of attention." [1]

I'm glad we have drivers on the road that hope for the best in other drivers, but you need to sink this in your head today:

You should treat everyone on the road as an accident waiting to happen, and should practice defensive driving at all times in response.

[1] https://en.wikipedia.org/wiki/Rear-end_collision


I might need to slam on the brakes at any moment because a child or deer or something ran onto the road. You always have the obligation to maintain a safe stopping distance. No exceptions.


Stopping for something imaginary and brake-checking a tailgater are not the same thing. Another person's irresponsibility does not justify your own. Your responsibility does not change based on your value judgement of another person's actions.


Braking with the intention to cause an accident is illegal in some places. I can't find any law making it illegal to suddenly brake because you mistakenly believe that it is necessary to do so.


> I've nearly rear-ended a couple myself.

Then clearly you need to get a car with better proximity detection systems. That's easily the simplest collision to detect and avoid for any automated system.

I know that sounds glib, but you need to recognize that if your ability to drive safely is so dependent on the "expected" behavior of other drivers, that maybe driving the way you are (which is to say, in the state of being a human) is the wrong way to solve the problem.

> I do believe that they likely increase the risk of minor collisions.

Pretty sure to the extent that this has been studied, exactly the opposite is true. For every "annoyance glitch" you posit, these things are saving 6-7 "loss of attention" fender benders.


You can't just tell every person to "get a car with better proximity detection systems". The product should fit as seamlessly into the world as possible, without forcing less wealthy people to deal with a higher chance of collisions.


> You can't just tell every person to "get a car with better proximity detection systems".

Then it's a good thing I didn't.

I was replying (with a bit of humor) to the upthread posters notion that because these cars behave in a way that s/he personally finds confusing they are more dangerous (they aren't) and by extension shouldn't be on the road. And that's ridiculous. And the proof to the contrary is that exactly the kind of unsafe situation the poster was complaining about is something that automated systems can almost completely eliminate.


Abnormal behavior on the road increases risk by creating unknowns. I don't think we should ban autonomous cars from new behavior, but I think it's just as odd to say that unpredictable but legal behavior is somehow just as safe as predictable legal behavior.

Legal != Safe.


But in this case we know that there are cases where a reasonable driver will be forced to do the behavior he is calling unexpected. Thus even though it is unexpected it is something that should be accounted for. If you cannot react in time then you need to adjust your behavior. I'm always amazed at how few people follow the 2 second following distance experts tell me is the minimum humans need for reaction time. (which works out to .2 seconds to realize there is a problem and 1.5 second to get your foot physically moved to the brake)


Not even just less wealthy people, but anyone who tends to be financially conservative. I don't think there's an economically sensible option to purchase a car like this right now.


https://goo.gl/KYATXX

That's a $22 backup camera.

I haven't seen it yet, but I don't know why somebody couldn't develop a similar kit to put on the front bumper that could warn about traffic ahead, lane drifting, etc.

A <$100 kit that can retrofit my old beater with some modern features would be money very well spent to protect a multi-thousand dollar asset, personal injury, and liability.

I believe that all manufacturers are required to build in backup cameras now. Why not require all registered vehicles to be retrofitted?

Next up, auto-braking.


Damn dynamic pricing... now it's up to $30. :-)


Not if the system has so many bad drivers, and it does. We don't want automated cars driving badly too, just because some people are impatient or are just no good at driving.


I might do that when I can buy a reliable one used for less than $5000. Personally I don't have enough money where financing vehicles is a negligible expense. Did it once...not planning on doing it again.


>> I've nearly rear-ended a couple myself.

>Then clearly you need to get a car with better proximity detection systems.

As it was nearly a rear-end collision but not actually a read-end collision doesn't that mean that sithadmin's current collision detection system (whatever that may be) is adequate and functioning properly?


Yep, I called it. Humans will be the bugs in your code. https://news.ycombinator.com/item?id=14652382


Imagine how annoying cars would have been to horse riders when they were first introduced. Startling the horses, likely increasing the risk of minor collisions... What a pain!


Except that we have the value of hindsight and presumably an advanced understanding of risk.

Back then it might have been perfectly acceptable to throw a bunch of horses and cars together and risk them crashing into each other.

Today it should be unacceptable to throw automated cars in with human drivers and put the onus on the humans to alter their innate understanding of 100 years of traffic flow.

Your comment is cheeky and annoyingly inapplicable.


Automation will take over very quickly, humans and machines won't share for long.

There's a tipping point where only people who want to drive will still do so which will likely make the insurance costs for those drivers very expensive because they're driving among automated vehicles which increases consequences of bad driving. (Fully automated, machines can talk to each other to avoid collisions).

This increased cost of insurance will encourage more people to switch to automated which will further encourage switching.

That tipping point is probably a relatively small percentage before it starts snowballing. All it takes is to get to a point where automated drivers are safer enough than driver-based that they can be insured significantly cheaper.

It won't take long after 95%+ of people are automated with the remaining 5% causing 90%+ of accidents for manual driving to be banned or shunned as a socially unacceptable risk like smoking.

I predict this will happen within 30 years.


>costs for those drivers very expensive because they're driving among automated vehicles which increases consequences of bad driving.

Or very cheap because the people who self select to drive will be the people who care enough about driving to not do things like text and drive and when they do slip up the machines around them will prevent it from causing problems.

>shunned as a socially unacceptable risk like smoking.

Among the middle and upper class in Bubble Valley maybe. Everywhere else it's more of a "you do you" thing.


Completely agree.


> put the onus on the humans to alter their innate understanding of 100 years of traffic flow

That is wildly overstating the complexity here. A teenager can learn to drive and pass their test with zero prior experience and maybe 10 hours of practice. "Altering" that intuition to account for the quite a bit simpler behavior of automated cars isn't going to be any harder.

This is just ludditism. The simple truth is that automated cars drive more safely, with fewer mistakes and much less velocity changing (i.e. lane switches, passing attempts, "oops" late braking, missed exit swerves...). They are easier to predict and understand, not harder.


> quite a bit simpler behavior of automated cars

Simple behavior is often more predictable, but not always. And it's predictability that matters.

> The simple truth is that automated cars drive more safely, with fewer mistakes and much less velocity changing (i.e. lane switches, passing attempts, "oops" late braking, missed exit swerves...).

> "oops" late braking

I'm not 100% sure what you mean here, so I'll split apart "oops" and "late".

A self-driving car might be less likely to brake late.

But everything I've heard says that a self-driving car is far more likely to "oops" in the form of sudden hard braking. Are you disagreeing? Can you back it up?


> presumably an advanced understanding of risk....Back then it might have been perfectly acceptable to throw a bunch of horses and cars together and risk them crashing into each other.

> Today it should be unacceptable to throw automated cars in with human drivers and put the onus on the humans to alter their innate understanding of 100 years of traffic flow.

No one who drove a car in 1917 is still driving today, and horse behavior had far more than 100 years to become innately understood.

But I think that it's wrong to assume that our modern attitude towards risk is 'advanced'. It's different, for sure. Thousands of people were likely killed or suffered as a result of automobiles mixing with horses. Today, 1.25 million people die annually as a result of automobile accidents - that's more than one person every 30 seconds. And many people also died as a result of taking risks while developing air travel, which is now one of the safest ways to travel. An attitude towards risk which suggests that equivalent risks are unacceptable would not have resulted in the billions of lives that have been improved by and saved by advancements in transportation. The modern economy is completely dependent on automotive transportation, and the world is unimaginably improved because we have cars instead of horses.

If equivalent progress would be made 100 years from now at the cost of thousands of lives lost due to adding automated cars to our existing roads and traffic flows, how could you possibly argue that we should not take this risk?


If you almost rear-end the autonomous car, it's because you are a bad driver. There is no "innate understanding of 100 years of traffic flow", there is just a socially accepted carelessness and recklessness with which people drive that is killing thousands every year, many of them kids. It's pretty much the leading cause of death for teenagers.


I doubt he meant it literally, probably more like he had to slow uncomfortably fast because the self driving car behaved in a somewhat jerky manner. Think about what you might do if you had to slow down in this situation -- you'd try to balance leaving room ahead of you as you did it with not stopping so quickly as to startle the driver behind you. There's a lot of nuance to that, and it wouldn't surprise me if computers weren't very good at it yet.


Britain did at one time have "red flag laws" that mandated that a person precede the car waving a flag to alert people (and horses).

https://www.autoevolution.com/news/road-traffic-history-befo...


Nice QI clip about this too. https://www.youtube.com/watch?v=--8bqcCJ0L4


Agreed. There were 391,000 distracted driving injuries in the US in 2015 [1]. Experiencing some fender benders is worth it, if we can decimate the number of injuries. Let's make this transition asap.

[1] https://www.cdc.gov/motorvehiclesafety/distracted_driving/in...


That's actually a legit criticism of putting the first cars on a road, and a valid reason to impose some kind of rules on cars to mitigate "spooking" behavior induced in horses, although obviously not as strong as the so-called "red flag" laws.


Seems like it will be trivial to DDOS an autonomous car by just putting a garbage can or two in front of it and behind it. If the vehicle is empty nobody will be able to get out and move or bump them.


Yes, let's impede the traffic flow of the vehicle that sees using cameras that are probably recording all the time. That's a smart plan right there.

The car will escalate to an operator, the operator will call the police.


Pre-programmed spy cars. Ya, that's going to go over great.


Cameras mounted in cars because the populace can't be trusted to respect the vehicle or its operator is already standard in some countries.

If people in the US want to go down that road, it can become standard here too.


Yes yes, the plebs better respect their robot masters.


Not standing in someone's way is them being a master over you?

Uhh, remind me not to be on a sidewalk near you...


By all means, keep pretending robots are people and that I was talking about people.


That's an even weirder position! I would think a robot is less likely to be a master over you. But you're saying that getting out of the way of a person is fine and doesn't make them a master, while getting out of the way of a robot does make them a master?


I'm obviously not talking about robots containing humans.

We should not pretend robots are people under any circumstances. If the robot needs me to move for it to continue on, then it better decide to wait.

Maybe you think it should call in a robo cop;)

This should be obvious, but I'll be more verbose, my original comment was in ref to robots with wheels, possessing possibly >MJ of KE, recording their surroundings and uploading it in real time to whatever the skynet equiv is at the time. It's an awful idea, and you are living in a bubble if you think the general population will accept it in the US. Sure, it will happen in more tightly controlled societies.


People don't collectively object to dash cams, and they let non-robot cars do something very similar.

And there's no need for it to be constantly uploading video to report a lane blockage. Heck, it doesn't need to upload any video to report a lane blockage.

As far as you being in the way: roads are not specifically for humans. A robo car has plenty of right to be there. If you wouldn't stand in the way of a taxi, you shouldn't stand in the way of a robo car.


"roads are not specifically for humans. A robo car has plenty of right to be there"

And there is is. Robots having rights too. As if they are sentient. Next those that don't anthropomorphize machines will be labeled roboist.

Machines are tools. _For_ humans. Nothing more. It's like saying a hammer has rights because I use it.

Making it a autonomous hammer does not change that, in fact it makes it much more important to not pretend it's sentient.


Oh my god you're being so pedantic.

The robo car has no rights, but the person that owns the robo car has roughly the same right to make use of the road whether or not they happen to be seated inside.

Is that wording satisfactory?

Edit: And I mean, it's not because I'm giving robots special treatment. It's colloquial english. I'd say a potted plant has "the right" to be somewhere too.


You are mixing two fundamentally different things. I made it clear I am not referring to robots containing humans. A person wielding a hammer is fundamentally different than a pre-programmed hammer some human unleashed on it's own. I consider the latter a serious threat.

If you don't mean "right" than don't use that word. As is you have redefined it to be practically meaningless.

I'm not convinced you did re-define it. Your edit makes it more clear:

"it's not because I'm giving robots special treatment"

Yes, you really are. You are arguing that since it's owned by someone it has "plenty of right" to use the road. That's wrong. Par with humans is as special as treatment gets.

  F(r) = -F(h)
  ma(r) = -ma(h)
  a(r) = -a(h)
Which a is more important? Should we treat them even remotely the same when one a ends a life and the other a causes a financial loss?

I mean... oops. Sorry for the hammer malfunction. We sent out a patch and fired someone though.

Admit it, the end solution is to just remove humans from having direct control over their KE and direction? Right? Programmers are always smarter aren't we?

I often wonder if the pre-programmed car proponents realize what side they are on in the war on general purpose computing.


> If you don't mean "right" than don't use that word.

If you so desire, I won't use it, sure.

> A person wielding a hammer is fundamentally different than a pre-programmed hammer some human unleashed on it's own. I consider the latter a serious threat.

> Which a is more important? Should we treat them even remotely the same when one a ends a life and the other a causes a financial loss?

It's more complicated than that. You can drive a car remotely, and you can have a robocar with a human in it.

A car with a human in it deserves more protection, but there's no reason it has to have priority toward getting to use lanes. A car en route to pick up a human might as well have the same priority as one delivering a human.

Whether a car is being controlled by computer or human being shouldn't matter at all. Whether it's carrying a human should matter in some ways but not in others. One case where it shouldn't matter is blocking it; neither should be blocked.


"Whether a car is being controlled by computer or human being shouldn't matter at all."

There is human-equivalent software? That's astonishing. Where can I find it? Does it run on ARM?

Edit: Bummer. I called Denso, they were adamant that their human-level code v3.11 is a trade secret.

"A car with a human in it deserves more protection"

We agree. _please_ explain how to accomplish that.

Remember, KE is relative, therefore velocity and mass limits for the pre-programmed machine are not relevant.

Clearly my empty pre-programmed human carrying drone should enjoy the same lanes as a loaded 777. Lets set aside the distraction about remote control which totally ignores the "skin in the physics game" which transportation of life requires.


> There is human-equivalent software?

Did I imply that? I don't see how that's relevant.

> We agree. _please_ explain how to accomplish that.

If you agree with me then why do I have to explain anything?

You're being confusing with all your talk of kinetic energy. Are you trying to imply that robocars inherently make the roads more dangerous? I don't think that's true at all. As an extreme example, even with today's tech you could flood the roads with 20mph robocars and make things safer overall.

> Clearly my empty pre-programmed human carrying drone should enjoy the same lanes as a loaded 777. Lets set aside the distraction about remote control which totally ignores the "skin in the physics game" which transportation of life requires.

If you want to pay the same airport fees, go for it. Seems fair to me.

If you're worried about congestion and road funding when it comes to cars then limit priority to one robocar per person. But when I have exactly one car, my ability to use the roads to drive it to the store shouldn't depend on whether my butt is inside of it.

Why would "skin in the game" be necessary? A taxi with a driver inside can act exactly the same as a taxi without a driver inside.


"Whether a car is being controlled by computer or human being shouldn't matter at all."

"A taxi with a driver inside can act exactly the same as a taxi without a driver inside."

Again, where is this software that enables the pre-programmed taxi to "act exactly the same"?

I assume you are making an actual argument, not "it can drive in a line and stop at a stop sign".


I believe a vehicle akin to the one you're describing ("act exactly the same" is a bit unnecessarily specific, but "drive safely on a road under its own direction from a point A to a point B chosen by a human") is definitely what Waymo is prototyping, and such vehichles have already driven on public roads (with human occupants, but without human occupants operating the control surfaces).


There will never be conventional digital pre-programmed cars that "act exactly the same" or even remotely as "same" as a human driver. What is going to happen, is for a bit, the control writers will blame humans for their failure, followed shortly by the population's understanind and eventual ban in many places on this unworkable idea.

Tightly restricted societies that put "safety" over "freedom" will just end up banning non-elite direct human control of KE>nJ. The repercussions of that mistake have ugly ends.


Is my argument invalid if that software doesn't exist in 2017?

And you could remote control a car today if you wanted to. Nobody inside in that case.


I'm not saying it's a good responsible thing to do, I'm just saying it's possible and a new type of vandalism that hasn't existed before. Your solution doesn't sound like it will scale well. If the police need to get involved for this every time, they're going to be very busy.


> If the police need to get involved for this every time, they're going to be very busy.

Well, they'll need something to do when speed traps become useless and there are far fewer motor vehicle accidents.


I've often thought this - but more in the context of pedestrians in crowded thoroughfares ignoring crosswalks. Sure, the police could be automatically summoned to let this car go through the mass of hundreds of people, but one block over... (Obviously, pedestrian thoroughfares separate from autos would help - but in effect, thats a subway system.)


>>I've nearly rear-ended a couple myself

How is this possible? I think you may be driving too close to cars.


:-) You haven't driven around SF much perhaps? But notice he said "almost" not "did" so clearly he is leaving enough space to avoid collisions.

The interesting thing for me at various demos has been the car/pedestrian relationship thing. Many humans won't slow for pedestrians unless they are clearly coming off the sidewalk and into the traffic lane, and even then some of them won't slow if they feel like they will be past them before the pedestrian would intersect with their car. Robo cars on the other hand immediately slow down when their algorithms suggest that the human might be about to cross the street.

Because of this, another human driver won't expect the car ahead to slow suddenly because of the loitering pedestrian (expected behavior of another human driver) and so will be "surprised" by the braking of the car in front of them. There will be a moment of cognitive dissonance while the human is trying to normalize their perception that slowing down is not 'normal' and the fact that the car ahead is now rapidly approaching their front bumper. At which point the impending collision overrides the dissonance with a self preservation reaction to brake hard and stop.


>>:-) You haven't driven around SF much perhaps? But notice he said "almost" not "did" so clearly he is leaving enough space to avoid collisions.

Have driven plenty in big cities and small towns. You should always leave enough space in front of you in case somebody suddenly stops.How much space you leave in front of you depends on how fast you are going. At least that was drilled to me by more experienced drivers. I think it was even part of the DMV exam. I have not driven in San Francisco. How bad is it? Do they drive bumper to bumper?


I have driven in San Francisco. As a rural Midwesterner accustomed to people leaving space between cars and slowing to allow people to merge onto the freeway, SF drivers are among the rudest I've had the misfortune to mingle with, and I've driven everywhere from my home in Michigan to Canada, the East coast, the deep South, the mountain states (one of the nicest places to drive, in my experience), Canada, Mexico, China, and California. Only in the outskirts of Shanghai have I experienced drivers that were less cautious and less respectful of my car's right to the road than those in San Francisco.

Fortunately, when I was in San Francisco, I was on vacation and therefore content to simply get to my destination a little later, taking back roads and continuously rebuilding the space in front of me, avoiding rush hour like the plague - but wow, those automated cars are getting a stress test. I just hope they're not programmed to drive like the Californians building them!


Come to Los Angeles / Orange County, where a blinker is a signal for the driver behind you in the lane you are merging into to floor it. Coupled with 12-16 lane freeways this makes for good times.


Yep. Leaving adequate stopping distance is laughable when driving in LA. If you leave more than a few body lengths, cars from adjacent lanes will just swerve in front to gain a bit of ground until the gaps are gone. Maybe all self-driving cars should be required to display "student driver" signs for the next couple of decades.


New Orleans, South Florida, have much more aggressive driving cultures. Bay Area is also kind of average in comparison to much of the northeast.


> How much space you leave in front of you depends on how fast you are going. Right?

In city traffic of any appreciable density the type of following distance that you can discuss on the internet without getting flak from the holier than thou types will quickly fill itself with cars pulling off of side streets or merging in front of you.

"Your safe following distance is my opportunity." -Jeff Bezos


I'll be honest - I don't respect people who tailgate or follow too closely. I think they're selfish and self-conceited.

People who think - I"m more important than you, I don't care if I kill you, as long as you don't inconvenience you.

The law (and insurance companies) are pretty clear on this (at least in Australia) - if you rear end somebody, it's nearly always your fault, for being silly enough and not leaving enough gap.

They have educational campaigns on leaving enough gap (it's meant to be 3 seconds). Things happen, and people have to emergency stop.

It's taught to every driver.


The problem is that a 3 second gap is large enough for someone else to put their car there, so that's what happens.


And no matter how many times you build that 3 second gap, someone will keep taking the space. What ends up happening is that your 3 second gap means people will try to go around you (if this is the freeway) and then get right in front of you and eliminate your 3 second gap.

I think eventually you'll get into a steady state solution where you'll be limited to 20mph on a 65mph freeway because that's all the room you'll be given - if you follow the rule of law exactly.

Instead, most bay area drivers keep about 40-70ft between cars when going 70mph. Juuuust enough to react in time to an emergency stop. It's a little silly but it's the only way you won't get cut off constantly. Only other way to not get cut off is to go quite slow and be behind traffic speeds.


My car has adaptive cruise control. It automatically paces the car ahead with about a 2-second gap. It is absolutely wonderful for reducing stress and workload in heavy traffic. But the other cars... Someone will see the gap and race around me to fill it, as if I wasn't driving exactly the same speed as the car ahead. Causing my car to slow a bit to rebuild the gap. The next car back sees that and decides the first car passed because I'm an obstacle, and copies them. And so on. Until I'm driving 5mph slower than traffic, with a constant stream flowing around me.


That is not the steady state. When a car gets in front of you and eliminates your three second gap, it does so so it can continue at the speed of the traffic ahead. For you to be limited to 20 mph, the other cars would have to get in front and slow down. I feel that in practice I never have to do more than take your foot off the gas once in a while.


So, your experience in Australia, applies to a city that has a population of 870,000 residents, and a total daytime population of over 2,000,000 all in a 46 square mile area, also with nearly every single intersection controlled in some fashion - no joke, pretty much every intersection has a stop sign or signal lights.

So based on this you give us your throw-away "3 second gap" comment, helpfully telling us that this is the law and something something emergency stop etc drive safe?

With an average speed of less than 10mph, and a very dense city, your normal traffic "rules" don't apply here. Don't even pretend you can understand how things are here without experiencing them.


Your post essentially means "it's okay to hit people sometimes; our city wouldn't work otherwise".


> Your post essentially means "it's okay to hit people sometimes; our city wouldn't work otherwise".

This is trivially true! Otherwise cars would not be allowed on public roads.

When it comes to cars, it's always about tradeoffs and acceptable risk.


>When it comes to literally EVERYTHING civilization does, it's always about tradeoffs and acceptable risk, regardless if whether you personally can identify those trades and risks.

FTFY


This so much this. Forget America or even the west, in Beijing you have to be prepared for anything to happen at anytime. So everyone drives “on edge” so to speak because you never know when some idiot will stop and back up on the expressway cause they missed their exist or some electric trike will barge straight into oncoming traffic. On the other hand, yielding is almost unheard of and traffic will rather try to sway around pedestrians.


You'd think a relatively authoritarian government would be able to stop people from doing this sort of thing. It's kind of amusing, from that perspective


China is authoritarian until it’s not. Most authoritarian countries are weak on rule of law and more aptly practice rule by law, where laws are just used to control the people vs. applied fairly. So laws are just used to get what officials want (e.g. get rid of a pesky dissident), and most of them don’t care about traffic safety.


and even then some of them won't slow if they feel like they will be past them before the pedestrian would intersect with their car

I see this all the time. I'd like to point out that this is illegal driver behavior. You can be ticked for it. It is also very dangerous for pedestrians, since it encourages people to cut the tolerances closer and closer. I've had people nearly run over my feet doing this.


Humans will have to adapt to the AIs. Large corporations generally get what they want, eventually.

Good thing is, humans are good at the adaptation bit.


The problem with autonomous vehicles at the moment is that they act like a paranoid schizophrenics with the reactions of a F1 driver, anxious to apply full braking force for no discernible reason in less than 5ms, placing everyone in and around the car in danger.


> I've nearly rear-ended a couple myself.

I have never come close to rear-ending anyone... But that may be because I keep a reasonable following distance from the car in front of me.

Tailgating is what adds danger to our roadways.


We'll get used to driving around them. You can cut off the Waymo cars and know they won't bump you.


Where in South SF? I regularly pickup/drop off my wife from work in South SF and am yet to see one


I see them in Mission Bay all the time near AT&T.


Oyster Point area. See quite a few of them puttering around near all the biotech office parks.


I see them a lot along the 3rd St corridor. Recently saw one almost come to a complete stop at a green light on 3rd in Dogpatch, before the driver took over and gunned the gas pedal.


Tons along 4th near UCSF and over across 18th in Potrero Hill.


I see one 2-3 times a week as I commute by bike up/down Bayshore Blvd.


11th street and Howard in the heart of SOMA.


WIRED talks a lot more about the drive itself, and isn't as positive about it.

https://www.wired.com/story/ride-general-motors-self-driving...


This gets at something I was taught in driving school. Slow =/= safe. Sudden stopping can be just as dangerous as any other tactic.


It depends what aspect of safe you are looking at. Higher speed = worse accident.

https://i.imgur.com/TFBOnG1.png

Hopefully we can move to 100% autonomous cars, and then crank up the speed because they are safer, have better reaction times, and are predictable.


Wow, just 10km faster significantly increases fatality rates. That's crazy.

Edit: oh it's for pedestrians, not drivers.


There was a time when my regular errands took me down El Camino in Mountain View, and at that time, NOT seeing a Google car was notable. Seeing 2 to 4 during a single outing was normal. I vividly remember when they started rolling out what I call the "jelly bean car". Sharing the road with the sensor-decorated SUV's was no problem at all. They really drove well and blended into traffic very well. The early bean cars were slow and annoying. They drove like somebody's grandparent that needs "the talk" about giving up the car keys. Very frustrating to be around. Hopefully, Waymo has addressed that by now.


Not been addressed in my experience. It's more like the other thread about rear ending - the Waymo is typically the only car going the speed limit in traffic besides some commercial vehicles and slow off the line and slowing at odd times, but I do see less of them.


“move below the speed limit and ensure you don’t kill anyone.”

Sounds good to me!


which brings up the question, if an AV can identify the speed limit do they always obey it or will operators of the vehicle be allowed to go faster?


I remember this being an issue with the Google cars on the highway. The cars, if software limited to the posted speed limit, were dangerously slow compared to the surrounding traffic. Some limited law breaking had to be introduced in order to operate safely.


My guess: If the vehicle allows manual override, and legislation does not change and force manufacturers of autonomous vehicles to prevent transgressions, the human operator will be able to go faster.


>Sounds good to me!

That's the half-lie you trot out at the beginning of a rand that ends with "think of the children"

The generalized version is "don't go too much faster or slower than the majority of drivers around you"

The one person doing 55 in a 60 (or ever 60) where everyone goes 80 may reduce their own legal liability if things go bad but they're also greatly increasing the likelihood that things go bad.

I really hope there was an implied /s that I missed.


To be fair "think of the children" can work pretty well to change the way people and governments think about cars. It got the Netherlands country wide bike infrastructure.

https://usa.streetsblog.org/2013/02/20/the-origins-of-hollan...


If you're actually right your side shouldn't have to play dirty with rhetoric and appeals to emotion to get your way (except maybe to counter an opposition that's doing so).

People as a group might not make great decisions the first time around but speaking society figures it out eventually. Usually the group that's promoting education and telling people to draw their own conclusions is the one that's right in the long term.


The opposition is always doing so, no matter who the opposition is.


In that case it's not the person obeying the law that's endangering anyone, it's everybody else.

Your argument is an excuse irresponsible drivers use to justify being reckless and endangering others.

Safety aside, you couldn't expect people to consistently drive over the speed limit due to the constant speeding fines and licence loss alone. Speed cameras don't care if you were "going with the traffic".


The speed limit is less important to me than the "ensure you don't kill anyone". Any deviation from the normal flow of traffic will decrease as self-driving cars get increasing numbers of miles under their wheels.


I've read half a dozen articles on Cruise's press event, they all tell similar stories. I guess whether you view them as positive depends on your expectations. Downtown SF is a very challenging driving environment, Cruise is doing some pretty awesome stuff realtive to the other established and well capitalized efforts.


> But this chaos—this unpremeditated waltz of oops, no, you go and nope, buster, me first—is reality. It’s how cities work. Which means that if a car is going to drive itself, no humans drivers involved, it must get really good at interpreting and anticipating the behavior of human non-drivers.

I'm not sure why it's always pushed as a near unattainable problem to have cars react in a human-ish way in situations as these.

All it needs is machine learning of thousands of situations of purely human drivers or watching other cars around driverless ones, and adapting the models to be able to apply them in particular situations. Such as a 4-way stop or someone rolling through a stop-sign.

It seems this obsession with defaulting to what the idealistic but unrealistic safety-first situation is (as say, designed by idealistic traffic engineering and law enforcement) vs focusing on learning to drive like humans with the least amount of errors, where rules are more flexible and risk is evaluated without much loss of safety.

We live in a very tightly controlled rules based society on paper, which seems like a major roadblock these systems have to confront, as they seem to pretend real safety is following these existing rules exactly. Yet it sounds like it would be far better if it broke the rules about 10-20% of the time... like most humans do for the sake of efficiency and merely adapting to the situation rationally.

We should be giving the machines as much, or more, leway as humans, much like how we hardly give jay-walking or going 10km/hr over the speed limit a second thought - but applied to a much larger group of compromises.

This may be a bit of my own libertarian bias that I have, a worldview where chaos is okay, and encouraged, as long as no one gets hurt (to a reasonable degree). But the actual problem sounds to me that the traffic rules aren't well adapted to machines yet, which tend to take them far more literally than humans, rather this idea being pushed that the machines are navigating the rules poorly.


“We will not launch until we have safety perfect,” General Motors President Dan Ammann said

So they are never going to launch?


'perfect' probably ties more to being legally safer than standard cars in the public eye than anything else.


Yet I wonder how much the ride quality will matter. If anyone can operate a ride service at scale that's significantly cheaper than Uber (without drivers to pay) and with a better safety record than most human drivers, I see these quirks being quickly overlooked. Like many bleeding edge innovations, early adopters and young adults will lead the way as it will make sense for many of them to go without owning a car.

Also, when such a service launches more widely, most of the miles will probably be driven on highways, which will probably be far less prone to surprises like jaywalking pedestrians.


Only if there isn't competition. Odds are that they won't be competing with Uber, but some other self driving service that would likely have a higher quality ride. Perception is important - just look at how people have switched from Uber to Lyft where they can.


Thanks I was reading the nyt article looking for evidence of pack leading, but discovered that they haven't even started real world urban tests...


They've been doing real world urban tests for about 18 months now. What do mean?


Ok I should have said "dense urban" or something, since downtown or Manhattan is a whole different ballgame from suburban & highway.

The article appeared to imply that they have not done actual urban driving yet:

""" The company plans to begin tests in Manhattan early next year.

Mr. Vogt is keen to prove that self-driving models can navigate complex urban environments such as downtown San Francisco, rather than just highways and suburban streets. """

Curious if anyone got a different reading from the article... (or perhaps if the journalism was just sloppy? idk)


They're heavily deployed in downtown San Francisco. That's where these test drives occurred, and they've posted tons of video over the last few months of downtown SF drives. Manhattan will be a new market, but downtown SF is indisputably dense and urban.


So, G.M. has a decades long history of often doing the wrong thing when it comes to safety in the name of profits. Ask yourself, would you trust these folks to auto-drive you anywhere?

http://money.cnn.com/2015/12/10/news/companies/gm-recall-ign...

https://en.wikipedia.org/wiki/General_Motors_ignition_switch...


Well at least they don't have the gall to call it Autopilot which has connotations sufficient enough to lull a lot of people into believing a product already delivers more than it can and likely ever will.

the issue we face now is, who is leading and or taking charge of this space, how are they actually doing it, and which solution is the best.

with more players we might get lucky to see Level5 within a decade or two. AV is perceived by many in the same category as any safety equipment, be that as simple as a seat belt, or an airbag, or complex like ABS and traction control. As such it has to meet a much higher standard.

GM has recently shown they can move in any direction they want and very quickly. The managed to launch a relatively affordable 200 mile range EV in short order with no availability issues other than an on purpose staggered release.

why would any of the existing automakers cede any innovative technologies and why would we not celebrate all of them trying


I don't know where those connotations came from. Any basic understanding of autopilot functions in existing vehicles, like airplanes, makes it immediately obvious how limited in function autopilot is.


Most people don't have a basic understanding of how autopilot works in airplanes.


Honestly, probably, because the bar is so very low. Humans are terrible at driving safely. We just accept it because until now we've had no alternative. I don't think a newcomer would have to execute perfectly, or even particularly well, to do better than the status quo.


The bar is actually pretty high. Making a system that fails less than once per hundred thousand kilometers is not trivial.


Paraphrasing someone else's excellent take:

Grandparent, to child, riding in an automated vehicle, "Can you believe, once upon a time people steered these?"

"What? How did that work? Was that safe?"

"Oh, no, 30,000 people died every year!"


1 mistake in 109 years does not seem too bad. Is there data to compare with other auto manufacturers?


Well there was the Corvair (unsafe at any speed) and the Vega (exploded if hit from behind) and I'm sure others. Where did your 1 in 109 stat come from?

Edit: I take back the Vega. While that was an awful car, I was actually thinking of the Ford Pinto.


> Edit: I take back the Vega. While that was an awful car, I was actually thinking of the Ford Pinto.

Turns out a lot of cars from that era had the same problem, and in general subcompact cars of the time were rather dangerous.

The famous cost benefit analysis memo was also misquoted, and taken out of context.

Economy sub-compact cars always have, and always will, meet exactly the minimum safety requirements put forth by the government. That is the entire point of an economy car, to make the cheapest car legally possible.

Want safety without regard for price? Drive a Volvo.


Curious to know how long folks here think it will be till we see AVs rolling around cities in America? I'm (literally) banking on less than 5 years, but curious what others think.



I think there's no chance we'll see them everywhere in less than 20 years. Right now they don't even do great in sandbox-controlled places with constant climate like Phoenix, AZ.


Source?


https://www.wired.com/2017/02/self-driving-cars-cant-even-co...

Every major city has construction zones. Right now this is a no-go for self-driving cars.


w00t! Thankfully that's the point of my startup, city can tell car about stuff in advance of stuff happening (construction, 911, etc)


Why would they? Traffic signs and sensors should be all you need to navigate in traffic. If it takes some service to keep you safe then suddenly traffic is as safe as that service is reliable.

And lots of this stuff is reactive or generated on the spot and the city will have no knowledge of it.


Even when the safety issues are resolved to the point where self-driving cars are at least as safe as humans in construction zones, they may be less safe than the same cars are elsewhere, and it may be a benefit to safety both to use self driving cars without an advance notice system and to use advance notice when available.

OTOH, even if it becomes a safety non-issue, advance knowledge of construction, 911 responses in progress, and other events on the potential routes can be useful to route planning in computer navigation systems whether or not they are used in self-driving vehicles.


Urs Holzle of Google said the following at Structure a couple weeks back: "You don't want to have a self-driving car that depends on an uplink for its driving ability. Vehicle to vehicle communication has to be a plus not a required."

As with detailed maps, other types of information would probably improve the overall safety and reliability of an autonomous vehicle. But, at some level, it needs to be able to drive on its own based on the environment it can directly sense.


That's basically what I was saying uphtread; self-driving cars need to get safe enough in all conditions to replace humans without relying on data broadcasts, but once they reach that point data broadcasts that allow them to identify and avoid situations that are relatively more dangerous and/or improve routing based on dynamic conditions still can have a valuable role.


Your utopia sounds wonderful. However, that's not how bureaucracy and public safety work.


I think you have that backwards.

Bureaucracy and public safety are mis-aligned to the point where a service relying on bureaucracy with a direct impact on public safety is going to be an uphill battle.

It also comes with some very interesting liability issues, such as: if the input data to that service is faulty and you then broadcast it to a bunch of AVs which react to it leading to an accident where would you assign blame?


To me autonomous vehicle means the car can drive me from point A to point B in any weather condition a normal person would drive in. I doubt it will be 5 years.

If your are talking about a lower level of AV then I would expect some cool stuff in 5 years.


Agreed. I believe the jump from "fantastic driver's aid" to "fully autonomous" in the consumer auto market will take a decade or more.

And even when there are true AVs available, adoption won't come without some pains and legal changes. If my AV runs over a pedestrian, who is at fault?

I hope we see AVs available for limited uses in the 5 year time-frame. Possibly limited to restricted areas and designated roadways. Or, maybe faster adoption in the commercial space, as there are potentially huge cost savings for shipping/transport companies.


I'd be totally satisfied with just an advanced cruise control for highways that gets you within a kilometer of your exit before handing back control.


In daylight and dry weather. As a CA resident this would allow the AI to take over the vast majority of the miles I drive throughout the year.


They will be there but they won't be autonomous. Not now, probably not ever.

They will be remote monitored and remote controllable.

Instead of having a driver per car you will have a team in some control center that monitors all the driverless cars and can take over if any get into trouble.

Full autonomous is a pipe-dream, but cars are a symbol of freedom in America, and just the idea of giving up control to a computer is enough to put off most people.

If people knew that autonomous cars will require giving full remote control to a corporation or government agency the idea would be dead in the water, so this part is kept in the fine print.


The aspect of freedom derived from cars is based on ones ability to get themselves from where they live to the point of destination at anytime, no matter where that location may exist. Self driving cars only increase the degree at which that freedom operates, as they will always be able to let you out at your final destination as you will no longer have to spend 30 minutes finding parking as is the case in many cities today.

Yes, people will be skeptical of shared self driving cars, but to suggest the problem is the self driving aspect seems a bit off.


>The aspect of freedom derived from cars is based on ones ability to get themselves from where they live to the point of destination at anytime, no matter where that location may exist.

Not necessarily. Part of the freedom is that you don't have to have a destination in mind. You can simply drive to wherever you want, whenever; you can explore. This also does not mention issues about surveillance, corporate control, privacy, etc.


You are thinking of things only from a utilitarian perspective, not an emotional and psychological perspective.

Driving provides a very real sense of freedom and power to people that they may not find in many other aspects of their life.

Most people will also remember being teenagers and having their parents controlling the keys.

Autonomous cars are like going back to being a teenager except now the government or some corporation owns the keys and tells you what is allowed.

Tech people don't always think about these things but they will have a major impact on how the technology is adopted or not adopted.


Personally speaking I feel very different as I grew up in NYC but then later moved to the suburbs during high school. In NYC, a 6th grader has the same level of freedom that you only have once you get your license in the suburbs, and it's very possible that with the advent of self driving cars that such freedom will later exist in the suburbs.

Additionally, I just think we'll have to see about that later part. For me the freedom is freedom of transport, the cars are just an expression of that.


For ownership purposes closer to 10-15, but on-demand where the car can decline certain routes it is unconfident in, 5 is definitely reasonable


Do they work in icy or snow related low visibility conditions? I'm living in Wisconsin so they would have to tick both boxes for me to consider.


Waymo has them launched in Arizona already.

So, -1 months?


More like they expanded their testing in Arizona. They are not commercially available.

And if we are counting test vehicles, they have been rolling around American cities for decades.


I hope soon - I have better things to do with my commute time than stare at the road.


I see it every day actually; though i'm in a dev-city.


"(literally) banking"

How?


My startup builds city infrastructure that allows them to define certain things via API, AV regulation will likley be one of those things.


Might be putting off buying a (new) car with hopes of ride-share networks being price competitive. That's what I'm doing.


Funny statement: “You don’t see any start-ups building iPhones,” he said.

Well, Apple don't license their software and would sue into oblivion anybody who tried copying them. It's not hard to find start-ups building Android phones, however.


Apple is also sometimes called "the world’s biggest startup," and Tesla is borrowing from the same playbook…

If I were at GM, I'd worry about that line from Palm CEO Ed Colligan: “We’ve learned and struggled for a few years here figuring out how to make a decent phone. PC guys are not going to just figure this out. They’re not going to just walk in.”


This comment is unfortunately not about the article, because I can’t view it on my iPhone. The link automatically opens up the app, but then the app doesn’t go to the article. Does anyone know how to get around this issue?


Uninstall the app since it’s just doing the job of a webpage anyway?


Why is GM (and others) allowed to test driverless cars in SF?

Didn't Uber get kicked out of SF for its driverless program and have to relocate to AZ?


It's legal to test them, but you have to register them as such. Uber refused to register them (as I understand it) because it felt they didn't meet the legal definition of autonomous.

https://www.nytimes.com/2016/12/14/technology/uber-self-driv...


Uber didn't get the required permit (which literally cost < $100) from the state DMV.

Rules? Who needs rules? -- Uber's motto


I don't know why I'm being downvoted. Please read this article [1]. Quote: "The move comes after Uber refused to apply for a $150 permit that would designate the cars as test vehicles, and allow them to be used on Californian roads, with the company arguing that the documentation didn't apply to its specific self-driving cars."

[1] https://www.theverge.com/2016/12/21/14049070/uber-san-franci...


It's less about flaunting rules and more about wanting their cars to be classified a different way.


You mean it's more complex than not having the foresight to get a permit? You don't say... So much reductionist criticism of these companies whenever these issues come up on HN and elsewhere.

The issues are far more complicated. These companies, including AirBnB et al, are not simply pushing the idea that "all regulations are bad" or "we're above regulations". It's about pushing back in an effort for better regulations appropriate for a modern industry. Typically they are designed for a previous era or inefficiently pigeonholed onto entirely new concepts.


I've seen tons of Cruise vehicles on the roads of SF, clearly being driven by a human.


they always have a human in the driver's seat with their hands close to the wheel but the one time I one was close enough to see the wheel was turning on its own and their hands were just hovering over it


Hope they are built in the US by union labor.


Could you please avoid generic ideological tangents, like the guidelines ask? They don't lead to the kinds of discussions we're here for.

https://news.ycombinator.com/newsguidelines.html


Do robots have unions yet? If not, then probably not. :)


GM is one of the last American companies that still has a union. Hope they don’t use this as an excuse to throw that proud tradition away.


The proud tradition that drove them to bankruptcy not too long ago?


You might want to check your facts if that’s what you believe. Or else I’ve got a bridge to sell you.


Seems to be an artical about an upcoming event? Not really unveiling anything.


Nah - this happened yesterday (the day before your post). It's unusual for media to be allowed inside autonomous cars, and the conditions are usually controlled. The conditions in this demo were fairly loose (difficult public streets in San Francisco) and also involved relatively little intervention by safety drivers. (Uber did a demo in SF once but it involved very frequent human takeovers. Alternatively, Waymo had a recent demo on a closed course (I think the one at Castle airforce base) and others in Arizona suburbs where conditions are less challenging.)

Fwiw as a human driver I find driving in San Francisco to be pretty challenging/stressful.


Not going to lie, I'm not sure how much I would trust a GM driver-less car.


Ok so you are telling the truth, care to elaborate on the why part?


The quality of many GM products has been questionable for ~20+ years. Remember how poorly they handled the ignition switch recall[1]?

[1] https://en.wikipedia.org/wiki/General_Motors_ignition_switch...


GM invests in puff piece PR.

Waymo invests in self-driving tech. We'll see which one wins out.


On the other hand, GM actually makes vehicles...


Remember kids: when it’s about cool companies like Google, it’s HN-worthy, but when it’s about old dad companies like GM, then it’s PR fluff to be downvoted - even when they’re doing the same thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: