1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Self driving vehicles

Discussion in 'General Technology Discussions' started by classic33, 27 Oct 2015.

  1. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    Well with the right software and comms all the vehicles would have known what was around the corner. Problem solved.
     
  2. EddyP

    EddyP Well-Known Geek

    Sounds like they could have made it much worse for him, I'd say he was pretty lucky!
     
  3. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
  4. classic33

    classic33 Über Geek

  5. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    https://www.bloomberg.com/news/arti...tepped-suddenly-in-front-of-self-driving-uber

    Well it's finally happened. A pedestrian has been killed by a self driving car. Though it appears from initial reports that it was probably unavoidable regardless of who was driving the car as it appears the just suddenly popped out in front of the car before either the machine or back up driver could do anything about it.

    It will be almost impossible to reduce deaths to zero but the right move to stop all automation for the time being till all the facts are known and try to work out if any improvements can be made before restarting them again. I guess it will also be the test case of the legal implications of a self driving car causing someone to die. Do they get off without paying anyone if it's proved to be her fault? Will they have to pay a fine? Any other criminal charges?

    My take would be they should have to pay something, if nothing else as a financial incentive to ensure this sort of thing don't happen that often. If you are reducing death or injury dramatically then you can afford to pay top dollar when it fails. Seems the only way to make sure they keep safety at the front of their minds... Then again you don't want to make it so the victim gets all the money or you will be finding people throwing themselves under cars so the people they leave behind get a nice pay out. Like the old life insurance / suicide thing.

    Will be interesting to see how this plays out.
     
  6. classic33

    classic33 Über Geek

    "Pedestrian" was actually a cyclist pushing a bicycle(road vehicle) at the time.
     
  7. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    Yer, I've seen the video now of the lead up to the 'crash' and there are a few more bits of information out.

    The first thing I think they should be pulled up on is that the car was going 5pmh over the speed limit. I don't know if the outcome would have been any different if they were going the speed limit but they are there for a reason. I can't think this is the self driving algorithm learning to go faster, unless it's learning by how fast other drive down roads. It seems more likely they have designed the cars to push the speed limits to the max and try to keep the journey times shorter. Well that just seems wrong to me. Self driving cars should obey the rules of the road and that includes the speed limit.

    The second thing is something that don't really surprise me at all. The car was reported as having someone behind the wheel and this is often a safety fallback for a lot of self driving tests. Though when you watch the video of what that person was doing, it seems like he is looking at the dashboard screen more often then the road. If they are there for any kind of safety reason then they should be watching the road all the time like a real driver and ready to hit the break or take control of the steering wheel at a moments notice. This event just shows how having a dumb human 'watching a machine do it's task' usually leads to that human believing the machine can cope and switching off.

    Another quite worrying thing is the person and their bike seem to be quite an obvious obstacle. You have the person who should show up on infrared and a highly reflective metal bike. Both of which should be easy to spot. They also don't seem to be travelling that fast so it's not like they jumped out in front of the car with no notice.

    The final worrying thing seems to be the reports, or conspiracy theory, rolling around some of the web that the police might be painting a picture that lays more blame on the human outside the car than the car it's self. Some pointing to the theory that pressure or financial incentives may be been exchanged to get people believing that it was just some stupid person not crossing at the right place who was to blame. Rather than have a big well funded corporation blamed.

    I've also heard some people say we should only cross the road at designated places, ware bright cloths or all have chips so the cars can spot us. While crossing safely and bright cloths at night are always a good safety consideration, I think it's totally unrealistic to think that everyone will adopt this. So it's a non-starter, the self driving cars need to be able to cope will people in dark cloths crossing at unexpected places just like a human may have to.

    I think there are a few things that could be done to help prevent these types of incidents happening, or at least reduce the risk. Firstly self driving cars should have their speeds capped. At least to work within the speed limit or even better to drive 5mph under the speed limit until they are proved to be more safe. Many studies have proved that dropping your speed by 5mph can drastically reduce the likelihood of death in many cases. This accident happened in a 40mph zone with the car doing 45pmh. I reckon if the car was instead doing 35mph the outcome could have been different.

    Another thing I've championed for a long time is the self driving car standard. Something akin to HTML. Where all the major players have to come up with a set of rules they all follow but are allowed to implement them themselves. One of the beauties of this is that it encourages openness and sharing where all the people involved have to pitch ideas they want adding and sell it to the rest of the industry. Also it creates competition with applying the rules better than other companies. Chrome took over the browser market partly because it did such a good job of implementing HTML in a good and fast way. If Apple made a self driving car as bad as it's HTML implementation only an idiot would get in the car.

    I can't see this one accident changing the timeline of the roll out of self driving cars much but I do hope that we learn some lessons and it helps us get to a good set of self driving rules quicker. Though we will have to see how it plays out when all the facts are out.
     
    Last edited: 24 Mar 2018
  8. classic33

    classic33 Über Geek

    Goes back to the cyclist and tanker scenario set earlier. Seems cyclists and anyone else outside the vehicle come second to those inside.

    With regards the speeding, you'd think with all the satnavs in use, and with a decent one being part of any self driving car. The speed limits would either be picked up by the satnav and relayed to the control system. Or they'd "inform" the control system through being pre-programmed.

    Non of this should take anything away from the "failsafe" that failed though.
     
  9. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
  10. Yellow Fang

    Yellow Fang Veteran Geek

    Location:
    Reading
    I had an interview with a company that is developing a driverless car. Actually what they are doing is putting some sensors and an NVIDIA control box into a Renault Twizy. Apparently there is a lot of open source car driving algorithms out there. They have to develop some of their own firmware and middleware, but I don't think they have to develop all the algorithms. I doubt I'll get that job but it was an interesting place to visit.
     
  11. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    Yer, as I see it... Self driving cars are like early computers where you would have custom OS's until Windows / OS X / *nix took over as the dominant platforms. A lot of the self driving cars use the same platforms to run on and how well they are implemented will determine how well the work. In the case of Uber it sounds like it was a really bad implementation of the self driving platform they were using.

    I was reminded today of why I want this to succeed as soon as possible when a taxi in my local city centre sped round a corner blaring it's horn at people casually walking across the road in the sunshine, followed by narrowly missing me as it swung at the last minute into the side street I was crossing without indicating. Humans seem far too hot headed to responsibly drive lethal machines around pedestrian filled streets and I can't wait till they are all replaced with predictable self driving vehicles.

    Hopefully lessons learnt from the Uber mess will drive (pun intend) people to choose between one or two platforms that work and they will get better quicker, rather than everyone trying to develop their own self driving systems.
     
  12. classic33

    classic33 Über Geek

    College Student Turned His Honda Civic Into a Self-Driving Car For $700
    Using plans he found online.

    Apparently you don't need to fork out for a brand new Tesla model if you want the benefits of a self-driving car, because a college student from the University of Nebraska, Omaha, has figured out how to transform his Honda Civic into a self-driving car for only $700.

    Brevan Jorgenson used open source plans and software that he downloaded from the Internet to build a self-driving car kit that's capable of controlling his vehicle's brakes, steering, and accelerator, and can sense obstructions and other cars on the road around it.

    Jorgenson used to be an early beta tester for Comma.ai - an ill-fated self-driving technology startup in San Francisco that became mired in controversy last year.

    He says he took his self-driving Honda for its first test-drive back in January.

    "It was dark on the interstate, and I tested it by myself because I figured if anything went wrong I didn't want anybody else in the car," he told Tom Simonite at MIT Technology Review.

    "It worked phenomenally."

    The software and plans Jorgenson used to make his self-driving kit are from Comma.ai, which put them online last year after failing to deliver on promises that it would sell self-driving car technology to consumers for less than $999 by the end of 2016.

    The kit, called Comma One, was immediately disappointing, because it only ended up being compatible with just two car models upon release.



    https://www.sciencealert.com/this-college-student-made-his-honda-civic-a-self-driving-car-for-700
     
  13. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    Yer I think someone will create something like Android, which like Android will be mostly open source and will also be snapped up by a big player. Like when Google bought Android and now something like 90% of phones run on it. With niche alienates like the iPhone.

    I think cars will be similar. There will be one dominant system that is considered safe, secure and well funded that most cars will run on. But technically you can build one yourself quite easily. How it will cope with tricky decisions may vary but the simple act of driving a car on quiet roads is relatively easy to do. I think, like with drones, eventually you will need expensive tests to pass before that car is road legal.
     
  14. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    I've recently been back on the road both as an occasional car driver and cyclist. Now I'm quite a polite road user and will often wave to another road user or pedestrian signalling that they can go first, even if technically I might have right of way.

    Which has got me wondering how self driving cars deal with this. If you have someone waiting near a pedestrian crossing with no intention to cross the road, or cars waiting to turn onto the road when you can see your going to get caught by upcoming traffic lights and similar situations. I'd imagine that current gen technology probably is not very good at working out these situations.

    Also it often amazes me at how good people are at subtle sign language between each other. A small wave of a hand or finger, nod of the head and the like. With a self driving car there seems to be a couple of problems. Firstly the non-driver in the driving seat position could make a gesture which could be misinterpreted by other road users. Also you will have the frustration where a car can't easily signal to other users, yer it could flash it's lights but without eye contact a flash of the lights can be hard to interpret it's meaning.

    I'm sure eventually we will work out a system to solve these problems, maybe something like a display on the front to give information to external road users. Obviously once almost all cars are self driving they will probably talk via a wireless link letting other cars around them where they are going, what they intend to do and if another car can filter into the traffic from a side road. Till then it could be a bit messy with an increasing number of humans and robotic vehicles.
     
    classic33 likes this.
  15. amusicsite

    amusicsite dn ʎɐʍ sᴉɥ┴ Staff Member

    Location:
    UK
    https://arstechnica.com/?p=1582773

    Well after almost a year of self driving cars with a safety driver it appears Waymo is ready to try out a fully self driving car. The cars will turn up with no one inside and be remotely monitored from a central base. Along with a panic button to talk to a human back at base.

    As I understand it, this is restricted to an area where they have driven many trips. Restricted to people who have signed up to be part of the test. Where you book your trip before you go. So I'd imagine they cherry pick trips which the car's A.I. can cope with well. Possibly at times of the day with light traffic and pedestrians.

    So probably still a long way from a full roll out. I do quite like this bit by bit approach. A bit like the new feature from Tesla which can summon your car. For this you need to be close to the car, within visual distance is their guidelines. You have to keep pressing a button on the app for the service to continue, if you lift of the car stops dead. The car can only go about 6mph so not likely to do much damage if it hits you. So far they have had it used about a million times with a few dozen reported problems.

    Data is king in these trials. Does it work as expected, if not does the tweaks to improve it work or break other things. It's a confidence game. You have to get to a point where people feel confident in the technology.

    I heard a good analogy recently of early elevators/lifts. Originally you had a lift operator. Then automation came in. People then were nervous about these new systems and I'm sure there were a few fatal mistakes. I do remember myself getting into some of the new fast lifts in the 70s/80s that were a little scary. There were plenty of films and books about demonic, malfunctioning or accidents in these new automated lifts. Obviously it's a lot easier to move a box up and down than drive in a continuously changing environment. Though I think the same basic results are true. People will be nervous of this new technology and it will not be perfect to start with. If you have an increasing number of these cars, probably driving all day every day, the amount of experience gained will grow expediently. Quite possibly someone born today will not think twice about getting into a self driving car by the time they get old enough to drive. Because by then we may have clocked up trillions of safe trips with automated cars.

    For now I think small trial areas, starting with a safety driver till they are sure the cars know the routes and the dangers. Then slowly roll out the self driving cars. Once sure it's all good, move on to the next area. Quite possibly reducing the time needed from the current year to months or weeks as the technology develops.