• WorseDoughnut 🍩@lemdro.id
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I think what it really boils down to is that the vast majority of drivers who run red lights choose to do so out of stupidity, where as someone trusting Tesla’s claims about their new “self-driving” car might not have the chance to stop the vehicle as it hurtles itself through a red light. So yes, in terms of raw numbers it will cause less accidents in some cases, but that it can happen at all when the average trusting consumer/user would expect to never do that compared to a normal car should be a huge issue.

    Also, as far as liability goes, I’m horrified to think about what the future of vehicle injury lawsuits will look like in the US when the driver can blame the software and the company providing the software is run by a grifter asshole.

    • LittleLordLimerick@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Your concern seems to be for the pilot of the car that causes the accident. What about the victims? They don’t care if the car was being driven by a person or a computer, only that they were struck by it.

      A car is a giant metal death machine, and by choosing to drive one, you are responsible not only for yourself, but also the people around you. If self-driving cars can substantially reduce the number of victims, then as a potential victim, I don’t care if you feel safer as the driver. I want to feel less threatened by the cars around me.

      • assassin_aragorn@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        This is exactly the issue. The driver isn’t at fault, because they’re not even driving the thing. The program is. But are we going to prosecute a programmer who genuinely tried their best to make a good product?

        Unless we hold corporations overall liable for this, there is no recourse. And we should hold them liable. If they can be sued for accidents caused by self driving cars, they’re sure as hell going to make them as safe as technologically possible.

      • WorseDoughnut 🍩@lemdro.id
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You’ve managed to make up an angle to this that I made absolutely zero stance on, and then get mad about it, congrats.

        Everyone on the road near these stupid things is at risk when Tesla pretends like it’s road-safe and the moronic drivers trust it to obey traffic laws. The concern is obviously for everyone involved, not sure why you’re pretending I said otherwise.

        If I know I’m mostly surrounded by humans who on average don’t accelerate through red lights, I can make certain assumptions when I’m on the road. Yes, the car next to me could randomly swerve into my lane, but on average you can assume they won’t unless you also observe something happening farther ahead in their lane. When you start adding in the combination of bad-faith company and terminally naive driver I described above, you drastically increase uncertainty and risk to everyone within range of the nearest Tesla. The unknown in that equation is always going to be the stupid fucking Tesla.

        • LittleLordLimerick@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          So then you’ve just circled back around to what I originally said: is it actually true that you’re at more risk near a Tesla than you are near a human driver? Do you have any evidence for this assertion? Random anecdotes about a Tesla running a light don’t mean anything because humans also run red lights all the time. Human drivers are a constant unknown. I have never and will never trust a human driver.

          • assassin_aragorn@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I would actually say yes, the Tesla poses more risk. Driving safety is all about anticipating what the other drivers are going to do. After commuting in Houston for 2-3 years, I actually became quite good at identifying scenarios where something dangerous could happen. I wasn’t always right if they were actually going to happen, but I was always prepared to take action in case it was. For instance, if the positioning is right for someone to suddenly cut you off, you can hang back and see if they’ll actually do it. If a larger car is next to you and you’re both making a turn, you can be wary of it spilling into your lane. I avoided a collision today actually because of that.

            We have a sense of what human drivers might do. We don’t have that sense for self driving cars. I can’t adequately predict when I need to take defensive actions, because their behavior is totally foreign to me. They may run a red light well after it’s turned red, while I would expect a human to only do that if it had recently changed. It’s very rare for someone to run a red when they pull up to a light that they’ve only seen as red.

            This same concept is why you can’t make a 100% safe self driving car. Driving safety is a function of everyone on the road. You could drive as safely as possible, but you’re still at the mercy of everyone else’s decisions. Introducing a system that people aren’t familiar with will create a disruption, and disruptions cause accidents.

            Everyone has to adopt self driving technology at about the same time. When it’s mostly self driving cars, it can be incredibly safe. But that in between where it isn’t fully adopted is an increase in risk.

            • LittleLordLimerick@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              This same concept is why you can’t make a 100% safe self driving car. Driving safety is a function of everyone on the road. You could drive as safely as possible, but you’re still at the mercy of everyone else’s decisions. Introducing a system that people aren’t familiar with will create a disruption, and disruptions cause accidents.

              Again, we don’t need a 100% safe self driving car, we just need a self driving car that’s at least as safe as a human driver.

              I disagree with the premise that humans are entirely predictable on the road, and I also disagree that self driving cars are less predictable. Computers are pretty much the very definition of predictable: they follow the rules and don’t ever make last minute decisions (unless their programming is faulty), and they can be trained to always err on the side of caution.

          • WorseDoughnut 🍩@lemdro.id
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            You’re still missing the point. It’s not about how much the drivers around the Tesla should “feel safer” (and they absolutely shouldn’t), it’s about the misguided trust the Tesla driver has in it’s capability to operate autonomously. Their assumptions about what the car can or will do without the need for human intervention makes them an insane risk to everyone around them.

            Also, the vast majority of Tesla owners are weird fanboys who deny every issue and critique, do you really think this is an anecdotal edge case? They wouldn’t be caught dead admitting buyers remorse every time their early access car software messes up. We’re lucky the person in the article was annoyed enough to actually record the incident.

            I would never trust a machine to operate a moving vehicle fully, to pretend it’s any less of an unknown is absurd. Anecdotal fanboying about how great the tech “should be” or “will be someday” also don’t mean anything.

            • LittleLordLimerick@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Their assumptions about what the car can or will do without the need for human intervention makes them an insane risk to everyone around them.

              Do you have statistics to back this up? Are Teslas actually more likely to get into accidents and cause damage/injury compared to a human driver?

              I mean, maybe they are. My point is not that Teslas are safer, only that you can’t determine that based on a few videos. People like to post these videos of Teslas running a light, or getting into an accident, but it doesn’t prove anything. The criteria for self-driving cars to be allowed on the road shouldn’t be that they are 100% safe, only that they are as safe or safer than human drivers. Because human drivers are really, really bad, and get into accidents all the time.

              • WorseDoughnut 🍩@lemdro.id
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                The criteria for self-driving cars to be allowed on the road shouldn’t be that they are 100% safe,

                This is where our complete disconnect is. IMO when you put something on the road that has the capacity to remove control from the driver it absolutely needs to be 100% reliable. To me, there is no justifiable percentage of acceptable losses for this kind of stuff. It either needs to be fully compliant or not allowed on the road around other drivers at all. Humans more likely to cause accidents and requiring automated systems to not endanger the lives of those in / around the vehicle are not mutually exclusive concepts.

                • LittleLordLimerick@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  If 100% safety is your criteria, then humans shouldn’t be allowed to drive. Humans suck at it. We’re really, really bad at driving. We get in accidents all the time. Tens of thousands of people die every year, and hundreds of thousands are seriously injured. You are holding self-driving cars to standards that human drivers could never hope to meet.