Safe Streets Rebel’s protest comes after automatic vehicles were blamed for incidents including crashing into a bus and running over a dog. City officials in June said…

    • @over_clox@lemmy.world
      link
      fedilink
      English
      -72 years ago

      Story time…

      I once had a crazy accident driving only like 15-20 MPH or so down a side road, then about 20 feet in front of me some idiot backed out of his parking spot right in front of me.

      Broad daylight, overcast skies, no other vehicles blocking his view even. Dude just backed up without looking like a freaking idiot.

      I responded in a split second. I did not hit the brakes, as I knew I didn’t have enough time or distance to stop. If I had hit the brakes, his car would have had more time to back out further and I would have smacked straight on into the passenger side of his car.

      Instead of hitting the brakes, I quickly jerked the steering wheel hard and fast to the left. See, I knew an impact was inevitable at that point, I made that move to clip his bumper instead of smacking into the passenger side and ruining both vehicles.

      Would an AI do that? 🤔

      • @SCB@lemmy.world
        link
        fedilink
        English
        62 years ago

        It’s weird that you think this isn’t the suggested driving practice in such an instance

        • @over_clox@lemmy.world
          link
          fedilink
          English
          -102 years ago

          They tend to work on basic sensors and simplified logic. They don’t tend to consider forward momentum and a vehicle pulling out perpendicular in front of you.

          I believe half the programmers of autonomous vehicles never even drove a vehicle in their life.

    • @kewjo@lemmy.world
      link
      fedilink
      English
      22 years ago

      are there actual datasets to look at and info regarding how data was collected? all the sources on that page are just domain links but don’t appear to point to the data making the claims?

      4.7 accidents per million miles doesn’t mean much if the cars are limited to specific roads or include test tracks that give them an advantage. the degree of variance in different environments would also need to be measured such as weather effects, road conditions and traffic patterns.

      I’m all for autonomous driving, but its not like companies don’t fudge numbers all the time for their benefit.

    • @HedonismB0t@lemmy.ml
      link
      fedilink
      English
      362 years ago

      That opinion puts a lot of blind faith in the companies developing self driving and their infinitely altruistic motives.

      • @biddy@feddit.nl
        link
        fedilink
        English
        72 years ago

        That wasn’t an opinion, it’s a statistic.

        No (large public) company ever has altruistic motives. They aren’t inherently good or bad, just machines driven by profit.

        • @rtxn@lemmy.world
          link
          fedilink
          English
          34
          edit-2
          2 years ago

          It’s not a strawman argument, it is a fact. Without the ability to audit the entire codebase of self-driving cars, there’s no way to know if the manufacturer had knowingly hidden something in the code that might have caused accidents and fatalities too numerous to recount, but too important to ignore, that were linked to a fault in self-driving technology.

          I was actually trying to find an article I’d read about Tesla’s self-driving software reverting to manual control moments before impact, but I was literally flooded by fatality reports.

          • @kep@lemmy.world
            link
            fedilink
            English
            112 years ago

            Strawman arguments can be factual. The entire point is that you’re responding to something that wasn’t the argument. You’re putting words in their mouth to defeat them instead of addressing their words at face value. It is the definition of a strawman argument.

          • @vinnymac@lemmy.world
            link
            fedilink
            English
            -42 years ago

            It may be the case that every line of code of all self driving vehicles is not available for a public audit. But neither is the instruction set of every human who was taught to drive properly on the road today.

            I would hope that through protesting and new legislation, that we will see the industry become more safe over time. Which we simply will never be able to achieve with human drivers.

          • @donalonzo@lemmy.world
            link
            fedilink
            English
            3
            edit-2
            2 years ago

            It is most definitely a strawman to frame my comment as considering the companies “infinitely altruistic”, no matter what lies behind the strawman. It doesn’t refute my statistics but rather tries to make me look like I make an extremely silly argument I’m not making, which is the defintion of a strawman argument.

            • @rambaroo@lemmy.world
              link
              fedilink
              English
              7
              edit-2
              2 years ago

              The data you cited comes straight from manufacturers, who’ve repeatedly been shown to lie and cherry-pick their data to intentionally mislead people about driverless car safety.

              So no it’s not a straw man argument at all to claim that you’re putting inordinate faith in manufacturers, because that’s exactly what you did. It’s actually incredible to me how many of you are so irresponsible that you’re not even willing to do basic cross-checking against an industry that is known for blatantly lying about safety issues.

      • @IntoDaLagoon@lemmygrad.ml
        link
        fedilink
        English
        02 years ago

        What do you mean, I’m sure the industry whose standard practices include having the self-driving function turn itself off nanoseconds before a crash to avoid liability is totally motivated to spend the time and money it would take to fix the problem. After all, we live in a time of such advanced AI that all the news sites and magazines tell me we’re on the verge of the Singularity, and they’ve never misled me before.

        • Red Wizard 🪄
          link
          fedilink
          English
          -12 years ago

          I feel like I’m taking crazy pills because no on seems to know or give a shit that Tesla was caught red handed doing this. They effectively murdered those drivers.

    • @over_clox@lemmy.world
      link
      fedilink
      English
      22 years ago

      So…

      Your car is at fault. Their kid is dead.

      Who pays for the funeral?

      Does your insurance cover programming glitches?

      • @HumbertTetere@feddit.de
        link
        fedilink
        English
        112 years ago

        If your insurance determined that an autonomous vehicle will cause less damage over time than a human driver, they will do that, yes.

        • @over_clox@lemmy.world
          link
          fedilink
          English
          -42 years ago

          Autonomous logic doesn’t pay insurance, does it?

          If so, who TF is paying the insurance behind the scenes, and who is responsible?

          • @HumbertTetere@feddit.de
            link
            fedilink
            English
            92 years ago

            If so, who TF is paying the insurance behind the scenes

            The owner of the vehicle is probably very openly paying.

            • Flying Squid
              link
              fedilink
              English
              12 years ago

              Here’s a question- if you have to agree to terms of service for the vehicle to function, and I’m guessing you would, is it really your vehicle?

            • @over_clox@lemmy.world
              link
              fedilink
              English
              -4
              edit-2
              2 years ago

              We’re talking about autonomous vehicles here, no driver, company owned.

              So is Alphabet responsible?

              Do your homework, these vehicles are owned by the parent company of Google and Apple, Alphabet. These vehicles have no private owner. So again, who TF is responsible?

                  • @CoderKat@lemm.ee
                    link
                    fedilink
                    English
                    3
                    edit-2
                    2 years ago

                    That’s not a good example. Courts move slow and that just barely happened and AFAIK is still being investigated (plus searching, the participants signed wavers – though wavers don’t give immunity legal negligence).

                    There’s plenty of examples of companies being punished for negligence. It happens all the time when, say, their poorly constructed building collapses, cutting corners causes an oil spill in the Gulf of Mexico, they falsified their vehicle emissions reports, or when they abuse their market dominance.

                    Corporations totally do get away with a lot, but I don’t see why you’d expect self driving cars to be a place where that would happen, especially since manually driven cars are already so regulated and require insurance. And everyone knows that driving is dangerous. Nobody is under any false impressions that self driving cars don’t have at least some of that same danger. I mean, even if the AI was utterly perfect, they’d still need insurance to protect against cases that aren’t the AI’s fault.

              • @Whirlybird@aussie.zone
                link
                fedilink
                English
                42 years ago

                these vehicles are owned by the parent company of Google and Apple, Alphabet.

                Alphabet don’t own Apple.

      • @CoderKat@lemm.ee
        link
        fedilink
        English
        62 years ago

        I mean, why shouldn’t it? Is a programming glitch in a self driving all that different from a mechanical issue in a manually driven car?

        • @over_clox@lemmy.world
          link
          fedilink
          English
          32 years ago

          AI driven cars are just as prone to mechanical issues as well. Is AI smart enough to deal with a flat tire? Will it pull over to the side of the road before phoning in for a mechanic, or will it just ignorantly hard stop right in the middle of the interstate?

          What’s AI do when there’s a police officer directing traffic around an accident or through a faulty red light intersection? I’ve literally seen videos on that before, AI couldn’t give two shits about a cop’s orders as to which way to drive the vehicle.