What an odd thing to say…

  • pedz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    5
    ·
    2 days ago

    Not that odd. Death by car is easily accepted by society. They are “accidents” and a “necessary evil” for society to function.

    There’s around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and “accidents” are just that.

    According to the World Health Organization (WHO), road traffic injuries caused an estimated 1.35 million deaths worldwide in 2016. That is, one person is killed every 26 seconds on average.

    Nobody cares about cars killing people and animals. So she’s probably right.

    • AwesomeLowlander@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      4
      ·
      edit-2
      2 days ago

      More so when you take her actual statement in context: that they’re actually reducing deaths by being safer. The comments on lemmy are turning out to be just as biased and ungrounded in reality as they were on Reddit.

      Waymo robotaxis are so safe that, according to the company’s data, its driverless vehicles are involved in 91 percent fewer crashes compared to human-operated vehicles.

      And yet the the company is bracing for the first time when a Waymo does kill somebody — a moment its CEO says society will accept, in exchange for access to its relatively safer driverless cars.

      • pedz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        17
        ·
        2 days ago

        However I’m pretty sure that a standard transit system not made up of single cars that can only transport one or two person at a time and spy on them is also much safer.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        2 days ago

        Waymo robotaxis are so safe that, according to the company’s data, its driverless vehicles are involved in 91 percent fewer crashes compared to human-operated vehicles.

        Wow, you think the “company’s data” is a trustworthy source? Where is your critical thinking skills?

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            3
            ·
            edit-2
            2 days ago

            If the data is falsified that’d be illegal.

            Oh no! It would be illegal!

            And what would be the punishment if it was found out that they released illegal data? A fine that could amount to hundreds of thousands of dollars? On top of their tens of millions of dollars of profits?

            Do you have a reason to think otherwise?

            Yes, they are directly incentivized to either push their data in a biased direction or outright falsify their numbers, in order to facilitate the marketing strategy of these taxis being a “safe” technology, and increase their profit margin.

            Fuck… have we learned nothing from the tobacco industry?!

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      There’s around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and “accidents” are just that.

      The difference is accountability. If a human kills another human because of a car accident, they are liable, even criminally liable, given the right circumstances. If a driverless car kills another human because of a car accident, you’re presented with a lose-lose scenario, depending on the legal implementation:

      1. If the car manufacturer says that somebody must be behind the wheel, even though the car is doing all of the driving, the person is suddenly liable for the accident. They are expected to just sit there and watch for a potential accident, but the behavior of what an AI model will do is undefined. Is the model going to stop in front of that passenger as expected? How long do they wait to see before they take back control? It’s not like cruise control, a feature that only controls part of the car, where they know exactly how it behaves and when to take back control. It’s the equivalent of asking a person to watch a panel with a single red light for an hour, and push a button as fast as possible when it blinks for a half-second.

      2. If the model is truly driverless (like these taxis), then NOBODY is liable for the accident. The company behind it might get sued, or might end up in a class-action lawsuit, but there is no criminal liability, and none of these lawsuits will result in enough financial impact to facilitate change. The companies have no incentive to fix their software, and will continue to parrot this shitty line about how it’s somehow better than humans at driving, despite these easily hackable scenarios and zero accountability.

      Humans have an incentive to not kill people, since nobody wants to have that on their conscience, and nobody wants to go to prison over it.

      Corporations don’t. In fact, they have an incentive to kill people over profits, if the choice presents itself!

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2 days ago

      Nobody cares about cars killing people and animals.

      I think that’s overstating it a bit, of course many care, and we have people who are responsible for setting safety standards.
      Just because accidents are unavoidable doesn’t mean we aren’t trying to minimize them and avoid fatalities.

      Mandatory safety belts is an example of this. But other than that there are actual scientific studies into road safety, and even city wide implementations of such studies. At least in Europe there is, but I’m guessing USA has it too.

      Just because traffic accidents happen, and we obviously need “traffic” to be able to move around, doesn’t mean nobody cares.

      As an anecdotal example, here (Denmark) the speed limit was increased from 110 to 130 on our equivalent to Autobahn, which may seem like accepting more accidents for convenience or efficiency. But in reality it was to divert more traffic to the safer “Autobahn” to actually reduce the number of accidents on smaller roads.

      Traffic safety is as much about psychology as it is about making safer systems.

      PPS:
      Regarding the animals we have just had warnings about deer, and some places have small tunnels made for frogs.
      And there are warning signs where deer tend to cross in almost any country that has them.

    • But_my_mom_says_im_cool@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 days ago

      Self driving cars will have far less accidents and deaths than human driven cars. But the idea of being killed by human error is acceptable to us but the idea of a machine fucking up and killing us is terrifying, even if it means one self driving accident will create algorithms to avoid that same incident on all cars. Whereas human error can happen over and over in the same situation