New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

  • Ganondorf@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 year ago

    a form of whataboutism

    Agreed. The argument of matching autonomous vehicle perceptions with human perception should be completely irrelevant. When an autonomous vehicle has that significant of a margin of error, who ends up being responsible for the accident? When humans are involved, the driver is responsible. Is a manufacturer liable in the event of all autonomous vehicle caused accidents? Guaranteed corporations will rally and lobby to make that not possible. The situations aren’t the same and a huge selling point of autonomous vehicles has always been that they should be the safest form of piloting a vehicle.

    • quirzle@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      When an autonomous vehicle has that significant of a margin of error, who ends up being responsible for the accident?

      There’s some details to be sorted out, of course, but this isn’t the major question people make it out to be.

      When humans are involved, the driver is responsible.

      As is the owner, at least in the US. People will stay responsible for their vehicles (and, more relevantly, for insuring them).

      Is a manufacturer liable in the event of all autonomous vehicle caused accidents?

      If it turns out to be a defect, of course they are. They are even without the vehicle having autonomy. If they become responsible for more of the vehicle’s performance, of course it stands to reason they’ll be responsible for more of the outcomes as well.

      a huge selling point of autonomous vehicles has always been that they should be the safest form of piloting a vehicle.

      Which is exactly why it is relevant to compare their safety to that of human drivers.