Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • gamer@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 years ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.