• Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    2
    ·
    5 days ago

    Targets for what??? Being brown? They need AI to confirm their racism? Was the old peter griffin white/not white chart not doing it for them anymore???

    • Shady_Shiroe@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      5 days ago

      You can’t be called a racist if a machine decides it for you, same as if a drone does the killing, it mean you can’t get convicted for murder even if you hit button to fire

      /s

      • mfed1122@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        If I point the gun, the gun evaluates the target, the gun automatically fires - then I’m not even doing anything!

        (I should rewatch psycho-pass…)

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      5 days ago

      I’m sure that’s what it will be: proprietary and questionable algorithms to identify anyone that might not be a 3+ generation American of European origin.

      • Mirshe@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        5 days ago

        And to shove justification off to a black box. Awful hard to put an algorithm on trial.

        • SaneMartigan@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 days ago

          They’re gonna fall back on the algo being racist and not the storm troopers with the nazi tattoos, those guys are just doing what the AI tells them.

    • IratePirate@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      They need AI to confirm their racism?

      Not confirm, but justify. “Hey, it’s not me - it’s this little machine that says you’re subhuman. I’m just doing my job here, man.”