WhatsApp’s AI shows gun-wielding children when prompted with ‘Palestine’::By contrast, prompts for ‘Israeli’ do not generate images of people wielding guns, even in response to a prompt for ‘Israel army’

    • theyoyomaster@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      9
      ·
      11 months ago

      This isn’t anything they actively did though. The literal point of AI is that it learns on its own and comes up with its own response absent human interaction. Meta very likely specifically added code to try and prevent this, but it just fell short of overcoming the bias found in the overwhelming majority of content that led to the model associating Hamas with Palestine.

    • pete_the_cat@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      11
      ·
      edit-2
      11 months ago

      I forget if it was on here or Reddit, but I remember seeing an article a week or so ago where the translation feature on Facebook ended up calling Palestinians terrorists “accidentally”. I cited the fact that Mark is Jewish, and probably so are a lot of the people that work there. The US is also largely pro-Israel, so it was probably less of an accidental bug and more of an intentional “fuck Palestine”. I got downvoted to hell and called a conspiracy theorist. I think this confirms I had the right idea.