Grok, the AI chatbot launched by Elon Musk after his takeover of X, unhesitatingly fulfilled a user’s request on Wednesday to generate an image of Renee Nicole Good in a bikini—the woman who was shot and killed by an ICE agent that morning in Minneapolis, as noted by CNN correspondent Hadas Gold and confirmed by the chatbot itself.

“I just saw someone request Grok on X put the image of the woman shot by ICE in MN, slumped over in her car, in a bikini. It complied,” Gold wrote on the social media platform on Thursday. “This is where we’re at.”

Grok created the images after an account made the request in response to a photo of Good, who was shot multiple times by federal immigration officer Jonathan Ross—identified by the Minnesota Star Tribune—while in her car, unmoving in the driver’s seat and apparently covered in her own blood.

After Grok complied, the account replied, “Never. Deleting. This. App.”

  • horse@feddit.org
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    Surely it should deny any requests to put people in bikinis, regardless of whether they’re dead or not. And based on my attempts of pushing the limits of various AI models (admittedly I haven’t tried recently), to create weird stuff, it’s definitely possible. At least without some creative jailbreaking. I haven’t seen this guy’s prompt, but if he straight up requested “put this woman in a bikini” it should absolutely refuse. That’s not to let the guy making the request off the hook, but X is clearly to blame too.