Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

  • diffuselight@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    1 year ago

    Nothing to do with AI, Garbage in, Garbage out.

    LLMs are tools that satisfies requests. The developer decided to allow people to put the ingredients for chlorine Gas into the input - LLM never stood a chance but to comply with the instructions to combine them into the end product.

    Clear indication we are in the magical witch hunt phase of the hype cycle where people expect the technology to have magical induction capabilities.

    We could discuss liability for the developer but somehow I don’t think a judge would react favorably to “So you put razor blades into your bread mixer and want to sue the developer because they allowed you to put razor blades into the bread mixer”

    • Hobo@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I think it was more poking fun at the fact that the developers, not the LLM, basically didn’t do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could’ve probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store…

      I do agree, people are trying to shoehorn LLMs into places they really don’t belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.

      On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I’m pretty sure it wouldn’t even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or “rat poison sandwich” called out in the article.