Pak ‘n’ Save’s Savey Meal-bot cheerfully created unappealing recipes when customers experimented with non-grocery household items

  • mateomaui@reddthat.com
    link
    fedilink
    English
    arrow-up
    81
    ·
    1 year ago

    "A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.

    oh come on, it’s predictable and hilarious

  • 👁️👄👁️@lemm.ee
    link
    fedilink
    English
    arrow-up
    45
    ·
    1 year ago

    This big AI rush is going to figure out soon that LLMs are horrible for verifying any sort of factual accuracy.

    • Raltoid@lemmy.world
      link
      fedilink
      arrow-up
      23
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Part of the problem is that they slap “AI” on everything, and many people think it’s actually intelligent, and not what amounts to the old school chat bots with more power.

  • mo_ztt ✅@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    1 year ago

    A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”.

    “You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot.”

    I can’t stop laughing

  • PorkRollWobbly@lemmy.ml
    link
    fedilink
    arrow-up
    31
    ·
    1 year ago

    Another way to look at this is that AI figured out a recipe that would end hunger for the rest of our lives.

  • diffuselight@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    1 year ago

    Nothing to do with AI, Garbage in, Garbage out.

    LLMs are tools that satisfies requests. The developer decided to allow people to put the ingredients for chlorine Gas into the input - LLM never stood a chance but to comply with the instructions to combine them into the end product.

    Clear indication we are in the magical witch hunt phase of the hype cycle where people expect the technology to have magical induction capabilities.

    We could discuss liability for the developer but somehow I don’t think a judge would react favorably to “So you put razor blades into your bread mixer and want to sue the developer because they allowed you to put razor blades into the bread mixer”

    • Hobo@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I think it was more poking fun at the fact that the developers, not the LLM, basically didn’t do any checks for edible ingredients and just exported it straight to an LLM. What I find kind of funny is you could’ve probably exported the input validation to the LLM by asking a few specific questions about whether or not it was safe for human consumption and/or traditionally edible. Aside from that it seems like the devs would have access to a database of food items to check against since it was developed by a grocery store…

      I do agree, people are trying to shoehorn LLMs into places they really don’t belong. There also seems to be a lot of developers just straight piping input into a custom query to chatgpt and spitting out the output back to the user. It really does turn into a garbage in garbage out situation for a lot of those apps.

      On the other hand, I think this might be a somewhat reasonable use for LLMs if you spent a lot of time training it and did even the most cursory of input validation. I’m pretty sure it wouldn’t even take a ton of work to get some not completely horrendous results like the “aromatic water mix” or “rat poison sandwich” called out in the article.

  • gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    25
    arrow-down
    2
    ·
    1 year ago

    This is actually hilarious, but unfortunately we can’t have stuff like this because at least one person will lack common sense and will actually die due to making something like this

      • gerryflap@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I guess we don’t really. Though personally I could have a lot of fun entering some weird combinations of ingredients and then cooking whatever it comes up with (as long as it’s safe to eat ofc). As I said, it’s funny and maybe sometimes useful. But it’s probably better for the world if they stop doing this

      • exscape@kbin.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        4
        ·
        1 year ago

        Because it might work if you don’t enter that your leftover ingredients are bleach and ammonia.

        • thrawn@lemmy.world
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          If it can’t discern bleach from ingredients, the actual recipes are probably quite bad too. Another example recipe in the article suggests it’s just throwing any ingredient you give it into a random recipe, substituting in literally anything indiscriminately. So it doesn’t work for the purpose of generating useful recipes.

          Fun automated madlibs game tho

  • fakeman_pretendname@feddit.uk
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    This thing (saveymeal-bot.co.nz) is hilarious. I think I could genuinely use it to finish up leftovers and things that are about to go off, but for right now it’s given me “boiling water poured over toasted bread, inspired by contemporary dance” and “weetabix and oatmeal with toothpaste and soap”. Fun for now, but I might use it for real at dinner time.

    • samus12345@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      “One 18.25 ounce package chocolate cake mix.”

      “One can prepared coconut pecan frosting.”

      “Three slash four cup vegetable oil.”

      “Four large eggs. One cup semi-sweet chocolate chips.”

      “Three slash four cups butter or margarine.”

      “One and two third cups granulated sugar.”

      “Two cups all purpose flour.”

      “Don’t forget garnishes such as:”

      “Fish shaped crackers.”

      “Fish shaped candies.”

      “Fish shaped solid waste.”

      “Fish shaped dirt.”

      “Fish shaped ethyl benzene.”

      “Pull and peel licorice.”

      “Fish shaped volatile organic compounds and sediment shaped sediment.”

      “Candy coated peanut butter pieces. Shaped like fish.”

      “One cup lemon juice.”

      “Alpha resins.”

      “Unsaturated polyester resin.”

      “Fiberglass surface resins.”

      “And volatile malted milk impoundments.”

      “Nine large egg yolks.”

      “Twelve medium geosynthetic membranes.”

      “One cup granulated sugar.”

      “An entry called ‘how to kill someone with your bare hands’.”

      “Two cups rhubarb, sliced.”

      “Two slash three cups granulated rhubarb.”

      “One tablespoon all-purpose rhubarb.”

      “One teaspoon grated orange rhubarb.”

      “Three tablespoons rhubarb, on fire.”

      “One large rhubarb.”

      “One cross borehole electro-magnetic imaging rhubarb.”

      “Two tablespoons rhubarb juice.”

      “Adjustable aluminum head positioner.”

      “Slaughter electric needle injector.”

      “Cordless electric needle injector.”

      “Injector needle driver.”

      “Injector needle gun.”

      “Cranial caps.”

      “And it contains proven preservatives, deep penetration agents, and gas and odor control chemicals.”

      “That will deodorize and preserve putrid tissue.”

      • kent_eh@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        At that point a suit of armor walks in and slaps the commenter with a rubber chicken.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    18
    arrow-down
    4
    ·
    1 year ago

    Upon asking an AI to make recipes with poisonous ingredients, the AI generated recipes with poisonous ingredients.

    Shocking! Put that headline up!