• ignirtoq@fedia.io
    link
    fedilink
    arrow-up
    22
    ·
    5 days ago

    Several years ago I created a Slack bot that ran something like Jupyter notebook in a container, and it would execute Python code that you sent to it and respond with the results. It worked in channels you invited it to as well as private messages, and if you edited your message with your code, it would edit its response to always match the latest input. It was a fun exercise to learn the Slack API, as well as create something non-trivial and marginally useful in that Slack environment. I knew the horrible security implications of such a bot, even with the Python environment containerized, and never considered opening it up outside of my own personal use.

    Looks like the AI companies have decided that exact architecture is perfectly safe and secure as long as you obfuscate the input pathway by having to go through a chat-bot. Brilliant.

  • BaroqueInMind@piefed.social
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 days ago

    And so Microsoft decided this wasn’t a big enough vulnerability to pay them a bounty. Why the fuck would you ever share that with them then, if you could sell it to a black-hat hacking org for thousands?

  • Bubbey@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    I’m sure nothing will go wrong with tons of critical business documents being routed through copilot for organizations…