• iegod@lemmy.zip
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    6
    ·
    2 days ago

    I don’t see how you police/enforce this. The technology is out of the bag, people will find ways to access. Do we need age/location verification for this now too? What if I’m running a local agent? I don’t agree with this.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      2
      ·
      2 days ago

      The law would allow you to sue whoever is running the chatbot. If you run your own LLM locally and take bad advice from it, then it’s your own fault.

      • iegod@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        10
        ·
        2 days ago

        Walk me through how a company based and operating not in new york would be subject to any actions from this lawsuit.

        • altkey (he\him)@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          10
          ·
          2 days ago

          I do agree it’s limited to a small scope of New York-based smaller LLMs, but if you read the news you know why exactly this bill occured - just now Mamdani gave up on a useless chatbot made with local budget by his predecessor Adams: https://www.thecity.nyc/2026/01/30/mamdani-unusable-ai-chatbot-budget/ It was indeed giving inaccurate legal recomendations on city’s website. I think the better result that can happen to that bill is it becoming a trend across cities and states as, I suspect, New York administration wasn’t the only one falling for this scam.

      • how_we_burned@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        2 days ago

        So who gets sued. The guy who put the chat bot on the server and is running it or the chatbot software developer themselves?

        Or both?