Veteran journalists Nicholas Gage, 84, and Nicholas Basbanes, 81, who live near each other in the same Massachusetts town, each devoted decades to reporting, writing and book authorship.

Gage poured his tragic family story and search for the truth about his mother’s death into a bestselling memoir that led John Malkovich to play him in the 1985 film “Eleni.” Basbanes transitioned his skills as a daily newspaper reporter into writing widely-read books about literary culture.

Basbanes was the first of the duo to try fiddling with AI chatbots, finding them impressive but prone to falsehoods and lack of attribution. The friends commiserated and filed their lawsuit earlier this year, seeking to represent a class of writers whose copyrighted work they allege “has been systematically pilfered by” OpenAI and its business partner Microsoft.

“It’s highway robbery,” Gage said in an interview in his office next to the 18th-century farmhouse where he lives in central Massachusetts.

  • MagicShel@programming.dev
    link
    fedilink
    arrow-up
    49
    arrow-down
    4
    ·
    2 months ago

    As someone who uses AI all the time to write fiction just for my own entertainment, AI in no way replaces actual authors because while it might be technically capable, it’s garbage at big picture stuff. No theme or plot or foreshadowing that spans more than a handful of pages.

    AI cannot do the craft of writing no matter how good it is at prose.

    Not that there aren’t valid concerns and all, but I think this is a fading fad.

    • FireTower@lemmy.world
      link
      fedilink
      arrow-up
      25
      arrow-down
      9
      ·
      2 months ago

      Not that there aren’t valid concerns and all, but I think this is a fading fad.

      I’m worried authors are 1920s horses. Sure those cars seem unreliable and impractical now. But we can’t see around the corner. The least they deserve is compensation for their works being used without proper license.

    • subignition@fedia.io
      link
      fedilink
      arrow-up
      15
      arrow-down
      4
      ·
      edit-2
      2 months ago

      With ever-growing context windows, I have a feeling that it will only be a matter of time before it forces us to adapt. ChatGPT-4o is somewhat intimidating already, though I haven’t used it as extensively as you have.

      But at the same time, I really would prefer to be wrong about that.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        8
        ·
        2 months ago

        I’ve used it a fair bit. The extra context helps with things like getting facts straight, but it doesn’t help with coming back to themes or the things that really make a story hit, you know? Even with the extra context, I still find the stories get worse and worse as they get longer.

        I do think that a skilled author (better than me - I’m not awful but I’m no professional) could get a better output, but that doesn’t cut the author out of the loop there.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          2 months ago

          That sort of thing can be handled by the framework outside of the AI’s literal context window. I did some tinkering with some automated story-writing stuff a while back, just to get some experience with LLM APIs, and even working with an AI that had only a few thousand tokens’ context I was able to get some pretty decent large-scale story structure. The key is to have the AI work in the same way that some human authors do; have them first generate an outline for the story, write up some character biographies, do revisions of those things, and only once a bunch of that stuff is done should it start writing actual prose.

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            2
            ·
            2 months ago

            I’m familiar with that. Not in quite that way because our app is for roleplaying where there isn’t a prewritten story but we use a database to pull relevant info into context. You can definitely help it, but you need author chops to do it well.

            Which means maybe this is a tool that could help good writers write faster, but it won’t make a poor writer into a good one. If for no reason other than you need to know how to steer and correct the output.

    • Shiggles@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      2 months ago

      “AI” will probably get there someday, but I agree the tech is nowhere near there. Calling what we have now “intelligence” is a very strong stretch at best.

    • kaosof@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      2 months ago

      These models can’t write satisfyingly/convincingly enough yet.

      But they will.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        2 months ago

        I’ve been using AI for about 5 years. I understand fairly well what they can and can’t do. I think you are wrong. I would bet money on it. They can’t reason or plan no matter how much context or training you give them because that’s not what they do at all. They predict the next word, that’s all.

        • criitz@reddthat.com
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          2 months ago

          I would bet against this. It’s not that hard to imagine machine learning being able to digest and reproduce plot-level architecture, and then handing the wording off to an LLM…

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            2 months ago

            I mean AI can produce a plot, but the real craft of like, the heroes journey or having a theme that comes back again and again in subplots and things like that. Humor. Irony and satire. Pacing - OMG pacing. It’s just not very good at those things.

            If you want to write a Dick and Jane book with AI writing and art, yeah probably. But something like Asimov or Heinlein (or much less well known authors who nevertheless know their craft) I think an AI would never be able to speak to the human spirit that way.

            Even at the most low-brow level, I can generate AI porn, but it’s never as good as art created by humans.

            • subignition@fedia.io
              link
              fedilink
              arrow-up
              2
              ·
              2 months ago

              I wonder if the bigger concern isn’t AI being able to imitate good writers, but rather it being able to imitate poor ones.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          Have them predict what a reasonable plan would look like. Then they can start working from that.

    • Boozilla@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 months ago

      I mostly agree with you, but I don’t think it’s a fading fad. There was way too much AI hype, way too early. However, it gets gradually but noticeably better with each new release. It’s been a game changer for my coworkers and me at work.

      Our merciless greedy overlords will always choose software over human employees whenever they can. Software doesn’t sleep, take breaks, call out sick, etc. Right now it makes too many mistakes. That will change.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        2 months ago

        Fair enough. I’ve spent enough words making my point and anything else would be redundant. Time will tell. Probably within a couple of years - whenever venture capital gets antsy for actual results/profits instead of promising leads.

    • anti-idpol action@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Yes even for technical writing it’s absolute shit. I once stumbled upon a book about postgresql with repetitive summaries and generally a very algorithmic, article-like pattern on literally every page.