• Feathercrown@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    12 hours ago

    Hmm, interesting theory. However:

    1. We know this is an issue with language models, it happens all the time with weaker ones - so there is an alternative explanation.

    2. LLMs are running at a loss right now, the company would lose more money than they gain from you - so there is no motive.