• skip0110@lemm.ee
    link
    fedilink
    English
    arrow-up
    17
    ·
    4 days ago

    This is not new knowledge and predates the current LLM fad.

    See the Hutter prize which has had “machine learning” based compressors leading the ranking for some time: http://prize.hutter1.net/

    It’s important to note when applied to compressors, the model does produce a code (aka encoding) that exactly reproduces the input. But on a different input the same model is unlikely to produce an impressive compression.