beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 1 year agoThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.message-squaremessage-square60fedilinkarrow-up1293arrow-down18
arrow-up1285arrow-down1message-squareThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 1 year agomessage-square60fedilink
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down1·1 year agoThe Mistral language model is 3.8gb and has a crazy amount of knowledge
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1·1 year agoI wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information. Regardless, it is still wild that 3.8gb can go so far
minus-squaretrolololol@lemmy.worldlinkfedilinkarrow-up1·1 year agoThen it will pass better on Turing test. It’s a feature.
The Mistral language model is 3.8gb and has a crazy amount of knowledge
deleted by creator
I wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information.
Regardless, it is still wild that 3.8gb can go so far
Then it will pass better on Turing test.
It’s a feature.