beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 3 months agoThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.message-squaremessage-square51fedilinkarrow-up1264
arrow-up1258message-squareThere was a time when the entirety of the internet would have fit onto the device you're currently browsing on.beebarfbadger@lemmy.world to Showerthoughts@lemmy.world · 3 months agomessage-square51fedilink
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1·3 months agoThe Mistral language model is 3.8gb and has a crazy amount of knowledge
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up1·3 months agoI wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information. Regardless, it is still wild that 3.8gb can go so far
minus-squaretrolololol@lemmy.worldlinkfedilinkarrow-up1·3 months agoThen it will pass better on Turing test. It’s a feature.
The Mistral language model is 3.8gb and has a crazy amount of knowledge
It lies a lot
I wouldn’t say a lot. Llama2 is way worse in my experience. Mistral gives fairly factual information.
Regardless, it is still wild that 3.8gb can go so far
Then it will pass better on Turing test.
It’s a feature.