• Deceptichum@quokk.au
    link
    fedilink
    English
    arrow-up
    53
    ·
    edit-2
    2 months ago

    Fuck it, I use local LLMs enough, will give this a crack.

    Edit: it’s doing 6 paragraphs in 8.2 seconds, the last model I used was doing like 1 paragraph in 12 seconds. Crazy fast in my experience.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 months ago

      How are they to run, how useful are they, and any you can recommend?

      • Deceptichum@quokk.au
        link
        fedilink
        English
        arrow-up
        30
        ·
        2 months ago

        Dead simple to run, I use Ollama to run local models and it’s like 3 words to setup from the command line.

        Useful is entirely relative. I use mine personally and somewhat professionally, but I only use it to draft text and manually alter it. AI is amazing, but it’s also crap. You gotta work it a bit.

        Umm this model from what I can see, I’m using the 8b model and it’s fast to generate, time will tell how good the quality is but I’m impressed after a few minutes play.