If even half of Intel’s claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 days ago

    Seems like a decent card, but here are my issues:

    • 12 GB RAM - not quite enough to play w/ LLMs
    • a little better than my 6650 XT, but not amazingly so
    • $250 - a little better than RX 7600 and RTX 4060 I guess? Probably?

    If it offered more RAM (16GB or ideally 24GB) and stayed under $300, I’d be very interested because it opens up LLMs for me. Or if it had a bit better performance than my current GPU, and again stayed under $300 (any meaningful step-up is $350+ from AMD or Nvidia).

    But this is just another low to mid-range card, so I guess it would be interesting for new PC builds, but not really an interesting upgrade option. So, pretty big meh to me. I guess I’ll check out independent benchmarks in case there’s something there. I am considering building a PC for my kids using old parts, so I might get this instead of reusing my old GTX 960, the board I’d use only has PCIe 3.0, so I worry performance would suffer and the GTX 960 may be a better stop-gap.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      9 days ago

      Its weird that Intel/AMD seem so disinterested in the LLM self hosting market. I get its not massive, but it seems way big enough for niche SKUs like they were making for blockchain, and they are otherwise tripping over themselves to brand everything with AI.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 days ago

        Exactly. Nvidia’s thing is RTX, and Intel/AMD don’t seem interested in chasing that. So their thing could be high mem for LLMs and whatnot. It wouldn’t cost them that much (certainly not as much as chasing RTX), and it could grow their market share. Maybe make an extra high mem SKU with the same exact chip and increase the price a bit.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          9 days ago

          Well AMD won’t do it ostensibly because they have a high mem workstation card market to protect, but the running joke is they only sell like a dozen of those a month, lol.

          Intel literally had nothing to lose though… I don’t get it. And yes, this would be a very cheap thing for them to try, just a new PCB (and firmware?) which they can absolutely afford.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            8 days ago

            They might not even need a new PCB, they might be able to just double the capacity of their mem chips. So yeah, I don’t understand why they don’t do it, it sounds like a really easy win. It probably wouldn’t add up to a ton of revenue, but it makes for a good publicity stunt, which could help a bit down the road.

            AMD got a bunch of publicity w/ their 3D Cache chips, and that cost a lot more than adding a bit more memory to a GPU.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              Are the double capacity GDDR6X ICs being sold yet? I thought they weren’t and “double” cards like the 4060 TI were using 2 chips per channel.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 days ago

                I’m really not sure, and I don’t think we have a teardown yet either, so we don’t know what capacities they’re running.

                Regardless, whether it’s a new PCB or just swapping chips shouldn’t matter too much for overall costs.

    • bluemellophone@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      There are some smaller Ollama Llama 3.2 models that would fit on 12GB. I’ve run some of the smaller Llama 3.1 models under 10GB on NVIDIA GPUs