

I’m running a 4B model on one of my machines, an old surface book 1.
It’s a brutal machine. heat issues, and the GPU doesn’t work in linux. But pick a minimal enough model and it’s good enough for me to have LLM access in my nextcloud if for some reason I wanted it.
Biggest thing really seems to be memory, most cheaper GPUs don’t have enough to run a big model, and CPUs are dreadfully slow on larger models if you can put enough RAM in one of them.






Insurance in particular is something that I really feel like regulators should be looking into because especially during the pandemic, people were driving a lot less, people are in their homes more so there would be less robberies and less catastrophic destruction, but insurance soared.
It was a long time ago now, but my predictions also ended up playing out in the data where they were paying out way less and still cranked up premiums.