Hmm, not meaning to get my conspiracy hat on here but do we think this could relate to the fact that Microsoft now has a quantum computing chip that they can hype to their investors to show they have the next big thing in the bag?
AI has served its purpose and is no longer strategically necessary?
Since they are only spending investors money it doesn’t matter if they burn billions on leading the industry down the wrong path and now they can let it rot on the vine and rake in the next round of funding while the competition scrambles to catch up.
WYEA? 🌝🤡
AI is a tool, like a hammer. Useful when used for its purpose. Unfortunately every tech company under the sun is using it for the wrong fucking thing. I don’t need AI in my operating system or my browser or my search engine. Just let it work on protein folding, chemical synthesis and other more useful applications. Honestly can’t wait for the AI hype to calm the fuck down.
You forgot mass surveillance. It’s great at that.
The only way it’s going to die down is if it gets replaced with the next tech bro buzzword.
The previous one was “smart”, and it stuck around for a very long time.
Blockchain was one
VR/AR
Preach it. I have been so sick of AI hype and rolling my eyes any time a business advertises it, and in some cases moving on. I don’t care about your glorified chat bot or search engine.
Yeah I mean, when has Microsoft of all companies ever been wring about the future of technology…
Hmmm let me just bring this on Internet explorer on my windows phone.
Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.
Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.
I’m gonna disagree - it’s not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.
This means that MS isn’t expecting these data centers to generate enough revenue to be profitable, and they’re not willing to bet on further advancements that might make them profitable. In other words, MS doesn’t have a positive outlook for AI.
Exactly. If AI were to scale like the people at OpenAI hoped, they would be celebrating like crazy because their scaling goal was literally infinity. Like seriously the plan that openai had a year ago was to scale their AI compute to be the biggest energy consumer in the world with many dedicated nuclear power plants just for their data centers. That means if they dont grab onto any and every opportunity for more energy, they have lost faith in their original plan.
More efficient hardware use should be amazing for AI since it allows you to scale even further.
If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?
Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?
There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.
If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.
So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.
If buying a new video card made me money, yes.
This doesn’t really work, because the goal when you buy a video card isn’t to have the most possible processing power ever and playing video games doesn’t scale linearly so having an additional card doesn’t add anything.
If I was mining crypto, or selling GPU compute (which is basically what ai companies are doing) and the existing card got an update that made it perform on par with new cards, I would buy out the existing cards and when there are no more, I would buy up the newer cards, they are both generating revenue still.
If buying a new video card made me money, yes
But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.
And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.
Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient.
Sorry but that makes no sense in multiple ways.
-
First of all single mode fiber provides magnitudes higher capacity than multi mode.
-
Secondly the modal patterns depend on the physics of the cable, specifically its core diameter. Single mode fibers has a 9 micrometer core, multi mode 50 or 62.5 micrometers. So you can’t change the light modes on existing fiber.
-
Thirdly multi mode fiber existed first, so it couldn’t be the improvement. And single mode fiber was becoming the way forward for long distance transmission in 1982 already, and the first transatlantic cable with it was laid in 1988. So it couldn’t be the improvement of 2000 either.
You must mean something else entirely.
I think they conflated multimode with DWDM.
-
yeah, genai as a technology and field of study may not disappear. genai as an overinflated product marketed as the be all end all that would solve all of humanity’s problems may. the bubble can’t burst soon enough
Exactly. It’s not as if this tech is going in the dumpster, but all of these companies basing their multi-trillion-dollar market cap on it are in for a rude awakening. Kinda like how the 2008 housing market crash didn’t mean that people no longer owned homes, but we all felt the effects of it.
Yeah you echo my thoughts actually. That efficiency could be found in multiple areas, including deepseek. That perhaps too that some other political things may be a bit more uncertain.
There’s been talk for a while that “AI” has reached a point where merely scaling up compute power is yielding diminishing returns; perhaps Microsoft agrees with that assessment.
My guess is that, given Lemmy’s software developer demographic, I’m not the only person here who is close to this space and these players.
From what I’m seeing in my day to day work, MS is still aggressively dedicated to AI internally.
Why would a company or government use Azure or windows if MS is compromising it with ai?
Pick a lane
That’s compatible with a lack of faith in profitable growth opportunity.
So far they have gone big with what I’d characterize as more evolutionary enhancements to tech. While that may find some acceptance, it’s not worth quite enough to pay off the capital investment in this generation of compute. If they overinvest and hope to eventually recoup by not upgrading, they are at severe risk of being superseded by another company that saved some expenditure to have a more modest, but more up to date compute infrastructure.
Another possibility is that they predicted a huge boom of a other companies spending on Azure hosting for AI stuff, and they are predicting those companies won’t have the growth either.
I am sure the internal stakeholders of Micro$oft’s AI strategies will be the very last to know. Probably as they are instructed to clean out their desks.
There are a few of us here who are closer to Satya‘s strategic roadmap than you might think.
I’m sure but they’re not going to hedge on a roadmap. Roadmaps are aways full-steam-ahead.
I think deepseek shook them enough to realize what should have been obvious for a while… Brute force doesn’t beat new techniques, and spending the most might not be the safest bet
There’s a ton of new techniques being developed all the time to do things more efficiently, and if you don’t need a crazy context window, in many use cases you can get away with much smaller models that don’t need massive datacenters
Because investors expect it, whether it generates profit or not. I guess we will see how it changes workflows, or whether people continue to do things like they always have.
Same. Big tech is still whole hog on Generative AI
Context is king which is why even the biggest models get tied in knots when I try them on my niche coding problems. I’ve been playing a bit with NotebookLM which promises to be interesting with enough reference material but unfortunately when I tried to add the Vulcan specs it complained it couldn’t accept them (copyright maybe?).
We have recently been given clearance to use the Gemini Pro tools with Google office at work. While we are still not using them for code generation I have found the transcription and meeting summary tools very useful and certainly a time saver.
„Microsoft stopped building AI data center infrastructure, therefore Microsoft signals that there’s not enough demand” is a valid point in itself but not enough to merit a blog post that’s this long.
I’m getting an impression that minor fame and success went into Ed Zitron’s head because he now brags about those word counts and other pretentious shit on BlueSky constantly.
Really? That’s disappointing.
I was hoping for more credible people to be pointing out the wank that is generative AI.
I don’t buy that. The article covers a lot more than the cancellation of data center leases.
It also talks about Stargate, SoftBank, and a lot of other related elements that point to the original premise as the correct one - M$ is aware they’re not going to get the bike for Christmas.
I had a feeling this was coming.