![](https://lemmy.world/pictrs/image/63faf4de-6dc1-41b7-801e-b30548bec9cd.png)
![](https://lemmy.world/pictrs/image/eb9cfeb5-4eb5-4b1b-a75c-8d9e04c3f856.png)
On the website you can zoom.
e
On the website you can zoom.
If the internet isn’t fun for you, find a community on the internet that you actually enjoy being in. Easier said than done, I know, but the internet is a big place.
Yeah, I wouldn’t say really that his channel has gotten objectively worse, just that he has realized that 90% of people who watch his channel are children and he has aimed his channel towards them - I just have to accept that I am no longer the target audience.
It’s hard to tell if the youtubers are getting more annoying or if its my tastes that are changing. Probably both tbh
yea, its funny to see xkcd.com/792 from 2010, back when google wasn’t evil
I stopped being a Musk supporter when I found out he had a twitter.
I still think all of his companies are doing (to various extents) genuinely cool and useful things, but Musk in recent years at least has shown himself to be a terrible manager (not to mention a terrible person)
Because that’s expensive and can be done with a camera.
Expensive, as in probably less than $600? Compared to the $35000 cost of a tesla?
(comparing the cost of the iPhone 12 (without lidar) and iPhone 12 pro (with lidar), we can guess that the sensor probably costs less than $200, so 3 of them (for left, right, and front) would cost probably less than $600)
lidar can actually be very cheap and small. Unfortunately, Apple bought the only company that seems to make sensors like that (besides some other super high end models)
There have been a lot of promising research papers on the technology lately though, so I expect more, higher resolution and cheaper lidar sensors to be available relatively soon (next couple years probably).
Depends, I think. In the same order of magnitude definitely.
Edit: this makes me wonder, is it possible to get an orthographic perspective with an ordinary size (but maybe not standard) lens on a normal camera?
Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.
ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.
Apple’s GPUs are definitely lacking though, in terms of performance.
Time machines don’t exist and (as far as we know) cannot exist. Therefore, we can say they work however we want. If you can travel back in time, surely you can do that while remaining close to an arbitrary point of reference.
Birds are dinosaurs in the way tomatoes are fruits