IMHO, data centers kind of need to be somewhat close to important population areas in order to ensure low latency.
You need a spot with attainable land, room to scale, close proximity to users, and decent infrastructure for power / connectivity. You can’t actually plop something out in the middle of BFE.
need to be somewhat close to important population areas
They really don’t. I live in regional Australia - the nearest data center is 1300 miles away. It’s perfectly fine. I work in tech and we had a small data center (50 servers) in our office with a data center grade fibre link - got rid of it because it was a waste of money. Even comparing 1300 miles of latency to 20 feet of latency wasn’t worth it.
To be clear, having 0.1ms of latency was noticeable for some things. But nothing that really matters. And certainly not AI where you’re often waiting 5 seconds or even a full minute.
For the majority of applications you need data centers for, latency just doesn’t matter. Bandwidth, storage space, and energy costs for example are all generally far more important.
I remember reading a story about an email server that was limited to sending emails within 150 miles. Through a lot of digging, they found it was due to an auto-timeout timer getting reset to 0ms. Anything further than 150 miles would cause a 1ms delay and thus get rejected for taking too long.
IMHO, data centers kind of need to be somewhat close to important population areas in order to ensure low latency.
You need a spot with attainable land, room to scale, close proximity to users, and decent infrastructure for power / connectivity. You can’t actually plop something out in the middle of BFE.
They really don’t. I live in regional Australia - the nearest data center is 1300 miles away. It’s perfectly fine. I work in tech and we had a small data center (50 servers) in our office with a data center grade fibre link - got rid of it because it was a waste of money. Even comparing 1300 miles of latency to 20 feet of latency wasn’t worth it.
To be clear, having 0.1ms of latency was noticeable for some things. But nothing that really matters. And certainly not AI where you’re often waiting 5 seconds or even a full minute.
For the majority of applications you need data centers for, latency just doesn’t matter. Bandwidth, storage space, and energy costs for example are all generally far more important.
I remember reading a story about an email server that was limited to sending emails within 150 miles. Through a lot of digging, they found it was due to an auto-timeout timer getting reset to 0ms. Anything further than 150 miles would cause a 1ms delay and thus get rejected for taking too long.
In case anyone wants to read that: https://www.ibiblio.org/harris/500milemail.html