I wasn't trying to imply anything other than the response about the servers not having current copies yet is one that needs to be put to bed. If after two full days your servers don't have the most recent files then it is quite possibly the worst hosting service I've heard of.
And that's where you completely missed the whole explanation. OUR servers ARE perfectly updated, to the second. The issue is, you are NEVER connected directly to them!
If you could really connect directly, the server would have crashes in a couple of minutes, if not less, after release. In that case, the result would have been that *nobody* would be able to download *anything* and, the moment we restarted the server, it would crash again. And you wouldn't even be able to come here to get help, because also the forum would have been down with the server.
To make something like this working, we MUST use a CDN (Content Distribution Network), we use Cloudflare, which is by far the most popular. They have about 260 nodes worldwide, so users will connect to the one closer to their location. It's likely each node would have hundreds if not thousands of servers, so it's a fairly complex system. It also became very obliquus, to the point that when they have issues, millions of sites would be affected at the same time, like the sky is falling, the end of the world, last one happened very recently, in June:
https://www.theverge.com/2022/6/21/23176519/cloudflare-outage-june-2022-discord-shopify-fitbit-peletonCloudflare's job is to REPLICATE files from individual hosts on their own servers, but it's not as if they just "copy" the whole server at once, that would be highly inefficient. They download a file from our server as soon as the first user will ask a download for it. Cloudflare check if it has the file, and if it hasn't or it has an older version, it will download the current version from our site will serve *that* to users, and will store that file until it gets changed again. The next user asking for that file, Cloudflare will see the file hasn't changed, so it will just serve the file without downloading it again from us.
This happens automatically, we don't have much control over which node gets what, the only thing we can do is asking for a forced "purge" of a particular file, so Cloudflare will just remove it, resulting in the next user asking for it surely getting it from our server, because Cloudflare simply doesn't have it anymore.
Obviously, we have a fully automated and tested system that, as soon we change a file, will send a request to purge it from Cloudflare's cache so, the challenges are:
- The purge command *itself* must be replicated. If I sent a purge command to Cloudflare to remove a specific file from the cache, the command must be replicated on ALL Cloudflare servers, removing it from all of them, so regardless where the next users asking for that file is coming from, every node would behave the same way, and will download the file from our server, to be ready to be serve directly to the next user, and so on.
- When many users are downloading, the servers will likely perform the purge commands with some delay.
- GSX is very large, over 25000 files, and all of them, eventually, must arrive on all 260 nodes.
So yes, the server issues are real. Or should have said, the challenges of releasing a popular product with a big amount of files, over a CDN with 260 nodes, which is the only way to distribute such stuff, in a way that won't block a single access server, no matter how powerful it is.