Asset Download Overhaul #737
Replies: 3 comments 4 replies
-
This is great news! As you know the download speed issue both in-game and with the repair option have been making it difficult for me to convince friends to give Uru a try here in Europe. Some added security also just makes sense (and I already expected it to be using https) I mentioned it in IRC quickly, but I'll go into it in more detail here: I would consider adding some optional feature for region-specific servers. While a better protocol would solve many of the issues. It's not unforeseeable that bandwidth costs as well as latency might play enough of a role that server admins may wish to offer one (or several) servers for each region. Specifically, this would allow the client (by choice of the user or automatically) to fetch from a server near the user. Another option is to use a single update URL that then redirects the user based on geoip location or something similar. Personally I have mixed experience with this kind of geoip methods, especially when IPs are moved between RIPE and ARIN regions, which is not that uncommon with decreasing numbers of IPv4 blocks available. Beyond that, I think the user should have some way of influencing the server they download from, either from the interface or a config file. It's also relevant to note that S3 buckets are always region specific. I couldn't immediately find a way on AWS to distribute files to several regions where the right region is automatically detected when someone wants to fetch data. I think it would make the most sense that the server announces/distributes a list of servers with names that state the region, and that some simple latency and/or speed tests are used to detect the right one for the user, if no specific choice/overwrite is in place. I would perhaps also consider to make it possible to fetch multiple files at the same time or combine them into one stream, so fewer requests are required when fetching all files for one or all ages. I think the option to use a separate service from the game server itself, using REST, makes the most sense. There really is no connection between asset fetching and the other aspects of the game, so in my opinion these should be separate. |
Beta Was this translation helpful? Give feedback.
-
Do we actually need a microservice providing an API here, or could the "API" be a static JSON manifest file listing the URIs and hashes? That could be generated once by a tool, and then served from the same storage system as the files themselves. The most reliable API is one with no moving parts. |
Beta Was this translation helpful? Give feedback.
-
I'm still not ready to write any code, but this has been floating around in my mind for a few days now. We currently use curl for all HTTP interaction, and it seems to me that we will continue to do so, even in this asset download overhaul. Currently, assets are generally downloaded in the GZip container (zlib compression). Curl seems to natively support HTTP streams that use brotli, gzip, and zstd. Would we like to rely on curl and the download server properly serving up a compressed asset or continue to serve explicitly compressed assets (eg gzipped files). On nginx, it's possible to, for example, create |
Beta Was this translation helpful? Give feedback.
-
Problem
Currently, Plasma downloads all files including executable code from the Plasma file server, the only connection in the protocol that is unencrypted. The architecture of the protocol and the lack of the modified DH exchange pose two main problems:
Further, all downloads are based on a singular "named" manifest download. For Ages, this is not problematic, however, when we begin offering multiple clients, e.g. windows 64-bit, windows 32-bit, macOS, Linux, this would require each combination to have a named manifest.
Proposal
To solve these issues, I propose that all downloads be served over HTTPS using standard web services. This could be either a home run nginx or cloud storage, such as an Amazon S3. These services provide transport layer security and avoid MOUL protocol design issues that may impact download speed.
To acquire the download URI and hash, there are a few options:
IMO the REST method would be more flexible and future proof.
Regardless of which method is selected, there should be separate ways to request "Age" and "code" assets. In the case of the former, a simple dataset (formerly "manifest") name should suffice. In the case of code assets, we would want more details provided in the request such as: dataset, OS, and architecture - at bare minimum. If a more unified approach is desired, a "universal" option could be given for those categories.
These are mostly just thoughts at the moment--because this means rewriting the patcher (again). Input is certainly welcome 😄
Beta Was this translation helpful? Give feedback.
All reactions