Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeeDeeG's February 2024 refactor of the Download microservice (complementary to PR 132) #133

Merged
merged 4 commits into from
Feb 3, 2024

Conversation

DeeDeeG
Copy link
Member

@DeeDeeG DeeDeeG commented Feb 2, 2024

This one is my changes, which were originally independently implemented on another branch (main...DeeDeeG:package-frontend:download-refactor-plus-more-error-handling) (Current tip of branch: DeeDeeG@007e7a2), but now re-done over PR 132, since those changes are cool too!!

All stemmed from some comments in (late, sorry!) review of #130 and #131.

Two people independently thought of ways to address that. This PR now targets my work onto confused's work (PR 132) to super-saiyan-fuse them together, or something.

See also: #132 which this is targeting to be merged into directly.

Copy link
Member

@confused-Techie confused-Techie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall I love these changes!

The safety added by them is fantastic, and especially invaluable.

The one thing I will comment on, is I'm going back and forth on the max length check of the queryParams. While this is 100% valid, it does feel brittle since it'd possibly break with new parameters being supported. Although at the same time, these parameters haven't changed at all since the implementation of this microservice so it may be a total moot point.

If you think we can rely on the stability of the existing parameters not being changed to much, then lets go right on ahead and merge this one into my branch, since besides that I don't have a single concern about the contents of the PR

@DeeDeeG
Copy link
Member Author

DeeDeeG commented Feb 3, 2024

tl;dr: we can maybe raise the limit to 100 or 250 characters? Or drop the limit altogether? Whichever you prefer.

My feeling on the length is, we own the length check and the set of valid parameters in a couple of files contained in this microservice, so we can update the max length allowed at the same time we add any longer valid combo of parameters... if we remember to update it.

That said, maybe we'd forget to update it. (I know I might easily forget to, now that you mention it.)

My motivation was to stop us spending a tiny amount of extra resources trying to parse some ridiculous megabytes-long set of query parameters -- I'm unsure how the app would respond do that, I guess I can test it relatively easily...

(Also as a fairly weak guard against unexpected behavior when duplicate or bogus arguments are provided, but it doesn't really do much for that, TBH.)

On the other hand, we could pretty easily raise this to, say, 100 characters and have it do mostly the same job. It just means we'd be sorting through up to ~25 bogus params max per request, instead of ~9. I imagine this is blazingly fast ™️ regardless. (Heck, I bet parsing through 10,000 bogus params is pretty darn fast.)

Anyway, how's raising it to 100 or 250 characters, as a compromise? Gives us nearly 3x (100) or nearly 7x (250) the wiggle room for future params to be added? Or just drop the length limit? Whatever you prefer I can easily go with.

@DeeDeeG
Copy link
Member Author

DeeDeeG commented Feb 3, 2024

Research:

% curl --verbose "localhost:8080?os=linux`printf '&p=v%.0s' {1..1000}`&type=linux_deb"
[ . . . ]
< HTTP/1.1 307 Temporary Redirect
< Location: https://github.com/pulsar-edit/pulsar-rolling-releases/releases/download/v1.113.0-dev/pulsar_1.113.2024012906_amd64.deb
[ . . . ]

^ Repeating a parameter &p=v 1,000 times is okay, if not imposing an (for example) 36 character limit for the query string.


% curl --verbose "localhost:8080?os=linux`printf '&p=v%.0s' {1..10000}`&type=linux_deb"
[ . . . ]
< HTTP/1.1 431 Request Header Fields Too Large
< Connection: close

^ Node or its HTTP module apparently has its own limits here, at least by default. I get an error message. Repeating the parameter &p=v 10,000 times will get the request rejected by Node HTTPS module by default.


% curl --verbose "localhost:8080?p=v`printf '&p=v%.0s' {1..4082}`"
[ . . . ]
< HTTP/1.1 307 Temporary Redirect
< Location: https://github.com/pulsar-edit/pulsar-rolling-releases/releases/download/v1.113.0-dev/pulsar_1.113.2024012906_amd64.deb
[ . . . ]

Upon further testing, I can repeat the &p=v parameter up to 4082 times at most, before getting the "Request Header Fields Too Large" error from NodeJS itself. This is around 16,328 characters.

% printf '&p=v%.0s' {1..4082} | wc -c
   16328

EDIT: Yeah, this is a NodeJS thing specifically to combat DDoSes, which is what I was going for here. Context (see answers here): https://stackoverflow.com/questions/24167656/nodejs-max-header-size-in-http-request

@confused-Techie
Copy link
Member

@DeeDeeG This is some amazing research here. I'm seriously impressed.

Also love seeing how there is absolutely a standard shown for just denying requests with too large headers.

So with that said, I'd be totally up for just giving us some wiggle room. Just enough that forgetting to adjust this once or twice doesn't give us any issues, which I think a 100 character limit would be perfect for. Honestly, I bet we could give our selves appropriate wiggle room with a limit of 50 even. I just would hate for it to break immediately on a single parameter change.

Neither risks much DDoS exposure, and this lets us update parameters
in the future without having to remember this limit is here and update
the limit as well.

Prevents us from accidentally breaking the microservice, although
the error messages if this got in our way would be extremely specific!

But not much downside to increasing this a bit, either.
@DeeDeeG
Copy link
Member Author

DeeDeeG commented Feb 3, 2024

Increased it to 100 for now. 👍


Slightly unrelated, but there's an error code or two that bug me.

if (!params.os || !params.type) {
await utils.displayError(req, res, {
code: 503,
msg: "Missing Required Download Parameters"
});
console.log("Download Returned 503 due to missing os or type");
return;
}

It's sending 503 when "Required Download Parameters [are] Missing or Invalid", whereas I think this should be code 400 or similar.

See: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status#server_error_responses for standard meanings of these codes.

If people use certain software like curl that understand the standard meanings of these status codes, they may see a message < HTTP/1.1 503 Service Unavailable when hitting this error, even if we don't send that string curl natively understands status code 503 to mean "Service Unavailable".

400 would be "400 Bad Request", which is more accurate for the user supplying wrong parameters, IMO.


EDIT to add:

Also, in utils.js, we do error code 505 as a sort of a "catch-all" error for findLink() (literally the content of the catch {} block), whereas the standard meaning for this is "505 HTTP Version Not Supported". I believe we should just be returning 500 status code, since that's the most specific to our situation while not having some separate meaning per the standard meanings.

} catch(err) {
console.log(err);
return {
ok: false,
code: 505,
msg: "Server Error"
};
}

@confused-Techie
Copy link
Member

@DeeDeeG That's a valid point. Feel free to make that change as well then we can go ahead and merge this one! Thanks for catching the weird return HTTP status, and for going with the suggestion of enlarging the accepted URI length

Also, give an error that's now a plain '500' a more specific message.
@DeeDeeG
Copy link
Member Author

DeeDeeG commented Feb 3, 2024

I updated the codes, and I felt a bit bad about making a 505 (easy to search for in the code) into a plain 500 with the standard message "Server Error", so I made the message more specific "Server Error While Finding Link", as it's serving as the catch-all (in the catch{} block) for findLink() function.

As a bonus, seeing capital "L" "Link" reminds me of Legend of Zelda, so that's kinda neat, arguably.

But anyway, this should be good to go now, probably!

Copy link
Member

@confused-Techie confused-Techie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is beautiful @DeeDeeG!
Thanks a ton for contributing and helping improve this microservice!

Which is still one of my favourites due to our work to make it have zero dependencies. Idk why that fact just tickles my brain in this world of NodeJS and a million deps.

But I think everything looks good now, just one final look and lets get it merged into my branch.


Which btw do love the Zelda tidbit in here, and the fact we can now test this microservice without any auth, which is a really good thing

@confused-Techie confused-Techie merged commit 4ae8954 into download/refactor Feb 3, 2024
DeeDeeG pushed a commit that referenced this pull request Feb 3, 2024
DeeDeeG's February 2024 refactor of the Download microservice (complementary to PR 132)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants