You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
By accident, we managed to submit duplicate upload jobs twice this week: once through the website (not sure how this happened) and once via HTTP request (ran the same request twice).
We then weren't able to cancel the duplicate jobs - for one of them we didn't even realize until Jon alerted us.
In the end, running two uploads of the same data simultaneously didn't cause an issue: the second operation on each file must have seen that the data already existed and skipped it. A duplicate run of the sorting capsule was started after upload, and would have been wasteful had Jon not canceled it.
I can't think of a case where someone would want to upload the same session multiple times simultaneously, so I propose that the server could prevent this from happening.
Describe the solution you'd like
Before allowing a new upload job to be submitted, check there isn't already an upload job for that session in progress.
If there is a reason to allow multiple uploads with the same session ID, then compare csv/job upload parameters instead.
Describe alternatives you've considered
user can check on site or via HTTP request.
Problems with this are:
user has to be aware that duplicate upload jobs are possible
user has to remember to check every time, or implement their own check in code (resulting in multiple people solving the same problem)
"submit job" helper code could be created that has this check built in; everybody uses that.
Potential problem with this: may trade customizability of current API for usability with a subset of features, and then the full API will be used anyway when extra features are needed
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
By accident, we managed to submit duplicate upload jobs twice this week: once through the website (not sure how this happened) and once via HTTP request (ran the same request twice).
We then weren't able to cancel the duplicate jobs - for one of them we didn't even realize until Jon alerted us.
In the end, running two uploads of the same data simultaneously didn't cause an issue: the second operation on each file must have seen that the data already existed and skipped it. A duplicate run of the sorting capsule was started after upload, and would have been wasteful had Jon not canceled it.
I can't think of a case where someone would want to upload the same session multiple times simultaneously, so I propose that the server could prevent this from happening.
Describe the solution you'd like
Before allowing a new upload job to be submitted, check there isn't already an upload job for that session in progress.
If there is a reason to allow multiple uploads with the same session ID, then compare csv/job upload parameters instead.
Describe alternatives you've considered
Problems with this are:
Potential problem with this: may trade customizability of current API for usability with a subset of features, and then the full API will be used anyway when extra features are needed
The text was updated successfully, but these errors were encountered: