You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
~10s on my machine. This is only mildly an issue for actual use, because it isn't a frequently used endpoint. However it is an issue for the tests, and slows down the integration tests quite a bit.
From a poke around with py-spy, the overwhelming majority of the slowness is not in our code, but in the validation of the spec performed by prance. In principle, we could skip the validation altogether, because we actually only want prance in order to resolve the references. Alternatively, we would likely get a pretty big gain from some judicious caching. e.g. we could cache the result of parsing the dumped-to-string api spec, which skips multiple validations and resolves, and won't break if for some reason the spec changes without flowapi restarting. (We probably only want to cache one here).
The text was updated successfully, but these errors were encountered:
Hmm. Actually caching won't impact on our flakey tests, because the api spec is only got once per fixture initialisation, maybe would be better off using the resolver directly?
~10s on my machine. This is only mildly an issue for actual use, because it isn't a frequently used endpoint. However it is an issue for the tests, and slows down the integration tests quite a bit.
From a poke around with py-spy, the overwhelming majority of the slowness is not in our code, but in the validation of the spec performed by prance. In principle, we could skip the validation altogether, because we actually only want prance in order to resolve the references. Alternatively, we would likely get a pretty big gain from some judicious caching. e.g. we could cache the result of parsing the dumped-to-string api spec, which skips multiple validations and resolves, and won't break if for some reason the spec changes without flowapi restarting. (We probably only want to cache one here).
The text was updated successfully, but these errors were encountered: