-
-
Notifications
You must be signed in to change notification settings - Fork 766
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to stream an HTTP response with connexion 3 and AsyncApp? #1928
Comments
Hi @mr-flannery , In the meantime, can you try by using the underlying Starlette |
Hi @Ruwann , thanks for the quick response! Your suggestion did solve the problem! from starlette.responses import StreamingResponse
# ...
async def streaming():
return StreamingResponse(async_gen_numbers(), status_code=200, media_type='text/plain') |
Hi @Ruwann , unfortunately, my response was a bit premature. It looks like it doesn't actually work as expected. While the code does run, it does not actually stream. From a caller's perspective it's still synchronous. When calling the endpoint with curl, it still blocks for 10 seconds before anything happens, i.e.:
I wrote an Express app to make sure this was actually the server's fault, not curls: const Express = require('express');
const app = new Express();
async function* generateNumbers() {
for (const i of [1, 2, 3, 4, 5]) {
await new Promise(resolve => setTimeout(resolve, 1000));
yield await Promise.resolve(i);
}
}
app.get('/streaming', async (req, res) => {
for await (const i of generateNumbers()) {
res.write(`data: ${i}\n\n`);
}
res.end();
});
app.listen(3456, () => {
console.log('Server is running on port 3456');
}) When using curl here, this works as expected, i.e. it sends a chunk of data every second. Therefore, I believe there might actually be a bug with sending StreamingResponses in connexion3 + AsyncApp. |
I cannot seem to reproduce the issue. When I use I used the following sample code: https://github.com/Ruwann/connexion-streaming |
Hi @Ruwann , thanks for the repo, I was able to track it down. This is what causes the behavior to change: app.add_api("openapi.yaml", validate_responses=True) Once I add |
Hi @Ruwann , just wanted to check in with you and see if there are there any plans to fix this? Best regards! |
An easy way to handle that would be to have the possibility to disable response body validation on per route basis |
This seem to work, though it is probably not future proof. validator_map = {
"response": MediaTypeDict(
{
'application/jsonlines+json': StreamResponseBodyValidator
}
),
}
class StreamResponseBodyValidator(AbstractResponseBodyValidator):
def wrap_send(self, send):
"""Disable validation, leaving stream untouched"""
return send
def _parse(self, stream: t.Generator[bytes, None, None]) -> t.Any: # type: ignore
return stream
def _validate(self, body: dict):
pass |
Description
I want to stream HTTP responses with connexion 3 + AsyncApp.
Expected behaviour
I would expect something like this to work:
Actual behaviour
Neither of the two approaches listed above works. In both cases there's an error like this:
Also I couldn't find any docs on how to do this with AsyncApp, only with FlaskApp.
Steps to reproduce
Additional info:
Output of the commands:
python --version
: 3.11.9pip show connexion | grep "^Version\:"
: 3.0.6The text was updated successfully, but these errors were encountered: