Replies: 1 comment 2 replies
-
The problem is surely that you close a session in 1 task while the other tasks are in the middle of making a request with that session. If you're going to close a session in a fetch function, I'm not sure there's an obvious, clean way to replace the session like that. For this example, I'd probably just use a session per fetch.
But, I'm not sure why creating a new session would help with a 504 error, that should just mean that the upstream server is timing out (maybe because it's under too much load). |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to concurrent request for one url like following:
`
async def fetch(data):
url = ''
async with session.post(url=url, json=data) as resp:
text = await resp.text()
return text
async def fetch_all(datas):
return await asyncio.gather(*[fetch(data) for data in datas])
loop = asyncio.get_event_loop()
datas = []
session = aiohttp.ClientSession()
results = loop.run_until_complete(fetch_all(datas))
loop.run_until_complete(session.close())
loop.run_until_complete(asyncio.sleep(3))
loop.close()
`
but I always meet 504 in request. so I want to recreate session in function fetch when status equal 504:
`
session = None
async def fetch(data):
global session
url = ''
async with session.post(url=url, json=data) as resp:
if resp.status == 504:
await session.close()
session = aiohttp.ClientSession()
return await fetch(data)
else:
text = await resp.text()
return text
async def fetch_all(datas):
return await asyncio.gather(*[fetch(data) for data in datas])
loop = asyncio.get_event_loop()
datas = []
session = aiohttp.ClientSession()
results = loop.run_until_complete(fetch_all(datas))
loop.run_until_complete(session.close())
loop.run_until_complete(asyncio.sleep(3))
loop.close()
`
Then I meet: Connector is closed.
Beta Was this translation helpful? Give feedback.
All reactions