Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] docsum return HTTP status 500 #1129

Open
3 of 6 tasks
lianhao opened this issue Nov 14, 2024 · 2 comments
Open
3 of 6 tasks

[Bug] docsum return HTTP status 500 #1129

lianhao opened this issue Nov 14, 2024 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@lianhao
Copy link
Collaborator

lianhao commented Nov 14, 2024

Priority

P1-Stopper

OS type

Ubuntu

Hardware type

Xeon-SPR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source

Deploy method

  • Docker compose
  • Docker
  • Kubernetes
  • Helm

Running nodes

Single Node

What's the version?

73879d3

Description

When launch the DocSum example using the docker compose and/or helm with the latest images built from source, sending the curl request to the DocSum mega gateway service results the following errors:

$ curl http://${host_ip}:8888/v1/docsum   -H "Content-Type: multipart/form-data"     -F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5."     -F "max_tokens=32"     -F "language=en"     -F "stream=false"
Internal Server Error

Reproduce steps

Follow the DocSum Xeon Readme:

  1. export host_ip=
  2. source ../../../set_env.sh
  3. docker compose up -d
  4. curl http://${host_ip}:8888/v1/docsum -H "Content-Type: multipart/form-data" -F "messages=Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5." -F "max_tokens=32" -F "language=en" -F "stream=false"

Raw log

$ docker compose logs docsum-xeon-backend-server
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "HUGGINGFACEHUB_API_TOKEN" variable is not set. Defaulting to a blank string.
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "HUGGINGFACEHUB_API_TOKEN" variable is not set. Defaulting to a blank string.
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "no_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "https_proxy" variable is not set. Defaulting to a blank string.
WARN[0000] The "http_proxy" variable is not set. Defaulting to a blank string.
docsum-xeon-backend-server  | /usr/local/lib/python3.11/site-packages/pydantic/_internal/_fields.py:132: UserWarning: Field "model_name_or_path" in Audio2TextDoc has conflict with protected namespace "model_".
docsum-xeon-backend-server  |
docsum-xeon-backend-server  | You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
docsum-xeon-backend-server  |   warnings.warn(
docsum-xeon-backend-server  | [2024-11-14 02:37:39,377] [    INFO] - Base service - CORS is enabled.
docsum-xeon-backend-server  | [2024-11-14 02:37:39,378] [    INFO] - Base service - Setting up HTTP server
docsum-xeon-backend-server  | [2024-11-14 02:37:39,379] [    INFO] - Base service - Uvicorn server setup on port 8888
docsum-xeon-backend-server  | INFO:     Waiting for application startup.
docsum-xeon-backend-server  | INFO:     Application startup complete.
docsum-xeon-backend-server  | INFO:     Uvicorn running on http://0.0.0.0:8888 (Press CTRL+C to quit)
docsum-xeon-backend-server  | [2024-11-14 02:37:39,392] [    INFO] - Base service - HTTP server setup successful
docsum-xeon-backend-server  | INFO:     100.80.243.121:59802 - "POST /v1/docsum HTTP/1.1" 500 Internal Server Error
docsum-xeon-backend-server  | ERROR:    Exception in ASGI application
docsum-xeon-backend-server  | Traceback (most recent call last):
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 406, in run_asgi
docsum-xeon-backend-server  |     result = await app(  # type: ignore[func-returns-value]
docsum-xeon-backend-server  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
docsum-xeon-backend-server  |     return await self.app(scope, receive, send)
docsum-xeon-backend-server  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
docsum-xeon-backend-server  |     await super().__call__(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
docsum-xeon-backend-server  |     await self.middleware_stack(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
docsum-xeon-backend-server  |     raise exc
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
docsum-xeon-backend-server  |     await self.app(scope, receive, _send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/prometheus_fastapi_instrumentator/middleware.py", line 174, in __call__
docsum-xeon-backend-server  |     raise exc
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/prometheus_fastapi_instrumentator/middleware.py", line 172, in __call__
docsum-xeon-backend-server  |     await self.app(scope, receive, send_wrapper)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
docsum-xeon-backend-server  |     await self.app(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
docsum-xeon-backend-server  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
docsum-xeon-backend-server  |     raise exc
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
docsum-xeon-backend-server  |     await app(scope, receive, sender)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
docsum-xeon-backend-server  |     await self.middleware_stack(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
docsum-xeon-backend-server  |     await route.handle(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
docsum-xeon-backend-server  |     await self.app(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
docsum-xeon-backend-server  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
docsum-xeon-backend-server  |     raise exc
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
docsum-xeon-backend-server  |     await app(scope, receive, sender)
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
docsum-xeon-backend-server  |     response = await f(request)
docsum-xeon-backend-server  |                ^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 301, in app
docsum-xeon-backend-server  |     raw_response = await run_endpoint_function(
docsum-xeon-backend-server  |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
docsum-xeon-backend-server  |     return await dependant.call(**values)
docsum-xeon-backend-server  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/home/user/GenAIComps/comps/cores/mega/gateway.py", line 423, in handle_request
docsum-xeon-backend-server  |     data = await request.json()
docsum-xeon-backend-server  |            ^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/site-packages/starlette/requests.py", line 249, in json
docsum-xeon-backend-server  |     self._json = json.loads(body)
docsum-xeon-backend-server  |                  ^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
docsum-xeon-backend-server  |     return _default_decoder.decode(s)
docsum-xeon-backend-server  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
docsum-xeon-backend-server  |     obj, end = self.raw_decode(s, idx=_w(s, 0).end())
docsum-xeon-backend-server  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
docsum-xeon-backend-server  |   File "/usr/local/lib/python3.11/json/decoder.py", line 355, in raw_decode
docsum-xeon-backend-server  |     raise JSONDecodeError("Expecting value", s, err.value) from None
docsum-xeon-backend-server  | json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
@lianhao
Copy link
Collaborator Author

lianhao commented Nov 14, 2024

@lvliang-intel investigated that the PR opea-project/GenAIComps/pull/865 caused this issue

@chensuyue chensuyue mentioned this issue Nov 14, 2024
4 tasks
@ashahba ashahba self-assigned this Nov 14, 2024
@joshuayao joshuayao added this to OPEA Nov 15, 2024
@joshuayao joshuayao added this to the v1.1 milestone Nov 15, 2024
@joshuayao joshuayao added the bug Something isn't working label Nov 15, 2024
@joshuayao joshuayao moved this to In progress in OPEA Nov 15, 2024
@ashahba
Copy link
Collaborator

ashahba commented Nov 15, 2024

Here's the fix for this issue:
opea-project/GenAIComps#902

@joshuayao joshuayao removed this from the v1.1 milestone Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: In progress
Development

No branches or pull requests

3 participants