Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Object of type list_iterator is not JSON serializable #972

Open
5 of 9 tasks
leoterry-ulrica opened this issue Aug 31, 2024 · 1 comment
Open
5 of 9 tasks

Object of type list_iterator is not JSON serializable #972

leoterry-ulrica opened this issue Aug 31, 2024 · 1 comment

Comments

@leoterry-ulrica
Copy link

  • This is actually a bug report.
  • I am not getting good LLM Results
  • I have tried asking for help in the community on discord or discussions and have not received a response.
  • I have tried searching the documentation and have not found an answer.

What Model are you using?

  • gpt-3.5-turbo
  • gpt-4-turbo
  • gpt-4
  • qwen2-72b
  • Other (please specify)

Describe the bug

2024-08-31 13:28:58,102 xinference.core.model 3678673 DEBUG    Enter wrapped_func, args: (<xinference.core.model.ModelActor object at 0x7a7d98a89ad0>, 'John Doe is 30 years old.', '', [], {'tool_choice': {'function': {'name': 'UserInfo'}, 'type': 'function'}, 'tools': <list_iterator object at 0x7a7d92bb3400>}), kwargs: {'raw_params': {'tool_choice': {'type': 'function', 'function': {'name': 'UserInfo'}}, 'tools': [{'type': 'function', 'function': {'name': 'UserInfo', 'description': 'Correctly extracted `UserInfo` with all the required parameters with correct types', 'parameters': {'properties': {'name': {'title': 'Name', 'type': 'string'}, 'age': {'title': 'Age', 'type': 'integer'}}, 'required': ['age', 'name'], 'type': 'object'}}}]}}
2024-08-31 13:28:58,102 xinference.core.model 3678673 DEBUG    Request chat, current serve request count: 0, request limit: None for the model qwen2-instruct
2024-08-31 13:28:58,102 xinference.model.llm.sglang.core 3678673 DEBUG    Enter generate, prompt: <|im_start|>system
You are a helpful assistant.<|im_end|>
<|im_start|>user
John Doe is 30 years old.<|im_end|>
<|im_start|>assistant
, generate config: {'tool_choice': {'function': {'name': 'UserInfo'}, 'type': 'function'}, 'tools': <list_iterator object at 0x7a7d92bb3400>, 'stop': ['<|endoftext|>', '<|im_start|>', '<|im_end|>'], 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'temperature': 1.0, 'top_p': 1.0, 'top_k': -1, 'max_new_tokens': 256, 'stream': False, 'stream_options': None, 'ignore_eos': False}
2024-08-31 13:28:58,103 xinference.core.model 3678673 DEBUG    After request chat, current serve request count: 0 for the model qwen2-instruct
2024-08-31 13:28:58,110 xinference.api.restful_api 3472060 ERROR    [address=0.0.0.0:46209, pid=3678673] Object of type list_iterator is not JSON serializable
Traceback (most recent call last):
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/api/restful_api.py", line 1752, in create_chat_completion
    data = await model.chat(
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/backends/context.py", line 231, in send
    return self._process_result_message(result)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
    raise message.as_instanceof_cause()
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/backends/pool.py", line 656, in send
    result = await self._run_coro(message.message_id, coro)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/backends/pool.py", line 367, in _run_coro
    return await coro
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/api.py", line 384, in __on_receive__
    return await super().__on_receive__(message)  # type: ignore
  File "xoscar/core.pyx", line 558, in __on_receive__
    raise ex
  File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.__on_receive__
    async with self._lock:
  File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.__on_receive__
    with debug_async_timeout('actor_lock_timeout',
  File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.__on_receive__
    result = await result
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/utils.py", line 45, in wrapped
    ret = await func(*args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/model.py", line 90, in wrapped_func
    ret = await fn(self, *args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xoscar/api.py", line 462, in _wrapper
    r = await func(self, *args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/model.py", line 536, in chat
    response = await self._call_wrapper_json(
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/model.py", line 401, in _call_wrapper_json
    return await self._call_wrapper("json", fn, *args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/model.py", line 114, in _async_wrapper
    return await fn(*args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/core/model.py", line 410, in _call_wrapper
    ret = await fn(*args, **kwargs)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/model/llm/sglang/core.py", line 440, in async_chat
    c = await self.async_generate(full_prompt, generate_config)  # type: ignore
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/model/llm/sglang/core.py", line 338, in async_generate
    state = await self._non_stream_generate(prompt, **sanitized_generate_config)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/xinference/model/llm/sglang/core.py", line 313, in _non_stream_generate
    async with session.post(
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/aiohttp/client.py", line 1353, in __aenter__
    self._resp = await self._coro
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/aiohttp/client.py", line 499, in _request
    data = payload.JsonPayload(json, dumps=self._json_serialize)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/site-packages/aiohttp/payload.py", line 397, in __init__
    dumps(value).encode(encoding),
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/json/__init__.py", line 231, in dumps
    return _default_encoder.encode(obj)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/root/miniconda3/envs/xin-0.14.4/lib/python3.10/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: [address=0.0.0.0:46209, pid=3678673] Object of type list_iterator is not JSON serializable

To Reproduce

import instructor
from pydantic import BaseModel
from openai import OpenAI


# Define your desired output structure
class UserInfo(BaseModel):
    name: str
    age: int


# Patch the OpenAI client
client = instructor.from_openai(OpenAI(base_url="http://ip:port/v1"), mode=instructor.Mode.TOOLS)

# Extract structured data from natural language
user_info = client.chat.completions.create(
    model="qwen2-instruct",
    response_model=UserInfo,
    messages=[{"role": "user", "content": "John Doe is 30 years old."}],
    max_retries=2
)

print(user_info.name)
#> John Doe
print(user_info.age)
#> 30

Expected behavior

print(user_info.name)
#> John Doe
print(user_info.age)
#> 30

Screenshots
If applicable, add screenshots to help explain your problem.

@ivanleomk
Copy link
Collaborator

@leoterry-ulrica what server are you hosting this on? Are you using ollama , vllm or some other host?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@leoterry-ulrica @ivanleomk and others