Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Invalid role when using chat history with Google Gemini LLM model #1183

Open
Stanley00 opened this issue Oct 1, 2024 · 0 comments
Open

Invalid role when using chat history with Google Gemini LLM model #1183

Stanley00 opened this issue Oct 1, 2024 · 0 comments

Comments

@Stanley00
Copy link

When using Gemini as LLM model for Assistant, I got this error:

Traceback (most recent call last):
  File "./example", line 11, in <module>
    assistant.print_response("What if I replace 10 by 8 in previous message?")
  File "$VENV_PATH/lib/python3.10/site-packages/phi/assistant/assistant.py", line 1476, in print_response
    for resp in self.run(message=message, messages=messages, stream=True, **kwargs):
  File "$VENV_PATH/lib/python3.10/site-packages/phi/assistant/assistant.py", line 894, in _run
    for response_chunk in self.llm.response_stream(messages=llm_messages):
  File "$VENV_PATH/lib/python3.10/site-packages/phi/llm/google/gemini.py", line 273, in response_stream
    for response in self.invoke_stream(messages=messages):
  File "$VENV_PATH/lib/python3.10/site-packages/phi/llm/google/gemini.py", line 138, in invoke_stream
    yield from self.client.generate_content(
  File "$VENV_PATH/lib/python3.10/site-packages/google/generativeai/generative_models.py", line 325, in generate_content
    iterator = self._client.stream_generate_content(
  File "$VENV_PATH/lib/python3.10/site-packages/google/ai/generativelanguage_v1beta/services/generative_service/client.py", line 1134, in stream_generate_content
    response = rpc(
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
    return wrapped_func(*args, **kwargs)
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py", line 293, in retry_wrapped_func
    return retry_target(
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py", line 153, in retry_target
    _retry_error_helper(
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/retry/retry_base.py", line 212, in _retry_error_helper
    raise final_exc from source_exc
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/retry/retry_unary.py", line 144, in retry_target
    result = target()
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/timeout.py", line 120, in func_with_timeout
    return func(*args, **kwargs)
  File "$VENV_PATH/lib/python3.10/site-packages/google/api_core/grpc_helpers.py", line 174, in error_remapped_callable
    raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.InvalidArgument: 400 Please use a valid role: user, model.

Example code:

from phi.assistant import Assistant
from phi.llm.google import Gemini

assistant = Assistant(
   llm = Gemini(model="gemini-1.5-flash"),
   add_chat_history_to_messages=True,
   debug_mode=True
)

assistant.print_response("What is 2+10?")
assistant.print_response("What if I replace 10 by 8 in previous message?")

Potential fix is at:

role = "model" if msg.role == "system" else "user" if msg.role == "tool" else msg.role

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant