Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iterable support not working in Anthropic Tool Calling mode #1107

Open
1 of 4 tasks
anmolsood21 opened this issue Oct 21, 2024 · 1 comment
Open
1 of 4 tasks

Iterable support not working in Anthropic Tool Calling mode #1107

anmolsood21 opened this issue Oct 21, 2024 · 1 comment

Comments

@anmolsood21
Copy link

anmolsood21 commented Oct 21, 2024

  • This is actually a bug report.
  • I am not getting good LLM Results
  • I have tried asking for help in the community on discord or discussions and have not received a response.
  • I have tried searching the documentation and have not found an answer.

What Model are you using?

claude-3-5-sonnet-20240620

Describe the bug

When using ANTHROPIC_TOOLS mode, the iterable example here (https://python.useinstructor.com/concepts/lists/#extracting-tasks-using-iterable) doesn't work. However switching it to json mode for anthropic makes it work.

To Reproduce
Follow the example in https://python.useinstructor.com/concepts/lists/#extracting-tasks-using-iterable with an anthropic model in tool calling mode. The entire response is not being returned in chunks when using iterable.

Expected behavior
Iterable support should work in tool calling mode for anthropic as well.

@ivanbelenky
Copy link
Contributor

Hey @anmolsood21 thanks for raising the issue. If you can write more explicit reproducibility steps it would be great. From your description I tested the following without exceptions

import instructor
import anthropic
from openai import OpenAI
from typing import Iterable
from pydantic import BaseModel

client = instructor.from_anthropic(
    anthropic.Anthropic(), 
    mode=instructor.Mode.ANTHROPIC_TOOLS
)


class User(BaseModel):
    name: str
    age: int


users = client.chat.completions.create(
    model="claude-3-5-sonnet-20240620",
    temperature=0.1,
    response_model=Iterable[User],
    max_tokens=1024,
    stream=False,
    messages=[
        {
            "role": "user",
            "content": "Consider this data: Jason is 10 and John is 30.\
                         Correctly segment it into entitites\
                        Make sure the JSON is correct",
        },
    ],
)
for user in users:
    print(user)
    #> name='Jason' age=10
    #> name='John' age=30

I think you are instantiating the client the incorrect way, and this may be leading to problems. There is an instructor @classmethod defined for Anthropic provider. I suggest once again that if the issue persist, you paste the exact code that raises. Cheers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants