Skip to content

Commit

Permalink
chore: update _extract.py (#70)
Browse files Browse the repository at this point in the history
reponse -> response
  • Loading branch information
eltociear authored Dec 26, 2024
1 parent bf598dc commit 4a5439e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/raglite/_extract.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ class MyNameResponse(BaseModel):
system_prompt = getattr(return_type, "system_prompt", "").strip()
if not llm_supports_response_format or config.llm.startswith("llama-cpp-python"):
system_prompt += f"\n\nFormat your response according to this JSON schema:\n{return_type.model_json_schema()!s}"
# Constrain the reponse format to the JSON schema if it's supported by the LLM [1]. Strict mode
# Constrain the response format to the JSON schema if it's supported by the LLM [1]. Strict mode
# is disabled by default because it only supports a subset of JSON schema features [2].
# [1] https://docs.litellm.ai/docs/completion/json_mode
# [2] https://platform.openai.com/docs/guides/structured-outputs#some-type-specific-keywords-are-not-yet-supported
Expand Down

0 comments on commit 4a5439e

Please sign in to comment.