-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add sources to completions APIs and UI #1206
Conversation
On note 2, we could maybe add an input parameter to the call, same as "stream" that is "include_context_sources" or something like that. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM even though I'm not capable at 100% to validate what is happening 😅
) | ||
|
||
@classmethod | ||
def from_node(cls: type["Chunk"], node: NodeWithScore) -> "Chunk": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you couldn't use the Chunk type here? circular reference?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason is that this class method is defined within Chunk class definition. So Chunk type is not ready.
private_gpt/open_ai/openai_models.py
Outdated
@@ -48,8 +52,11 @@ class OpenAICompletion(BaseModel): | |||
choices: list[OpenAIChoice] | |||
|
|||
@classmethod | |||
def from_text( | |||
cls, text: str | None, finish_reason: str | None = None | |||
def from_text_and_sources( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tiny nit, but if you name the function from_text_and_sources
I would expect the sources to be mandatory, and from_text
to still be there but with the sources as an optional arg.
from_text(all_options_nullable)
, from_text_and_sources(options, non-null sources) -> call from_text
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
Reuse
Chunk
object to append the sources used during completions to the completions response.Affects both chat/completions and plain /completions
It has been integrated in the API and UI (see attachments).
Note 1:
Chunk
s returned as sources will always have theirprevious_texts
andnext_texts
set to none, given those are intended to be only used in the /chunks API - therefore we could eventually separate both objects (Chunks used for source representation, and /chunks API response)Note 2: this information is also being added to the stream API, making the streaming chunks too big in my opinion, potentially affecting performance; should we remove sources from the streaming API?