-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Security solution] Elastic Assistant, OpenAI stream #169729
Conversation
x-pack/packages/kbn-elastic-assistant/impl/assistant/use_conversation/index.tsx
Outdated
Show resolved
Hide resolved
reader?: ReadableStreamDefaultReader<Uint8Array>; | ||
content?: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this interface intended to stay the same through part2? Will we support streaming turned on/off (even if just technically and not exposed via the UI)? If so, we might want to rework these types so one of the two are required. Does a message make sense without content now? I suppose the new empty 'loading' message would fit this interface?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if a message has both content + reader, we know to show the "Regenerate response" button. @jamesspi thinks we should turn streaming on 100% of the time. Even with streaming, once a reader is complete the message will get its content
property
x-pack/packages/kbn-elastic-assistant/impl/assistant/use_conversation/index.tsx
Outdated
Show resolved
Hide resolved
jest.clearAllMocks(); | ||
}); | ||
|
||
it('Should stream response. isLoading/isStreaming are true while streaming, isLoading/isStreaming are false when streaming completes', async () => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The description of this test appears to contain three assertions, and the implementation contains a total of 5
expects
. Consider (generally) one assertion per test
x-pack/plugins/security_solution/public/assistant/get_comments/stream/stream_observable.ts
Outdated
Show resolved
Hide resolved
customCodeBlock: (props) => { | ||
return ( | ||
<> | ||
<CustomCodeBlock value={props.value} /> | ||
<EuiSpacer size="m" /> | ||
</> | ||
); | ||
}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like everything around the markdown parsing/processing got moved over correctly, however I'm not seeing the codeblock actions show up anymore:
I confirmed the custom codeblock is rendering as expected, so it is probably something downstream w/ augmentMessageCodeBlocks
kibana/x-pack/plugins/security_solution/public/assistant/helpers.tsx
Lines 80 to 97 in 480ef87
/** | |
* Augments the messages in a conversation with code block details, including | |
* the start and end indices of the code block in the message, the type of the | |
* code block, and the button to add the code block to the timeline. | |
* | |
* @param currentConversation | |
*/ | |
export const augmentMessageCodeBlocks = ( | |
currentConversation: Conversation | |
): CodeBlockDetails[][] => { | |
const cbd = currentConversation.messages.map(({ content }) => | |
analyzeMarkdown( | |
getMessageContentWithReplacements({ | |
messageContent: content ?? '', | |
replacements: currentConversation.replacements, | |
}) | |
) | |
); |
You had mentioned some changes around getMessageContentWithReplacements()
, which this ends up calling -- perhaps it's coming from there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok I've got this sort of figured out. First off, I dropped the className that the getControlContainer
was looking for, className={'message-${index}'}
. I put the class in place, and it is working again for NON-streaming.
Putting a debugger
in the getControlContainer
while streaming is turned on, when our comments changes from reader
to content
there seems to be a flash where the .euiCodeBlock__controls
disappears. As soon as the debugger is played, the controls appear again. ❓
Playing with this, I tried switching from useLayoutEffect
to useEffect
, but the same issue happened. I added a setTimeout
, and experimented with the timeout length and even a timeout of 0 seems to resolve the issue. Kind of a render bandaid that appears to be holding... what do you think? Any better ideas?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussed offline w/ Steph -- this all looks good and even improves the inconsistencies with the current experience, so all the better! 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't have exact repro steps for getting it consistently stuck, but at the time I was testing closing/opening the assistant mid-stream and then later opening/closing the timeline instance of the assistant while it was streaming as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shit ok, i've not seen this. I bet calling useStream
's setComplete
callback on modal close will prevent this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe this should resolve the issue: cae753a
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this first phase of streaming @stephmilovic! 🙏
LGTM 🚀
@elasticmachine merge upstream |
@elasticmachine merge upstream |
💚 Build Succeeded
Metrics [docs]Module Count
Public APIs missing comments
Async chunks
History
To update your PR or re-run it, just comment with: |
Part 1 of streaming in Security's AI Assistant
Implements streaming for OpenAI in a non-langchain environment
To test:
kbn-elastic-assistant/impl/assistant/api.tsx
namedisStream
To do in follow up:
isStream
feature flag