Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add optional prompt template transformation in input transform #379

Merged
merged 3 commits into from
Sep 16, 2024

Conversation

ohltyler
Copy link
Member

@ohltyler ohltyler commented Sep 13, 2024

Description

For RAG use cases, users may want to configure the prompt templates directly in the ML processors, dynamically using data inputs as part of the prompt. For example, collecting a list of documents as context to pass to an LLM to summarize and return a human-readable response.

This PR adds a component in the input transform modal to view the original prompt template, and the prompt template with any injected input values (if applicable).

More details:

  • adds prompt-related state vars and custom hooks when a prompt and/or input parameters are populated in InputTransformModal
  • dynamically renders the new components if a valid prompt field is found in the processor's model_config field
  • allows toggling between showing the prompt or not
  • allows toggling between showing the original prompt, or the transformed version, with injected parameter values
  • disables toggling and auto-selects options based on the transformed options being populated or not
  • also: defaults all of the JSON fields to have wrap=true so long values (e.g., prompts) don't stretch out on one line
  • also: minor var name updates to be consistent based on input/output contexts

Note this is just a first iteration at exposing the prompt template in the transform modal, and being able to see the finally-produced template that the model will receive. It will likely change, handle more edge case states, handle direct prompt manipulation, more details, etc. in the future.

Demo video, showing the prompt template being populated with parameter values:

screen-capture.22.webm

Issues resolved

Makes progress on #380

Check List

  • Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

@ohltyler ohltyler marked this pull request as ready for review September 16, 2024 16:17
Signed-off-by: Tyler Ohlsen <[email protected]>
@saimedhi saimedhi merged commit 50cd402 into opensearch-project:main Sep 16, 2024
6 checks passed
opensearch-trigger-bot bot pushed a commit that referenced this pull request Sep 16, 2024
* Add toggle-able prompt template transformation in input transform

Signed-off-by: Tyler Ohlsen <[email protected]>

* auto toggling based on parameters empty or not; var name updates

Signed-off-by: Tyler Ohlsen <[email protected]>

* remove TODO

Signed-off-by: Tyler Ohlsen <[email protected]>

---------

Signed-off-by: Tyler Ohlsen <[email protected]>
(cherry picked from commit 50cd402)
@ohltyler ohltyler deleted the prompt-template branch September 16, 2024 16:40
ohltyler added a commit that referenced this pull request Sep 16, 2024
…#381)


Signed-off-by: Tyler Ohlsen <[email protected]>
(cherry picked from commit 50cd402)

Co-authored-by: Tyler Ohlsen <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants