Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BYO LLM Image Update #6368

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

BYO LLM Image Update #6368

wants to merge 1 commit into from

Conversation

benironside
Copy link
Contributor

@benironside benironside commented Dec 18, 2024

Fixes #6276 by updating an outdated image in the BYO LLM guide.

Preview: ESS, serverless

Copy link

A documentation preview will be available soon.

Request a new doc build by commenting
  • Rebuild this PR: run docs-build
  • Rebuild this PR and all Elastic docs: run docs-build rebuild

run docs-build is much faster than run docs-build rebuild. A rebuild should only be needed in rare situations.

If your PR continues to fail for an unknown reason, the doc build pipeline may be broken. Elastic employees can check the pipeline status here.

Copy link
Contributor

mergify bot commented Dec 18, 2024

This pull request is now in conflicts. Could you fix it @benironside? 🙏
To fixup this pull request, you can check out it locally. See documentation: https://help.github.com/articles/checking-out-pull-requests-locally/

git fetch upstream
git checkout -b 6276-update-image-byo-llm upstream/6276-update-image-byo-llm
git merge upstream/main
git push upstream 6276-update-image-byo-llm

Copy link
Contributor

@jmikell821 jmikell821 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left an alt suggestion, but will approve.

@@ -135,7 +135,7 @@ Use the following commands in your CLI:

image::images/lms-cli-welcome.png[The CLI interface during execution of initial LM Studio commands]

After the model loads, you should see a `Model loaded successfully` message in the CLI.
After the model loads, you should see a `Model loaded successfully` message in the CLI. Select a model using the arrow keys and Enter.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
After the model loads, you should see a `Model loaded successfully` message in the CLI. Select a model using the arrow keys and Enter.
After the model loads, you should see a `Model loaded successfully` message in the CLI. Select a model using the arrow and **Enter** keys.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Request] Update Docs for Connecting Local LLM
2 participants