Skip to content

Commit

Permalink
feat(docs): update doc for ipex-llm (#1968)
Browse files Browse the repository at this point in the history
  • Loading branch information
shane-huang authored Jul 8, 2024
1 parent b687dc8 commit 19a7c06
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions fern/docs/pages/manual/llms.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -193,3 +193,9 @@ or

When the server is started it will print a log *Application startup complete*.
Navigate to http://localhost:8001 to use the Gradio UI or to http://localhost:8001/docs (API section) to try the API.

### Using IPEX-LLM

For a fully private setup on Intel GPUs (such as a local PC with an iGPU, or discrete GPUs like Arc, Flex, and Max), you can use [IPEX-LLM](https://github.com/intel-analytics/ipex-llm).

To deploy Ollama and pull models using IPEX-LLM, please refer to [this guide](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/ollama_quickstart.html). Then, follow the same steps outlined in the [Using Ollama](#using-ollama) section to create a `settings-ollama.yaml` profile and run the private-GPT server.

0 comments on commit 19a7c06

Please sign in to comment.