From bad4e12d63469cb90f6e734c8e1a5cf10a1493a0 Mon Sep 17 00:00:00 2001 From: Erik Dunteman <44653944+erik-dunteman@users.noreply.github.com> Date: Sat, 9 Mar 2024 17:42:29 -0800 Subject: [PATCH] Update README.md --- README.md | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index 92c1166..84ac5f0 100644 --- a/README.md +++ b/README.md @@ -132,7 +132,7 @@ ai = Fructose(model = "gpt-3.5-turbo") -## Custom Clients +## Custom Clients and Alternative APIs You can configure your own OpenAI client and use it with Fructose. This allows you to do things like route your calls through proxies or OpenAI-compatible LLM APIs. @@ -151,10 +151,7 @@ client = OpenAI( ai = Fructose(client = client) ``` -### Alternative LLM APIs: -Note that Fructose uses OpenAI's `json` return mode for all calls, and `tools` API for the `uses` function-calling. - -Not all alternative LLM providers support these features, and those that do likely exhibit subtle differences in the API. +Note that Fructose uses OpenAI's `json` return mode for all calls, and `tools` API for the `uses` function-calling. Not all alternative LLM providers support these features, and those that do likely exhibit subtle differences in the API. You're free to point to alternative APIs using a custom client (above) or the `OPENAI_BASE_URL` environment variable, but Fructose does not officially support it.