Skip to content

Commit

Permalink
Sync with upstream
Browse files Browse the repository at this point in the history
  • Loading branch information
Крестников Константин Николаевич authored and Крестников Константин Николаевич committed Nov 9, 2023
2 parents 90e5b04 + e188f2e commit 82677ca
Show file tree
Hide file tree
Showing 43 changed files with 1,908 additions and 1,007 deletions.
4 changes: 4 additions & 0 deletions .clabot
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"contributors": ["eyurtsev", "hwchase17", "nfcampos", "efriis", "jacoblee93", "dqbd", "kreneskyp", "adarsh-jha-dev", "harris", "baskaryan", "hinthornw", "bracesproul", "jakerachleff"],
"message": "Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have the username {{usersWithoutCLA}} on file. In order for us to review and merge your code, please complete the Individual Contributor License Agreement here https://forms.gle/Ljhqvt9Gdi1N385W6 .\n\nThis process is done manually on our side, so after signing the form one of the maintainers will add you to the contributors list.\n\nFor more details about why we have a CLA and other contribution guidelines please see: https://github.com/langchain-ai/langserve/blob/main/CONTRIBUTING.md."
}
1 change: 1 addition & 0 deletions .github/workflows/_pydantic_compatibility.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ env:

jobs:
build:
timeout-minutes: 5
defaults:
run:
working-directory: ${{ inputs.working-directory }}
Expand Down
9 changes: 8 additions & 1 deletion .github/workflows/langserve_ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,14 @@ jobs:
working-directory: .
secrets: inherit

pydantic-compatibility:
uses:
./.github/workflows/_pydantic_compatibility.yml
with:
working-directory: .
secrets: inherit
test:
timeout-minutes: 5
runs-on: ubuntu-latest
defaults:
run:
Expand All @@ -51,7 +58,7 @@ jobs:
- "3.9"
- "3.10"
- "3.11"
name: Python ${{ matrix.python-version }} extended tests
name: Python ${{ matrix.python-version }} tests
steps:
- uses: actions/checkout@v3

Expand Down
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,10 @@ lint_diff format_diff: PYTHON_FILES=$(shell git diff --relative=. --name-only --

lint lint_diff:
poetry run ruff .
poetry run black $(PYTHON_FILES) --check
poetry run ruff format $(PYTHON_FILES) --check

format format_diff:
poetry run black $(PYTHON_FILES)
poetry run ruff format $(PYTHON_FILES)
poetry run ruff --select I --fix $(PYTHON_FILES)

spell_check:
Expand Down
27 changes: 22 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# GigaServe 🦜️🏓 = LangServe + GigaChat

[![Release Notes](https://img.shields.io/github/release/langchain-ai/langserve)](https://github.com/langchain-ai/langserve/releases)
[![Downloads](https://static.pepy.tech/badge/langserve/month)](https://pepy.tech/project/langserve)
[![Open Issues](https://img.shields.io/github/issues-raw/langchain-ai/langserve)](https://github.com/langchain-ai/langserve/issues)
[![](https://dcbadge.vercel.app/api/server/6adMQxSpJS?compact=true&style=flat)](https://discord.com/channels/1038097195422978059/1170024642245832774)

🚩 We will be releasing a hosted version of LangServe for one-click deployments of LangChain applications. [Sign up here](https://airtable.com/app0hN6sd93QcKubv/shrAjst60xXa6quV2) to get on the waitlist.

## Overview
Expand All @@ -26,7 +31,7 @@ A javascript client is available in [LangChainJS](https://js.langchain.com/docs/
### Limitations

- Client callbacks are not yet supported for events that originate on the server
- Does not work with [pydantic v2 yet](https://github.com/tiangolo/fastapi/issues/10360)
- OpenAPI docs will not be generated when using Pydantic V2. Fast API does not support [mixing pydantic v1 and v2 namespaces](https://github.com/tiangolo/fastapi/issues/10360). See section below for more details.

## Hosted LangServe

Expand All @@ -38,10 +43,10 @@ We will be releasing a hosted version of LangServe for one-click deployments of

## LangChain CLI 🛠️

Use the `LangChain` CLI to bootstrap a `LangServe` project quickly.
Use the `GigaChain` CLI to bootstrap a `GigaChain` project quickly.

To use the langchain CLI make sure that you have a recent version of `gigachain-cli`
installed. You can install it with `pip install -U "gigachain-cli[serve]"`.
To use the gigachain CLI make sure that you have a recent version of `gigachain-cli`
installed. You can install it with `pip install -U gigachain-cli`.

```sh
langchain app new ../path/to/directory
Expand Down Expand Up @@ -257,6 +262,15 @@ You can deploy to GCP Cloud Run using the following command:
gcloud run deploy [your-service-name] --source . --port 8001 --allow-unauthenticated --region us-central1 --set-env-vars=OPENAI_API_KEY=your_key
```

## Pydantic

LangServe provides support for Pydantic 2 with some limitations.

1. OpenAPI docs will not be generated for invoke/batch/stream/stream_log when using Pydantic V2. Fast API does not support [mixing pydantic v1 and v2 namespaces].
2. LangChain uses the v1 namespace in Pydantic v2. Please read the [following guidelines to ensure compatibility with LangChain](https://github.com/langchain-ai/langchain/discussions/9337)

Except for these limitations, we expect the API endpoints, the playground and any other features to work as expected.

## Advanced

### Files
Expand Down Expand Up @@ -381,7 +395,10 @@ that are uploaded as base64 encoded strings. Here's the full [example](https://g
Snippet:

```python
from pydantic import Field
try:
from pydantic.v1 import Field
except ImportError:
from pydantic import Field

from langserve import CustomUserType

Expand Down
2 changes: 1 addition & 1 deletion examples/agent/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@
from langchain.chat_models import ChatOpenAI
from langchain.embeddings import OpenAIEmbeddings
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.pydantic_v1 import BaseModel
from langchain.tools.render import format_tool_to_openai_function
from langchain.vectorstores import FAISS
from pydantic import BaseModel

from langserve import add_routes

Expand Down
120 changes: 103 additions & 17 deletions examples/configurable_chain/client.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -18,28 +18,16 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"{'output': \"Why don't scientists trust atoms who play sports?\\n\\nBecause they make up everything!\",\n",
" 'callback_events': []}"
]
},
"execution_count": 1,
"metadata": {},
"output_type": "execute_result"
}
],
"outputs": [],
"source": [
"import requests\n",
"\n",
"inputs = {\"input\": {\"topic\": \"sports\"}}\n",
"response = requests.post(\"http://localhost:8000/invoke\", json=inputs)\n",
"response = requests.post(\"http://localhost:8000/configurable_temp/invoke\", json=inputs)\n",
"\n",
"response.json()"
]
Expand All @@ -61,7 +49,7 @@
"source": [
"from langserve import RemoteRunnable\n",
"\n",
"remote_runnable = RemoteRunnable(\"http://localhost:8000/\")"
"remote_runnable = RemoteRunnable(\"http://localhost:8000/configurable_temp\")"
]
},
{
Expand Down Expand Up @@ -177,6 +165,104 @@
" },\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Configurability Based on Request Properties\n",
"\n",
"If you want to change your chain invocation based on your request's properties,\n",
"you can do so with `add_routes`'s `per_req_config_modifier` method as follows:\n",
"\n",
"```python \n",
"\n",
"# Add another example route where you can configure the model based\n",
"# on properties of the request. This is useful for passing in API\n",
"# keys from request headers (WITH CAUTION) or using other properties\n",
"# of the request to configure the model.\n",
"def fetch_api_key_from_header(config: Dict[str, Any], req: Request) -> Dict[str, Any]:\n",
" if \"x-api-key\" in req.headers:\n",
" config[\"configurable\"][\"openai_api_key\"] = req.headers[\"x-api-key\"]\n",
" return config\n",
"\n",
"dynamic_auth_model = ChatOpenAI(openai_api_key='placeholder').configurable_fields(\n",
" openai_api_key=ConfigurableField(\n",
" id=\"openai_api_key\",\n",
" name=\"OpenAI API Key\",\n",
" description=(\n",
" \"API Key for OpenAI interactions\"\n",
" ),\n",
" ),\n",
")\n",
"\n",
"dynamic_auth_chain = dynamic_auth_model | StrOutputParser()\n",
"\n",
"add_routes(\n",
" app, \n",
" dynamic_auth_chain, \n",
" path=\"/auth_from_header\",\n",
" config_keys=[\"configurable\"], \n",
" per_req_config_modifier=fetch_api_key_from_header\n",
")\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we can see that our request to the model will only work if we have a specific request\n",
"header set:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# The model will fail with an auth error\n",
"unauthenticated_response = requests.post(\n",
" \"http://localhost:8000/auth_from_header/invoke\", json={\"input\": \"hello\"}\n",
")\n",
"unauthenticated_response.json()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, ensure that you have run the following locally on your shell\n",
"```bash\n",
"export TEST_API_KEY=<INSERT MY KEY HERE>\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# The model will succeed as long as the above shell script is run previously\n",
"import os\n",
"\n",
"test_key = os.environ[\"TEST_API_KEY\"]\n",
"authenticated_response = requests.post(\n",
" \"http://localhost:8000/auth_from_header/invoke\",\n",
" json={\"input\": \"hello\"},\n",
" headers={\"x-api-key\": test_key},\n",
")\n",
"authenticated_response.json()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -195,7 +281,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.9.18"
}
},
"nbformat": 4,
Expand Down
75 changes: 58 additions & 17 deletions examples/configurable_chain/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,9 @@
1) Configurable Fields: Use this to specify values for a given initialization parameter
2) Configurable Alternatives: Use this to specify complete alternative runnables
"""
from fastapi import FastAPI
from typing import Any, Dict

from fastapi import FastAPI, HTTPException, Request
from fastapi.middleware.cors import CORSMiddleware
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
Expand All @@ -15,6 +17,25 @@

from langserve import add_routes

app = FastAPI(
title="LangChain Server",
version="1.0",
description="Spin up a simple api server using Langchain's Runnable interfaces",
)

# Set all CORS enabled origins
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
expose_headers=["*"],
)

###############################################################################
# EXAMPLE 1: Configure fields based on RunnableConfig #
###############################################################################
model = ChatOpenAI(temperature=0.5).configurable_alternatives(
ConfigurableField(
id="llm",
Expand All @@ -38,26 +59,46 @@
)
chain = prompt | model | StrOutputParser()

app = FastAPI(
title="GigaChain Server",
version="1.0",
description="Spin up a simple api server using Langchain's Runnable interfaces",
)
# Add routes requires you to specify which config keys are accepted
# specifically, you must accept `configurable` as a config key.
add_routes(app, chain, path="/configurable_temp", config_keys=["configurable"])

# Set all CORS enabled origins
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
expose_headers=["*"],

###############################################################################
# EXAMPLE 2: Configure fields based on Request metadata #
###############################################################################


# Add another example route where you can configure the model based
# on properties of the request. This is useful for passing in API
# keys from request headers (WITH CAUTION) or using other properties
# of the request to configure the model.
def fetch_api_key_from_header(config: Dict[str, Any], req: Request) -> Dict[str, Any]:
if "x-api-key" in req.headers:
config["configurable"]["openai_api_key"] = req.headers["x-api-key"]
else:
raise HTTPException(401, "No API key provided")

return config


dynamic_auth_model = ChatOpenAI(openai_api_key="placeholder").configurable_fields(
openai_api_key=ConfigurableField(
id="openai_api_key",
name="OpenAI API Key",
description=("API Key for OpenAI interactions"),
),
)

dynamic_auth_chain = dynamic_auth_model | StrOutputParser()

# Add routes requires you to specify which config keys are accepted
# specifically, you must accept `configurable` as a config key.
add_routes(app, chain, config_keys=["configurable"])
add_routes(
app,
dynamic_auth_chain,
path="/auth_from_header",
config_keys=["configurable"],
per_req_config_modifier=fetch_api_key_from_header,
)

if __name__ == "__main__":
import uvicorn
Expand Down
2 changes: 1 addition & 1 deletion examples/conversational_retrieval_chain/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@
from langchain.chains import ConversationalRetrievalChain
from langchain.chat_models import ChatOpenAI
from langchain.embeddings import OpenAIEmbeddings
from langchain.pydantic_v1 import BaseModel, Field
from langchain.vectorstores import FAISS
from pydantic import BaseModel, Field

from langserve import add_routes

Expand Down
2 changes: 1 addition & 1 deletion examples/file_processing/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@
from fastapi import FastAPI
from langchain.document_loaders.blob_loaders import Blob
from langchain.document_loaders.parsers.pdf import PDFMinerParser
from langchain.pydantic_v1 import Field
from langchain.schema.runnable import RunnableLambda
from pydantic import Field

from langserve import CustomUserType, add_routes

Expand Down
Loading

0 comments on commit 82677ca

Please sign in to comment.