____ ____ ______ _ __ ______ __ __ ___ ______
/ __ \ / __ \ / ____// | / / / ____// / / // | /_ __/
/ / / // /_/ // __/ / |/ / / / / /_/ // /| | / /
/ /_/ // ____// /___ / /| / / /___ / __ // ___ | / /
\____//_/ /_____//_/ |_/ \____//_/ /_//_/ |_|/_/
- OpenChat is easy to use opensource chatting framework.
- OpenChat supports 40+ dialogue model based on neural networks.
- You can talk with AI with only one line of code.
bot_id: This is the name of the bot.
text: This is the chat to talk to the bot.
topic: This is the topic of the chat.
agent: The model type of the Bot you want to chat with.
["blender.small", "blender.medium", 'dialogpt.small',
'dialogpt.medium', 'gptneo.small', 'gptneo.large']
{"0": [AI's chatting - string]}
- Input
※ "S2lt" is the user ID to use when chatting.
※ You can change the "S2lt" of the url to any name you want.
bot_id: Mr.Bot
text: Hey, What are you going to do?
topic: weekend
agent: DIALOGPT.MEDIUM
curl -X POST "https://main-openchat-fpem123.endpoint.ainize.ai/send/S2lt" -H "accept: application/json" -H "Content-Type: multipart/form-data" -F "bot_id=Mr.Bot" -F "text=Hey, What are you going to do?" -F "topic=weekend" -F "agent=DIALOGPT.MEDIUM"
-
Output
{ "output": "I don't know." }
Demo page: End-point
API page: In Ainize
pip install openchat
- OpenChat supports 40+ dialogue models based on neural networks.
- Use these names as parameter
model='name'
when you createOpenChat
. Click here if you want to check supported models.
- gptneo.small
- gptneo.medium
- gptneo.large
- gptneo.xlarge
- blender.small
- blender.medium
- blender.large
- blender.xlarge
- blender.xxlarge
- dialogpt.small
- dialogpt.medium
- dialogpt.large
- dodecathlon.all_tasks_mt
- dodecathlon.convai2
- dodecathlon.wizard_of_wikipedia
- dodecathlon.empathetic_dialogues
- dodecathlon.eli5
- dodecathlon.reddit
- dodecathlon.twitter
- dodecathlon.ubuntu
- dodecathlon.image_chat
- dodecathlon.cornell_movie
- dodecathlon.light_dialog
- dodecathlon.daily_dialog
- reddit.xlarge
- reddit.xxlarge
- safety.offensive
- safety.sensitive
- unlikelihood.wizard_of_wikipedia.context_and_label
- unlikelihood.wizard_of_wikipedia.context
- unlikelihood.wizard_of_wikipedia.label
- unlikelihood.convai2.context_and_label
- unlikelihood.convai2.context
- unlikelihood.convai2.label
- unlikelihood.convai2.vocab.alpha.1e-0
- unlikelihood.convai2.vocab.alpha.1e-1
- unlikelihood.convai2.vocab.alpha.1e-2
- unlikelihood.convai2.vocab.alpha.1e-3
- unlikelihood.eli5.context_and_label
- unlikelihood.eli5.context
- unlikelihood.eli5.label
- wizard_of_wikipedia.end2end_generator
- Just import and create a object. That's all.
>>> from openchat import OpenChat
>>> OpenChat(model="blender.medium", device="cpu")
- Set param
device='cuda'
If you want to use GPU acceleration.
>>> from openchat import OpenChat
>>> OpenChat(model="blender.medium", device="cuda")
- Set
**kwargs
if you want to change decoding options.- method (str): one of
["greedy", "beam", "top_k", "nucleus"]
, - num_beams (int): size of beam search
- top_k (int): K value for top-k sampling
- top_p: (float): P value for nucleus sampling
- no_repeat_ngram_size (int): beam search n-gram blocking size for removing repetition,
- length_penalty (float): length penalty (1.0=None, UP=Longer, DOWN=Shorter)
- method (str): one of
- Decoding options must be
keyword argument
notpositional argument
.
>>> from openchat import OpenChat
>>> OpenChat(
... model="blender.medium",
... device="cpu",
... method="top_k",
... top_k=20,
... no_repeat_ngram_size=3,
... length_penalty=0.6,
... )
- For
safety.offensive
model, parametermethod
must be one of["both", "string-match", "bert"]
>>> from openchat import OpenChat
>>> OpenChat(
... model="safety.offensive",
... device="cpu"
... method="both" # ---> both, string-match, bert
... )
- The GPT-Neo model was released in the EleutherAI/gpt-neo repository.
- It is a GPT2 like causal language model trained on the Pile dataset.
- Openchat supports the above Prompt based dialogues via GPT-Neo.
- Below models provides custom prompt setting. (
*
means all models)gptneo.*
- ConvAI2 is one of the most famous conversational AI challenges about a persona.
- Openchat provides custom persona setting like above image.
- Below models provides custom perona setting. (
*
means all models)blender.*
dodecathlon.convai2
unlikelihood.convai2.*
- Wizard of wikipedia is one of most famous knowledge grounded dialogue dataset.
- Openchat provides custom topic setting like above image.
- Below models provides custom topic setting. (
*
means all models)wizard_of_wikipedia.end2end_generator
dodecathlon.wizard_of_wikipedia
unlikelihood.wizard_of_wikipedia.*
- Openchat provides a dialog safety model to help you design conversation model.
- Below models provides dialog safety features.
safety.offensive
: offensive words classificationsafety.sensitive
: sensitive topic classification
- Openchat is not a finished, but a growing library.
- I plan to add the following features in the near future.
- v1.0: Support
huggingface transformers
for DialoGPT and Blender. - v1.1: Support
parlai
for various dialogue generation tasks. - v1.2: Support
pytorch-lightning
for fine-tuning using GPU & TPU. - v1.3: Support
deepspeed
for huge model inference like Reddit 9.4B. - v1.4: Add Retrieval-based dialogue models.
- v1.5: Add
non-parlai
models (e.g. Baidu PLATO-2, ...) - v1.6: Easy deployment to messengers (e.g. Facebook, Whatsapp, ...)
- v1.7: Support database (e.g. PostgreSQL, MySQL, ...)
Copyright 2021 Hyunwoong Ko.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.