Skip to content

connect KobolAI API clients to text-generation-inference

License

Notifications You must be signed in to change notification settings

Pyroserenus/tgi-kai-bridge

 
 

Repository files navigation

tgi-kai-bridge

Minimal API translation layer to make text-generation-inference accessible to KoboldAI clients including KoboldAI, SillyTavern and AI-Horde-Worker

Dockerfile (not tested) includes TGI and connects it to the AI Horde

Configuration

Environment Variables:

KAI_PORT - port to listen on for KAI clients (default 5000)
KAI_HOST - hostname to listen on (default 127.0.0.1)
TGI_ENDPOINT - URL to TGI REST API (default http://127.0.0.1:3000)
TGI_MODE - additional information to add to the model name (default "")

Known Issues

  • some options of KAI are not supported by TGI API
  • no EOS token ban
  • outputs get trimmed and can cause words to get lumped together (e.g. when doing story writing)

About

connect KobolAI API clients to text-generation-inference

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.7%
  • Shell 7.3%