LLamaStack.Web has no heavy dependencies and no extra frameworks over bootstrap and jquery to keep the examples clean and easy to copy over to your own project
LLamaStack.Web uses Signalr websockets this simplifys the streaming of responses and model per connection management
You can setup Models, Prompts and Inference parameters in the appsettings.json
Models You can add multiple models to the options for quick selection in the UI, options are based on LLamaSharp ModelParams so its fully configurable
Manage and interact with all your models in one sime UI interface
Update inference parameters between each question/instruction