Releases: 3x3cut0r/llama-cpp-python-streamlit
Releases · 3x3cut0r/llama-cpp-python-streamlit
v0.4.2
v0.4.1
v0.4.0
HIGHTLIGHT: added context support for all endpoints, fixed markdown styling
- added
enable context?
to all endpoints to turn on/off context (default = True) - fixed some styling for ansers containing code-blocks and markdown elements
- changed default values from
stop
,system_content
andprompt
- you can now store default model settings inside
src/config.json
v0.3.0
HIGHLIGHT: added context support for /v1/chat/completions
- Added "enable context?" toggle to Model Settings to turn on/off context (default = True)
- Added n_ctx in config.json set context size if context is enabled
- now load config into session_state and read from session_state instead of config
v0.2.0
UPDATED: Streamlit App with Model Settings and new Endpoints
- Updated the title of the streamlit app to "Model: Llama-2-7b-Chat"
- Updated the src/config.json file with the API URL and page title
- Refactored the streamlit app code to use the sidebar, request, and context functions
- handling context now within a python dictionary
- the output now looks more like chatgpt
- Updated README.md with an index section with installation, configuration, usage, find me, and license information