Skip to content

Releases: 3x3cut0r/llama-cpp-python-streamlit

v0.4.2

14 Nov 12:49
Compare
Choose a tag to compare
  • changed text_input to text_area to support multiple lines
  • restructured src.header
  • updated README.md

v0.4.1

14 Nov 11:20
e5a22e7
Compare
Choose a tag to compare
  • fix error: Connection broken: InvalidChunkLength(got length b'', 0 bytes read)

v0.4.0

08 Nov 13:32
Compare
Choose a tag to compare

HIGHTLIGHT: added context support for all endpoints, fixed markdown styling

  • added enable context? to all endpoints to turn on/off context (default = True)
  • fixed some styling for ansers containing code-blocks and markdown elements
  • changed default values from stop, system_content and prompt
  • you can now store default model settings inside src/config.json

v0.3.0

07 Nov 17:24
Compare
Choose a tag to compare

HIGHLIGHT: added context support for /v1/chat/completions

  • Added "enable context?" toggle to Model Settings to turn on/off context (default = True)
  • Added n_ctx in config.json set context size if context is enabled
  • now load config into session_state and read from session_state instead of config

v0.2.0

07 Nov 12:24
aa32cda
Compare
Choose a tag to compare

UPDATED: Streamlit App with Model Settings and new Endpoints

  • Updated the title of the streamlit app to "Model: Llama-2-7b-Chat"
  • Updated the src/config.json file with the API URL and page title
  • Refactored the streamlit app code to use the sidebar, request, and context functions
  • handling context now within a python dictionary
  • the output now looks more like chatgpt
  • Updated README.md with an index section with installation, configuration, usage, find me, and license information

v0.1.0

07 Nov 06:26
9c1a821
Compare
Choose a tag to compare

Initial release