Skip to content

Releases: ParisNeo/lollms-webui

V13 (feather)

07 Oct 22:03
Compare
Choose a tag to compare
  • Light as a feather
  • removed conda dependence

What's Changed

Full Changelog: v12...v13

V12 (Strawberry)

01 Sep 01:34
Compare
Choose a tag to compare
  • New rag system
  • New version of Smart routing for enhanced LLM selection based on the complexity of the prompt
  • Upgraded UI
  • Added Apps zoo: apps can be built, tested, and shared with friends. There are already many apps (all free)
  • Added a dedicated Page to Personalities zoo
  • New services
  • New endpoints
  • A complete restful backend.

v9.8

20 May 00:53
Compare
Choose a tag to compare
  • Fixed some security issues
  • Enhanced the UI :
    • The mounted personalities ui is more accessible. More personas can be loaded and viewed
    • Personalities can talk to each other easily
    • Fixed some errors in the webui
  • Enhanced settings:
    • The services zoo replaces the servers configuration
    • Services are now well organized by type (tts, stt, tti, ttt)
  • Third generation personalities with access to function calls makes it easy to build super sophesticated personalities with unmatched capabilities.
  • Fixed Macos installer

v9.6

21 Apr 22:01
Compare
Choose a tag to compare

Enhanced UI
Enhanced information about the generation process (number of tokens, tokens rate, warmup time etc)
Enhanced security
Code compression tools
Personalities can now load models (useful for models comparison and model optimization)

v9.5

31 Mar 19:25
Compare
Choose a tag to compare

Enhanced security
Added comfyui interface
Added ollama installer

v9.4

18 Mar 23:35
Compare
Choose a tag to compare

Release Notes - LoLLMs v9.4

We are excited to announce the release of LoLLMs v9.4, which brings significant enhancements and new features to our powerful language model system. This release focuses on expanding capabilities, improving user experience, and ensuring a more streamlined installation process.

Key Features and Improvements:

Skills Library Management:

Introduced a new skills library management system that allows users to easily manage and organize their skills.
Users can now categorize, search, and activate/deactivate skills with ease.
The skills library provides a centralized location for managing all installed skills.
Multiple New Bindings:

Added support for multiple new bindings, enabling seamless integration with various platforms and services.
Users can now connect LoLLMs with a wider range of external systems and APIs.
The new bindings expand the possibilities for automating tasks and enhancing the functionality of LoLLMs.
Enhanced Internet Usage:

Revamped the way LoLLMs interacts with the internet, providing a more efficient and secure experience.
Improved internet connectivity and data retrieval capabilities for faster and more reliable results.
Implemented robust security measures to protect user data and maintain privacy.
ComfyUI and Additional Services:

Integrated ComfyUI, a powerful and user-friendly interface for interacting with LoLLMs.
Added support for several new services, expanding the range of tasks and functionalities available to users.
These additions provide users with more options and flexibility in leveraging the capabilities of LoLLMs.
Important Changes:

Manual installation of LoLLMs is no longer supported or allowed.
Users must use the provided installation script to set up LoLLMs correctly.
The installation script handles the necessary dependencies, environment setup, and configuration.
This change ensures a consistent and reliable installation process for all users.
We highly recommend all users to upgrade to LoLLMs v9.4 to take advantage of the new features, improvements, and enhanced performance. Please refer to the updated documentation and installation instructions for a smooth transition.

Enhanced Security and Corrected Vulnerabilities:
LoLLMs v9.4 prioritizes security enhancements and vulnerability mitigation. We have conducted thorough audits, implemented multi-layered protection, strengthened authentication, applied security patches, and employed advanced encryption. These measures ensure the safety and integrity of user data and interactions within LoLLMs.

Thank you for choosing LoLLMs!

The LoLLMs Team (ParisNeo + Lollms)

v9.3

29 Feb 09:47
Compare
Choose a tag to compare
  • Enhanced security by adding more strict typing and sanitization of endpoints
  • Added news section to get informed about updates directly in the webui:
    image
  • Upgraded bindings libraries.
  • Fixed multimodal models and image imports
  • Moved all data to the dicussion folder in order to simplify locating the discussion files
  • Added tokenization visualization to the playground
  • Support for markdown as a text input
  • CSVs are now usable in RAG thanks to a little hack allowing the AI to access columns names. So now you can load a csv and ask the AI about a specific entry

If you appreciate this work, please consider starring the repository.

9.2

17 Feb 02:22
Compare
Choose a tag to compare
9.2

More security fixes
Added full audio to audio transcription to playground
Few enhancements in the ui

v9.1

15 Feb 00:23
Compare
Choose a tag to compare

Vulnerabilities fixes:

  • cors security fixed. now the only allowed source is the webui. (the user can set other sources if he wants but the default is to refuse any access from another webiste)

  • path traversal (all endpoints that receive data from the user are now sanitized to prevent path traversal problem)

  • code injection (sanitization has been added to endpoint to prevent code injection except the execute_code endpoint that is now automatically turned off if you expose lollms to the outside world)

  • All os.system that uses data from the user are now replaced with a more secure method

  • Users can deactivate code execution

  • Users can activate code validation which will make the backend ask you for confirmation whenever it receives a code execution command

  • Added the possibility to use https. Just put your certificates in certs subfolder of your personal folder

  • New configurations in settings folder.

  • New personas.

For macos users, please after installing, goto the settings and select your hardware configuration.

Have fun

v9.0

27 Jan 21:10
Compare
Choose a tag to compare

Code fully moved to fastAPI

New bindings. Faster than ever.
Better binding installation parameters with fixed versions.

Created new install method for hugging face, exllamav2 and python llama cpp.
Added conda library so that we can install more complex stuff from lollms directly.

New multi tools paradigms to solve libraries versions problems and incompatibility between them.
Added ollama client and server
Added vllm client and server
Added petals client and server

Full compatibility with open ai API allowing the user to use any client application with lollms, which basically means that you can use for example google gimini binding and use lollms to route it to autogen or some other open ai compatible API, just configure it to use the lollms server instead.

New lollms generation interface that allows you to build your own apps using raw generation or persona augmented generation through lollms.

An unreal engine plugin is gonna be released to give life to your lollms characters