v0.6.0
Pre-release
Pre-release
What's Changed
- Fixed bugs related to focused inputs (#6), model selection (#5) and other performance issues.
- Dropped the 'Mac'. Just HuggingChat. It's cleaner. (#8)
- Added basic local model inference support for GGUF with the ability to dynamically switch between local and server model using new keyboard shortcut
⌘+shift+\
(modifiable in Settings). - Added a 'Components' pane in Settings for model download/import.
- Added the ability to import local text files (e.g. source code and text)
- Conversations can now be set to clear at preset intervals
- Added acknowledgments section to the About section highlighting OS tools used within the app.
- Fixed app update server.
What To Test
- Test local generation with various models, with a special focus on user experience.
- Please test model management (i.e. import, deletion, etc) in the dedicated Components pane.
- Import text/source code files to local and server-side models
- Test long context local inference
What's Next
- Codebase cleanup
- Add two more themes
- Quality of life improvements and stability fixes
- Open sourcing the code