v6.0.12
Highlights
This release contains a lot of fixes and improvements, and the new LLMs. Much of the included effort is related to the improvements of development environment, code quality, better error handling, testing (including both unit and integration testing), etc.
What's Changed
- Add new LLMs.
- Add support for local Ollama models.
- Add a warning notification on deleting a file being generated by Generate Unit Tests.
- Improve the accessibility and discoverability of Cody actions.
- Improve Inline Edit stability and performance.
- Improve Cody authentication stability.
- Improve invalid/removed/expired access token handling.
- Improve the stability and performance of chat's history and Inline Edit's history.
- Deprecate some old LLMs.
- Fix stuck in "Loading..." in My Account tab.
- Fix
ResponseErrorException: No default chat model found
in some scenarios. - Fix
AlreadyDisposedException: Container is already disposed
in some scenarios. - Fix the manual triggering of autocomplete when the automatic autocompletions are disabled.
- Fix issue with Sourcegraph actions invalid branch resolution.
Full Changelog: v6.0.3...v6.0.12