Skip to content

Commit

Permalink
docs(changelog): add v0.21.0 release notes and remove unreleased entry (
Browse files Browse the repository at this point in the history
#3497)

* docs(changelog): add v0.21.0 release notes and remove unreleased entry

* update
  • Loading branch information
wsxiaoys authored Dec 3, 2024
1 parent f6ec8f7 commit b4fba58
Show file tree
Hide file tree
Showing 3 changed files with 39 additions and 3 deletions.

This file was deleted.

19 changes: 19 additions & 0 deletions .changes/v0.21.0.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
## v0.21.0 (2024-12-02)

### Notice

* Due to changes in the indexing format, the `~/.tabby/index` directory will be automatically removed before any further indexing jobs are run. It is expected that the indexing jobs will be re-run (instead of incrementally) after the upgrade.

### Features

* Support connecting to llamafile model backend.
* Display **Open** / **Closed** state for issues / pull requests in Answer Engine context card.
* Support deleting the entire thread in Answer Engine.
* Add rate limiter options for HTTP-powered model backends.

### Fixed and Improvements

* Fixed a panic that occurred when specifying a local model ([#3464](https://github.com/TabbyML/tabby/issues/3464))
* Add pagination to Answer Engine threads.
* Fix Vulkan binary distributions.
* Improve the retry logic for chunk embedding computation in indexing job.
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,26 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html),
and is generated by [Changie](https://github.com/miniscruff/changie).

## v0.21.0 (2024-12-02)

### Notice

* Due to changes in the indexing format, the `~/.tabby/index` directory will be automatically removed before any further indexing jobs are run. It is expected that the indexing jobs will be re-run (instead of incrementally) after the upgrade.

### Features

* Support connecting to llamafile model backend.
* Display **Open** / **Closed** state for issues / pull requests in Answer Engine context card.
* Support deleting the entire thread in Answer Engine.
* Add rate limiter options for HTTP-powered model backends.

### Fixed and Improvements

* Fixed a panic that occurred when specifying a local model ([#3464](https://github.com/TabbyML/tabby/issues/3464))
* Add pagination to Answer Engine threads.
* Fix Vulkan binary distributions.
* Improve the retry logic for chunk embedding computation in indexing job.

## v0.20.0 (2024-11-08)

### Features
Expand Down

0 comments on commit b4fba58

Please sign in to comment.