Skip to content

Commit

Permalink
Adding Prompt Tuning to Missing Tables (#607)
Browse files Browse the repository at this point in the history
Adding Prompt Tuning to all tables where it was missing and fixing
broken links in the README.
  • Loading branch information
lenglaender authored Nov 23, 2023
1 parent c521436 commit 3a44848
Show file tree
Hide file tree
Showing 3 changed files with 17 additions and 15 deletions.
13 changes: 7 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -142,12 +142,13 @@ Currently, adapters integrates all architectures and methods listed below:
| MAD-X,<br> Invertible adapters | [Pfeiffer et al. (2020)](https://aclanthology.org/2020.emnlp-main.617/) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/04_Cross_Lingual_Transfer.ipynb) |
| AdapterDrop | [Rücklé et al. (2021)](https://arxiv.org/pdf/2010.11918.pdf) | [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/05_Adapter_Drop_Training.ipynb) |
| MAD-X 2.0,<br> Embedding training | [Pfeiffer et al. (2021)](https://arxiv.org/pdf/2012.15562.pdf) | [Docs: Embeddings](https://docs.adapterhub.ml/embeddings.html), [Notebook](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/08_NER_Wikiann.ipynb) |
| Prefix Tuning | [Li and Liang (2021)](https://arxiv.org/pdf/2101.00190.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#prefix-tuning) |
| Parallel adapters,<br> Mix-and-Match adapters | [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#mix-and-match-adapters) |
| Compacter | [Mahabadi et al. (2021)](https://arxiv.org/pdf/2106.04647.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#compacter) |
| LoRA | [Hu et al. (2021)](https://arxiv.org/pdf/2106.09685.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#lora) |
| (IA)^3 | [Liu et al. (2022)](https://arxiv.org/pdf/2205.05638.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#ia-3) |
| UniPELT | [Mao et al. (2022)](https://arxiv.org/pdf/2110.07577.pdf) | [Docs](https://docs.adapterhub.ml/overview.html#unipelt) |
| Prefix Tuning | [Li and Liang (2021)](https://arxiv.org/pdf/2101.00190.pdf) | [Docs](https://docs.adapterhub.ml/methods.html#prefix-tuning) |
| Parallel adapters,<br> Mix-and-Match adapters | [He et al. (2021)](https://arxiv.org/pdf/2110.04366.pdf) | [Docs](https://docs.adapterhub.ml/method_combinations.html#mix-and-match-adapters) |
| Compacter | [Mahabadi et al. (2021)](https://arxiv.org/pdf/2106.04647.pdf) | [Docs](https://docs.adapterhub.ml/methods.html#compacter) |
| LoRA | [Hu et al. (2021)](https://arxiv.org/pdf/2106.09685.pdf) | [Docs](https://docs.adapterhub.ml/methods.html#lora) |
| (IA)^3 | [Liu et al. (2022)](https://arxiv.org/pdf/2205.05638.pdf) | [Docs](https://docs.adapterhub.ml/methods.html#ia-3) |
| UniPELT | [Mao et al. (2022)](https://arxiv.org/pdf/2110.07577.pdf) | [Docs](https://docs.adapterhub.ml/method_combinations.html#unipelt) |
| Prompt Tuning | [Lester et al. (2021)](https://aclanthology.org/2021.emnlp-main.243/) | [Docs](https://docs.adapterhub.ml/methods.html#prompt-tuning) |

## Supported Models

Expand Down
18 changes: 9 additions & 9 deletions docs/adapter_composition.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,15 +40,15 @@ The basic building blocks of the more advanced setups are objects derived from `
each representing a different possibility to combine single adapters.
The following table gives an overview on the supported composition blocks and their support by different adapter methods.

| Block | Bottleneck<br> Adapters | Prefix<br> Tuning | Compacter | LoRA | (IA)³ |
| --- | --- | --- | --- | --- | --- |
| [`Stack`](#stack) |||| ✅(*) | ✅(*) |
| [`Fuse`](#fuse) || || | |
| [`Split`](#split) || || | |
| [`BatchSplit`](#batchsplit) |||| ✅(*) | ✅(*) |
| [`Parallel`](#parallel) |||| ✅(*) | ✅(*) |
| [Output averaging](#output-averaging) || || ✅(*) | ✅(*) |
| [Parameter averaging](#parameter-averaging) ||||||
| Block | Bottleneck<br> Adapters | Prefix<br> Tuning | Compacter | LoRA | (IA)³ | Prompt Tuning |
| --- | --- | --- | --- | --- | --- | --- |
| [`Stack`](#stack) |||| ✅(*) | ✅(*) | |
| [`Fuse`](#fuse) || || | | |
| [`Split`](#split) || || | | |
| [`BatchSplit`](#batchsplit) |||| ✅(*) | ✅(*) | |
| [`Parallel`](#parallel) |||| ✅(*) | ✅(*) | |
| [Output averaging](#output-averaging) || || ✅(*) | ✅(*) | |
| [Parameter averaging](#parameter-averaging) |||||| |

(*) except for Deberta-v1, GPT-2.

Expand Down
1 change: 1 addition & 0 deletions docs/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ Identifiers and configuration classes are explained in more detail in the [next
| `ia3` | `IA3Config()` | [IA³](methods.html#ia-3) |
| `mam` | `MAMConfig()` | [Mix-and-Match Adapters](method_combinations.html#mix-and-match-adapters) |
| `unipelt` | `UniPELTConfig()` | [UniPELT](method_combinations.html#unipelt) |
| `prompt_tuning` | `PromptTuningConfig()` | [Prompt Tuning](methods.html#prompt-tuning)

## Configuration

Expand Down

0 comments on commit 3a44848

Please sign in to comment.