Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Constrained Beam Search (Disjunctive Positive Constraint Decoding) #931

Open
JOHW85 opened this issue Mar 18, 2022 · 1 comment

Comments

@JOHW85
Copy link

JOHW85 commented Mar 18, 2022

Feature description

Constrained beam search allows the user to exert control over the output of text generation based on forcing certain terms (like phrase table entries).

Currently beam search limits the user to just the highest probability outputs. Implementing this feature allows the user to force diverse outputs by forcing the model to include diverse tokens across multiple generations.

This method is called "Disjunctive Positive Constraint Decoding", and it forces the generation process to generate sequences with the highest probabilities under the constraint of needing to include a set of provided tokens.

This "disjunctive" method is powerful in that it can handle lemmatizing these forced tokens. For instance, when asking the model to autoregressively generate the completion tokens from "Babies cry because" and want to force the generation to include the word "lonely", it can induce the model to generate sequences like "Babies cry because they are lonely", as well as "Babies cry because of their loneliness".

Relevant papers:
Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation
Improved Lexically Constrained Decoding for Translation and Monolingual Rewriting
Guided Generation of Cause and Effect

Example

More details can be found in the blog post:
https://huggingface.co/blog/constrained-beam-search

Implementation on Huggingface:
https://github.com/huggingface/transformers/blob/master/src/transformers/generation_beam_constraints.py

Original Feature request on Huggingface:
huggingface/transformers#14081 (comment)

@kpu
Copy link
Member

kpu commented Mar 18, 2022

Aren't beam search based approaches deprecated in favor of model based approaches?
See https://aclanthology.org/P19-1294/
Here's Marian's implementation of the above paper: https://github.com/marian-nmt/marian-examples/tree/master/forced-translation

Regarding "disjunctive" constraints, it would seem the natural way to do this is provide all the options in the source (factors, special tokens, or second input) and train. One can create such data by sampling from the target side and shuffling.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants