marian-nmt / marian-dev

Fast Neural Machine Translation in C++ - development repository

Home Page:https://marian-nmt.github.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Implement Constrained Beam Search (Disjunctive Positive Constraint Decoding)

JOHW85 opened this issue · comments

commented

Feature description

Constrained beam search allows the user to exert control over the output of text generation based on forcing certain terms (like phrase table entries).

Currently beam search limits the user to just the highest probability outputs. Implementing this feature allows the user to force diverse outputs by forcing the model to include diverse tokens across multiple generations.

This method is called "Disjunctive Positive Constraint Decoding", and it forces the generation process to generate sequences with the highest probabilities under the constraint of needing to include a set of provided tokens.

This "disjunctive" method is powerful in that it can handle lemmatizing these forced tokens. For instance, when asking the model to autoregressively generate the completion tokens from "Babies cry because" and want to force the generation to include the word "lonely", it can induce the model to generate sequences like "Babies cry because they are lonely", as well as "Babies cry because of their loneliness".

Relevant papers:
Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation
Improved Lexically Constrained Decoding for Translation and Monolingual Rewriting
Guided Generation of Cause and Effect

Example

More details can be found in the blog post:
https://huggingface.co/blog/constrained-beam-search

Implementation on Huggingface:
https://github.com/huggingface/transformers/blob/master/src/transformers/generation_beam_constraints.py

Original Feature request on Huggingface:
huggingface/transformers#14081 (comment)

Aren't beam search based approaches deprecated in favor of model based approaches?
See https://aclanthology.org/P19-1294/
Here's Marian's implementation of the above paper: https://github.com/marian-nmt/marian-examples/tree/master/forced-translation

Regarding "disjunctive" constraints, it would seem the natural way to do this is provide all the options in the source (factors, special tokens, or second input) and train. One can create such data by sampling from the target side and shuffling.