ematvey / tensorflow-seq2seq-tutorials

Dynamic seq2seq in TensorFlow, step by step

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Suggestion] PAD added twice

opened this issue · comments

Your first tutorial has this code snippet:

decoder_targets_, _ = helpers.batch(
    [(sequence) + [EOS] + [PAD] * 2 for sequence in batch]
)
decoder_inputs_, _ = helpers.batch(
    [[EOS] + (sequence) + [PAD] * 2 for sequence in batch]
)

I'm opening this issue to suggest that you add an explanation for why you append PAD twice, if you have time, since I don't understand this part myself. Thanks for these tutorials; I'm finding them very helpful!

This bit was meant to illustrate that decoder rollout is somewhat arbitrary and is controlled by us. This is probably not very important implementation detail given scope of tutorial 1. Dropped it.