ArdalanM / nlp-benchmarks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Some questions about the input layer.

zoeonly opened this issue · comments

commented

Hello, I am reading the paper recently. Your code helps me a lot. But I have a some questions about the input layer. I saw that the embedding_dim is 16 and maxlen is 1024, and were basicly hard encode in the code. What if my embedding_dim is 64 or larger? Or my text is really short so can I set maxlen to 200? ps. code links in README may have confused. ^_^