jayparks / tf-seq2seq

Sequence to sequence learning using TensorFlow.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How can I use pre-embedded data in this model?

andymogul opened this issue · comments

commented

I'm trying to use pre-embedded data as input. It means I don't want to use embedding layer of model.
How should I do this? Need for help.

Hi andy!

try adding this

`

        embedding_variable = tf.Variable(tf.constant(0.0, shape = [self.encoder_vocab_size, embedding_size]),trainable = False, name = 'embedding')
                       
        self.encoder_embedding_placeholder = tf.placeholder(tf.float32, shape=[self.encoder_vocab_size,embedding_size], name = 'embedding_placeholder' )
        self.encoder_embeddings = embedding_variable.assign(self.encoder_embedding_placeholder)
        self.encoder_inputs_embedded=tf.nn.embedding_lookup(self.encoder_embeddings,self.encoder_inputs)`