DemoVersion / theanolm

TheanoLM is a recurrent neural network language modeling tool implemented using Theano

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TheanoLM

Introduction

TheanoLM is a recurrent neural network language modeling tool implemented using the Python library Theano. Theano allows the user to customize and extend the neural network very conveniently, still generating highly efficient code that can utilize multiple GPUs or CPUs for parallel computation. TheanoLM allows the user to specify an arbitrary network architecture. New layer types and optimization methods can be easily implemented.

TheanoLM can be used for rescoring n-best lists, decoding word lattices, or generating text. It can be called from command line or from a Python script.

Implementations of many currently popular layer types are provided, such as long short-term memory (LSTM), gated recurrent units (GRU), bidirectional recurrent networks, and highway networks are provided. Several different Stochastic Gradient Descent (SGD) based optimizers are implemented, including RMSProp, AdaGrad, ADADELTA, and Adam. In addition to the standard cross-entropy cost, one can use sampling based noise-contrastive estimation (NCE) or BlackOut.

About the project

TheanoLM is open source and licensed under the Apache License, Version 2.0. The source code is available on GitHub. Documentation can be read online on Read the Docs.

Author

About

TheanoLM is a recurrent neural network language modeling tool implemented using Theano

License:Apache License 2.0


Languages

Language:Python 92.6%Language:Shell 7.4%