sudharsan13296 / Hands-On-Deep-Learning-Algorithms-with-Python

Master Deep Learning Algorithms with Extensive Math by Implementing them using TensorFlow

Home Page:https://www.amazon.com/gp/product/B07LH43V8P/ref=dbs_a_def_rwt_hsch_vapi_tkin_p1_i3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Master Deep Learning Algorithms with Extensive Math by Implementing them using TensorFlow

About the book

Book Cover

Deep learning is one of the most popular domains in the artificial intelligence (AI) space, which allows you to develop multi-layered models of varying complexities. This book is designed to help you grasp things, from basic deep learning algorithms to the more advanced algorithms. The book is designed in a way that first you will understand the algorithm intuitively, once you have a basic understanding of the algorithms, then you will master the underlying math behind them effortlessly and then you will learn how to implement them using TensorFlow step by step.

The book covers almost all the state of the art deep learning algorithms. First, you will get a good understanding of the fundamentals of neural networks and several variants of gradient descent algorithms. Later, you will explore RNN, Bidirectional RNN, LSTM, GRU, seq2seq, CNN, capsule nets and more. Then, you will master GAN and various types of GANs and several different autoencoders.

By the end of this book, you will be equipped with the skills you need to implement deep learning in your projects.

Get the book


Check out my Deep Reinforcement Learning Repo here.

Table of contents

  • 5.1. LSTM to the Rescue
  • 5.2. Understanding the LSTM cell
  • 5.3. Forward propagation in LSTM
  • 5.4. Backpropagation in LSTM
  • 5.5. Deriving backpropagation of LSTM Step by step
  • 5.6. Predicting Bitcoins price using LSTM
  • 5.7. Gated Recurrent Units
  • 5.8. Understanding GRU cell
  • 5.9. Forward propagation in GRU cell
  • 5.10. Deriving backpropagation in GRU cell
  • 5.11. Implementing GRU cell in Tensorflow
  • 5.12. BiDirectional RNN
  • 5.13. Going Deep with Deep RNN
  • 5.14. Language Translation Seq2seq models