kris-singh / ReadingList

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Papers To Read

  1. BREAKING THE SOFTMAX BOTTLENECK: A HIGH-RANK RNN LANGUAGE MODEL
  2. Visual Curiosity: Learning to Ask Questions to Learn Visual Recognition
  3. TARMAC: TARGETED MULTI-AGENT COMMUNICATION
  4. Towards Understanding Linear Word Analogies
  5. Understanding the impact of entropy on policy optimization
  6. Do explanations make VQA models more predictable to a human?
  7. How agents see things: On visual representations in an emergent language game
  8. Semantic Parsing for Task Oriented Dialog using Hierarchical Representations
  9. [https://openreview.net/pdf?id=ryQu7f-RZ] AmsGrad

Projects Roadmap

  1. Visdial Research

    RNN in Pytorch HRED in Pytorch LMFUSION in Pytorch Topical Hred Augmenting Neural Response Generation with Context-Aware Topical Attention A Hierarchical Latent Variable Encoder-Decoder Model for Generating Dialogues Implemention of Co-Operating Games GuessWhat?! Visual object discovery through multi-modal dialogue

  2. Presentation for PRML group

    Reading the VI paper Reading the VAE Paper 0. [https://arxiv.org/abs/1312.6114] Original Paper 1. [https://arxiv.org/abs/1606.05908] Tutorial 2. Implementation in Pytorch(Ipython notebook prefered for tutorial) VARIATIONAL INFERENCE: FOUNDATIONS AND INNOVATIONS Tutorial

  3. Fashion Image Tagging

    Reading the Google open images [] Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels (Tags Riken Noisy Fashion) Masking: A New Perspective of Noisy Supervision (Tags Riken Noisy Fashion)

  4. Autograd & Computational Graph

Mostly Cuda Learning

  1. Efficient Large-scale Approximate Nearest Neighbor Search on OpenCL FPGA
  2. Billion-scale similarity search with GPUs FAIR (Tags CUDA FAISS LOPQ Image Search)
  3. Locally Optimized Product Quantization for Approximate Nearest Neighbor Search (Tags CUDA LOPQ Image Search)
  4. Sparse Tensor tutorial (Tags SparseTensor CUDA OPENAI )

    Blog Implementation Pytorch

RL Resources

  1. https://spinningup.openai.com/en/latest/spinningup/keypapers.html

*Papers to Implement

  1. Poincaré Embeddings for Learning Hierarchical Representations
  2. A DISCIPLINED APPROACH TO NEURAL NETWORK HYPER-PARAMETERS: PART 1 – LEARNING RATE, BATCH SIZE, MOMENTUM, AND WEIGHT DECAY
  3. https://github.com/rusty1s/pytorch_geometric.git

Blogs to Read

  1. Hovrod
  2. [https://towardsdatascience.com/how-to-build-a-gated-convolutional-neural-network-gcnn-for-natural-language-processing-nlp-5ba3ee730bfb] Gated Conv Neural Network
  3. [https://www.fast.ai/2018/07/02/adam-weight-decay/] Super Convergence
  4. [http://www.phontron.com/class/nn4nlp2017/schedule.html] NLP course
  5. [https://stats.stackexchange.com/questions/281240/why-is-the-cost-function-of-neural-networks-non-convex] Why loss function is convex still the loss i s surface is not convex

About


Languages

Language:HTML 100.0%