jonkrohn / study-group

Deep Learning Study Group

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

study-group


We are a diverse group of software engineering and data science professionals from industry and academia.

Inspired by the collaborative culture of artificial neural network research, we hold regular, chilled beverage-enhanced study sessions in midtown Manhattan. At these meetings, we summarise prescribed preparatory material and leverage our individual strengths in computer science, mathematics, statistics, neuroscience, and venture capital to cement our comprehension of concepts and to implement effective deep learning models.

Over the course of our sessions, we follow three parallel paths:

  1. Theory: We study academic textbooks, exercises, and coursework so that we command strong theoretical foundations for neural networks and deep learning. Broadly, we cover calculus, algebra, probability, computer science, with a focus on their intersection at machine learning.
  2. Application: We practice deep learning in the real world. We typically commence by collectively following tutorials then we move on to solving novel and illustrative data problems involving a broad range of techniques. In addition to incorporating deep learning into our respective academic and commercial applications, we commit code to the present, public repository where possible.
  3. Presentations: Study group members regularly share their progress on Deep Learning projects and their area of expertise. This elicits novel discourse outside of the relatively formal paths 1 and 2, playfully encouraging along serendipity.

Theory

Thus far, Jon Krohn has led coverage of:

  1. Michael Nielsen's introductory text Neural Networks and Deep Learning (covered in sessions I through V)
  2. Fei-Fei Li, Andrej Karpathy and Justin Johnson's CS231n on Convolutional Neural Networks for Visual Recognition (sessions VI through VIII)
  3. Richard Socher's CS224d on Deep Learning for Natural Language Processing (sessions IX through XIII)

Application

Our applications have largely involved convolutional neural networks built in Python with Numpy, TensorFlow, or Keras.

Presentations

In chronological order, we have experienced the joy of being enlightened by:

  1. Katya Vasilaky on the mathematics of deep learning (session II) and on regularization (session VIII)
  2. Thomas Balestri on countless machine-learning underpinnings (sessions III and IV)
  3. Gabe Rives-Corbett on Keras implementations of deep learning deployed at untapt (session III)
  4. Dmitri Nesterenko on his NumPy implementation of k-Nearest Neighbours (session VI)
  5. Raphaela Sapire on the deep learning start-up investment atmosphere (session VIII)
  6. Grant Beyleveld on his implementation of the U-Net Convolutional Network (session IX)
  7. Jessica Graves on applications of Deep Learning to the fashion industry (session IX)
  8. VT Rajan on deriving the word2vec algorithm (session X)
  9. Karl Habermas on his NumPy implemention of the word2vec algorithm (session X)
  10. David Epstein on Generative Adversarial Networks (session X)
  11. Claudia Perlich on predictability and how it creates biases when your target is created by mistures (session XI)
  12. Brian Dalessandro on generating text with Keras LSTM models (session XI)

Session Notes

See the weekly work subdirectories for details of what we covered each session, including summary notes.

  1. August 17th, 2016: Perceptrons and Sigmoid Neurons
  2. September 6th, 2016: The Backpropagation Algorithm
  3. September 28th, 2016: Improving Neural Networks
  4. October 20th, 2016: Proofs of Key Properties
  5. November 10th, 2016: Deep (Conv)Nets
  6. November 30th, 2016: Convolutional Neural Networks for Visual Recognition
  7. January 12th, 2017: Implementing Convolutional Nets
  8. February 7th, 2017: Unsupervised Learning, Regularisation, and Venture Capital
  9. March 6th, 2017: Word Vectors, AI x Fashion, and U-Net
  10. March 27th, 2017: [coming soon]

Acknowledgements

Thank you to untapt and its visionary, neural net-loving CEO Ed Donner for hosting all meetings of the Deep Learning Study Group.


With a desire to remain intimately-sized, our study group has reached its capacity. If you'd like to be added to our waiting list, please contact the organiser, Jon Krohn, describing your relevant experience as well as your interest in deep learning. We don't expect you to necessarily be a deep learning expert already :)

About

Deep Learning Study Group

License:MIT License


Languages

Language:Jupyter Notebook 86.6%Language:Python 13.4%