pooja-k-swamy / Deeplearning-lab

My notes of DS-1008 GY lab sessions

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Deeplearning-lab

My notes of DS- GA 1008 GY Updating as the course progresses

Why Deeper networks?

  • More training modules
  • Exponentially wide hidden layer with 1000 neurons, it's better to stack these on multiple layers than spread in one hidden layer.
  • Computationally effecient

DL- Lab 1 Implementation of non-linearity functions on a randomly created data points in 2D space. Homework-1 Implement a deep neural network with two linear layers and activation funstions f and g that can be sigmoid, ReLU or identity functions. The loss functions are either BCE or MSE loss. We had to implement this WITHOUT autograd in PyTorch(a good exercise btw, you get to know how autograd works internally). The mlp.py is the folder of hw1.

About

My notes of DS-1008 GY lab sessions


Languages

Language:Jupyter Notebook 99.8%Language:Python 0.2%