adityabantwal / Complexity-of-Deep-Networks

investigating how closely the theoretical predictions are obeyed by concrete real-world networks.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Complexity-of-Deep-Networks

investigating how closely the theoretical predictions are obeyed by concrete real-world networks.

Conducted a comprehensive research project involving various experiments inspired by the paper "Random deep neural networks are biased towards simple functions" by G. De Palma et al. (2019). The project included: Implementation of random neural classifiers with specific network architectures as described in the paper. Replicated Experiment to investigate Hamming distance, exploring both ReLU and tanh nonlinearities. Examined depth dependence by estimating h(n) for deeper networks, providing insight into the theory's depth independence. Confirmed the linear scaling law for random bit flips using tanh activation functions. Conducted experiments with trained neural networks, akin to MNIST analysis, but with a customized dataset.

About

investigating how closely the theoretical predictions are obeyed by concrete real-world networks.


Languages

Language:Jupyter Notebook 100.0%