Numpy implementation of Neural Networks with various solvers.
- Capable of handling multivariate function approximation tasks. (
$\mathbb{R}^{N} \rightarrow \mathbb{R}$ ) - This repository implements a two-stage optimization method which is popular in the scientific machine learning community, outperforms several SGD-based methods such as Adam and SGDM in various scientific computing tasks.
- A more generalized and sophisticated version (
$\mathbb{R}^{N} \rightarrow \mathbb{R}^{M}$ ) can be found in my MATLAB File Exchange. - "CompareWithTorch" provides a comparison between pure SGD-Based methods(Adam) and the two-stage optimization strategy.
- Numerical Optimization, Nocedal & Wright.
- Practical Quasi-Newton Methods for Training Deep Neural Networks, Goldfarb, et al.
- Kronecker-factored Quasi-Newton Methods for Deep Learning, Yi Ren, et al.