kvignesh1420 / shallow_nc1

Exploring "variability collapse" in shallow neural networks

Home Page:https://arxiv.org/abs/2406.02105

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Collapse in Shallow Neural Networks

This repository provides a comprehensive analysis of neural collapse in shallow neural networks through a kernel-based approach. Our study focuses on the:

  • Limiting Neural Network Gaussian Process (NNGP)
  • Limiting Neural Tangent Kernel (NTK)
  • Adaptive Kernels (derived from NNGP)

We utilize these kernel characterizations of a 2-layer fully connected neural network (2L-FCN) to investigate NC1, which refers to the variability collapse of hidden layer activations.

Setup

To set up the environment, follow these steps:

$ python3.9 -m virtualenv .venv
$ source .venv/bin/activate
$ pip install -r requirements.txt

Experiments

The shallow_collapse package includes the core library code required to run the experiments.

Training the Fully Connected Network (FCN)

  • Single run: Train a 2L-FCN using the following command:

    (.venv) $ python fcn.py configs/fcn.yml
  • Bulk run (balanced): Train multiple 2L-FCN on various dataset sizes and data dimensions under balanced conditions:

    (.venv) $ python fcn_bulk_balanced.py configs/fcn_bulk_balanced.yml
  • Bulk run (imbalanced): Train multiple 2L-FCN on various dataset sizes and data dimensions under imbalanced conditions:

    (.venv) $ python fcn_bulk_imbalanced.py configs/fcn_bulk_imbalanced.yml

Limiting Kernels

  • Bulk run (balanced): Conduct multiple experiments using NNGP/NTK with balanced datasets:

    (.venv) $ python limiting_kernels_bulk_balanced.py configs/limiting_kernels_bulk_balanced.yml
  • Bulk run (imbalanced): Execute multiple experiments using NNGP/NTK with imbalanced datasets:

    (.venv) $ python limiting_kernels_bulk_imbalanced.py configs/limiting_kernels_bulk_imbalanced.yml

Adaptive Kernels

  • Single run: Solve the "Equations of State" (EoS) for the adaptive kernels using the following command:

    (.venv) $ python adaptive_kernels.py configs/adaptive_kernels.yml
  • Bulk run (balanced): Perform multiple experiments using EoS-based adaptive kernels with balanced datasets:

    (.venv) $ python adaptive_kernels_bulk_balanced.py configs/adaptive_kernels_bulk_balanced.yml
  • Bulk run (imbalanced): Run multiple experiments using EoS-based adaptive kernels with imbalanced datasets:

    (.venv) $ python adaptive_kernels_bulk_imbalanced.py configs/adaptive_kernels_bulk_imbalanced.yml

All output files are stored in the out/ directory. Each output is associated with a unique hash value corresponding to the context of the experiment.

Citation

@article{kothapalli2024kernel,
  title={Kernel vs. Kernel: Exploring How the Data Structure Affects Neural Collapse},
  author={Kothapalli, Vignesh and Tirer, Tom},
  journal={arXiv preprint arXiv:2406.02105},
  year={2024}
}

About

Exploring "variability collapse" in shallow neural networks

https://arxiv.org/abs/2406.02105


Languages

Language:Python 100.0%