4uiiurz1 / pytorch-lars

PyTorch implementation of LARS (Layer-wise Adaptive Rate Scaling)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PyTorch implementation of LARS (Layer-wise Adaptive Rate Scaling)

This repository contains code for LARS (Layer-wise Adaptive Rate Scaling) based on Large Batch Training of Convolutional Networks implemented in PyTorch.

Requirements

  • Python 3.6
  • PyTorch 1.0

Usage

from lars import LARS

optimizer = torch.optim.LARS(model.parameters(), lr=0.1, momentum=0.9)
optimizer.zero_grad()
loss_fn(model(input), target).backward()
optimizer.step()

Results

CIFAR-10

batch size = 4096 batch size = 8192

About

PyTorch implementation of LARS (Layer-wise Adaptive Rate Scaling)

License:MIT License


Languages

Language:Python 100.0%