kevinykuo / luz

Higher Level API for torch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

luz

R-CMD-check Codecov test coverage

luz is a higher level API for torch providing abstractions to allow for much less verbose training loops.

This package is in very early stage of development. Don't use for anything meaningful yet.

It's heavily inspired in other higher level frameworks for deep learning, to cite a few:

  • FastAI: we are heavily inspired in the FastAI library, specially the Learner object and the callbacks API.

  • Keras: We are also heavily inspired by Keras, specially callback names, the lightning module interface is similar to compile too.

  • PyTorch Luzning: The idea of the luz_module being a subclass of nn_module is inspired in the LuzningModule object in lightning.

  • HuggingFace Accelerate: The internal device placement API is heavily inspired in Accelerate, but much more modest in features. Currenly only CPU and Single GPU are supported.

Todo

  • 'compiling' and training classification models

  • training and validation data

  • metrics other than the loss

  • callbacks for logging and progressbar

  • custom optimizer definition

  • custom training and validation steps

  • timings for each part of the model

  • handle device placement

Installation

You can install the released version of luz from CRAN with:

install.packages("luz")

Example

This is a basic example which shows you how to solve a common problem:

library(luz)
## basic example code

About

Higher Level API for torch

License:Other


Languages

Language:R 100.0%