RadiumLZhang / Flux.jl

Relax! Flux is the ML library that doesn't make you tensor

Home Page:https://fluxml.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DOI Flux Downloads
ColPrac: Contributor's Guide on Collaborative Practices for Community Packages

Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.

Works best with Julia 1.8 or later. Here's a simple example to try it out:

using Flux  # should install everything for you, including CUDA

x = hcat(digits.(0:3, base=2, pad=2)...) |> gpu  # let's solve the XOR problem!
y = Flux.onehotbatch(xor.(eachrow(x)...), 0:1) |> gpu
data = ((Float32.(x), y) for _ in 1:100)  # an iterator making Tuples

model = Chain(Dense(2 => 3, sigmoid), BatchNorm(3), Dense(3 => 2)) |> gpu
optim = Adam(0.1, (0.7, 0.95))
mloss(x, y) = Flux.logitcrossentropy(model(x), y)  # closes over model

Flux.train!(mloss, Flux.params(model), data, optim)  # updates model & optim

all((softmax(model(x)) .> 0.5) .== y)  # usually 100% accuracy.

See the documentation for details, or the model zoo for examples. Ask questions on the Julia discourse or slack.

If you use Flux in your research, please cite our work.

About

Relax! Flux is the ML library that doesn't make you tensor

https://fluxml.ai/

License:Other


Languages

Language:Julia 100.0%