jeremiedb / FluxBench.jl

Benchmarks for the FluxML ecosystem for deep learning, scientific machine learning, differentiable programming etc including AD and CUDA accelerated workloads

Home Page:https://speed.fluxml.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FluxBench.jl

bench-img

This is a repository that backs the results generated for https://speed.fluxml.ai

It is a collection of benchmarking runs for a subset of modeling done in the FluxML ecosystem and also serves as a means of tracking progress.

Running Locally

To run the benchmarks locally:

  • clone this repository
  • cd in to the local copy via cd FluxBench.jl
  • open Julia and call ] instantiate

And finally:

julia> using FluxBench

julia> FluxBench.bench()

Adding Benchmarks

To contribute benchmarks one needs to:

  • add in the script(s) to the src/packages directory with the required dependencies and code needed to run the benchmarks
    • Note: remember to add a group to the SUITE variable via the addgroup!(SUITE, "name/of/benchmark/group")
    • Treat group as a dictionary and new benchmarks can be added via assigning results to group as: group["name_of_benchmark"] = @benchmarkable ...
    • Please use the macro @benchmarkable to set up the benchmarks (see BenchmarkTools.jl for a reference)
    • Please follow the performance, profiling and benchmarking guides of the different packages used in the benchmark. Examples include - Julia's, Flux's, CUDA's, BenchmarkTools
  • include the benchmarks in the top level file src/FluxBench.jl
  • call the benchmarks in the bench function located in file src/bench.jl

About

Benchmarks for the FluxML ecosystem for deep learning, scientific machine learning, differentiable programming etc including AD and CUDA accelerated workloads

https://speed.fluxml.ai


Languages

Language:Julia 100.0%