Damien Sicard (altar31)

altar31

Geek Repo

Location:France

Github PK Tool:Github PK Tool

Damien Sicard's repositories

altar31

Config files for my GitHub profile.

Stargazers:0Issues:0Issues:0

basalt

A Machine Learning framework from scratch in Pure Mojo 🔥

Language:MojoLicense:NOASSERTIONStargazers:0Issues:0Issues:0

burn

Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.

Language:RustLicense:Apache-2.0Stargazers:0Issues:0Issues:0

continuiti

Learning function operators with neural networks.

Language:PythonLicense:LGPL-3.0Stargazers:0Issues:0Issues:0

dash

Data Apps & Dashboards for Python. No JavaScript Required.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

deeponet-jax-bench

Benchmarking DeepONet with JAX

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

deepxde

A library for scientific machine learning and physics-informed learning

Language:PythonLicense:LGPL-2.1Stargazers:0Issues:0Issues:0

Flux.jl

Relax! Flux is the ML library that doesn't make you tensor

Language:JuliaLicense:NOASSERTIONStargazers:0Issues:0Issues:0

Gridap.jl

Grid-based approximation of partial differential equations in Julia

Language:JuliaLicense:MITStargazers:0Issues:0Issues:0

Makie.jl

Interactive data visualizations and plotting in Julia

Language:JuliaLicense:MITStargazers:0Issues:0Issues:0

PackageCompiler.jl

Compile your Julia Package

Language:JuliaLicense:MITStargazers:0Issues:0Issues:0

polars

Dataframes powered by a multithreaded, vectorized query engine, written in Rust

Language:RustLicense:MITStargazers:0Issues:0Issues:0

pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

jax

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

License:Apache-2.0Stargazers:0Issues:0Issues:0

machine_learning_refined

Notes, examples, and Python demos for the 2nd edition of the textbook "Machine Learning Refined" (published by Cambridge University Press).

License:NOASSERTIONStargazers:0Issues:0Issues:0

modulus

Open-source deep-learning framework for building, training, and fine-tuning deep learning models using state-of-the-art Physics-ML methods

License:Apache-2.0Stargazers:0Issues:0Issues:0

mojo

The Mojo Programming Language

License:NOASSERTIONStargazers:0Issues:0Issues:0

neuraloperator

Learning in infinite dimension with neural operators.

License:MITStargazers:0Issues:0Issues:0
Stargazers:0Issues:0Issues:0
License:Apache-2.0Stargazers:0Issues:0Issues:0

PINA

Physics-Informed Neural networks for Advanced modeling

License:MITStargazers:0Issues:0Issues:0

pytorch-widedeep

A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch

License:Apache-2.0Stargazers:0Issues:0Issues:0

ResUNet-DeepONet-Plasticity

Implementation of a ResUNet-based DeepONet for predicting stress distribution on variable input geometries subject to variable loads. A ResUNet is used in the trunk network to encode the variable input geometries, and a feed-forward neural network is used in the branch to encode the loading parameters.

License:MITStargazers:0Issues:0Issues:0

S-DeepONet

A sequential DeepONet model implementation that uses a recurrent neural network (GRU and LSTM) in the branch and a feed-forward neural network in the trunk. The branch network efficiently encodes time-dependent input functions, and the trunk network captures the spatial dependence of the full-field data.

Stargazers:0Issues:0Issues:0