Tun Sheng Tan (tunsheng)

tunsheng

Geek Repo

Github PK Tool:Github PK Tool

Tun Sheng Tan's starred repositories

TorchLitho

Differentiable Computational Lithogrpahy Framework

Language:PythonLicense:GPL-3.0Stargazers:118Issues:0Issues:0

rigl

End-to-end training of sparse deep neural networks with little-to-no performance loss.

Language:PythonLicense:Apache-2.0Stargazers:313Issues:0Issues:0

Tutorial-SCADS-Summer-School-2020-Scalable-Deep-Learning

Code associated with 6th International (online) Summer school on AI and Big Data tutorial "Scalable Deep Learning Tutorial" and "Scalable deep learning: how far is one billion neurons?" tutorial at ECAI 2020.

Language:PythonStargazers:6Issues:0Issues:0

SoftAdapt

Implementation of the SoftAdapt paper (techniques for adaptive loss balancing of multi-tasking neural networks)

Language:PythonLicense:MITStargazers:22Issues:0Issues:0

SMDP

Solving Inverse Physics Problems with Score Matching

Language:Jupyter NotebookLicense:MITStargazers:18Issues:0Issues:0

RoBO

RoBO: a Robust Bayesian Optimization framework

Language:PythonLicense:BSD-3-ClauseStargazers:480Issues:0Issues:0

Pruning-Weights-with-Biobjective-Optimization-Keras

Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We suggest a pruning strategy which is completely integrated in the training process and which requires only marginal extra computational cost. The method relies on unstructured weight pruning which is re-interpreted in a multiobjective learning approach. A batchwise Pruning strategy is selected to be compared using different optimization methods, of which one is a multiobjective optimization algorithm. As it takes over the choice of the weighting of the objective functions, it has a great advantage in terms of reducing the time consuming hyperparameter search each neural network training suffers from. Without any a priori training, post training, or parameter fine tuning we achieve highly reductions of the dense layers of two commonly used convolution neural networks (CNNs) resulting in only a marginal loss of performance. Our results empirically demonstrate that dense layers are overparameterized as with reducing up to 98 % of its edges they provide almost the same results. We contradict the theory that retraining after pruning neural networks is of great importance and opens new insights into the usage of multiobjective optimization techniques in machine learning algorithms in a Keras framework. The Stochastic Multi Gradient Descent Algorithm implementation in Python3 is for usage with Keras and adopted from paper of S. Liu and L. N. Vicente: "The stochastic multi-gradient algorithm for multi-objective optimization and its application to supervised machine learning". It is combined with weight pruning strategies to reduce network complexity and inference time.

Language:PythonStargazers:7Issues:0Issues:0

fastfeedforward

A repository for log-time feedforward networks

Language:PythonLicense:MITStargazers:202Issues:0Issues:0

MacroMax

Library for solving the macroscopic Maxwell equations in complex dielectric materials. The materials may be any mixture of isotropic and anisotropic permittivity, permeability, and coupling tensors.

Language:PythonLicense:MITStargazers:21Issues:0Issues:0

MultirateTrainingOfNNs

Supplement code to our ICML 2022 paper on Multirate Training of Neural Networks

Language:PythonStargazers:4Issues:0Issues:0

autobound

AutoBound automatically computes upper and lower bounds on functions.

Language:PythonLicense:Apache-2.0Stargazers:351Issues:0Issues:0

Koopman-Training-Pytorch-Tools

Tools to perform Koopman training in Pytorch

Language:PythonStargazers:5Issues:0Issues:0

allegro

Allegro is an open-source code for building highly scalable and accurate equivariant deep learning interatomic potentials

Language:PythonLicense:MITStargazers:305Issues:0Issues:0

nif

A library for dimensionality reduction on spatial-temporal PDE

Language:Jupyter NotebookLicense:LGPL-2.1Stargazers:52Issues:0Issues:0

composer

Supercharge Your Model Training

Language:PythonLicense:Apache-2.0Stargazers:5077Issues:0Issues:0

koopman-forecasting

Long-term probabilistic forecasting of quasiperiodic phenomena using Koopman theory

Language:Jupyter NotebookLicense:MITStargazers:33Issues:0Issues:0

RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

Language:PythonLicense:Apache-2.0Stargazers:12060Issues:0Issues:0

deep-symbolic-optimization

A deep learning framework for symbolic optimization.

Language:PythonLicense:BSD-3-ClauseStargazers:549Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:12Issues:0Issues:0

py-metal-compute

A python library to run metal compute kernels on macOS

Language:CLicense:MITStargazers:67Issues:0Issues:0

sepsis_competition_physionet_2019

Code (rewritten) for our winning submission to the sepsis physionet 2019 challenge. Team name: Can I get your signature?

Language:Jupyter NotebookStargazers:14Issues:0Issues:0
Language:PythonStargazers:29Issues:0Issues:0

omnipose

Omnipose: a high-precision solution for morphology-independent cell segmentation

Language:Jupyter NotebookLicense:NOASSERTIONStargazers:88Issues:0Issues:0

leaf-audio

LEAF is a learnable alternative to audio features such as mel-filterbanks, that can be initialized as an approximation of mel-filterbanks, and then be trained for the task at hand, while using a very small number of parameters.

Language:PythonLicense:Apache-2.0Stargazers:491Issues:0Issues:0

leaf-audio-pytorch

Pytorch port of Google Research's LEAF Audio paper

Language:PythonLicense:Apache-2.0Stargazers:92Issues:0Issues:0

leaf-pytorch

PyTorch implementation of the LEAF audio frontend

Language:PythonStargazers:63Issues:0Issues:0

Plot2Spec

an automatic plot digitizer for spectroscopy images (i.e. XANES and Raman)

Language:Jupyter NotebookLicense:GPL-3.0Stargazers:29Issues:0Issues:0

copy-paste-aug

Copy-paste augmentation for segmentation and detection tasks

Language:Jupyter NotebookLicense:MITStargazers:536Issues:0Issues:0

LiteMORT

A memory efficient GBDT on adaptive distributions. Much faster than LightGBM with higher accuracy. Implicit merge operation.

Language:C++License:MITStargazers:56Issues:0Issues:0

CDGP

Counterexample-Driven Genetic Programming

Language:SlashLicense:MITStargazers:16Issues:0Issues:0