Samsung SAIT AI Lab, Montreal (SamsungSAILMontreal)

SamsungSAILMontreal

Geek Repo

0

followers

0

following

0

stars

Location:Canada

Home Page:https://www.sait.samsung.co.kr/saithome/about/labs.do

Github PK Tool:Github PK Tool

Samsung SAIT AI Lab, Montreal's repositories

ForestDiffusion

Generating and Imputing Tabular Data via Diffusion and Flow XGBoost Models

ghn3

Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]

Language:ShellLicense:MITStargazers:29Issues:4Issues:4

PAPA

Repository for the PopulAtion Parameter Averaging (PAPA) paper

Language:PythonLicense:MITStargazers:25Issues:3Issues:0

subpruning

Code for "Data-Efficient Structured Pruning via Submodular Optimization" [NeurIPS 2022]

Language:Jupyter NotebookLicense:MITStargazers:7Issues:1Issues:3

difference-submodular-min

Code for "Difference of Submodular Minimization via DC Programming" [ICML 2023]

Language:MATLABLicense:MITStargazers:3Issues:2Issues:0

hyper-representation

Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights [NeurIPS 2022]

l2o_pytorch

Simple Learning to Optimize in PyTorch

Language:PythonLicense:MITStargazers:2Issues:2Issues:0

GGM-metrics

On Evaluation Metrics for Graph Generative Models [ICLR 2022]

multiset-equivariance

Multiset-Equivariant Set Prediction with Approximate Implicit Differentiation [ICLR 2022]

fair-matroid-submodular-max

Code for "Fairness in Streaming Submodular Maximization over a Matroid Constraint" [ICML 2023]

Stargazers:0Issues:0Issues:0

layer-merge

Code for "LayerMerge: Neural Network Depth Compression through Layer Pruning and Merging" [ICML 2024]

Stargazers:0Issues:0Issues:0

LoGAH

LoGAH: Predicting 774-Million-Parameter Transformers using Graph HyperNetworks with 1/100 Parameters.

Stargazers:0Issues:0Issues:0