Hasanul Mahmud (max2022)

max2022

Geek Repo

Location:San Antonio,Texas.

Github PK Tool:Github PK Tool

Hasanul Mahmud's repositories

Language:Jupyter NotebookStargazers:1Issues:1Issues:0

Advanced-Deep-Learning-with-Keras

Advanced Deep Learning with Keras, published by Packt

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Anomaly-ReactionRL

Using RL for anomaly detection in NSL-KDD

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:1Issues:0

Awesome-Deep-Neural-Network-Compression

Summary, Code for Deep Neural Network Quantization

Language:PythonStargazers:0Issues:0Issues:0

classification_models

Classification models trained on ImageNet. Keras.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:0Issues:0

cpu-energy-meter

A tool for measuring energy consumption of Intel CPUs

Language:CLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

flops-counter.pytorch

Flops counter for convolutional networks in pytorch framework

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

keras-surgeon

Pruning and other network surgery for trained Keras models.

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

Keras_FLOP_Estimator

This is a function for estimating the floating point operations (FLOPS) of deep learning models developed with keras.

License:GPL-3.0Stargazers:0Issues:0Issues:0

knowledge-distillation-papers

knowledge distillation papers

Stargazers:0Issues:0Issues:0

Knowledge_distillation_via_TF2.0

The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API

Language:PythonStargazers:0Issues:0Issues:0

Lottery-Ticket-Hypothesis-in-Pytorch

This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.

Language:PythonStargazers:0Issues:0Issues:0

mae-scalable-vision-learners

A TensorFlow 2.x implementation of Masked Autoencoders Are Scalable Vision Learners

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

mdistiller

The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf

Language:PythonStargazers:0Issues:0Issues:0

model-optimization

A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

pyclustering

pyclustering is a Python, C++ data mining library.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

pytorch-cifar100

Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)

Language:PythonStargazers:0Issues:0Issues:0

pytorch_resnet_cifar10

Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.

Language:PythonLicense:BSD-2-ClauseStargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

rpi-power-monitor

Power Monitor (for Raspberry Pi)

Language:PythonLicense:GPL-3.0Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0

SpinalNet

SpinalNet: Deep Neural Network with Gradual Input

Language:Jupyter NotebookStargazers:0Issues:0Issues:0

tensorflow-deep-learning

All course materials for the Zero to Mastery Deep Learning with TensorFlow course.

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

Tiny-ImageNet

Image classification on Tiny ImageNet

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

TrojanNN

Trojan Attack on Neural Network

Language:PythonStargazers:0Issues:0Issues:0

um34c

A small NodeJS tool to read out and control the UM34C (or UM24C / UM25C) USB analyzer via Bluetooth

Language:JavaScriptLicense:GPL-3.0Stargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0