Allan's repositories
aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
annotated_deep_learning_paper_implementations
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
argo-scholar
Literature Review Made Easy with Visualization
dsdeliver
Projeto da semana do devsuperior2.0
EvalAI-Starters
How to create a challenge on EvalAI?
markdown-preview-enhanced
One of the 'BEST' markdown preview extensions for Atom editor!
webportfolio
This is my personal web page
control-engineering-with-python
Control Engineering with Python
Grounded-Segment-Anything
Grounded-SAM: Marrying Grounding DINO with Segment Anything & Stable Diffusion & Recognize Anything - Automatically Detect , Segment and Generate Anything
MAGALI
MAGALI: An AI-powered tool to analyze nutrients and carbs from food photos, aiding metabolic health optimization.
neural-compressor
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
neurosymbolic-representations-for-IR
Resources for Tutorial on Neuro-Symbolic Representations for IR
Non-linear-system
Nonlinear problem solving written in R.
portuguese-bert
Portuguese pre-trained BERT models
Semi-automation-of-systematic-review-of-clinical-trials-in-medical-psychology-with-BERT-models
We employed pre-trained BERT models (distillBERT, BioBert, and SciBert) for text-classifications of the titles and abstracts of clinical trials in medical psychology. The average score of AUC is 0.92. A stacked model was then built by featuring the probability predicted by distillBERT and keywords of search domains. The AUC improved to 0.96 with F1, precision, and recall increasing to 0.95, 0.94, and 0.96 respectively. Training sample size of 100 results in the most cost-effective performance.
systematic-review-datasets
A collection of fully labeled systematic review datasets (title-abstract screening)
tokyo
BSPWM - Aesthetic Dotfiles 🍚
vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
wanda
A simple and effective LLM pruning approach.