elzaksspro / ML-AI-Research-Papers---Solved

This repository contains everything you need to become proficient in ML/AI Research and Research Papers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ML/AI Research Papers Solved

This repository contains everything you need to become proficient in ML/AI Research and Research Papers.

How to Make Best Use of ML/DL Research Papers?

FuSPZdvWYAImbnA

Pic credits:ResearchGate

Youtube for all the implemented projects and tech interview resources - Ignito Youtube Channel

Complete Cheat Sheet for Tech Interviews - How to prepare efficiently

I took theses Projects Based Courses to Build Industry aligned Data Science and ML skills

Part 1 - How to solve Any ML System Design Problem

Link - Complete ML Research Papers Summarized Series

We will covering each and every Research Paper using 10 step framework —

  1. Research Paper Name and Authors

  2. Area and field of research

  3. Main Contributions

  4. Main Results

  5. Main Findings

  6. Opportunities

  7. Future Research

  8. Future Projects

  9. Code and Results

  10. Link to the Research Paper


Must know concepts before you dive in the research papers—

Model Name Link
Transformer Link
TransformerXL Link
VGG Link
Mask RCNN Link
Masked Autoencoder Link
BEiT Link
BERT Link
ColD Fusion Link
ConvMixer Link
Deep and Cross Network Link
DenseNet Link
DistilBERT Link
DiT Link
DocFormer Link
Donut Link
EfficientNet Link
ELMo Link
Entity Embeddings Link
ERNIE-Layout Link
FastBERT Link
Fast RCNN Link
Faster RCNN Link
MobileBERT Link
MobileNetV1 Link
MobileNetV2 Link
MobileNetV3 Link
RCNN Link
ResNet Link
ResNext Link
SentenceBERT Link
Single Shot MultiBox Detector (SSD) Link
StructuralLM Link
Swin Transformer Link
TableNet Link
TabTransformer Link
Tabular ResNet Link
TinyBERT Link
Vision Transformer Link
Wide and Deep Learning Link
Xception Link
XLNet Link
AlexNet Link
BART Link
InceptionNetV2 and InceptionNetV3 Link
InceptionNetV4 and InceptionResNet Link
Layout LM Link
Layout LM v2 Link
Layout LM v3 Link
Lenet Link
LiLT Link
Feature Pyramid Network Link
Feature Tokenizer Transformer Link
Focal Loss (RetinaNet) Link

NLP Research Papers

Paper Name Simplified/Summarized Version
Fine-mixing: Mitigating Backdoors in Fine-tuned Language Models Link
Bag of Tricks for Efficient Text Classification Link
Visualizing Linguistic Diversity of Text Datasets Synthesized by Large Language Models Link
(QA)²: Question Answering with Questionable Assumptions Link
QueryForm: A Simple Zero-shot Form Entity Query Framework Link
Semi-supervised Sequence Learning Link
Universal Language Model Fine-tuning for Text Classification Link
DARTS: Differentiable Architecture Search Link
RoBERTa: A Robustly Optimized BERT Pretraining Approach Link
Generating Sequences With Recurrent Neural Networks Link
Deep contextualized word representations Link
Regularizing and Optimizing LSTM Language Models Link
End-To-End Memory Networks Link
Listen, Attend and Spell Link
Well-Read Students Learn Better: On the Importance of Pre-training Compact Models Link
Language Models are Few-Shot Learners Link
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context Link
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter Link
Harnessing the Power of LLMs in Practice: A Survey on ChatGPT and Beyond Link
LIMA: Less Is More for Alignment Link
Efficient Neural Architecture Search via Parameter Sharing Link
Tree of Thoughts: Deliberate Problem Solving with Large Language Models Link
AudioGPT: Understanding and Generating Speech, Music, Sound, and Talking Head Link
FrugalGPT: How to Use Large Language Models While Reducing Cost and Improving Performance Link
CodeT5+: Open Code Large Language Models for Code Understanding and Generation Link
Unlimiformer: Long-Range Transformers with Unlimited Length Input Link
Speak, Memory: An Archaeology of Books Known to ChatGPT/GPT-4 Link
PaLM: Scaling Language Modeling with Pathways Link
Attention Is All You Need Link
Denoising Diffusion Probabilistic Models Link
ZeRO: Memory Optimizations Toward Training Trillion Parameter Models Link
Wide Residual Networks Link
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Link
STaR: Bootstrapping Reasoning With Reasoning Link
Meta-Gradient Reinforcement Learning Link
Distilling the Knowledge in a Neural Network Link
How to Fine-Tune BERT for Text Classification? Link
Primer: Searching for Efficient Transformers for Language Modeling Link
Training Compute-Optimal Large Language Models Link
Learning Transferable Visual Models From Natural Language Supervision Link
More Coming soon

CV Research Papers

Paper Name Summarized and Simplified Version
NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis Link
More Coming Soon

About

This repository contains everything you need to become proficient in ML/AI Research and Research Papers

License:MIT License