Raj Nath Patel's repositories
pytorch-dl
Pytorch based BERT, mBART and NMT training
ACL2022_KnowledgeNLP_Tutorial
Materials for ACL-2022 tutorial: Knowledge-Augmented Methods for Natural Language Processing
allRank
allRank is a framework for training learning-to-rank neural models based on PyTorch.
Anagnorisis
Local recommendation system based on LLama 2
ColBERT
ColBERT: state-of-the-art neural search (SIGIR'20, TACL'21, NeurIPS'21, NAACL'22, CIKM'22, ACL'23, EMNLP'23)
DeepPavlov
An open source library for deep learning end-to-end dialog systems and chatbots.
degiro-connector
This is yet another library to access Degiro's API.
LlamaRec
[PGAI@CIKM 2023] PyTorch Implementation of LlamaRec: Two-Stage Recommendation using Large Language Models for Ranking
LLMRec
[WSDM'2024 Oral] "LLMRec: Large Language Models with Graph Augmentation for Recommendation"
lm-evaluation-harness
A framework for few-shot evaluation of language models.
mindspore
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
nanochat
The best ChatGPT that $100 can buy.
nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
neural-collaborative-filtering
pytorch version of neural collaborative filtering
patelrajnath.github.io
Build a Jekyll blog in minutes, without touching the command line.
pylate
Late Interaction Models Training & Retrieval
pytorch-loss
label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful
txtai
💡 All-in-one open-source AI framework for semantic search, LLM orchestration and language model workflows
unsloth
Finetune Llama 3, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
xm-retrievers
🌏 Modular retrievers for zero-shot multilingual IR.