Cong Duy Vu Hoang's repositories
Transformer-DyNet
An Implementation of Transformer (Attention Is All You Need) in DyNet
duyvuleo.github.io
Cong Duy Vu Hoang's Personal Homepage
allennlp-optuna
⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Awesome-Code-LLM
👨💻 An awesome and curated list of best code-LLM for research.
chat-gpt-google-extension
A browser extension to display ChatGPT response alongside search engine results
DeBERTa
The implementation of DeBERTa
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
gpt4all
gpt4all: a chatbot trained on a massive collection of clean assistant data including code, stories and dialogue
mesh-transformer-jax
Model parallel transformers in JAX and Haiku
PaLM-rlhf-pytorch
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
pdfplumber
Plumb a PDF for detailed information about each char, rectangle, line, et cetera — and easily extract text and tables.
Prompt-Engineering-Guide
:octopus: Guide and resources for prompt engineering
pynndescent
A Python nearest neighbor descent for approximate nearest neighbors
scrapy
Scrapy, a fast high-level web crawling & scraping framework for Python.
Slurm_tools
My tools for the Slurm HPC workload manager
sqlglot
Python SQL Parser and Transpiler
tensor2struct-public
Semantic parsers based on encoder-decoder framework
test-suite-sql-eval
Semantic Evaluation for Text-to-SQL with Distilled Test Suites
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
transformers-interpret
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
x-transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers