Raymond Hernandez's repositories
TheAnimalFarmAutoGardener
This is an auto gardener written for the gardening game on https://theanimal.farm. Python3.
UniswapV2Python
This is my UniswapV2 controller class written in python to power my degeneracy.
audio-datasets
open-source audio datasets
Bard
Reverse engineering of Google's Bard API
Deep-Fake_First_Order_Model
This Repo consists of implementing First order motion model for making Deep Fakes. It is referenced from a video on youtube by Two Minute Papers about Deep Fakes. The code given by @AliaksandrSiarohin
EdgeGPT
Reverse engineered API of Microsoft's Bing Chat AI
FasterTransformer
Transformer related optimization, including BERT, GPT
fastT5
⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
flash-gpt
Add Flash-Attention to Huggingface Models
IPRotate_Burp_Extension
Extension for Burp Suite which uses AWS API Gateway to rotate your IP on every request.
lightning-text-classification
Minimalist implementation of a BERT Sentence Classifier with PyTorch Lightning, Transformers and PyTorch-NLP.
lightning-transformers
Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
llama-rs
Run LLaMA inference on CPU, with Rust 🦀🚀🦙
llamafile
Distribute and run LLMs with a single file.
LlamaGPTJ-chat
Simple chat program for LLaMa, GPT-J, and MPT models.
mpt-play
Command-line script for inferencing from models such as MPT-7B-Chat
nft-armory
Simple tool to display, mint, and modify your Metaplex NFTs
notebook-utils
Commonly used notebook functions that I use time and time again. This way I can clone it, and have my functions and classes. Mainly for Google Colab+ notebook instances.
NVIDIAPyTorchLM
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
open-llm-leaderboard
Open LLM Leaderboard
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
solana-mass-transfer
Move all your tokens, NFTs, and SOL, to a new wallet
stanford_alpaca
Code and documentation to train Stanford's Alpaca models, and generate the data.
transformer-ls
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
TransformerEngine
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper GPUs, to provide better performance with lower memory utilization in both training and inference.
universal-distillation
🧪Create domain-adapted language models by distilling from many pre-trained LMs