pauldanielconway's repositories

gpt-oss

gpt-oss-120b and gpt-oss-20b are two open-weight language models by OpenAI

License:Apache-2.0Stargazers:0Issues:0Issues:0

llama-cpp-python

Python bindings for llama.cpp

License:MITStargazers:0Issues:0Issues:0

FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.

License:Apache-2.0Stargazers:0Issues:0Issues:0

lm-evaluation-harness

A framework for few-shot evaluation of language models.

License:MITStargazers:0Issues:0Issues:0

Deep-Live-Cam

real time face swap and one-click video deepfake with only a single image

License:AGPL-3.0Stargazers:0Issues:0Issues:0

NeMo

A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)

License:Apache-2.0Stargazers:0Issues:0Issues:0

google-research

Google Research

License:Apache-2.0Stargazers:0Issues:0Issues:0

trl

Train transformer language models with reinforcement learning.

License:Apache-2.0Stargazers:0Issues:0Issues:0

vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

License:Apache-2.0Stargazers:0Issues:0Issues:0

pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

License:NOASSERTIONStargazers:0Issues:0Issues:0

flash-attention

Fast and memory-efficient exact attention

License:BSD-3-ClauseStargazers:0Issues:0Issues:0

triton

Development repository for the Triton language and compiler

License:MITStargazers:0Issues:0Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

License:Apache-2.0Stargazers:0Issues:0Issues:0

transformers

πŸ€— Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

License:Apache-2.0Stargazers:0Issues:0Issues:0

scikit-learn

scikit-learn: machine learning in Python

License:BSD-3-ClauseStargazers:0Issues:0Issues:0

accelerate

πŸš€ A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support

License:Apache-2.0Stargazers:0Issues:0Issues:0

Qwen2.5-Coder

Qwen2.5-Coder is the code version of Qwen2.5, the large language model series developed by Qwen team, Alibaba Cloud.

Stargazers:0Issues:0Issues:0

gpt-2-simple

Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

peft

πŸ€— PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.

License:Apache-2.0Stargazers:0Issues:0Issues:0

Awesome-Code-LLM

πŸ‘¨β€πŸ’» An awesome and curated list of best code-LLM for research.

License:MITStargazers:0Issues:0Issues:0

tqdm

:zap: A Fast, Extensible Progress Bar for Python and CLI

License:NOASSERTIONStargazers:0Issues:0Issues:0

GPT-SoVITS

1 min voice data can also be used to train a good TTS model! (few shot voice cloning)

License:MITStargazers:0Issues:0Issues:0
License:NOASSERTIONStargazers:0Issues:0Issues:0

Awesome-LLM

Awesome-LLM: a curated list of Large Language Model

License:CC0-1.0Stargazers:0Issues:0Issues:0

datasets

πŸ€— The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools

License:Apache-2.0Stargazers:0Issues:0Issues:0

MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone

License:Apache-2.0Stargazers:0Issues:0Issues:0

evals

Evals is a framework for evaluating LLMs and LLM systems, and an open-source registry of benchmarks.

License:NOASSERTIONStargazers:0Issues:0Issues:0

DeepSeek-Coder-V2

DeepSeek-Coder-V2: Breaking the Barrier of Closed-Source Models in Code Intelligence

License:MITStargazers:0Issues:0Issues:0

trax

Trax β€” Deep Learning with Clear Code and Speed

License:Apache-2.0Stargazers:0Issues:0Issues:0

ERNIE

Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.

Stargazers:0Issues:0Issues:0