Thomas Wang (thomasw21)

thomasw21

Geek Repo

Company:MistralAI

Github PK Tool:Github PK Tool


Organizations
bigscience-workshop

Thomas Wang's repositories

pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Language:PythonLicense:NOASSERTIONStargazers:2Issues:2Issues:0

datasets

🤗 The largest hub of ready-to-use NLP datasets for ML models with fast, easy-to-use and efficient data manipulation tools

Language:PythonLicense:Apache-2.0Stargazers:1Issues:1Issues:0

apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

bigscience

Codebase for the engineering/scaling WG

Language:ShellLicense:NOASSERTIONStargazers:0Issues:1Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

drjit

Dr.Jit — A Just-In-Time-Compiler for Differentiable Rendering

Language:C++License:BSD-3-ClauseStargazers:0Issues:0Issues:0

eval_t0_deepspeed

Evaluate T0 with DeepSpeed

Language:PythonStargazers:0Issues:1Issues:0

FlexFlow

A distributed deep learning framework that supports flexible parallelization strategies.

Language:C++License:Apache-2.0Stargazers:0Issues:1Issues:0

lm-evaluation-harness

A framework for few-shot evaluation of autoregressive language models.

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

lxmls-toolkit

Machine Learning applied to Natural Language Processing Toolkit used in the Lisbon Machine Learning Summer School

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

Megatron-LM

Ongoing research training transformer models at scale

Language:PythonLicense:NOASSERTIONStargazers:0Issues:1Issues:0

metaseq

Repo for external large-scale work

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

nerfacc

A General NeRF Acceleration Toolbox in PyTorch.

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

nerfstudio

A collaboration friendly studio for NeRFs

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

promptsource

Toolkit for collecting and applying templates of prompting instances

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:1Issues:0

scripts

Dump of all the scripts that are not part of any specific project.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

seqio

Task-based datasets, preprocessing, and evaluation for sequence models.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

svox2

Plenoxels: Radiance Fields without Neural Networks, Code release WIP

License:BSD-2-ClauseStargazers:0Issues:0Issues:0

text-to-text-transfer-transformer

Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Stargazers:0Issues:2Issues:0

transformers

🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

trl

Train transformer language models with reinforcement learning.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0