Mayank Mishra (mayank31398)

mayank31398

Geek Repo

Company:IBM Research

Location:Boston

Home Page:https://mayank31398.github.io/

Github PK Tool:Github PK Tool

Mayank Mishra's repositories

GPTQ-for-SantaCoder

4 bits quantization of SantaCoder using GPTQ

Language:PythonStargazers:54Issues:2Issues:0

Papers-books-and-blogs

This repository contains the research papers, white papers, thesis etc that I love.

Language:PythonStargazers:19Issues:5Issues:0

pseudo-code-instructions

Pseudo-code Instructions dataset

Language:PythonLicense:Apache-2.0Stargazers:17Issues:1Issues:0
Language:PythonLicense:Apache-2.0Stargazers:2Issues:3Issues:0

BigCode-Megatron-LM

Ongoing research training transformer models at scale

Language:PythonLicense:NOASSERTIONStargazers:1Issues:1Issues:0
Language:PythonStargazers:1Issues:3Issues:0
Language:Jupyter NotebookStargazers:1Issues:3Issues:0
Language:TeXStargazers:0Issues:4Issues:0

blog

Public repo for HF blog posts

Language:Jupyter NotebookStargazers:0Issues:2Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:ShellStargazers:0Issues:3Issues:0

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:0Issues:0
Language:DockerfileLicense:Apache-2.0Stargazers:0Issues:2Issues:0

IBM-fms-fsdp

Demonstrate throughput of PyTorch FSDP

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:3Issues:0
Stargazers:0Issues:3Issues:0
Language:HTMLStargazers:0Issues:2Issues:0

MIPS-verilog

This repository contains code for a MIPS single cycle architecture written in Verilog.

Language:VerilogStargazers:0Issues:3Issues:0
Language:PythonStargazers:0Issues:1Issues:0

optimum

🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

rocm-apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0
Language:Jupyter NotebookStargazers:0Issues:3Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

vscode-icons

Icons for Visual Studio Code

License:MITStargazers:0Issues:0Issues:0
Stargazers:0Issues:3Issues:0
Language:PythonStargazers:0Issues:3Issues:0