Sourab Mangrulkar (pacman100)

pacman100

Geek Repo

Company:Amazon

Location:🇮🇳

Home Page:https://pacman100.github.io/

Twitter:@sourab_m

Github PK Tool:Github PK Tool

Sourab Mangrulkar's starred repositories

Language:PythonLicense:MITStargazers:6014Issues:0Issues:0

dalle-mini

DALL·E Mini - Generate images from a text prompt

Language:PythonLicense:Apache-2.0Stargazers:14726Issues:0Issues:0

legacy-cc

The earliest versions of the very first c compiler known to exist in the wild written by the late legend himself dmr.

Language:CLicense:NOASSERTIONStargazers:3744Issues:0Issues:0

pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Language:PythonLicense:NOASSERTIONStargazers:81630Issues:0Issues:0

aiaiart

Course content and resources for the AIAIART course.

Language:Jupyter NotebookLicense:MITStargazers:564Issues:0Issues:0

blog

Public repo for HF blog posts

Language:Jupyter NotebookStargazers:2228Issues:0Issues:0

DALLE2-pytorch

Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch

Language:PythonLicense:MITStargazers:11028Issues:0Issues:0

accelerate

🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision

Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0

datasets

🤗 Fast, efficient, open-access datasets and evaluation metrics in PyTorch, TensorFlow, NumPy and Pandas

License:Apache-2.0Stargazers:1Issues:0Issues:0

approachingalmost

Approaching (Almost) Any Machine Learning Problem

Stargazers:7188Issues:0Issues:0

djl

An Engine-Agnostic Deep Learning Framework in Java

Language:JavaLicense:Apache-2.0Stargazers:4046Issues:0Issues:0

tflite-android-transformers

DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps

Language:JavaLicense:Apache-2.0Stargazers:387Issues:0Issues:0

fashion-compatibility

Learning Type-Aware Embeddings for Fashion Compatibility

Language:PythonLicense:BSD-3-ClauseStargazers:152Issues:0Issues:0

awesome-fashion-ai

A repository to curate and summarise research papers related to fashion and e-commerce

Stargazers:1161Issues:0Issues:0
Stargazers:166Issues:0Issues:0

ML-YouTube-Courses

📺 Discover the latest machine learning / AI courses on YouTube.

License:CC0-1.0Stargazers:14666Issues:0Issues:0

server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

Language:PythonLicense:BSD-3-ClauseStargazers:7940Issues:0Issues:0

polyvore-dataset

Dataset used in paper "Learning Fashion Compatibility with Bidirectional LSTMs"

License:Apache-2.0Stargazers:191Issues:0Issues:0
Language:PythonStargazers:1Issues:0Issues:0

UQ360

Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you estimate, communicate and use uncertainty in machine learning model predictions.

Language:PythonLicense:Apache-2.0Stargazers:254Issues:0Issues:0

LID-tool

This code provides word level language identification tool for identifying language for individual words in Code-Mixed text. e.g. The text that includes words from two languages such as Hindi written in roman script, mixed with English.

Language:PythonLicense:MITStargazers:49Issues:0Issues:0

ecco

Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).

Language:Jupyter NotebookLicense:BSD-3-ClauseStargazers:1958Issues:0Issues:0

gpt-neo

An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.

Language:PythonLicense:MITStargazers:8198Issues:0Issues:0

transformer-deploy

Efficient, scalable and enterprise-grade CPU/GPU inference server for 🤗 Hugging Face transformer models 🚀

Language:PythonLicense:Apache-2.0Stargazers:1643Issues:0Issues:0

mesh-transformer-jax

Model parallel transformers in JAX and Haiku

Language:PythonLicense:Apache-2.0Stargazers:6265Issues:0Issues:0

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries

Language:PythonLicense:Apache-2.0Stargazers:6777Issues:0Issues:0

DeepSpeed

DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.

Language:PythonLicense:Apache-2.0Stargazers:34486Issues:0Issues:0

fairscale

PyTorch extensions for high performance and large scale training.

Language:PythonLicense:NOASSERTIONStargazers:3125Issues:0Issues:0

wit

WIT (Wikipedia-based Image Text) Dataset is a large multimodal multilingual dataset comprising 37M+ image-text sets with 11M+ unique images across 100+ languages.

License:NOASSERTIONStargazers:989Issues:0Issues:0