HPC-AI Tech (hpcaitech)

HPC-AI Tech

hpcaitech

Organization data from Github https://github.com/hpcaitech

We are a global team to help you train and deploy your AI models

Home Page:https://hpc-ai.com/

GitHub:@hpcaitech

HPC-AI Tech's repositories

ColossalAI

Making large AI models cheaper, faster and more accessible

Language:PythonLicense:Apache-2.0Stargazers:41228Issues:385Issues:1734

Open-Sora

Open-Sora: Democratizing Efficient Video Production for All

Language:PythonLicense:Apache-2.0Stargazers:27804Issues:222Issues:622

EnergonAI

Large-scale model inference.

Language:PythonLicense:Apache-2.0Stargazers:629Issues:21Issues:50

FastFold

Optimizing AlphaFold Training and Inference on GPU Clusters

Language:PythonLicense:Apache-2.0Stargazers:610Issues:16Issues:80

SwiftInfer

Efficient AI Inference & Serving

Language:PythonLicense:Apache-2.0Stargazers:477Issues:4Issues:7

ColossalAI-Examples

Examples of training models with hybrid parallelism using ColossalAI

Language:PythonLicense:Apache-2.0Stargazers:339Issues:17Issues:47

PaLM-colossalai

Scalable PaLM implementation of PyTorch

Language:PythonLicense:Apache-2.0Stargazers:188Issues:12Issues:13

TensorNVMe

A Python library transfers PyTorch tensors between CPU and NVMe

CachedEmbedding

A memory efficient DLRM training solution using ColossalAI

Language:PythonLicense:Apache-2.0Stargazers:106Issues:6Issues:3

SkyComputing

Sky Computing: Accelerating Geo-distributed Computing in Federated Learning

Language:PythonLicense:Apache-2.0Stargazers:91Issues:11Issues:1

Titans

A collection of models built with ColossalAI

Language:PythonLicense:Apache-2.0Stargazers:32Issues:7Issues:7

ColossalAI-Documentation

Documentation for Colossal-AI

Language:JavaScriptLicense:Apache-2.0Stargazers:23Issues:6Issues:11

Oh-My-Dockerfile

A collection of dockerfiles for various tasks

Language:DockerfileLicense:Apache-2.0Stargazers:22Issues:0Issues:0

Elixir

Elixir: Train a Large Language Model on a Small GPU Cluster

Language:PythonStargazers:15Issues:2Issues:0

public_assets

Storing publicly available assets such as images, animations and texts

Language:PythonLicense:Apache-2.0Stargazers:15Issues:7Issues:2

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:11Issues:1Issues:0

ColossalAI-Platform-CLI

CLI for ColossalAI Platform

Language:PythonLicense:Apache-2.0Stargazers:10Issues:5Issues:1

GPT-Demo

GPT Demo with hybrid distributed training

Language:PythonLicense:Apache-2.0Stargazers:10Issues:3Issues:2
Language:PythonLicense:Apache-2.0Stargazers:4Issues:3Issues:7

mmdetection-examples

Train mmdetection models with ColossalAI.

CANN-Installer

This repository contains Huawei Ascend CANN files

Cloud-Platform-Docs

Documentation for our cloud platform

Language:JavaScriptStargazers:1Issues:5Issues:0

graphrag

A modular graph-based Retrieval-Augmented Generation (RAG) system

Language:PythonLicense:MITStargazers:1Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0

torchrec

Pytorch domain library for recommendation systems

Language:PythonLicense:BSD-3-ClauseStargazers:1Issues:1Issues:0

pytest-testmon

Selects tests affected by changed files. Executes the right tests first. Continuous test runner when used with pytest-watch.

Language:PythonLicense:AGPL-3.0Stargazers:0Issues:1Issues:0

TensorRT-LLM

TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in performant way.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

TensorRT-Model-Optimizer

A unified library of state-of-the-art model optimization techniques like quantization, pruning, distillation, speculative decoding, etc. It compresses deep learning models for downstream deployment frameworks like TensorRT-LLM or TensorRT to optimize inference speed.

License:Apache-2.0Stargazers:0Issues:0Issues:0