There are 127 repositories under pretrained-models topic.
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
An open source implementation of CLIP.
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Neural building blocks for speaker diarization: speech activity detection, speaker change detection, overlapped speech detection, speaker embedding
A PyTorch implementation of EfficientNet
Official release of InternLM series (InternLM, InternLM2, InternLM2.5, InternLM3).
Natural Language Processing Best Practices & Examples
A treasure chest for visual classification and recognition powered by PaddlePaddle
A modular framework for vision & language multimodal research from Facebook AI Research (FAIR)
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
Silero Models: pre-trained speech-to-text, text-to-speech and text-enhancement models made embarrassingly simple
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
Superduper: End-to-end framework for building custom AI applications and agents.
Pretrained Pytorch face detection (MTCNN) and facial recognition (InceptionResnet) models
Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
《大语言模型》作者:赵鑫,李军毅,周昆,唐天一,文继荣
A repository for storing models that have been inter-converted between various frameworks. Supported frameworks are TensorFlow, PyTorch, ONNX, OpenVINO, TFJS, TFTRT, TensorFlowLite (Float32/16/INT8), EdgeTPU, CoreML.
OpenMMLab Pre-training Toolbox and Benchmark
Chronos: Pretrained Models for Probabilistic Time Series Forecasting
Sparsity-aware deep learning inference runtime for CPUs
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
The official Python client for the Huggingface Hub.
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
The PyTorch-based audio source separation toolkit for researchers