There are 0 repository under pre-train topic.
Multi-label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger
中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
[MICCAI'23] Foundation Model for Endoscopy Video Analysis via Large-scale Self-supervised Pre-train
Codes and datasets for AAAI-2021 paper "Learning to Pre-train Graph Neural Networks"
📦 Repomix (formerly Repopack) is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, and Gemini.