kemolo's repositories

AetherConverTools

以太流派的AI转绘工具包

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

axolotl

Go ahead and axolotl questions

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

Bert-VITS2

vits2 backbone with bert

License:AGPL-3.0Stargazers:0Issues:0Issues:0

bytepiece

更纯粹、更高压缩率的Tokenizer

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

CaMeLS

Codebase for Context-aware Meta-learned Loss Scaling (CaMeLS). https://arxiv.org/abs/2305.15076.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

control-lora-v2

ControlLoRA Version 2: A Lightweight Neural Network To Control Stable Diffusion Spatial Information Version 2

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

Cutie

[arXiv 2023] Putting the Object Back Into Video Object Segmentation

Language:PythonLicense:GPL-3.0Stargazers:0Issues:0Issues:0

Depth-Anything

Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

detect-pretrain-code

This repository provides an original implementation of Detecting Pretraining Data from Large Language Models by *Weijia Shi, *Anirudh Ajith, Mengzhou Xia, Yangsibo Huang, Daogao Liu , Terra Blevins , Danqi Chen , Luke Zettlemoyer.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

DoLa

Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"

Language:PythonStargazers:0Issues:0Issues:0

function_vectors

Function Vectors in Large Language Models

Stargazers:0Issues:0Issues:0

Genshin_Datasets

Genshin Datasets For SVC/SVS/TTS

Stargazers:0Issues:0Issues:0

grok-1

Grok open release

License:Apache-2.0Stargazers:0Issues:0Issues:0

h2o-llmstudio

H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

intel-extension-for-transformers

⚡ Build your chatbot within minutes on your favorite device; offer SOTA compression techniques for LLMs; run LLMs efficiently on Intel Platforms⚡

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

intercode

Code repository for InterCode benchmark https://arxiv.org/abs/2306.14898

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

LLM-Agent-Paper-List

The paper list of the paper "The Rise and Potential of Large Language Model Based Agents: A Survey" by Zhiheng Xi et al.

Stargazers:0Issues:0Issues:0

llm-viz

3D Visualization of an GPT-style LLM

Language:TypeScriptStargazers:0Issues:0Issues:0

llm_multiagent_debate

Code for Improving Factuality and Reasoning in Language Models through Multiagent Debate

Language:PythonStargazers:0Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

MathGLM

Official Pytorch Implementation for MathGLM

Language:PythonStargazers:0Issues:0Issues:0

Medusa

Medusa: Simple Framework for Accelerating LLM Generation with Multiple Decoding Heads

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

MetaGPT

🌟 The Multi-Agent Framework: Given one line Requirement, return PRD, Design, Tasks, Repo

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Omost

Your image is almost there!

License:Apache-2.0Stargazers:0Issues:0Issues:0
Language:DockerfileLicense:MITStargazers:0Issues:0Issues:0

prose-benchmarks

PROSE Public Benchmark Suite

License:NOASSERTIONStargazers:0Issues:0Issues:0
Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Skywork

Skywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sourced the model, training data, evaluation data, evaluation methods, etc. 天工系列模型在3.2TB高质量多语言和代码数据上进行预训练。我们开源了模型参数,训练数据,评估数据,评估方法。

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0