SIONIC AI's repositories
webgpu-llm-loader
A loader that lets you try running LLMs built for WebGPU.
Data_KoSuperNI
StrategyQA 데이터 세트 번역
web-stable-diffusion
Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.
unstructured
Open source libraries and APIs to build custom preprocessing pipelines for labeling, training, or production machine learning pipelines.
llama3.cuda-ko
llama3.cuda is a pure C/CUDA implementation for Llama 3 model.
Flowise
Drag & drop UI to build your customized LLM flow
mlc-llm
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
notion-blog
Deploy your own Notion-powered website in minutes with Next.js and Vercel.
storm-api-example
Storm API를 이용한 예제 코드를 제공하는 저장소입니다.
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs