SIONIC AI's repositories
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
llama3.cuda-ko
llama3.cuda is a pure C/CUDA implementation for Llama 3 model.
kubespray
Deploy a Production Ready Kubernetes Cluster
Data_KoSuperNI
StrategyQA 데이터 세트 번역
CICERO_Ko
The purpose of this repository is to introduce new dialogue-level commonsense inference datasets and tasks. We chose dialogues as the data source because dialogues are known to be complex and rich in commonsense.
privacy
사이오닉에이아이 주식회사 개인정보 처리방침
webgpu-llm-loader
A loader that lets you try running LLMs built for WebGPU.
notion-blog
Deploy your own Notion-powered website in minutes with Next.js and Vercel.
web-llm
Bringing large-language models and chat to web browsers. Everything runs inside the browser with no server support.
mlc-llm
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
web-stable-diffusion
Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.