Baseten (basetenlabs)

Baseten

basetenlabs

Geek Repo

Machine learning infrastructure for developers

Home Page:https://baseten.co

Twitter:@basetenco

Github PK Tool:Github PK Tool

Baseten's repositories

truss

The simplest way to serve AI/ML models in production

Language:PythonLicense:MITStargazers:847Issues:11Issues:112

truss-examples

Examples of models deployable with Truss

Language:PythonLicense:MITStargazers:99Issues:0Issues:0
Language:PythonStargazers:13Issues:0Issues:0

starcoder-truss

Truss for deploying Starcoder to Baseten or other platforms

Language:PythonStargazers:12Issues:0Issues:0
Language:Jupyter NotebookLicense:MITStargazers:3Issues:3Issues:1
Language:PythonStargazers:3Issues:0Issues:0

infrastructure-take-home

Baseten infrastructure recruiting take home

Language:PythonStargazers:3Issues:0Issues:0
Language:PythonStargazers:2Issues:0Issues:0

ControlNet

Let us control diffusion models

Language:PythonLicense:Apache-2.0Stargazers:1Issues:0Issues:0
Language:TypeScriptStargazers:1Issues:0Issues:0

pygmalion-6b-truss

A Truss to deploy Pygmalion 6B on Baseten.

Language:PythonStargazers:1Issues:0Issues:0
Stargazers:0Issues:5Issues:0

chainlit-cookbook

Chainlit's cookbook repo

Language:PythonStargazers:0Issues:0Issues:0

diffusers

🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:7Issues:0

gpu-operator

NVIDIA GPU Operator creates/configures/manages GPUs atop Kubernetes

License:Apache-2.0Stargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:0Issues:0

kaniko

Build Container Images In Kubernetes

License:Apache-2.0Stargazers:0Issues:0Issues:0

langchain

⚡ Building applications with LLMs through composability ⚡

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

mpt-7b-base-truss

A deployment "truss" for the MPT-7B Base model from MosaicML

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

python_backend

Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.

License:BSD-3-ClauseStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:7Issues:0
Language:PythonStargazers:0Issues:0Issues:0

TensorRT-LLM

TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

tensorrtllm_backend

The Triton TensorRT-LLM Backend

License:Apache-2.0Stargazers:0Issues:0Issues:0

triton-inference-server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:1Issues:0

truss-public-gh-repo-test

A public github repo for testing truss deploy flow

Language:PythonStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:0Issues:0
Stargazers:0Issues:0Issues:0