Interactions (Interactions-AI)

Interactions

Interactions-AI

Geek Repo

Location:United States of America

Home Page:https://www.interactions.com

Github PK Tool:Github PK Tool

Interactions's repositories

odin

Lightweight Machine Learning Framework for workflows with Kubernetes

Language:PythonLicense:Apache-2.0Stargazers:4Issues:0Issues:0
Language:C++License:Apache-2.0Stargazers:2Issues:1Issues:1

sample-odin-configs

Sample configs for setting up Odin locally

Stargazers:1Issues:0Issues:0

sample-odin-pipelines

Some sample pipelines with odin

Stargazers:1Issues:0Issues:0

espnet

End-to-End Speech Processing Toolkit

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

ggml

Tensor library for machine learning

Language:C++License:MITStargazers:0Issues:0Issues:0

NeMo

NeMo: a toolkit for conversational AI

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

NeMo-I

NeMo: a toolkit for conversational AI

Language:PythonLicense:Apache-2.0Stargazers:0Issues:0Issues:0

onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

Language:C++License:MITStargazers:0Issues:0Issues:0

riva-asrlib-decoder

Standalone implementation of the CUDA-accelerated WFST Decoder available in Riva

Language:PythonStargazers:0Issues:0Issues:0

silero-vad

Silero VAD: pre-trained enterprise-grade Voice Activity Detector

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

TensorRT

NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

TensorRT-LLM

TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

triton-client

Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.

Language:C++License:BSD-3-ClauseStargazers:0Issues:0Issues:0

triton-server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0