Jan (janhq)

Jan

janhq

Geek Repo

An open source alternative to OpenAI that runs on your own computer or server

Home Page:https://jan.ai

Twitter:@janframework

Github PK Tool:Github PK Tool

Jan's repositories

jan

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

Language:TypeScriptLicense:AGPL-3.0Stargazers:22788Issues:127Issues:1824

cortex.cpp

Run and customize Local LLMs.

Language:C++License:Apache-2.0Stargazers:1967Issues:16Issues:627

cortex.tensorrt-llm

Cortex.Tensorrt-LLM is a C++ inference library that can be loaded by any server at runtime. It submodules NVIDIA’s TensorRT-LLM for GPU accelerated inference on NVIDIA's GPUs.

Language:C++License:Apache-2.0Stargazers:39Issues:2Issues:20

docs

Jan.ai Website & Documentation

cortex.llamacpp

cortex.llamacpp is a high-efficiency C++ inference engine for edge computing. It is a dynamic library that can be loaded by any server at runtime.

Language:C++License:AGPL-3.0Stargazers:16Issues:5Issues:18
Language:PythonLicense:MITStargazers:4Issues:2Issues:14

cortex.js

The official Node.js / Typescript library for the OpenAI API

Language:TypeScriptLicense:Apache-2.0Stargazers:3Issues:0Issues:0

cortex.python

C++ code that run Python embedding

Language:C++License:AGPL-3.0Stargazers:3Issues:2Issues:3

cortex.py

The official Python library for the OpenAI API

License:Apache-2.0Stargazers:2Issues:0Issues:0
Language:C++License:AGPL-3.0Stargazers:2Issues:0Issues:0
Language:RubyStargazers:1Issues:0Issues:0
Language:JavaScriptLicense:AGPL-3.0Stargazers:1Issues:5Issues:0
Stargazers:0Issues:2Issues:0
Language:C++Stargazers:0Issues:0Issues:0
Language:RubyStargazers:0Issues:0Issues:0

infinity

The AI-native database built for LLM applications, providing incredibly fast vector and full-text search

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0
Language:TypeScriptLicense:MITStargazers:0Issues:1Issues:0

llama.cpp-avx-vnni

Port of Facebook's LLaMA model in C/C++

Language:C++License:MITStargazers:0Issues:1Issues:0

openai_trtllm

OpenAI compatible API for TensorRT LLM triton backend

Language:RustLicense:MITStargazers:0Issues:1Issues:0
License:AGPL-3.0Stargazers:0Issues:0Issues:0

pymaker

Make the py

Stargazers:0Issues:0Issues:0

tensorrtllm_backend

The Triton TensorRT-LLM Backend

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

trt-llm-as-openai-windows

This reference can be used with any existing OpenAI integrated apps to run with TRT-LLM inference locally on GeForce GPU on Windows instead of cloud.

Language:PythonLicense:NOASSERTIONStargazers:0Issues:1Issues:0

winget-pkgs

The Microsoft community Windows Package Manager manifest repository

License:MITStargazers:0Issues:0Issues:0