There are 0 repository under offline-inference topic.
Run large language models like Qwen and LLaMA locally on Android for offline, private, real-time question answering and chat - powered by ONNX Runtime.
A comprehensive toolkit for streamlining and simplifying the offline inference process for LLMs across various models and libraries.
Мультимодальная офлайновая система детекции контрафакта (текст+изображение+таблица).