ljwh / mobllm

An inference engine for high-performance execution of LLMs on mobile devices.

Repository from Github https://github.comljwh/mobllmRepository from Github https://github.comljwh/mobllm

mobllm

An inference engine for high-performance execution of LLMs on mobile devices.

About

An inference engine for high-performance execution of LLMs on mobile devices.

License:Apache License 2.0