AbhinavAdarsh / MobiRnn

Efficient LSTM parallelization on smartphone GPU

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

MobiRnn on Android

Intro

This repo is for running LSTM model on mobile devices. Currently we support the following mode:

  • Plain CPU (Java)
  • TensorFlow CPU (Java)
  • Native CPU (C)
  • Eigen CPU (C++)
  • GPU (RenderScript)
  • GPU (OpenCL)
  • GPU (Vulkan)

Usage

Just run ./gradlew iR to install MobiRNN on your connected phone.

You can run ./gradlew pu to generate apk file or simply download the blob/mobile-release.apk in the repo.

About

Efficient LSTM parallelization on smartphone GPU

License:MIT License


Languages

Language:Java 69.8%Language:C++ 17.6%Language:RenderScript 11.4%Language:C 0.7%Language:CMake 0.5%