mallman / CoreMLaMa

LaMa for Core ML

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CoreMLaMa: LaMa for Core ML

This repo contains a script for converting a LaMa (aka cute, fuzzy 🦙) model to Apple's Core ML model format. More specifically, it converts the implementation of LaMa from Lama Cleaner.

This repo also includes a simple example of how to use the Core ML model for prediction. See Sample.

iOS Deployment Notes

The Core ML model this script produces was designed for macOS deployments. It runs well on macOS, on the GPU. I have received several reports of unsuccessful attempts to run this model on iOS, especially with fp16 precision on the Neural Engine. Conversely, I have not received any reports of successful deployments to iOS.

It may very well be possible to run this model on iOS with some tuning in the conversion process. I simply have not attempted this. I would very much welcome a PR and give credit to anyone who is able to convert this model and run it with great results on iOS.

Conversion Instructions

  1. Create a Conda environment for CoreMLaMa:

    conda create -n coremllama python=3.10 # works with mamba, too
    conda activate coremllama
    pip install -r requirements.txt
  2. Run the conversion script:

    python convert_lama.py

This script will download and convert Big LaMa to a Core ML package named LaMa.mlpackage.

Acknowledgements and Thanks

Thanks to the authors of LaMa:

[Project page] [arXiv] [Supplementary] [BibTeX] [Casual GAN Papers Summary]

CoreMLaMa uses the LaMa model and supporting code from Lama Cleaner. Lama Cleaner makes this project much simpler.

About

LaMa for Core ML

License:Apache License 2.0


Languages

Language:Python 100.0%