webersongao / Queryable

Run CLIP on iPhone to Search Photos.

Home Page:https://queryable.app

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Queryable

The open-source code of Queryable, an iOS app, utilizes the CLIP model to conduct offline searches in the Photos album.

Unlike the object recognition-based search feature built into the iOS gallery, Queryable allows you to use natural language statements, such as a brown dog sitting on a bench, to search your gallery. It operates offline, ensuring that your album privacy won't be leaked to anyone, including Apple/Google.

Blog | Website | App Store

Performance

final.mp4

How does it work?

  • Process all photos in your album through the CLIP Image Encoder to create a set of local image vectors.
  • When a new text query is inputted, convert the text into a text vector using the Text Encoder.
  • Compare the text vector with all the stored image vectors, evaluating the level of similarity between the text query and each image.
  • Sort and return the top K most similar results.

The process is as follows:

For more details, please refer to my article Run CLIP on iPhone to Search Photos.

Run on Xcode

Download the ImageEncoder_float32.mlmodelc and TextEncoder_float32.mlmodelc from Google Drive. Clone this repo, put the downloaded models below CoreMLModels/ path and run Xcode, it should work.

Core ML Export

If you only want to run Queryable, you can skip this step and directly use the export model from Google Drive. If you wish to implement Queryable that supports your own native language, or do some model quantization/acceleration work, here are some guidelines.

The trick is to separate the TextEncoder and ImageEncoder at the architecture level, and then load the model weights individually. Queryable uses the OpenAI ViT-B/32 model, and I wrote a Jupyter notebook to demonstrate how to separate, load, and export the Core ML model. The export results of the ImageEncoder's Core ML have a certain level of precision error, and more appropriate normalization parameters may be needed.

Contributions

Disclaimer: I am not a professional iOS engineer, please forgive my poor Swift code. You may focus only on the loading, computation, storage, and sorting of the model.

You can apply Queryable to your own product, but I don't recommend simply modifying the appearance and listing it on the App Store. If you are interested in optimizing certain aspects(such as mazzzystar#3, mazzzystar#4, mazzzystar#5, mazzzystar#6), feel free to submit a PR (Pull Request).

If you have any questions/suggestions, here are some contact methods: Discord | Twitter | Reddit: r/Queryable.

License

MIT License

Copyright (c) 2023 Ke Fang

About

Run CLIP on iPhone to Search Photos.

https://queryable.app

License:MIT License


Languages

Language:Swift 84.0%Language:Jupyter Notebook 16.0%