pi314ever / tei-gaudi

A blazing fast inference solution for text embeddings models

Home Page:https://huggingface.co/docs/text-embeddings-inference/quick_tour

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Text Embeddings Inference on Habana Gaudi

To use 🤗 text-embeddings-inference on Habana Gaudi/Gaudi2, follow these steps:

  1. Build the Docker image located in this folder with:
    docker build -f Dockerfile-hpu -t tei_gaudi .
  2. Launch a local server instance on 1 Gaudi card:
    model=BAAI/bge-large-en-v1.5
    volume=$PWD/data # share a volume with the Docker container to avoid downloading weights every run
    
    docker run -p 8080:80 -v $volume:/data --runtime=habana -e HABANA_VISIBLE_DEVICES=all -e OMPI_MCA_btl_vader_single_copy_mechanism=none --cap-add=sys_nice --ipc=host tei_gaudi --model-id $model --pooling cls
  3. You can then send a request:
     curl 127.0.0.1:8080/embed \
         -X POST \
         -d '{"inputs":"What is Deep Learning?"}' \
         -H 'Content-Type: application/json'

For more information and documentation about Text Embeddings Inference, checkout the README of the original repo.

Not all features of TEI are currently supported as this is still a work in progress.

The license to use TEI on Habana Gaudi is the one of TEI: https://github.com/huggingface/text-embeddings-inference/blob/main/LICENSE

Please reach out to api-enterprise@huggingface.co if you have any question.

About

A blazing fast inference solution for text embeddings models

https://huggingface.co/docs/text-embeddings-inference/quick_tour

License:Apache License 2.0


Languages

Language:Rust 90.7%Language:Python 5.3%Language:JavaScript 2.5%Language:Dockerfile 0.9%Language:Shell 0.3%Language:Makefile 0.3%