xlang-ai / instructor-embedding

[ACL 2023] One Embedder, Any Task: Instruction-Finetuned Text Embeddings

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Different types of embedding used internally inside the tool?

pradeepdev-1995 opened this issue · comments

What are the different types of embeddings such as (sentence transformer embedding, spacy embedding, bert embedding, glove embedding, openai embedding,cohere embedding, huggingface embedding...etc) used internally in instructor-embedding ?

or instructor-embedding use its own embedding?

Please explain what is happening internally.

Hi, Thanks a lot for your interest in the INSTRUCTOR!

Different types of embeddings use different backend models with various architectures and training data, and we have also trained a general model with T5-structure for efficient adaptation in INSTRUCTOR embedding. For details, please refer to our paper: https://arxiv.org/abs/2212.09741

Feel free to re-open the issue if your have any further questions or comments!