microsoft / kernel-memory

RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.

Home Page:https://microsoft.github.io/kernel-memory

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question] can I use kernel memory to generate the embeddings and then query using Semantic Kernel?

roldengarm opened this issue · comments

Context / Scenario

We want to store millions of documents which works well with KM as it's doing the heavy lifting for partitioning, queueing, etc
We also use Semantic Kernel for our chat bot to stream results using Azure Openai.

Question

Kernel Memory lacks streaming of results, so I was wondering if we can use KM to generate embeddings in Azure Ai Search and then plug that into Semantic Kernel. I guess it should work, but just checking if this is the best approach.

currently KM and SK use different schemas, different field names and KM relies on tags, so it's not possible out of the box. You could achieve that creating a custom copy of KM Azure AI Search connector and change it to write using SK format.

Thanks @dluc that answers my question :)