EricLBuehler / candle-lora

Low rank adaptation (LoRA) for Candle.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to use canle_lora modle with rust auxm web server

arthasyou opened this issue · comments

Auxm code is below

let model = Llama::load(
        vb,
        &cache,
        &config,
        false,
        loraconfig,
        linearconfig,
        embedconfig,
    )
    .unwrap();

Router::new().layer(Extension(model))

the error is

(dyn EmbeddingLayerLike + 'static)` cannot be sent between threads safely
the trait `Send` is not implemented for `(dyn EmbeddingLayerLike + 'static)`
the trait `tower_layer::Layer<S>` is implemented for `Extension<T>`
required for `Arc<(dyn EmbeddingLayerLike + 'static)>` to implement `Send`
required for `Extension<LoraLLM>` to implement `tower_layer::Layer<Route>

I found the problem is "trc" dependence should use "SharedTrc"

Yes, that is true. I will update the code shortly.

I have added some code which adds Send + Sync bounds to the macro generation. Could you please try it again?

thx It worked