Noeda / rllama

Rust+OpenCL+AVX2 implementation of LLaMA inference code

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Heads up. Doesn't build on Arch Linux due to #![feature(stdsimd)]

trholding opened this issue · comments

commented

RUSTFLAGS="-C target-feature=+sse2,+avx,+fma,+avx2" cargo install rllama --features opencl

error[E0554]: `#![feature]` may not be used on the stable release channel
 --> /home/experiments/.cargo/registry/src/github.com-1ecc6299db9ec823/rllama-0.3.0/src/lib.rs:1:12
  |
1 | #![feature(stdsimd)]
  |            ^^^^^^^

For more information about this error, try `rustc --explain E0554`.
error: could not compile `rllama` due to previous error
warning: build failed, waiting for other jobs to finish...
error: failed to compile `rllama v0.3.0`, intermediate artifacts can be found at `/tmp/cargo-installgHQL8m`

You need nightly Rust to compile and run rllama. Until SIMD is stabilized in Rust, this probably won't change.

I believe it's possible to get SIMD in stable Rust with some trickery but it requires some work which currently I don't want to work on.