Code Llama support
BohdanPetryshyn opened this issue · comments
Summary
It would be great to support Code Llama tokenization.
Context
I know that the Code Llama model is based on Llama 2. However, I haven't found any clear evidence for this. If the current implementation already supports Code Llama tokenization, it would be great to clearly state this in the README.
Hey there! I checked Code Llama and it seems to use the same tokenizer, so llama-tokenizer-js is compatible with it.
I don't want to maintain a list of compatible models, because there are thousands of them.
Thank you for your reply! Your project will be very helpful to me 🙏