sleepingcat4 / codellama-inference

a complete guide to perform inference on codellama

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Introduction

A repo to detail, how to load codellama model both from scratch using the FAIR repository and lazy loading using huggingface library.

What's the purpose?

There are not detailed article/blogpost/guideline exists neither from facebook/third-party to load codellama model from scratch. Though loading through huggingface exists with an article from HF detailing it, but not complete. Which is why, if you're not a seasoned researcher, it is reletively hard to load the model by scratch, without resorting to example.py file provided by META. I don't oppose using that but why not have some code. Most articles in this task are bunch of spam and click bait. And once you know the process, you can apply the same formula over and over.

Note: I'm not providing any fine-tune/model from scratch method. It's all pure inference : )

About

a complete guide to perform inference on codellama

License:MIT License


Languages

Language:Jupyter Notebook 100.0%