lyogavin / Anima

33B Chinese LLM, DPO QLORA, 100K context, AirLLM 70B inference with single 4GB GPU

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Insuficient disk space

ulisesbussi opened this issue · comments

Was trying to work with AutoModel.from_pretrained("v2ray/Llama-3-70B") but the space in C:\Users\myuser.cache... was insuficient. So I was wondering if there was a way to change the location of the cache for the library. I've readed the code diagonally and didn't find an option

I would like to know the default location as well. I am using MacOS.

I found it in my home path .cache/huggingface/

I have moved this folder into my external harddisk and soft link this external harddisk folder to the original location.