AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.

Home Page:https://ai4finance.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

please how to call it locally

NanshaNansha opened this issue · comments

@NanshaNansha To load a model locally using the PeftModel class from a pretrained model, you need to ensure that the base_model and other required files are available locally.
Try out these steps and let me know, if it works

  1. Install the dependencies:- pip install transformers peft
  2. Prepare Local Paths: Set the local paths where your pretrained model and cache directory are located.
    example snippet:-
from peft import PeftModel

# Define the local path to the base model and the cache directory
base_model_path = 'path/to/your/base_model'
model_id = 'path/to/your/local_model_directory'
cache_dir = 'path/to/your/cache_directory'

# Load the model from the local path
model = PeftModel.from_pretrained(base_model_path, model_id=model_id, cache_dir=cache_dir)

# Example: Use the model for inference
# Make sure you have the tokenizer and other necessary components
from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(base_model_path)
input_text = "Your input text here"

inputs = tokenizer(input_text, return_tensors="pt")
outputs = model(**inputs)

print(outputs)

Let me know, if it works
Thanks