lamm-mit / MeLM

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Inference notebook not working

grndnl opened this issue · comments

Hello,

I would like to use the model for research.
I was trying to run MechGPT inference.ipynb but I'm seeing the following error:

Downloading shards: 100%|██████████| 3/3 [02:04<00:00, 41.48s/it]
Loading checkpoint shards: 100%|██████████| 3/3 [00:34<00:00, 11.54s/it]
Traceback (most recent call last):
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
    response.raise_for_status()
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 197, in _get_peft_type
    config_file = hf_hub_download(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1403, in hf_hub_download
    raise head_call_error
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1261, in hf_hub_download
    metadata = get_hf_file_metadata(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1667, in get_hf_file_metadata
    r = _request_wrapper(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
    response = _request_wrapper(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
    hf_raise_for_status(response)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65f4d5e3-1a948b0b62e8109f010f9de5;fdbc9d9b-3f29-42ef-9af9-e62dc7a11647)

Repository Not Found for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/c/Users/grandid/source/repos/MeLM/inference.py", line 30, in <module>
    model = PeftModel.from_pretrained(model_base, peft_model_id,
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/peft_model.py", line 325, in from_pretrained
    PeftConfig._get_peft_type(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 203, in _get_peft_type
    raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
ValueError: Can't find 'adapter_config.json' at 'MechGPT-13b_v106C'
ERROR conda.cli.main_run:execute(49): `conda run python /mnt/c/Users/grandid/source/repos/MeLM/inference.py` failed. (See above for error)

Process finished with exit code 1

It looks like this is because the huggingface model is not public.
Could you advise as to how to run the model locally?

Thanks.

After looking into the huggingface cli, the issue still persists.

Loading checkpoint shards: 100%|██████████| 3/3 [00:15<00:00,  5.30s/it]
Traceback (most recent call last):
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
    response.raise_for_status()
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 197, in _get_peft_type
    config_file = hf_hub_download(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1403, in hf_hub_download
    raise head_call_error
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1261, in hf_hub_download
    metadata = get_hf_file_metadata(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
    return fn(*args, **kwargs)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1667, in get_hf_file_metadata
    r = _request_wrapper(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
    response = _request_wrapper(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
    hf_raise_for_status(response)
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 404 Client Error. (Request ID: Root=1-65f4d925-49e86f1813d04fa41bce1963;f8780477-1c8c-4ec8-9a27-1aec0a0cd892)

Repository Not Found for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/c/Users/grandid/source/repos/MeLM/inference.py", line 30, in <module>
    model = PeftModel.from_pretrained(model_base, peft_model_id,
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/peft_model.py", line 325, in from_pretrained
    PeftConfig._get_peft_type(
  File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 203, in _get_peft_type
    raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
ValueError: Can't find 'adapter_config.json' at 'MechGPT-13b_v106C'
ERROR conda.cli.main_run:execute(49): `conda run python /mnt/c/Users/grandid/source/repos/MeLM/inference.py` failed. (See above for error)

Process finished with exit code 1

The repository is public, you can find it here: https://huggingface.co/lamm-mit/MeLM/tree/main/MechGPT-13b_v106C

The reason is why your loading command didn't work is likely because the command expects the folder 'MechGPT-13b_v106C' to be local.

I suggest downloading the entire folder and store it in a subdir with said name. Then the model should load without issue. You can do this as follows:

from huggingface_hub import HfApi, hf_hub_download

def download_subfolder(repo_id, subfolder, local_dir):
    api = HfApi()
    repo_files = api.list_repo_files(repo_id=repo_id)
    subfolder_files = [f for f in repo_files if f.startswith(subfolder)]
    for filename in subfolder_files:
        hf_hub_download(repo_id=repo_id, filename=filename, cache_dir=local_dir)
        print(f"Downloaded {filename} to {local_dir}")

# Example 
repo_id = 'lamm-mit/MeLM'
subfolder = 'MechGPT-13b_v106C'
local_dir = './MechGPT-13b_v106C'  # Local directory to save the files
download_subfolder(repo_id=repo_id, subfolder=subfolder, local_dir=local_dir)