echonet / lvh

EchoNet-LVH is a deep learning model that quantifies ventricular hypertrophy and predicts etiologies of increased wall thickness and LVH (amyloidosis, HCM, etc).

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: too many values to unpack (expected 2)

xyllq999 opened this issue · comments

File "run_plax_hypertrophy_inference.py", line 376, in
args.update({k.replace('-', '_'): v for k, (v, h) in vars(parser.parse_args()).items()})

Also, after correct these error in code, I get a new error. RuntimeError: CUDA out of memory. Tried to allocate 3.66 GiB (GPU 0; 11.75 GiB total capacity; 9.65 GiB already allocated; 337.06 MiB free; 9.67 GiB reserved in total by PyTorch). My GPI is 3060, I wonder what GPU are required in this project?

Also, after correct these error in code, I get a new error. RuntimeError: CUDA out of memory. Tried to allocate 3.66 GiB (GPU 0; 11.75 GiB total capacity; 9.65 GiB already allocated; 337.06 MiB free; 9.67 GiB reserved in total by PyTorch). My GPI is 3060, I wonder what GPU are required in this project?

Hi where did you get the weight file of this model?I could not find any related file of model‘s weight.

Need to clear cache of GPU prior to running. Model weights on GitHub as a release.

Need to clear cache of GPU prior to running. Model weights on GitHub as a release.

Thank you! i found it!

File "run_plax_hypertrophy_inference.py", line 376, in
args.update({k.replace('-', '_'): v for k, (v, h) in vars(parser.parse_args()).items()})

What did you change the line to? @xyllq999

File "run_plax_hypertrophy_inference.py", line 376, in
args.update({k.replace('-', '_'): v for k, (v, h) in vars(parser.parse_args()).items()})

What did you change the line to? @xyllq999

It works for me with these two changes:
args.update({k.replace('-', '_'): v for k, v in vars(parser.parse_args()).items()})
get_args = lambda l: {k: args[k] for k in l}
I hope this helps!

Thanks, this helped me.
But now it is giving me this error. Thoughts?

TypeError: () takes 1 positional argument but 2 were given