DefTruth / torchlm

💎A high level pipeline for face landmarks detection, it supports training, evaluating, exporting, inference(Python/C++) and 100+ data augmentations, can easily install via pip.

Home Page:https://github.com/DefTruth/torchlm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fine-tuning Custom Dataset Error

apecundo opened this issue · comments

Hi!

I tried finetuning my dataset (I followed the format of annotations) but I am getting this error

image

Here is the code I used:

"model = pipnet(backbone="resnet101", weights = True, num_nb=10, num_lms=68, net_stride=32,
input_size=256, meanface_type="300w", backbone_pretrained=True)

model.apply_freezing(backbone=True)
model.apply_training(
annotation_path=r"\data\TNF\test\train.txt", # or fine-tuning your custom data
num_epochs=10,
learning_rate=0.0001,
save_dir="./save/pipnet",
save_prefix="pipnet-300W-TNFfinetune-resnet101",
save_interval=1,
logging_interval=1,
device="cuda",
coordinates_already_normalized=False,
batch_size=16,
num_workers=4,
shuffle=True)
"

For reference, my folder structure looks like this

image

have you solved?