A little problem about the batchsize
xiaolongcheng opened this issue · comments
Hi, your work is so impressive that I want to reproduce your experiment. But I got a problem about the config of the batchsize, I use a single GTX 1080Ti GPU as you did,but the batchsize of 8 seems too big for it, I can just set batchsize to 5 or 4. Can you tell me the solution about this problem,thanks a lot
Which config file did you use?
Oh! This do remind me about the configs, I use celena-hq_256.yaml as my configuration. If I want to set batchsize to 8, I should use celebs-hq.yaml, which set the input_size as 128, am I right?
…------------------ 原始邮件 ------------------
发件人: "imlixinyang/HiSD" ***@***.***>;
发送时间: 2021年6月3日(星期四) 晚上6:06
***@***.***>;
***@***.******@***.***>;
主题: Re: [imlixinyang/HiSD] A little problem about the batchsize (#20)
Which config file did you use?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
Yes. The 256 config needs 2 1080ti at least.
Thanks a lot for your time and reply
…---Original---
From: ***@***.***>
Date: Thu, Jun 3, 2021 19:45 PM
To: ***@***.***>;
Cc: ***@***.******@***.***>;
Subject: Re: [imlixinyang/HiSD] A little problem about the batchsize (#20)
Yes. The 256 config needs 2 1080ti at least.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
Always welcomed!