Way to achive deterministic results?
JackPieCZ opened this issue · comments
Hi! I've noticed using your proposed method you always get a little bit different results due to how DDPM works. I was wondering if there's a possibility to make the results more deterministic.
I was thinking about setting a specific seed that would influence the randomness of diffusion evolution.
Or using specific classes
tensor instead of generating a one with random values as it is currently in test.py script. With that, what does the keyword cond_y
mean that determines how is the tensor generated? I couldn't find it either in the paper or the rest of your project.
Or is there already a way to achieve more stable results, and I had just failed to notice?
Is there a chance you could suggest how I shall proceed? Concerning that, I wanted to ask, how did you achieve specific results that you could compare to other older methods you mentioned in your paper when the results without interfering are always a bit different?
Thank you very much in advance! Your work is amazing and I am a bit fan of it
For anyone interested, here's a solution I found. To eliminate the randomness of the result, set the seed of random functions that are used in the code:
def init_seeds(seed=0):
# Initialize random number generator (RNG) seeds https://pytorch.org/docs/stable/notes/randomness.html
# cudnn seed 0 settings are slower and more reproducible, else faster and less reproducible
import torch.backends.cudnn as cudnn
random.seed(seed)
np.random.seed(seed)
torch.manual_seed(seed)
cudnn.benchmark, cudnn.deterministic = (False, True) if seed == 0 else (True, False)