THUDM / SwissArmyTransformer

SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.

Home Page:https://THUDM.github.io/SwissArmyTransformer

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

请问dataloader能shuffle吗?

XaviLv opened this issue · comments

我看到 site-packages/sat/data_utils/configure_data.py 的 torch.utils.data.DataLoader() 没有指定 shuffle,也就是用默认值 False。是否意味着每个 epoch 开始时 dataset 不会被打乱顺序?在不改库实现的情况下,有没有其他方式能过做到 shuffle 呢?

data_loader = torch.utils.data.DataLoader(dataset, batch_sampler=batch_sampler, num_workers=args.num_workers, pin_memory=True, collate_fn=collate_fn)

不用改库,默认是shuffle的,参考make_dataset_full函数