pytorch同时让两个dataloader打乱的顺序是相同

class MyDataset(Dataset):
    def __init__(self, datasetA, datasetB):
        self.datasetA = datasetA
        self.datasetB = datasetB
        
    def __getitem__(self, index):
        xA = self.datasetA[index]
        xB = self.datasetB[index]
        return xA, xB
    
    def __len__(self):
        return len(self.datasetA)
    
datasetA = ...
datasetB = ...
dataset = MyDataset(datasetA, datasetB)
loader = DataLoader(dataset, batch_size=10, shuffle=True)

https://discuss.pytorch.org/t/dataloader-shuffle-same-order-with-multiple-dataset/94800/2

猜你喜欢

转载自blog.csdn.net/aab11235/article/details/116203567