You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These inputs are arrays with different shapes, so it's hard to concatenate them into a single tensor. They work well with normal pytorch code. However, Ensemble-Pytorch can not deal with it.
Hope Ensemble-Pytorch can support DataLoader with multiple inputs in the future.
Thank you.
The text was updated successfully, but these errors were encountered:
I am wondering that whether the code snippet below meets your requirement, where we have created a dataloader with three input tensors input_1, input_2, input_3, and it is passed into a model whose forward method takes three inputs accordingly. To make the entire workflow run as expected, we can pass multiple inputs into the model in the form of non-keyword arguments (i.e., *data). There should be no problem as long as the order of these inputs are the same between arguments in forward and arguments when creating the dataloader.
I have a train_loader with multiple inputs:
train_loader = torch.utils.data.DataLoader( torch.utils.data.TensorDataset(input1, input2, input3, label) ,batch_size=batch_size, shuffle=True)
These inputs are arrays with different shapes, so it's hard to concatenate them into a single tensor. They work well with normal pytorch code. However, Ensemble-Pytorch can not deal with it.
Hope Ensemble-Pytorch can support DataLoader with multiple inputs in the future.
Thank you.
The text was updated successfully, but these errors were encountered: