Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support training on (non-in-memory) datasets for VectorModel-derived classes #40

Closed
MischaPanch opened this issue Jan 12, 2021 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@MischaPanch
Copy link
Collaborator

This is crucial for datasets that don't fit in RAM. Special care must be taken with featuregens and dft transformers since they typically cannot be trained batch-wise

@MischaPanch MischaPanch added the enhancement New feature or request label Jan 12, 2021
@MischaPanch MischaPanch self-assigned this Jan 12, 2021
@MischaPanch MischaPanch added this to To do in sensAI Board via automation Jan 12, 2021
@MischaPanch
Copy link
Collaborator Author

MischaPanch commented Feb 2, 2021

This issue will likely require many significant changes. The requirements I see from the top of my head:

  1. A DataSet abstraction that is compatible with "standard" data set loaders (torch and keras) as well as with pandas and numpy
  2. The preprocessing part of VectorModel should be compatible. This is not easy: see comment above. We need to think how to deal with non-fitted preprocessors when a DataSet is passed
  3. Ideally, we would also want it to work with our caching utils. For that, semantic indices need to be part of DataSet somehow.
  4. Validation set splitting and cross-validation logic needs to be adjusted. How do we split a generator? Maybe instead we require to pass a validation loader (like in pytorch/keras) - what about cross validation then?

There will be many more difficulties which I cannot anticipate now. Certainly the issue should be split into multiple issues/PRs when the time to address it comes

@opcode81
Copy link
Collaborator

opcode81 commented Feb 3, 2021

It might be worthwhile to take a look at existing libraries such as NVTabular, which supports feature engineering and preprocessing with a particular (exclusive) focus on neural networks and which can be integrated with fastai.
I am not saying we shouldn't add more neural network features to sensAI, but if what is needed has already been done by others, we shouldn't reinvent the wheel.

If we go ahead with all this, it might make sense to restrict ourselves to being compatible only with finite datasets, e.g. support only torch.utils.data.Dataset but not IterableDataset (as is already the case for sensai.torch.torch_data.TorchDataSet). It would make many things much easier, and it shouldn't be a practically relevant restriction for the vast majority of users.

@opcode81 opcode81 changed the title Support training on data generators Support Training on Data Generators for VectorModels Feb 3, 2021
@opcode81 opcode81 changed the title Support Training on Data Generators for VectorModels Support Training on (Non-in-Memory) DataSets for VectorModels Feb 3, 2021
@opcode81 opcode81 changed the title Support Training on (Non-in-Memory) DataSets for VectorModels Support Training on (Non-in-Memory) Datasets for VectorModels Feb 3, 2021
@opcode81 opcode81 changed the title Support Training on (Non-in-Memory) Datasets for VectorModels Support training on (non-in-memory) datasets for VectorModels Feb 3, 2021
@opcode81 opcode81 changed the title Support training on (non-in-memory) datasets for VectorModels Support training on (non-in-memory) datasets for VectorModel-derived classes Feb 3, 2021
@opcode81
Copy link
Collaborator

opcode81 commented Aug 23, 2021

A reasonable approach to handle this using current mechanisms (for torch models) is to have DataFrames which contain only meta-data (e.g. filenames/paths or other references to the actual data) and which do fully fit in memory and to make the TorchDataSet implementation (injected via a TorchDataSetProviderFactory) dynamically load the actual data in its iterBatches method.
Maybe we do not need additional mechanisms to handle this sort of thing.

@MischaPanch
Copy link
Collaborator Author

Yes, this sounds like a reasonable approach for many applications, where normalizers and feature extractors don't need to be fitted on the non-loaded data. What I originally had in mind was a support for training on generators of data frames (or arrays). Several libraries help building such generators, augmenting data on the way which can come in pretty handy. It might well be that these tools for data augmentation can also easily be used within an implementation of iterBatches, thus removing my main motivation for generators.

We could either close this issue or put it on ice until one of us actually uses sensai for such data sets and shares hands-on experience (I would prefer the latter option)

@opcode81
Copy link
Collaborator

opcode81 commented Feb 4, 2022

The approach I described above has recently been added to sensAI with class TorchDataSetFromDataFramesDynamicallyTensorised. To make use of it in torch vector models, one only needs to inject a TorchDataSetProviderFactory that creates a provider which will in turn create such TorchDataSet instances.

@opcode81 opcode81 closed this as completed Aug 5, 2023
sensAI Board automation moved this from To do to Done Aug 5, 2023
@opcode81 opcode81 reopened this Aug 5, 2023
sensAI Board automation moved this from Done to In progress Aug 5, 2023
@opcode81 opcode81 closed this as not planned Won't fix, can't repro, duplicate, stale Aug 5, 2023
sensAI Board automation moved this from In progress to Done Aug 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Development

No branches or pull requests

2 participants