You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When generating the SVD for (a) compression of the dataset and (b) initialisation of the first network layer, we need to make sure that the basis is sufficiently accurate. The method SVDBasis.test_basis() is intended for this purpose, however at the moment it is pretty messy with print statements and saved data. Need to clean this up.
The test_basis method should maybe have two modes:
The minimal test, testing only with the maximal number of basis elements. It could either print the mismatches, or just check whether they are below a particular threshold. This should be run whenever generating and using a reduced basis, i.e. for (a) and for (b).
A detailed test that allows for better analysis. E.g., when generating a dataset with a new waveform model, or a new prior, one does not know how many basis elements are necessary. The detailed test should scan the mismatches with a variety of n_svd (default could e.g. be n_svd in [2**i for i in range(5, i_max)]). In addition, there should be an option to save the array of mismatches along with the parameters. This would help to analyse in which part of the parameter space the reduced basis fails. It would also be necessary to determine what prior on particular parameters (e.g., chirp mass) is feasible for a specific n_svd.
The text was updated successfully, but these errors were encountered:
For dataset generation, this could also be used to (automatically?) determine how many basis elements are necessary. One could add an option to WaveformDataset to truncate a loaded dataset to a smaller number of basis elements, which would speed up preprocessing.
I agree we should overhaul the SVDBasis class. In particular, if we make it subclass DingoDataset we can more easily save additional diagnostic information along with the V matrix, and we can also output all of this as a dictionary and save it along with a WaveformDataset. The new class could also have a method to truncate itself at a smaller number of basis elements.
When generating the SVD for (a) compression of the dataset and (b) initialisation of the first network layer, we need to make sure that the basis is sufficiently accurate. The method
SVDBasis.test_basis()
is intended for this purpose, however at the moment it is pretty messy with print statements and saved data. Need to clean this up.The
test_basis
method should maybe have two modes:n_svd
(default could e.g. ben_svd in [2**i for i in range(5, i_max)]
). In addition, there should be an option to save the array of mismatches along with the parameters. This would help to analyse in which part of the parameter space the reduced basis fails. It would also be necessary to determine what prior on particular parameters (e.g., chirp mass) is feasible for a specificn_svd
.The text was updated successfully, but these errors were encountered: