You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The parameters n_epochs and batch_size cannot fully define the number of batches in negative edge sampling. Since the negative edge sampling uses a weighted random sampler without replacement, it cannot be guaranteed that each item is seen exactly once. Instead, the user must define something like a batches_per_epoch parameter (rethink name, maybe check how it is called in UMAP).
The text was updated successfully, but these errors were encountered:
This is actually not necessary, it is enough to just set a higher number of epochs. It is now explained in the docstring of the TrainingPhase class how the batch size relates to the number of edges and items in a batch. There, also the different meaning of epoch in the case of the two sampling variants is explained.
On second thought, it might be better to have a batches_per_epoch parameter that by default is set in such a way that roughly a number of items/edges is samples per epoch is sampled that is equal to the full dataset (but can be overridden).
The parameters
n_epochs
andbatch_size
cannot fully define the number of batches in negative edge sampling. Since the negative edge sampling uses a weighted random sampler without replacement, it cannot be guaranteed that each item is seen exactly once. Instead, the user must define something like abatches_per_epoch
parameter (rethink name, maybe check how it is called in UMAP).The text was updated successfully, but these errors were encountered: