You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm using Laplacian Eigenmaps and noticed that the resulting embeddings are not always the same, even though I have explicitly set the seed:
model = LaplacianEigenmaps(dimensions=3,seed=0)
Running the same algorithm in the same python session for multiple times yields different embeddings each time. Here is a minimal reproducible example:
import networkx as nx
g_undirected = nx.newman_watts_strogatz_graph(1000, 20, 0.05, seed=1)
from karateclub.node_embedding.neighbourhood import LaplacianEigenmaps
import numpy as np
for _ in range(5):
model = LaplacianEigenmaps(dimensions=3,seed=0)
model.fit(g_undirected)
node_emb_le = model.get_embedding()
print(np.sum(node_emb_le))
It yields the following summed value of the embeddings for me:
How can I control the randomness so that every time the resulting embeddings are exactly the same, even if I run the algorithm for arbitrary times in the same python session?
The text was updated successfully, but these errors were encountered:
Hi! I'm using Laplacian Eigenmaps and noticed that the resulting embeddings are not always the same, even though I have explicitly set the seed:
Running the same algorithm in the same python session for multiple times yields different embeddings each time. Here is a minimal reproducible example:
It yields the following summed value of the embeddings for me:
How can I control the randomness so that every time the resulting embeddings are exactly the same, even if I run the algorithm for arbitrary times in the same python session?
The text was updated successfully, but these errors were encountered: