New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
log batches of embeddings #43
Comments
How about concatenate those tensors as a big tensor and append different label prefix for each tensor? |
unfortunately I don't think i can. I'm actually logging these on the fly using a hook. pytorch hooks can't return anything =(. The only thing I was able to think of is to write some code to combine the log files but there should be a better solution than that. |
maybe this is better than dump to a file. XD class Log():
def __init__():
self.tensors = []
self.labels = []
def __call__(t, l):
self.tensors.append(t)
self.labels.append(l)
def dump():
return cancat(...) |
That's the solution I JUST implemented! lol Thanks! |
Is there a way to log batches of embeddings then interact with all of the batches together in the projector?
I'm currently logging batches with the following code:
When I check the embeddings in the projector I have 10 (batch size * dim) tensors and I can only look at one batch at a time. they are named default:0000 - default:0010. Any way to get these all on the same graph?
The text was updated successfully, but these errors were encountered: