Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why to create a new tensor? #11

Closed
dragen1860 opened this issue Jun 4, 2018 · 2 comments
Closed

Why to create a new tensor? #11

dragen1860 opened this issue Jun 4, 2018 · 2 comments

Comments

@dragen1860
Copy link

Hi, dear author:

    def write(self, w, e, a):
        """write to memory (according to section 3.2)."""
        self.prev_mem = self.memory
        self.memory = Variable(torch.Tensor(self.batch_size, self.N, self.M))
        erase = torch.matmul(w.unsqueeze(-1), e.unsqueeze(1))
        add = torch.matmul(w.unsqueeze(-1), a.unsqueeze(1))
        self.memory = self.prev_mem * (1 - erase) + add

In your writing method, I dont understand why u create a new Variable(torch.Tensor(self.batch_size, self.N, self.M)) and then assign the new value,
Why not write as following directly :

    def write(self, w, e, a):
        """write to memory (according to section 3.2).""" 
        erase = torch.matmul(w.unsqueeze(-1), e.unsqueeze(1))
        add = torch.matmul(w.unsqueeze(-1), a.unsqueeze(1))
        self.memory = self.memory * (1 - erase) + add
@dragen1860
Copy link
Author

@loudinthecloud please help me/

@loudinthecloud
Copy link
Owner

Feel free to commit a fix, this is completely unnecessary

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants