Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect behavior for atom_messages = True #133

Closed
Varal7 opened this issue Feb 2, 2021 · 5 comments · Fixed by #138
Closed

Incorrect behavior for atom_messages = True #133

Varal7 opened this issue Feb 2, 2021 · 5 comments · Fixed by #138

Comments

@Varal7
Copy link

Varal7 commented Feb 2, 2021

I think there is currently a bug where args.atom_messages = True.
BatchMolGraph uses the get_bond_dim() function without forwarding the argument which seems will lead to incorrect dimension of the bond features.

@fhvermei
Copy link
Contributor

fhvermei commented Feb 2, 2021

The bond feature vector in BatchMolGraph are indeed constructed using both the atom and bond features. When the bond feature vector is called for the MPNN (BatchMolGraph.get_components(atom_messages=True)) the size of the bond feature vector is corrected and the atom features are removed.

@Varal7
Copy link
Author

Varal7 commented Feb 2, 2021

I am talking about this line:

self.bond_fdim = get_bond_fdim()

But I can run some tests tomorrow and submit a proper bug report if needs be

@hesther
Copy link
Contributor

hesther commented Feb 2, 2021

Thank you for your question! Just had a look - I agree with Florence. Upon initialising BatchMolGraph, the bond dimensions are set up alike no matter whether atom messages are used or not (line 206), and the bond features are set up with the full vector (bond + atom features). But when returning the atom and bond features in get_components, the bond features get truncated up to their correct length if atom messages are used (with the length determined by calling get_bond_fdim again with atom_messages=True):

def get_components(self, atom_messages: bool = False) -> Tuple[torch.FloatTensor, torch.FloatTensor,
torch.LongTensor, torch.LongTensor, torch.LongTensor,
List[Tuple[int, int]], List[Tuple[int, int]]]:
"""
Returns the components of the :class:`BatchMolGraph`.
The returned components are, in order:
* :code:`f_atoms`
* :code:`f_bonds`
* :code:`a2b`
* :code:`b2a`
* :code:`b2revb`
* :code:`a_scope`
* :code:`b_scope`
:param atom_messages: Whether to use atom messages instead of bond messages. This changes the bond feature
vector to contain only bond features rather than both atom and bond features.
:return: A tuple containing PyTorch tensors with the atom features, bond features, graph structure,
and scope of the atoms and bonds (i.e., the indices of the molecules they belong to).
"""
if atom_messages:
f_bonds = self.f_bonds[:, :get_bond_fdim(atom_messages=atom_messages)]
else:
f_bonds = self.f_bonds
return self.f_atoms, f_bonds, self.a2b, self.b2a, self.b2revb, self.a_scope, self.b_scope

This may be a bit clumsy (and there might be faster ways to do this), but correct.

I will close this issue for now, but please feel free to reopen if you have further questions! Yours, Esther

@hesther hesther closed this as completed Feb 2, 2021
@hesther
Copy link
Contributor

hesther commented Feb 8, 2021

While looking through the code more thoroughly today, I noticed that there actually IS a bug (although a different one than reported):

We set up the bond features in the order atom(length 133)-bond(length 14):

self.f_bonds.append(self.f_atoms[a1] + f_bond)
self.f_bonds.append(self.f_atoms[a2] + f_bond)

but then cut out the first (!) 14 values, instead of the last:

if atom_messages:
f_bonds = self.f_bonds[:, :get_bond_fdim(atom_messages=atom_messages)]
else:
f_bonds = self.f_bonds

Can someone else confirm this, so I can write a PR?

@hesther hesther reopened this Feb 8, 2021
@cjmcgill
Copy link
Contributor

cjmcgill commented Feb 8, 2021

@hesther It looks this way to me as well

@hesther hesther linked a pull request Feb 8, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants