MinkaiXu / GeoDiff

Implementation of GeoDiff: a Geometric Diffusion Model for Molecular Conformation Generation (ICLR 2022).

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Edge encoder shared between global and local

torfjelde opened this issue · comments

First off, I want to just say that this is some really awesome work!

I'm currently considering applying the approach you've taken in GeoDiff to a particular conformation generation problem, and so I'm having a look through the code + trying to reproduce experiments atm.

While having a look I noticed that

# Encoding global
edge_attr_global = self.edge_encoder_global(
edge_length=edge_length,
edge_type=edge_type
) # Embed edges

and
# Encoding local
edge_attr_local = self.edge_encoder_global(
edge_length=edge_length,
edge_type=edge_type
) # Embed edges

are the same, leaving
self.edge_encoder_local = get_edge_encoder(config)

unused.

Is this sharing of parameters intentional or just a typo? In the case it's the latter, I figured I should make you aware of it:)

Oh nope, definitely not the case. I think I should have used different networks. Let me check my git history...

Hi Tor, thanks a lot for your nice comment!

I checked my git history, and it seems I do share the parameters for my experiments... But yes it is a typo, and you can fix it in your own experiments!
For the empirical performance, I think since this module is relatively simple so it will not affect the performance a lot.

But one concern for correcting the repo now is that it will make the currently provided checkpoint work not very well. So for this repo, I may just keep the current code until find a time to re-run the whole model... : )

Thanks again for your check!
Minkai

| For the empirical performance, I think since this module is relatively simple so it will not affect the performance a lot.

Exactly. I specifically didn't use the word "bug" in my description because it's such a minor thing that it shouldn't matter that much + clearly the empirical performance of the model is already very good:) But figured I'd at least make you aware in case it wasn't intentional 👍 Glad you found it useful!

| But one concern for correcting the repo now is that it will make the currently provided checkpoint work not very well. So for this repo, I may just keep the current code until find a time to re-run the whole model... : )

That makes sense:) As a tip: one thing you can always do is to create a separate branch for the code used in the publication and just point to the reader to that in the README on #main.

Thanks, Tor! And please ping me if there is any other question!