Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review Request: C. de la Torre-Ortiz and A. Nioche #10

Closed
c-torre opened this issue Nov 7, 2019 · 58 comments
Closed

Review Request: C. de la Torre-Ortiz and A. Nioche #10

c-torre opened this issue Nov 7, 2019 · 58 comments

Comments

@c-torre
Copy link

c-torre commented Nov 7, 2019

Dear ReScience editors,

I request a review for the following replication:

Original article: https://doi.org/10.3389/fncom.2015.00149

PDF URL: https://github.com/c-torre/replication-recanatesi-2015/blob/master/re-neural-network-model-of-memory-retrieval.pdf
Metadata URL: https://github.com/c-torre/replication-recanatesi-2015/blob/master/metadata.tex
Code URL: https://github.com/c-torre/replication-recanatesi-2015/

Scientific domain: Computational Neuroscience
Programming language: Python
Suggested editor: @oliviaguest

@oliviaguest
Copy link
Member

Happy to take this on. @bobaseb are you interested in reviewing this? 😄

@bobaseb
Copy link

bobaseb commented Nov 8, 2019

Sure I'm in :)

@oliviaguest
Copy link
Member

👋 @anne-urai would you be able to review this?

@anne-urai
Copy link

I don't really have expertise in neural networks, but my colleague @jamesproach is, and is happy to do this. He'll register as a reviewer first.

@oliviaguest
Copy link
Member

@anne-urai thank you!

@jamesproach let me know when you are ready and if you have any questions. Same goes for you @bobaseb of course. The sooner the better of course, but let us know when you plan to get this done. 😄

@bobaseb
Copy link

bobaseb commented Nov 21, 2019

The authors do a good job in transforming the original model based on individual neurons to a population model. New parameters are introduced to this effect and the original model's dynamics (i.e., retrieving a memory while inhibiting others under periodic inhibition dynamics) is broadly reproduced.

The code is clear and human-readable with enough comments to get the gist of each computation. I was able to run it with ease on a linux machine.

Three specific questions came to mind when reviewing this work.

  1. Why is the sparsity parameter different? Shouldn't it be 0.1?

  2. Where is the recall activation threshold parameter defined?

  3. Why is there no attempt to plot something similar to figures 3 to 6?

Since the original model has been modified and only the first figure of the original paper seems to have been replicated, perhaps this is best suited as a partial replication.

@oliviaguest
Copy link
Member

@jamesproach what's a realistic ETA for you to take a look here? Thanks. 😄

@jamesproach
Copy link

The authors reproduce simulation results from Recanatesi et al using a model of associative memory retrieval which accounts for the sequential activation of memory though two network features: (1) synaptic connections which enforce sequential associations between particular memories and (2) periodic inhibition. This is an extension of the Hopfield Network model of auto-associative memory. In their reproduction, the authors employ a dimensionality reduction proposed in the original article which groups units that present the same activity across all memories reducing the size of the dynamical system by several orders of magnitude.

The authors were able to reproduce the simulation results from the original article with slight modifications to the parameters. The associated Python code is clearly written, well commented and fully reproduces the figures in the manuscript.

Major Comment:
What accounts for the difference between the results with the original and reproduction variable? As a work aimed at reproduction, presenting differences between dynamics with the original parameters and the scaled reproduction variables would be useful. In some cases parameters need to be changed by N (10^5). Presentation of the differences would be helpful to understand this.

Minor Comments:
After running the network for several different seeds, I found that there were several periods where the same memory was recalled in sequence. The original article presents a simulation with up to three repetitions while the provided code often produces repetitions of 6 or longer, even 14 in one case. Is this a meaningful difference resulting from the difference in parameters?

Could the authors provide a reproduction of the simulation results in figures 3-6.

@rougier
Copy link
Member

rougier commented Jan 23, 2020

@oliviaguest @c-torre Gentle reminder

@c-torre
Copy link
Author

c-torre commented Jan 23, 2020

Dear all,
We thank the reviewers for their comments. We also apologize for our delay. We are still addressing the reviews and related arising issues (ETA two weeks from now). Thank you for your patience.
With kind regards,
Carlos

@c-torre
Copy link
Author

c-torre commented Feb 7, 2020

Dear all,
Thank you for your patience so far. We had to scale up our computing and build a pipeline for the computer cluster due to very expensive computations (i.e. simulating many networks for many time steps, around 8h per network in the cluster). In that regard, we've been experiencing delays due to particulary long job queues in the system. We are working on this issue and expect to deliver our response in one week time.

@oliviaguest
Copy link
Member

Thank you for the update! Sounds like you are making huge progress. 👍

@rougier
Copy link
Member

rougier commented Mar 9, 2020

Any progress ?

@c-torre
Copy link
Author

c-torre commented Mar 9, 2020

Dear all,
We would like to update on the status of the replication.

According to the reviews, we reproduced the all the figures of the initial paper. Although the overall behavior of the network is similar to the initial network (figure 1), we cannot replicate many of the other results (figures 3 to 6). At this point, our conclusion would be that the original article is not reproducible.

However, while working on the replication, we began a discussion by mail with one author of the original article. He has been able to provide scripts from early versions of the code (in a proprietary language). It is very likely that the quality of this replication would benefit from this newfound help, but further investigation is still required. We therefore thank in advance the editorial board and the reviewers for their patience.

Best regards,
Carlos

@oliviaguest
Copy link
Member

@rougier this seems to be a reason to rewrite and re-review this work? What's our policy on this type of situation?

@rougier
Copy link
Member

rougier commented Mar 10, 2020

I think we have a section on non-reproducibility where it is written to try to contact the original author to try to see where's the problem. I guess we can now wait for the new input to check if @c-torre will be able to replicate the work based on the new input. We can leave this thread open until then.

@rougier
Copy link
Member

rougier commented May 4, 2020

Gentle reminder

@c-torre
Copy link
Author

c-torre commented May 4, 2020

Dear everyone,
I hope you are safe in these difficult times.

In this time, we addressed the reviewer comments, including the change in one parameter value.
The model shown to be particularly sensitive to this and the dynamics were completely distorted.

We then reimplemented the model making new adjustments to equations and parameters, as we were able to obtain figure 2 again.
We launched series of simulations to reproduce the last figures of the original article.
Unfortunately, results were not as expected.

We contacted the original author, who provided apparently a similar implementation to what they used for their article.
They also provided further explanations on theoretical concepts driving their implementaiton.
After reimplementing their code in Python, we obtained yet another working version of the network that could replicate figure 2.
We launched the extended simulations to produce the last figures, but again its behavior was far from intended.
We were not able to get in touch with the original author again.

On the other hand, we are now able to provide a lot more details on the theory and implementation.
In addition, we have two working versions of the code driven by similar but not equal theory:
we have a low sparsity version that works with simulation numbers as described in the article;
we also have a high sparsity version assumes an infinite number of neurons, which is the closest to the original implementation.

We are wondering how we should proceed in this case.

@oliviaguest
Copy link
Member

oliviaguest commented May 4, 2020

We contacted the original author, who provided apparently a similar implementation to what they used for their article.

We were not able to get in touch with the original author again.

@c-torre Can you please provide a bit more info regarding what has changed and why? I don't blame you in any way, I just would like to know more and your opinion.

@c-torre
Copy link
Author

c-torre commented May 4, 2020

The complete timeline occured as follows:
I contacted the corresponding author of the original article, but I did not get a reply back.
Then I contacted the first author. He was very willing to help, provided some initial discussion and finally the partial code that allowed to reproduce figure 2.

I launched the longer simulations to get results for the last figures, but again could not reproduce results. At this point I got back to contact the first author of the original article. I could not get a reply even after reminding.

I'm wondering if the current situation may have an effect in any way, but unfortunately it is all guessing just by now. It does not seem to be any other changes that would explain why we could not get back to them again.

@oliviaguest
Copy link
Member

@c-torre Very useful. Thank you. Have you emailed the corresponding author again? Or just once at the start?

@oliviaguest
Copy link
Member

PS: @c-torre I also forgot to ask... did you email any of the other authors? Original article has four in total.

@c-torre
Copy link
Author

c-torre commented May 4, 2020

I emailed the corresponding author only at the beginning, as I was initially told by the first author he might not reply to emails often in general. I did not contact other authors apart from first and corresponding (last).

@oliviaguest
Copy link
Member

@rougier @khinsen @benoit-girard what do we do/think in this case? Is there a value in asking for input from (any of) the original authors?

@c-torre
Copy link
Author

c-torre commented May 28, 2020

Hi all, I hope everything is well. We would like to update on the state of the replication, especially since many things happened since we contacted all authors.

We got replies and clarifications for a few questions we had. They spotted and confirmed that several typos were introduced into the equations that made the replication impossible.

The first author got a way back into the cluster where the original files were and found that the parameters in the cluster simulations were different than the ones in the previously provided script, and also than those reported in the paper.

As of right now, it seems that the first author could finally fully simulate all the figures, which should complete the replication work.
We still have to launch the longer simulations ourselves confirm replication on our side, and then proceed to translate and clean the code, and update the writing accordingly.

The first and last authors expressed their willingness to publish a correction on the original article, especially when this replication paper is published so they can provide a direct link to an implementation, and credit us for helping in the corrections.

We are working on the simulations now and will be back with hopefully good news

@oliviaguest
Copy link
Member

@c-torre: Thank you so much for the update and well done on all the work in figuring this out! I am very pleased to have pushed (lightly) towards you contacting them — this is highly productive for the journal/this publication... and I hope also has been useful for you and your work as well as, of course, science generally! Great work and I am excited to see the next steps. 👏

@c-torre
Copy link
Author

c-torre commented Aug 4, 2020

Dear all,
Please find an updated version of the code as in the original issue. The PDF link in the original issue link will take you to our previous manuscript submission if needed for comparison. The new PDF can be found at: https://github.com/c-torre/replication-recanatesi-2015/blob/full-replication/re-neural-network-model-memory-retrieval.pdf

We are thankful for the reviewer's comments as they have drastically improved the quality of our work. We also thank all the discussion, as it led the replication work in the appropriate direction. We would encourage to revisit both the manuscript and code, as very major changes have taken place. Here we address the review comments:

Review 1

  • Why is the sparsity parameter different? Shouldn't it be 0.1?

A typo was found and corrected the parameter to its correct value of 0.1

  • Where is the recall activation threshold parameter defined?

We followed the approach of the original interpretation now, were recalls are consistently observed at every inhibition minima, where rates always go above this threshold value. In the implementation this means that recalls are calculated not based on a threshold, but on the time iteration at which the peak maximum occurs.

  • Why is there no attempt to plot something similar to figures 3 to 6?

We have corrected this major issue and now provide missing and extra figures.

Review 2

  • Major Comment: What accounts for the difference between the results with the original and reproduction variable? As a work aimed at reproduction, presenting differences between dynamics with the original parameters and the scaled reproduction variables would be useful. In some cases parameters need to be changed by N (10^5). Presentation of the differences would be helpful to understand this.

We have been collaborating with the original authors as we discovered errors in the original article. Normalization of parameters was indeed one of the main issues with the original paper. We have addressed this in detail in the new manuscript, as it is the core of the major changes since our original review request.

  • Minor Comments:
    After running the network for several different seeds, I found that there were several periods where the same memory was recalled in sequence. The original article presents a simulation with up to three repetitions while the provided code often produces repetitions of 6 or longer, even 14 in one case. Is this a meaningful difference resulting from the difference in parameters?

A new section has been added with the new figures, which address more in detail changes to the seed, as 10000 networks with different seeds are simulated to produce the recall analysis figures and conclusions. We agree that Figure 2 can be seed or parameter-dependent, and that many recalls in a row are possible with seed changes as seen in later figures.

In the time you may take to have a look at our replies, we intend to continue collaborating in the correction of the original article, as the original authors expressed wishes to continue working together on this issue and hope encourage future citations to also include our work.

Thank you for all your patience and help,
Best,
Carlos

@rougier
Copy link
Member

rougier commented Sep 8, 2020

@oliviaguest Gentle reminder

@oliviaguest
Copy link
Member

@jamesproach and @bobaseb would you be able to give feedback (e.g., if your comments, questions, have been addressed) on the above, please?

@bobaseb
Copy link

bobaseb commented Sep 8, 2020

Yes, my comments have been addressed. I'm happy the replication worked out in the end after contacting the original authors and correcting for errors in the original article (all stated in the current replication).

@jamesproach
Copy link

jamesproach commented Sep 9, 2020 via email

@oliviaguest
Copy link
Member

@c-torre OK, amazing!

Can you please update the metadata.md file with the details of each reviewer, editor, etc., please? Acceptance date is today and also see: ReScience/ReScience#48 (comment)! 🥳

@c-torre
Copy link
Author

c-torre commented Sep 10, 2020

Hi all,

This is very good news. However, we went an extra step to ensure that everything is absolutely correct this time. Thanks to the original authors we have spotted the following mistakes in our current submission:

  1. Equation 2 should have the same time scaling as equation 10, adding sqrt(dt) for the noise term.
  2. Equation 4 has a mistake in the constant part.
  3. Equation 6 has a mistake in the contiguity terms.
  4. A parenthesis is lacking in the manuscript text.
  5. A parameter value was not updated in the new manuscript text.
  6. Parameter table has some issues in the consistency of scaling.
  7. Code should be re-run to ensure that is bug-free and replication holds after this tweaking.

While the replication is in a very good place now, as opposed to out first submission, I believe it's in the community's best interest to address these issues before proceeding to final publication.

@oliviaguest
Copy link
Member

Sure, @c-torre! I'd assumed you had fixed these. Let me know when it's good to go.

@c-torre
Copy link
Author

c-torre commented Dec 28, 2020

Hi! Manuscript is ready and the code has a Zenodo DOI now. What's next?

2020_rescience_neural_network_model_of_memory_retrieval.pdf

@rougier
Copy link
Member

rougier commented Dec 31, 2020

We'll publish the paper next week most probably. Don't hesitate to remind us.

@oliviaguest
Copy link
Member

[ Indeed @rougier and sorry @c-torre for taking a bit of time. I am dealing with a big chronic illness flare up and will get to this ASAP. Apologies. ]

@c-torre
Copy link
Author

c-torre commented Jan 5, 2021

Hi! No worries, thanks for letting me know. Take care and proceed when you're feeling better. There's no rush on our side. All the best.

@rougier
Copy link
Member

rougier commented Jan 11, 2021

@oliviaguest I can publish it for you, just tell me.

@oliviaguest
Copy link
Member

If you can that would be amazing — thank you, @rougier!

@rougier
Copy link
Member

rougier commented Jan 13, 2021

@oliviaguest no prob.
@c-torre can you remind me what is the repository for your article such that I can make PR against it? I would need especially the metadata.yaml file.

@c-torre
Copy link
Author

c-torre commented Jan 18, 2021

Added article source with metadata.yaml at c-torre/replication-recanatesi-2015-article

@rougier
Copy link
Member

rougier commented Jan 20, 2021

Perfect, thank. Can you update the template with the latest template (and check it does not break things). This would be necessary to include a software heritage id for your code. This is now the default way of ensuring the coe will be available in the future. To have your code on software heritage is really simple, just check https://www.softwareheritage.org/save-and-reference-research-software/

@c-torre
Copy link
Author

c-torre commented Jan 22, 2021

Updated template, corrected latex errors and added code to software heritage (c-torre/replication-recanatesi-2015-article)

@rougier
Copy link
Member

rougier commented Jan 25, 2021

Thanks, I'll try to publish it today. For the swh link, can you send the swid (instead of the url) ? You should have received it when you saved the repo. The article template will complete the full url based on this id.

@rougier
Copy link
Member

rougier commented Jan 25, 2021

Here is the sanboxed version (not the real one): https://sandbox.zenodo.org/record/719803
Can you check everything looks good ?

@c-torre
Copy link
Author

c-torre commented Jan 25, 2021

Sandboxed version looks good.

For the SWID, the only thing I seem to find is a "permalink" hidden in a SWH side bar:

swh:1:dir:0b541aeb4707ecedfbbcdd85adfd0100d748cc03;
origin=https://github.com/c-torre/replication-recanatesi-2015/;
visit=swh:1:snp:0fd6c99fcc42679041cee0c384bfd4a0ffa0a50e;
anchor=swh:1:rev:59a3c88f5e7af41d4bf748ac4c41e03bdb2722da

https://archive.softwareheritage.org/browse/origin/directory/?origin_url=https://github.com/c-torre/replication-recanatesi-2015/

Is it any of these?

@rougier
Copy link
Member

rougier commented Jan 25, 2021

First one (swh:1:dir:0b541aeb4707ecedfbbcdd85adfd0100d748cc03) is good yes, I'll include it. Thanks.

@rougier
Copy link
Member

rougier commented Jan 25, 2021

Your article is published at https://zenodo.org/record/4461767. It will appear soon on the ReScience website as well.
Congratulations !

@rougier
Copy link
Member

rougier commented Jan 25, 2021

@oliviaguest Feel free to close the issue.
@c-torre I'll make a PR onto your repo with my modifications.

@oliviaguest
Copy link
Member

Thank you all for this! Nice work. 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants