-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keyword arguments for the forward simulators used in select Protocols #358
Conversation
…ept_ for dealing with "target models". Need to figure out if this is me messing up or if this should be expected for some reason.
… should not have been there.
…xecuted. Added the "simulator" keyword argument to the calling functions in these cases.
…ns that end up executing a GST or ModelTest protocol
…t of ModelTest.run(...)
…itions into an instance of the class. Needed in case some ForwardSimulator classes are stateful.
…stance of a ForwardSimulator.
…seems to be a bug in existing, unrelated test code
Tests are currently failing. Looks like they've been failing for a while. The failures are first observed on this CI run, which was triggered when I pushed this pair of commits. I'll note that the earlier of the two commits in the above pair changed |
Thanks @rileyjmurray! I've actually started looking into the failing tests already, I agree that this may be a case of old tests being picked up that were failing. I will figure that out and push a commit as part of my review. |
I've started looking into this as well. From what I can tell part of the issue is that we're picking up some base helper classes as testing classes that we shouldn't be. Here is a comparison of the output of
I ran this locally, but the behavior seems to match up with what is happening on the runners in the CI log @rileyjmurray linked to. I don't see any difference in the Question for you about the Edit: Sorry for the very long output, is there any decent way to put blocks of test side-by-side? Edit 2: I can confirm this isn't a pytest version thing. Syncing the pytest versions up across the two venvs on my machine didn't change anything. |
Ok, I think I've tracked down most of the failing tests. As mentioned above it looks like we're picking up tests from helper base classes we shouldn't be. It looks like this was introduced by commit d922474. This commit made one of the helper base classes subclass off of
It looks like this happens in addition to any of the other test discovery that happens. @rileyjmurray, do you remember what the motivation was for this change? Maybe we can achieve it though different means. |
@coreyostrove I don't remember what initially prompted my change you highlighted (direct link to relevant file). |
Thanks, @rileyjmurray! I pulled up this PR to read your comment just as the newest round of tests were completed. The last failing test is a straightforward one. You have a line in one of your new tests that is referencing the non-existing attribute
with this
I think it should do the trick. |
@sserita @coreyostrove, tests are passing! |
Yep I saw that! Thanks @coreyostrove for the work on fixing the tests. I had reached a similar conclusion but your answers were much more comprehensive than mine was going to be. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few minor changes requested, and I'm interested to hear a little more about your thought process on ForwardSimCallable (I think it'll be mildly annoying to maintain) and the new logic for what objects you think will be passed in to cast that need to be callable.
@@ -193,7 +193,7 @@ def copy(self): | |||
self._processor_grid, self._pblk_sizes) | |||
|
|||
def create_layout(self, circuits, dataset=None, resource_alloc=None, array_types=('E',), | |||
derivative_dimension=None, verbosity=0): | |||
derivative_dimensions=None, verbosity=0): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note to self: This is also a minimal exposure API change probably worth noting in changelog for 0.9.12.1.
TODOs for me:
|
Woohoo!
Riley
…On Mon, Jan 15, 2024 at 5:32 PM Stefan Seritan ***@***.***> wrote:
Merged #358 <#358> into develop.
—
Reply to this email directly, view it on GitHub
<#358 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACRLIFHW4EUBHBNNVHNW5MLYOWVBXAVCNFSM6AAAAAA6N4PX72VHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJRGQ4TENJVGYYDOOI>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
There are a few Protocol classes where a call to
Protocol.run(...)
is significantly affected by the forward simulator used by the Protocol'smodel
. This PR changes the signature of theserun
functions to allow asimulator
keyword argument. This argument defaults to None. If it's non-None, then we assign it as the simulator to any relevantModel
objects.Correctness of the implementation relies the casting that happens inside the setter method
Model.sim
, which is handled by the classmethodForwardSimulator.cast
. I've made minor changes to this classmethod so that it can accommodate any callable that returns aForwardSimulator
object when invoked with zero arguments. I've also created a type alias (in the__init__.py
file of theforwardsims
subpackage) for values that are permitted inForwardSimulator.cast
.I made small but nontrivial changes to GST.run, StandardGST.run, and ModelTest.run. I added tests to verify correctness of my changes for each of these functions.
I also made trivial changes to a few functions that served as entry points to the methods mentioned above. I didn't create separate tests for these changes.
@sserita this is ready for review