-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Also cache undefined Rij values #1634
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
If we know that the correlaton coefficient is undefined for this particular combination of data sets then we can cache that knowledge and avoid the cost of rediscovering this fact for other symmetry-related combinations of the same data sets. This shouldn't change any results, and will only likely have any significant performance benefit when dealing with many sparse data sets (i.e. where there are many pairs of data sets where it isn't possible to calculate a valid correlation coefficient). Co-authored-by: Daniel Paley <dwpaley@lbl.gov>
Codecov Report
@@ Coverage Diff @@
## main #1634 +/- ##
=======================================
Coverage 66.63% 66.63%
=======================================
Files 616 616
Lines 68950 68950
Branches 9600 9600
=======================================
+ Hits 45944 45946 +2
+ Misses 21070 21068 -2
Partials 1936 1936 |
dwpaley
approved these changes
Mar 25, 2021
ndevenish
pushed a commit
that referenced
this pull request
Mar 25, 2021
If we know that the correlaton coefficient is undefined for this particular combination of data sets then we can cache that knowledge and avoid the cost of rediscovering this fact for other symmetry-related combinations of the same data sets. This shouldn't change any results, and will only likely have any significant performance benefit when dealing with many sparse data sets (i.e. where there are many pairs of data sets where it isn't possible to calculate a valid correlation coefficient). Co-authored-by: Daniel Paley <dwpaley@lbl.gov>
Merged
DiamondLightSource-build-server
added a commit
that referenced
this pull request
Mar 30, 2021
Bugfixes -------- - ``dials.cosym``: Cache cases where Rij is undefined, rather than recalculating each time. This can have significant performance benefits when handling large numbers of sparse data sets. (#1634) - ``dials.cosym``: Fix factor of 2 error when calculating target weights (#1635) - ``dials.cosym``: Fix broken ``engine=scipy`` option (#1636)
DiamondLightSource-build-server
added a commit
that referenced
this pull request
Mar 31, 2021
Bugfixes -------- - ``dials.cosym``: Cache cases where Rij is undefined, rather than recalculating each time. This can have significant performance benefits when handling large numbers of sparse data sets. (#1634) - ``dials.cosym``: Fix factor of 2 error when calculating target weights (#1635) - ``dials.cosym``: Fix broken ``engine=scipy`` option (#1636)
DiamondLightSource-build-server
added a commit
that referenced
this pull request
Mar 31, 2021
Features -------- - ``dials.cosym``: Significantly faster via improved computation of functional, gradients and curvatures (#1639) - ``dials.integrate``: Added parameter ``valid_foreground_threshold=``, to require a minimum fraction of valid pixels before profile fitting is attempted (#1640) Bugfixes -------- - ``dials.cosym``: Cache cases where Rij is undefined, rather than recalculating each time. This can have significant performance benefits when handling large numbers of sparse data sets. (#1634) - ``dials.cosym``: Fix factor of 2 error when calculating target weights (#1635) - ``dials.cosym``: Fix broken ``engine=scipy`` option (#1636) - ``dials.integrate``: Reject reflections with a high number of invalid pixels, which were being integrated since 3.4.0. This restores better merging statistics, and prevents many reflections being incorrect profiled as zero-intensity. (#1640)
DiamondLightSource-build-server
added a commit
that referenced
this pull request
Apr 1, 2021
Features -------- - ``dials.cosym``: Significantly faster via improved computation of functional, gradients and curvatures (#1639) - ``dials.integrate``: Added parameter ``valid_foreground_threshold=``, to require a minimum fraction of valid pixels before profile fitting is attempted (#1640) Bugfixes -------- - ``dials.cosym``: Cache cases where Rij is undefined, rather than recalculating each time. This can have significant performance benefits when handling large numbers of sparse data sets. (#1634) - ``dials.cosym``: Fix factor of 2 error when calculating target weights (#1635) - ``dials.cosym``: Fix broken ``engine=scipy`` option (#1636) - ``dials.integrate``: Reject reflections with a high number of invalid pixels, which were being integrated since 3.4.0. This restores better merging statistics, and prevents many reflections being incorrect profiled as zero-intensity. (#1640)
DiamondLightSource-build-server
added a commit
that referenced
this pull request
Apr 1, 2021
Features -------- - ``dials.cosym``: Significantly faster via improved computation of functional, gradients and curvatures (#1639) - ``dials.integrate``: Added parameter ``valid_foreground_threshold=``, to require a minimum fraction of valid pixels before profile fitting is attempted (#1640) Bugfixes -------- - ``dials.cosym``: Cache cases where Rij is undefined, rather than recalculating each time. This can have significant performance benefits when handling large numbers of sparse data sets. (#1634) - ``dials.cosym``: Fix factor of 2 error when calculating target weights (#1635) - ``dials.cosym``: Fix broken ``engine=scipy`` option (#1636) - ``dials.integrate``: Reject reflections with a high number of invalid pixels, which were being integrated since 3.4.0. This restores better merging statistics, and prevents many reflections being incorrect profiled as zero-intensity. (#1640) - Fix rare crash in symmetry calculations when no resolution limit could be calculated (#1641)
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
If we know that the correlaton coefficient is undefined for this particular combination of data sets then we can cache that knowledge and avoid the cost of rediscovering this fact for other symmetry-related combinations of the same data sets.
This shouldn't change any results, and will only likely have any significant performance benefit when dealing with many sparse data sets (i.e. where there are many pairs of data sets where it isn't possible to calculate a valid correlation coefficient).
Co-authored-by: Daniel Paley dwpaley@lbl.gov