Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Version 0.14.0 #788

Closed
pc494 opened this issue Jan 6, 2022 · 40 comments
Closed

Version 0.14.0 #788

pc494 opened this issue Jan 6, 2022 · 40 comments
Labels

Comments

@pc494
Copy link
Member

pc494 commented Jan 6, 2022

Happy New Year to all; in exciting news hyperspy/hyperspy#2703 is now merged, so I think we should start thinking about getting the code into a release state for when that comes out. I'll write a list up tomorrow, but in the meantime if people have ideas they could drop in the comments that would up a lot.

(Drag and drops from #765)

@magnunor
Copy link
Collaborator

magnunor commented Jan 6, 2022

I very much agree! Are we skipping the 0.13.x release, and going straight for 0.14.0?

@pc494
Copy link
Member Author

pc494 commented Jan 6, 2022

I think so (we are skipping 0.13.4 and going to 0.14.0), there is some new functionality floating around + some deprecations I would like to get in that make a minor release make sense (at least to me) + it will be over a year since 0.13.0

@magnunor
Copy link
Collaborator

magnunor commented Jan 6, 2022

Sounds good! I'll start on a pull request which makes pyxem work with the hyperspy RELEASE_next_minor (ergo, the 1.7.0 version)

@magnunor
Copy link
Collaborator

There are a number of tests from my "old" pixstem code which is currently failing, I'll go through those and sort them out. Probably during the weekend.

Most of them seems related to various skimage functions like EllipseModel and peak finding.

@pc494
Copy link
Member Author

pc494 commented Jan 20, 2022

Pruning (or deprecating) of anything that can go is also encouraged.

Not sure if I've said this already, but there is no huge rush on this.

@magnunor
Copy link
Collaborator

With the further map changes sorted out by @CSSFrancis in hyperspy/hyperspy#2878, I think the remaining parts includes fixing the various broken functionalities in https://github.com/pyxem/pyxem/tree/hyperspy_release_next_minor

I sorted out some of these in #789, but there were a number of issues related to the handling of ragged arrays.

@CSSFrancis started on this in the hyperspy/hyperspy#2877 pull request, where the work on blockwise was separated out into a new pull request (hyperspy/hyperspy#2877).

@CSSFrancis: any plans to continue with the "ragged array" part of hyperspy/hyperspy#2877? If yes, we can wait with those specific "bugs" in pyxem until this is resolved. If not, we can sort out the issues in https://github.com/pyxem/pyxem/tree/hyperspy_release_next_minor in pyxem with some workarounds.

@CSSFrancis
Copy link
Member

Just so that there isn't too much overlap with what people are working on. We should maybe comment here before working on fixing a failing test.

That being said I will figure out the failing vector tests with map.

@magnunor
Copy link
Collaborator

@CSSFrancis, sounds good! I got through some of the failing tests in #789, so see if those solutions can be copied.

I had planned "copying" the fixes from #789 to https://github.com/pyxem/pyxem/tree/hyperspy_release_next_minor, with some extra care with regards to the recent changes in HyperSpy, but I haven't had the time yet.

@CSSFrancis
Copy link
Member

CSSFrancis commented Mar 15, 2022

So far we have the following (Grouped) failing tests:

I'll update this as new PR's are added:

Failing Tests

  • 0D diffraction Vector Fixes: Fix Vector Map #811
  • Pearson Correlation Bug Fixes: Bugfix Pearson Correlation #809
  • Subpixelrefinement Generator bug fixes @magnunor can you submit your fix for this one when you have the chance. I got a little confused with the fix for this one. (I think the previous method wasn't doing what it was supposed to anyways)
  • map function with an iterating signal that is a function rather than an array. Fixes failing Affine Transformation Test (@CSSFrancis to submit fix to hyperspy) BugFix: Map Function With non array signals. hyperspy/hyperspy#2903
  • Index Generation fails (test_indexation_generator.test_vector_indexation_generator_index_vectors) This is related to DiffractionVectorLibrary.structures not being properly initialized in that test. This might be something that @hakonanes might have a better idea about as I don't really know much about diffsims. That being said that test suite is kind of confusing so I wouldn't be opposed to rewriting it. Long term this might be something I might plan to fix with an overhaul of the diffraction vectors class.
  • Radial Integration map functions @magnunor is the plan still to remove these in 0.14? I think that is probably a decent idea. That fixes those issues and cleans up some of the duplication. I think the errors there are related to directly calling _map_iterate which is now not recommended. Let me know if you want me to do this sometime this week. Fix radial_average issues due to HyperSpy map changes #812

@magnunor
Copy link
Collaborator

Subpixelrefinement Generator bug fixes @magnunor can you submit your fix for this one when you have the chance. I got a little confused with the fix for this one. (I think the previous method wasn't doing what it was supposed to anyways)

Indeed! I think I got this all sorted in my old pull request, I'll look at it soonish.


Radial Integration map functions @magnunor is the plan still to remove these in 0.14? I think that is probably a decent idea. That fixes those issues and cleans up some of the duplication. I think the errors there are related to directly calling _map_iterate which is now not recommended. Let me know if you want me to do this sometime this week.

The failing tests should be fixed now (I cherry-picked the commits which fixed this). With regards to removing radial_integration I agree!


There also seems to be a bug affecting just the windows tests: https://github.com/pyxem/pyxem/runs/5560089211?check_suite_focus=true#step:8:60, seems like it is related to dask threading?

I haven't had time to examine it properly, but it does not seem to happen to the master branch, where I just now ran the windows 3.8 workflow, which passed: https://github.com/pyxem/pyxem/runs/5560255963?check_suite_focus=true

@magnunor
Copy link
Collaborator

The subpixelrefinement generator and the integration generator issues is now fixed!

That leaves:

@magnunor
Copy link
Collaborator

The indexation generator is failing in two tests:

  • pyxem/pyxem/tests/signals/test_indexation_results.py:61: AttributeError: 'dict' object has no attribute 'get_library_entry'
  • pyxem/pyxem/tests/generators/test_indexation_generator.py:237: AttributeError: 'dict' object has no attribute 'structures'

The windows bug seems to fixed in #809, thus the only thing missing is the indexation issues.

@pc494
Copy link
Member Author

pc494 commented Mar 21, 2022

Okay these are probably me problems, I should be able to have a look this week :)

@magnunor
Copy link
Collaborator

We also need to remember to increase the HyperSpy version in setup.py, before the release.

@pc494
Copy link
Member Author

pc494 commented Mar 26, 2022

The indexation generator is failing in two tests:

* `pyxem/pyxem/tests/signals/test_indexation_results.py:61`: `AttributeError: 'dict' object has no attribute 'get_library_entry'`

* `pyxem/pyxem/tests/generators/test_indexation_generator.py:237`: `AttributeError: 'dict' object has no attribute 'structures'`

The windows bug seems to fixed in #809, thus the only thing missing is the indexation issues.

I think both of these relate to the old version of template matching, and so I would be in favour of dropping the tests as we move towards using the new improvement version in all future applications.

@CSSFrancis
Copy link
Member

CSSFrancis commented Mar 27, 2022

I think both of these relate to the old version of template matching, and so I would be in favour of dropping the tests as we move towards using the new improvement version in all future applications.

I'm okay with this. It would be good to take a look after 0.14 is released at what our goals are for pyxem are over the next year or so. Primarily from the standpoint of what stays and what goes.

Is that plan still to try to push for a 1.0 release down the road? I know that is probably a fair bit of work so maybe something like creating a road map of releases might be a good idea. Maybe something like defining goals for 0.15, 0.16 and then maybe 1.0?

From at least my development goals I have the following development goals at least before anything like a stable 1.0 release:

  • Overhaul Diffraction Vectors class and Matching

  • Overhaul Calibration generator and astigmatism correction

  • Depreciate and remove some generators in favor of better documentation. (i.e. remove virtual dark field generator in favor of using a hyperspy roi)

Dropping tests is probably fine in the short run, but we should maybe have a defined way forward if things are broken and need fixing.

@hakonanes
Copy link
Member

hakonanes commented Mar 27, 2022

remove virtual dark field generator

In kikuchipy, I find having a virtual backscatter electron imaging (VBSE) generator (EBSD equivalent of VDF/VBF generator) which provides convenience methods for common uses of HyperSpy ROIs (often multiple ROIs) very nice. Of course one shouldn't hide the flexibility of ROIs from users, but providing shortcuts doesn't hurt.

Depreciate and remove some generators

The sooner people are notified that some functionality will be removed or replaced, the better. So if we agree that some functionality will be removed, and an alternative exists, marking something as deprecated doesn't cost anything.

@magnunor
Copy link
Collaborator

Is that plan still to try to push for a 1.0 release down the road? I know that is probably a fair bit of work so maybe something like creating a road map of releases might be a good idea. Maybe something like defining goals for 0.15, 0.16 and then maybe 1.0?

I agree! I think the first step is to go through all the functionality and workflows we have in pyXem, and plan out what we want to keep, what we want to rework, and possibly what we want to remove/spin-off.

@CSSFrancis
Copy link
Member

With #817 we should be back to all of the tests passing. The 2 remaining tests were just ignored which per @pc494 is probably good for now.

That should be rewritten in the next couple of months in which case I will replace those tests.

@magnunor
Copy link
Collaborator

I think the last thing which needs to be sorted out, is the changes due to hyperspy/hyperspy#2830: several of the axes_manager attributes will now be protected, and can't be changed directly.

We can't see these issues in the unit tests, as it hasn't been merged yet. However, it should be fairly straight forward to fix these.

@hakonanes
Copy link
Member

hakonanes commented Mar 29, 2022

We can't see these issues in the unit tests

But you get an idea by inspecting the failing tests from the extension/integration test run in that PR seven days ago.

@pc494
Copy link
Member Author

pc494 commented Mar 30, 2022

Okay, given that there are now published material (#818) on our new OM method, I think we should remove the old method given that it's basically unusable. Are there any objections to this?

@CSSFrancis
Copy link
Member

It seems like the next thing we should look at before this release is the demos. I'll start going through them with Hyperspy version 1.7 and 0.14 this weekend and see if everything works fairly smoothly or needs some additional work.

@din14970
Copy link
Contributor

din14970 commented Apr 7, 2022

@CSSFrancis just a heads up, for sure the template matching demo will be broken as I messed with some of the arguments in the functions. I will have to do a clean-up.

@magnunor
Copy link
Collaborator

There are currently 3 tests failing in the hyperspy_release_next_minor branch. Not 100% sure what caused them to break.

 FAILED pyxem/tests/generators/test_subpixelrefinement_generator.py::Test_init_xfails::test_out_of_range_vectors_DiffractionVectors
FAILED pyxem/tests/signals/test_indexation_results.py::test_single_vector_get_crystallographic_map
FAILED pyxem/tests/signals/test_indexation_results.py::test_double_vector_get_crystallographic_map

Full test log, click to expand

________ Test_init_xfails.test_out_of_range_vectors_DiffractionVectors _________

self = <pyxem.tests.generators.test_subpixelrefinement_generator.Test_init_xfails object at 0x7fbeca352160>

    def test_out_of_range_vectors_DiffractionVectors(self):
        """Test that putting vectors that lie outside of the
        diffraction patterns raises a ValueError"""
        vectors = DiffractionVectors(np.array([[1, -100]]))
        dp = ElectronDiffraction2D(np.ones((20, 20)))
    
        with pytest.raises(
            ValueError,
            match="Some of your vectors do not lie within your diffraction pattern",
        ):
>           _ = SubpixelrefinementGenerator(dp, vectors)

/home/runner/work/pyxem/pyxem/pyxem/tests/generators/test_subpixelrefinement_generator.py:84: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pyxem.generators.subpixelrefinement_generator.SubpixelrefinementGenerator object at 0x7fbec33f0790>
dp = <ElectronDiffraction2D, title: , dimensions: (|20, 20)>
vectors = <DiffractionVectors, title: , dimensions: (2, 1|)>

    def __init__(self, dp, vectors):
        self.dp = dp
        self.vectors_init = vectors
        self.last_method = None
        sig_ax = dp.axes_manager.signal_axes
        self.calibration = [sig_ax[0].scale, sig_ax[1].scale]
        self.center = [sig_ax[0].size / 2, sig_ax[1].size / 2]
    
>       self.vector_pixels = _get_pixel_vectors(
            dp, vectors, calibration=self.calibration, center=self.center
        )

/home/runner/work/pyxem/pyxem/pyxem/generators/subpixelrefinement_generator.py:274: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

dp = <ElectronDiffraction2D, title: , dimensions: (|20, 20)>
vectors = <DiffractionVectors, title: , dimensions: (2, 1|)>
calibration = [1.0, 1.0], center = [10.0, 10.0]

    def _get_pixel_vectors(dp, vectors, calibration, center):
        """Get the pixel coordinates for the given diffraction
        pattern and vectors.
    
        Parameters
        ----------
        dp: :obj:`pyxem.signals.ElectronDiffraction2D`
            Instance of ElectronDiffraction2D
        vectors : :obj:`pyxem.signals.diffraction_vectors.DiffractionVectors`
            List of diffraction vectors
        calibration : [float, float]
            Calibration values
        center : float, float
            Image origin in pixel coordinates
    
        Returns
        -------
        vector_pixels : :obj:`pyxem.signals.diffraction_vectors.DiffractionVectors`
            Pixel coordinates for given diffraction pattern and vectors.
        """
    
        def _floor(vectors, calibration, center):
            if vectors.shape == (1,) and vectors.dtype == object:
                vectors = vectors[0]
            return np.floor((vectors.astype(np.float64) / calibration) + center).astype(int)
    
        if isinstance(vectors, DiffractionVectors):
            if vectors.axes_manager.navigation_shape != dp.axes_manager.navigation_shape:
>               raise ValueError(
                    "Vectors with shape {} must have the same navigation shape "
                    "as the diffraction patterns which has shape {}.".format(
                        vectors.axes_manager.navigation_shape,
                        dp.axes_manager.navigation_shape,
                    )
                )
E               ValueError: Vectors with shape (2, 1) must have the same navigation shape as the diffraction patterns which has shape ().

/home/runner/work/pyxem/pyxem/pyxem/generators/subpixelrefinement_generator.py:112: ValueError

During handling of the above exception, another exception occurred:

self = <pyxem.tests.generators.test_subpixelrefinement_generator.Test_init_xfails object at 0x7fbeca352160>

    def test_out_of_range_vectors_DiffractionVectors(self):
        """Test that putting vectors that lie outside of the
        diffraction patterns raises a ValueError"""
        vectors = DiffractionVectors(np.array([[1, -100]]))
        dp = ElectronDiffraction2D(np.ones((20, 20)))
    
        with pytest.raises(
            ValueError,
            match="Some of your vectors do not lie within your diffraction pattern",
        ):
>           _ = SubpixelrefinementGenerator(dp, vectors)
E           AssertionError: Regex pattern 'Some of your vectors do not lie within your diffraction pattern' does not match 'Vectors with shape (2, 1) must have the same navigation shape as the diffraction patterns which has shape ().'.

/home/runner/work/pyxem/pyxem/pyxem/tests/generators/test_subpixelrefinement_generator.py:84: AssertionError
_________________ test_single_vector_get_crystallographic_map __________________

single_match_result = OrientationResult(phase_index=0, rotation_matrix=array([[ 6.123234e-17, -1.000000e+00,  0.000000e+00],
       [ 1.0000...00000e+00]]), match_rate=0.5, error_hkls=array([0.1 , 0.05, 0.2 ]), total_error=0.1, scale=1.0, center_x=0, center_y=0)
mode = 'vector', rank = 0, key = 'total_error', descending = False

    def get_nth_best_solution(
        single_match_result, mode, rank=0, key="match_rate", descending=True
    ):
        """Get the nth best solution by match_rate from a pool of solutions
    
        Parameters
        ----------
        single_match_result : VectorMatchingResults, TemplateMatchingResults
            Pool of solutions from the vector matching algorithm
        mode : str
            'vector' or 'template'
        rank : int
            The rank of the solution, i.e. rank=2 returns the third best solution
        key : str
            The key to sort the solutions by, default = match_rate
        descending : bool
            Rank the keys from large to small
    
        Returns
        -------
        VectorMatching:
            best_fit : `OrientationResult`
                Parameters for the best fitting orientation
                Library Number, rotation_matrix, match_rate, error_hkls, total_error
        TemplateMatching: np.array
                Parameters for the best fitting orientation
                Library Number , [z, x, z], Correlation Score
        """
        if mode == "vector":
            try:
                best_fit = sorted(
>                   single_match_result[0].tolist(), key=attrgetter(key), reverse=descending
                )[rank]
E               AttributeError: 'int' object has no attribute 'tolist'

/home/runner/work/pyxem/pyxem/pyxem/utils/indexation_utils.py:98: AttributeError

During handling of the above exception, another exception occurred:

sp_vector_match_result = <VectorMatchingResults, title: , dimensions: (2|)>

    def test_single_vector_get_crystallographic_map(sp_vector_match_result):
>       _ = sp_vector_match_result.get_crystallographic_map()

/home/runner/work/pyxem/pyxem/pyxem/tests/signals/test_indexation_results.py:143: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/runner/work/pyxem/pyxem/pyxem/signals/indexation_results.py:253: in get_crystallographic_map
    crystal_map = self.map(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/hyperspy/signal.py:4902: in map
    result = self._map_iterate(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/hyperspy/signal.py:5000: in _map_iterate
    temp_output_signal_size, temp_output_dtype = guess_output_signal_size(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/hyperspy/misc/utils.py:1420: in guess_output_signal_size
    output = function(test_data, **kwargs)
/home/runner/work/pyxem/pyxem/pyxem/signals/indexation_results.py:59: in crystal_from_vector_matching
    best_match = get_nth_best_solution(
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

single_match_result = OrientationResult(phase_index=0, rotation_matrix=array([[ 6.123234e-17, -1.000000e+00,  0.000000e+00],
       [ 1.0000...00000e+00]]), match_rate=0.5, error_hkls=array([0.1 , 0.05, 0.2 ]), total_error=0.1, scale=1.0, center_x=0, center_y=0)
mode = 'vector', rank = 0, key = 'total_error', descending = False

    def get_nth_best_solution(
        single_match_result, mode, rank=0, key="match_rate", descending=True
    ):
        """Get the nth best solution by match_rate from a pool of solutions
    
        Parameters
        ----------
        single_match_result : VectorMatchingResults, TemplateMatchingResults
            Pool of solutions from the vector matching algorithm
        mode : str
            'vector' or 'template'
        rank : int
            The rank of the solution, i.e. rank=2 returns the third best solution
        key : str
            The key to sort the solutions by, default = match_rate
        descending : bool
            Rank the keys from large to small
    
        Returns
        -------
        VectorMatching:
            best_fit : `OrientationResult`
                Parameters for the best fitting orientation
                Library Number, rotation_matrix, match_rate, error_hkls, total_error
        TemplateMatching: np.array
                Parameters for the best fitting orientation
                Library Number , [z, x, z], Correlation Score
        """
        if mode == "vector":
            try:
                best_fit = sorted(
                    single_match_result[0].tolist(), key=attrgetter(key), reverse=descending
                )[rank]
            except AttributeError:
                best_fit = sorted(
>                   single_match_result.tolist(), key=attrgetter(key), reverse=descending
                )[rank]
E               AttributeError: 'OrientationResult' object has no attribute 'tolist'

/home/runner/work/pyxem/pyxem/pyxem/utils/indexation_utils.py:102: AttributeError
----------------------------- Captured stderr call -----------------------------
WARNING:hyperspy.signal:The function you applied does not take into account the difference of units and of scales in-between axes.
WARNING:hyperspy.io:`signal_type='vector_matching'` not understood. See `hs.print_known_signal_types()` for a list of installed signal types or https://github.com/hyperspy/hyperspy-extensions-list for the list of all hyperspy extensions providing signals.
------------------------------ Captured log call -------------------------------
WARNING  hyperspy.signal:signal.py:4863 The function you applied does not take into account the difference of units and of scales in-between axes.
WARNING  hyperspy.io:io.py:639 `signal_type='vector_matching'` not understood. See `hs.print_known_signal_types()` for a list of installed signal types or https://github.com/hyperspy/hyperspy-extensions-list for the list of all hyperspy extensions providing signals.
_________________ test_double_vector_get_crystallographic_map __________________

dp_vector_match_result = <VectorMatchingResults, title: , dimensions: (2, 2|)>

    def test_double_vector_get_crystallographic_map(dp_vector_match_result):
>       _ = dp_vector_match_result.get_crystallographic_map()

/home/runner/work/pyxem/pyxem/pyxem/tests/signals/test_indexation_results.py:147: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/home/runner/work/pyxem/pyxem/pyxem/signals/indexation_results.py:253: in get_crystallographic_map
    crystal_map = self.map(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/hyperspy/signal.py:4902: in map
    result = self._map_iterate(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/hyperspy/signal.py:4998: in _map_iterate
    old_sig.inav[(0,) * len(os_am.navigation_shape)].data.compute()
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/base.py:292: in compute
    (result,) = compute(self, traverse=False, **kwargs)
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/base.py:575: in compute
    results = schedule(dsk, keys, **kwargs)
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/threaded.py:81: in get
    results = get_async(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/local.py:508: in get_async
    raise_exception(exc, tb)
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/local.py:316: in reraise
    raise exc
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/local.py:221: in execute_task
    result = _execute_task(task, data)
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/core.py:119: in _execute_task
    return func(*(_execute_task(a, cache) for a in args))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

obj = OrientationResult(phase_index=0, rotation_matrix=array([[ 6.123234e-17, -1.000000e+00,  0.000000e+00],
       [ 1.0000...1.000000e+00]]), match_rate=0.6, error_hkls=array([0.1, 0.1, 0.2]), total_error=0.3, scale=1.0, center_x=0, center_y=0)
index = (None,)

    def getitem(obj, index):
        """Getitem function
    
        This function creates a copy of the desired selection for array-like
        inputs when the selection is smaller than half of the original array. This
        avoids excess memory usage when extracting a small portion from a large array.
        For more information, see
        https://numpy.org/doc/stable/reference/arrays.indexing.html#basic-slicing-and-indexing.
    
        Parameters
        ----------
        obj: ndarray, string, tuple, list
            Object to get item from.
        index: int, list[int], slice()
            Desired selection to extract from obj.
    
        Returns
        -------
        Selection obj[index]
    
        """
>       result = obj[index]
E       TypeError: tuple indices must be integers or slices, not tuple

/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/dask/array/chunk.py:417: TypeError

@CSSFrancis
Copy link
Member

@pc494 Can we just skip these tests? What do we still need for the 0.14 release?

I've ran through a couple of the notebooks and everything seems to be working for the most part. I have some additional information I've been meaning to add there that I'll do later.

As far as the status of the outstanding PR's it would be good to merge:

#829
#827
#826
#823

@magnunor
Copy link
Collaborator

Sounds good! I'm going through those pull requests now.

We also need to merge the hyperspy_release_next_minor into the master branch. I can do that.

I suggest we wrap up the outstanding pull requests, merge hyperspy_release_next_minor, then check if everything is working, lastly we do a release.

I think @pc494 needs to do the release itself?

@pc494
Copy link
Member Author

pc494 commented Apr 27, 2022

Good work everyone, I think it may well be true that I need to perform this release. I'll get to work on my outstanding bits (which appear to be a fair amount of the problems)

@magnunor
Copy link
Collaborator

I made a pull request for merging the hyperspy_release_next_minor branch into the master branch: #832

Given that no further tests fail, that pull request should be ready for merging.

Other than that, there is the indexation pull request (#823), and fixing any currently failing tests.

@magnunor
Copy link
Collaborator

One thing to remember: increasing the HyperSpy version in the conda-forge version of pyxem

@CSSFrancis
Copy link
Member

It might be a good idea to manually add #824 as well before releasing. I can do that maybe later tonight once everything is merged to the master branch.

@magnunor
Copy link
Collaborator

In addition to the 3 unit tests already failing, there are 2 new ones:

FAILED pyxem/tests/signals/test_beam_shift.py::TestFullDirectBeamCentering::test_mask
FAILED pyxem/tests/signals/test_beam_shift.py::TestFullDirectBeamCentering::test_mask_lazy

We should probably have a look at those after everything is merged into the master branch.


@CSSFrancis, feel free to merge #832 after #833 is ready.


Should we aim for a release tomorrow? So that everything is ready for a HyperSpy bundle release.

@pc494
Copy link
Member Author

pc494 commented Apr 28, 2022

@magnunor
Copy link
Collaborator

Sounds good! I'm trying to figure out why the test_beam_shift.py tests are failing, but currently they're passing locally on my computer.

@magnunor
Copy link
Collaborator

magnunor commented Apr 28, 2022

Sounds good! I'm trying to figure out why the test_beam_shift.py tests are failing, but currently they're passing locally on my computer.

This is resolved now! Still 3 tests failing, but not sure how to fix them (or if we need to fix them...).

FAILED pyxem/tests/generators/test_subpixelrefinement_generator.py::Test_init_xfails::test_out_of_range_vectors_DiffractionVectors
FAILED pyxem/tests/signals/test_indexation_results.py::test_single_vector_get_crystallographic_map
FAILED pyxem/tests/signals/test_indexation_results.py::test_double_vector_get_crystallographic_map

@pc494
Copy link
Member Author

pc494 commented Apr 28, 2022

Will deal with the tests and package the release tomorrow :)

@pc494
Copy link
Member Author

pc494 commented Apr 29, 2022

this is now underway, closing.

@pc494 pc494 closed this as completed Apr 29, 2022
@magnunor
Copy link
Collaborator

@pc494, status on the conda-forge version?

@pc494
Copy link
Member Author

pc494 commented Apr 30, 2022

It's failing on a comparison within the testing suite, I've added you as a maintainer in the PR while I was there.

Edit: may well have been a dependencies issue.

@pc494
Copy link
Member Author

pc494 commented Apr 30, 2022

conda forge is now done (it was a dependencies issue)!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants