Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug in Gadget frontend #2819

Closed
joostvan opened this issue Aug 6, 2020 · 5 comments · Fixed by #3816
Closed

bug in Gadget frontend #2819

joostvan opened this issue Aug 6, 2020 · 5 comments · Fixed by #3816
Labels
bug code frontends Things related to specific frontends

Comments

@joostvan
Copy link

joostvan commented Aug 6, 2020

Bug report

Bug summary

Error message when using yt.load outputs a different file name then what it is actually called. Here is my data file-- it could be that testing failed for this type. This code worked for one of the provided datasets on the yt-webpage.
(edit by @neutrinoceros : the sample dataset in question is snapshot_010)

Code for reproduction

import os.path
from os import path
print ("Is it File? " + str(path.isfile('snap_000.11')))
data_ds = yt.load('snap_000.11')
hc = HaloCatalog(data_ds=data_ds, finder_method='hop')
hc.create()

Actual outcome

---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-89-79dabe25cdb0> in <module>
      4 data_ds = yt.load('snap_000.11')
      5 hc = HaloCatalog(data_ds=data_ds, finder_method='hop')
----> 6 hc.create()

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_analysis/halo_catalog.py in create(self, save_halos, save_catalog, njobs, dynamic)
    335 
    336         """
--> 337         self._run(save_halos, save_catalog, njobs=njobs, dynamic=dynamic)
    338 
    339     def load(self, save_halos=True, save_catalog=False, njobs=-1, dynamic=False):

//anaconda3/lib/python3.7/site-packages/yt/utilities/parallel_tools/parallel_analysis_interface.py in barrierize(*args, **kwargs)
    299     def barrierize(*args, **kwargs):
    300         if not parallel_capable:
--> 301             return func(*args, **kwargs)
    302         mylog.debug("Entering barrier before %s", func.__name__)
    303         comm = _get_comm(args)

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_analysis/halo_catalog.py in _run(self, save_halos, save_catalog, njobs, dynamic)
    404         if self.halos_ds is None:
    405             # Find the halos and make a dataset of them
--> 406             self.halos_ds = self.finder_method(self.data_ds)
    407             if self.halos_ds is None:
    408                 mylog.warning('No halos were found for {0}'.format(\

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_analysis/halo_finding_methods.py in __call__(self, ds)
     42 
     43     def __call__(self, ds):
---> 44         return self.function(ds, *self.args, **self.kwargs)
     45 
     46 def _hop_method(ds, **finder_kwargs):

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_analysis/halo_finding_methods.py in _hop_method(ds, **finder_kwargs)
     49     """
     50 
---> 51     halo_list = HOPHaloFinder(ds, **finder_kwargs)
     52     halos_ds = _parse_old_halo_list(ds, halo_list)
     53     return halos_ds

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_finding/halo_objects.py in __init__(self, ds, subvolume, threshold, dm_only, ptype, padding, total_mass)
   1471         self._data_source = ds.all_data()
   1472         GenericHaloFinder.__init__(self, ds, self._data_source, padding,
-> 1473                                    ptype=ptype)
   1474         # do it once with no padding so the total_mass is correct
   1475         # (no duplicated particles), and on the entire volume, even if only

//anaconda3/lib/python3.7/site-packages/yt/analysis_modules/halo_finding/halo_objects.py in __init__(self, ds, data_source, padding, ptype)
   1209         ParallelAnalysisInterface.__init__(self)
   1210         self.ds = ds
-> 1211         self.index = ds.index
   1212         self.center = (np.array(data_source.right_edge) +
   1213                        np.array(data_source.left_edge)) / 2.0

//anaconda3/lib/python3.7/site-packages/yt/data_objects/static_output.py in index(self)
    447                 raise RuntimeError("You should not instantiate Dataset.")
    448             self._instantiated_index = self._index_class(
--> 449                 self, dataset_type=self.dataset_type)
    450             # Now we do things that we need an instantiated index for
    451             # ...first off, we create our field_info now.

//anaconda3/lib/python3.7/site-packages/yt/geometry/particle_geometry_handler.py in __init__(self, ds, dataset_type)
     37         self.directory = os.path.dirname(self.index_filename)
     38         self.float_type = np.float64
---> 39         super(ParticleIndex, self).__init__(ds, dataset_type)
     40 
     41     @property

//anaconda3/lib/python3.7/site-packages/yt/geometry/geometry_handler.py in __init__(self, ds, dataset_type)
     48 
     49         mylog.debug("Setting up domain geometry.")
---> 50         self._setup_geometry()
     51 
     52         mylog.debug("Initializing data grid data IO")

//anaconda3/lib/python3.7/site-packages/yt/geometry/particle_geometry_handler.py in _setup_geometry(self)
     48     def _setup_geometry(self):
     49         mylog.debug("Initializing Particle Geometry Handler.")
---> 50         self._initialize_particle_handler()
     51 
     52     def get_smallest_dx(self):

//anaconda3/lib/python3.7/site-packages/yt/geometry/particle_geometry_handler.py in _initialize_particle_handler(self)
    102                 ds.domain_left_edge, ds.domain_right_edge,
    103                 [N, N, N], len(self.data_files))
--> 104         self._initialize_indices()
    105         self.oct_handler.finalize()
    106         self.max_level = self.oct_handler.max_level

//anaconda3/lib/python3.7/site-packages/yt/geometry/particle_geometry_handler.py in _initialize_indices(self)
    132                 npart = data_file.total_particles[index_ptype]
    133             morton[ind:ind + npart] = \
--> 134                 self.io._initialize_index(data_file, self.regions)
    135             ind += npart
    136         morton.sort()

//anaconda3/lib/python3.7/site-packages/yt/frontends/gadget/io.py in _initialize_index(self, data_file, regions)
    351             count = sum(data_file.total_particles.values())
    352             return self._get_morton_from_position(
--> 353                 data_file, count, 0, regions, DLE, DRE)
    354         else:
    355             idpos = self._ptypes.index(self.index_ptype)

//anaconda3/lib/python3.7/site-packages/yt/frontends/gadget/io.py in _get_morton_from_position(self, data_file, count, offset_count, regions, DLE, DRE)
    330     def _get_morton_from_position(self, data_file, count, offset_count,
    331                                   regions, DLE, DRE):
--> 332         with open(data_file.filename, "rb") as f:
    333             # We add on an additionally 4 for the first record.
    334             f.seek(data_file._position_offset + 4 + offset_count * 12)

FileNotFoundError: [Errno 2] No such file or directory: '/Users/joost/Documents/research/snap_000.0'

Expected outcome

Supposed to say catalogue generated.

Version Information

  • Operating System: OSX
  • Python Version: 3.7.3
  • yt version: 3.6
  • Other Libraries (if applicable):

I installed from pip

@welcome
Copy link

welcome bot commented Aug 6, 2020

Hi, and welcome to yt! Thanks for opening your first issue. We have an issue template that helps us to gather relevant information to help diagnosing and fixing the issue.

@triage-new-issues triage-new-issues bot added the triage Triage needed label Aug 6, 2020
@neutrinoceros neutrinoceros added bug code frontends Things related to specific frontends and removed triage Triage needed labels Aug 6, 2020
@neutrinoceros
Copy link
Member

I have trouble reproducing this on my machine since it relies on the yt_astro_analysis package, which seems to be broken with most recent version of yt itself. Namely it fails to import because we stopped vendoring six in yt/externs/

from yt.extern.six.moves import zip as izip

@brittonsmith , I believe you're the most competent person to answer this ? :-)

@neutrinoceros
Copy link
Member

I'm sorry it has been this long... now that yt-astro-analysis is stabilised for yt 4.x, I was able to investigate this a little better and what I discovered is that at some point in the back and forth dataflow between yt and yt-astro, we create a iterate over a series of files without checking wether they are present first.
I think the fix is trivial though I do not understand what's happening well enough to guarantee that it is correct. I'll open a PR and require a review from Britton.

@brittonsmith
Copy link
Member

@neutrinoceros thanks for remember this issue. I had completely forgotten about it. I will check this PR out when you have it.

@neutrinoceros
Copy link
Member

@brittonsmith it's there #3816

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug code frontends Things related to specific frontends
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants