-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tool to get SFD dustmap values #733
Comments
This would be kickass, so I'd be happy to help! |
I'd also like to see this happen (although I'm no dust expert). To get the ball rolling, here's some initial thoughts I had about interface and implementation:
|
Does this belong in |
@astrofrog I was wondering the same about |
I think this does not belong in I agree with @kbarbary that one function seems fine, as well as As for where it should live, that's an excellent question: I don't think it belongs in Maybe this needs to be in its own place? In the future there will probably be something for photometry, and this might make sense there. Alternatively, we could make something like |
Oh, and now that I look a bit more, its not obvious to me that there's a good way to get the IRSA form of the data - the intent of that site seems to be pretty much for people viewing things on the web, so parsing the output might be needlessly complex. |
@eteq I have the same thinking about this being a fundamentally necessary tool for doing other things (and also the same extragalactic bias). Location-wise, I feel like this belongs in a grab-bag of tools (that's where I would look first as a user). There's a programmable interface for IRSA described here: It returns an XML document, so you can parse the values out in about 5 lines of code. |
Small comment: as a non-extragalactic astronomer, I think we would need to make sure that it's made obvious this is the total extinction through the galaxy - obviously for us galactic observers, things are a lot more complex for the dust extinction. Now for the bigger issue: to me, SFD isn't more fundamental than say SIMBAD, which is currently in I think there should be a bigger discussion on any kind of online querying in Astropy in future. We need to find ways to make it robust in the long term (e.g. developing an API layer we can control on a server we control, which translates the calls if the APIs ever change). Once we have a good idea how we can manage this, then I'm all for including things like SFD, Simbad, and Vizier (imagine |
Naive question - is there a VO service for the SFD maps? If so, it could potentially go in |
As to where this could belong once it did make it to the core - how about |
@astrofrog Pretty reasonable. Just to be clear, you're only talking about the functionality for querying IRSA, not downloading and caching the maps, right? Seems like your comment also applies to downloading the maps directly from an externally hosted site, such as the Princeton site. But if we had the maps on an astropy-controlled server, you'd be OK with the download-and-cache function being in the core package? |
For location I'm more concerned with the function that uses the maps directly -- and I don't think |
@kbarbary - yes, if we hosted it ourselves, it would be easier to make sure the API doesn't break, but I'm not necessarily advocating storing everything ourselves, since in some cases the datasets are huge, but we can also provide API translation. Basically, I'm just suggesting that once 0.2 is out, we start a discussion on mailing list or google hangout about how to deal with online services in general. |
Regarding location, what do you mean by the function that uses the maps directly? If it's used to correct photometry, then it should go into |
I mean a function originally described in the issue which would be analogous to, e.g., DUST_GETVAL in IDL, but with automatic download and cache of the maps. I said In supernova measurements, we tend to separate "photometry" from these dust corrections: Photometry measurements are summing up flux on your images. Dust corrections are something you do after you've determined the total observed flux of the source. This distinction is useful: for example, I would use these dust corrections for simulations that don't even go to the image level, and otherwise wouldn't touch |
IMHO, caching is only a convenience - you're still querying a dataset/database (note, for the reason you suggest, I would not support |
Agreed. And |
I still think this isn't quite the same as I see @astrofrog's point on the IRSA version, though. That requires an actual API and interfacing, so all the other issues in So I think @astrofrog is right that we need a wider discussion on the matter, but I think we need this function in some form no matter what. |
Just to clarify my position - I think the base code for SFD should be in |
Hmm, we might actually be talking across each other, @astrofrog - what do you mean by the "base" code? The main thing I'm thinking about in this issue is code that does the following:
So which, if any, of those are you thinking should be in It might have not been clear from the original post, but I'm thinking of this as a useful convenience function like |
@eteq - I would personally put 1-4 in |
A similar tool that would be interesting to include for X-ray astonomy is a function to obtain the total galactic neutral hydrogen column density for given coordinates. A good all-sky survey to use is the Leiden/Argentine/Bonn (LAB) Survey of Galactic HI, found in VizieR at http://vizier.cfa.harvard.edu/viz-bin/Cat?VIII/76 It is similar to the SFD dustmap in that a simple download of the catalog is not enough, some interpolation would be required to get the value at the given coordinates. The is an FTOOL that does exactly this, called Another useful tool would be a calculator of galactic gamma-ray diffuse emission from code such as GALPROP, which is used in Fermi/LAT analysis. However, I believe there is no available table-like output, but just the code to generate your own. An all-sky, low-resolution map could be distributed with astropy, or use a astropy-hosted higher-resolution map. |
@astrofrog - after both forgetting about this and then thinking about it for a little while, what I was trying to say above is that it seems to me that 1,3, and 4 are not "query" related, because they're all about things like coordinates and mapping, and could apply just as well if you downloaded the dust maps yourself. That is, they are not tied to a specific query or web interface, but rather a specific dataset (which is very unlikely to be modified ever again, although the ways of querying it might). The idea is that the dust maps just give you the (for example) E(B-V) values in a given sightline, and after that it's up to you what to do with it (sometimes it's photometric corrections, sometimes it's correcting a spectrum, and sometimes it's computing a column density or something). It may be this is just semantics, though - it sounds like we agree that something like this should be in the core, the question is just exactly which parts go in which subpackages. (As an aside, there are plenty of other sources of dust maps, but most of them make sense to approach as a query rather than actually downloading maps - that's what astropy/astroquery#82 is about) |
@eteq - I wonder whether part of this could be implemented as: given an NDData object with WCS information, and given a list of coordinate objects, extract e.g. the nearest pixel, or interpolate at that position. This is something that would be generically useful and then could be used here. Maybe this is finally a chance to try and get useful functionality into NNData? It would finally also give us an excuse to sub-class NDData into a 'Image/Map' class which is a 2-d map where both WCS coordinates are sky coordinates (maybe 'SkyMap'?). |
This might also be useful for situations where you have a 3-or-greater dimension grid of data and want to interpolate between data points. For example, suppose you have a grid of galaxy spectra models for various ages and metallicities: I don't know of an easy way to extract a single spectrum, or flux values for a single given |
👍 to @astrofrog's idea here. That's an excellent generalization of this use case. I think the starting point would be to use |
I just saw this discussion by chance and while I'm not interested in dustmap values, I'm interested in astroquery and world-coordinate-bases lookup / interpolation (there's issue #619, it's probably just a few lines of code, but so far it hasn't been clear where to put this in astropy, please comment there and I can do it). Some of the issues mentioned are on the agenda for discussion the astroquery hangout on Friday (announcement, agenda). Note that in astroquery there are already interfaces to Vizier, IRSA, Fermi. |
There is now astroquery.irsa_dust |
@eteq - can this be closed now it's in astroquery? |
There's still a need to get values of the full locally stored fits images. When you're doing thousands of positions, the web query can take a significant amount of time (dominant in a real use case I had). |
@astrofrog - It's not quite the same thing as this issue, I think. EDIT: posted this before seeing @kbarbary's, but same answer 😉 |
Yes, it all depends on your use: for a few objects, you don't want to download the SFD dust maps because they are relatively large (200 MB). On the other hand, for a large number of extinction calls, the query is slow. |
@eteq had an excellent method in "astropysics" to return E(B-V) values for multiple sources it was quick and one needed to first download the SFD_dust_4096_ngp.fits SFD_dust_4096_sgp.fits (granted) I think astropy + affilated packages (incl. astroquery) currently do not efficiently support a E(B-V) query for a large number of sources. If true, would it not be worth reconsidering to include the etud astropysics code into astropy (as mentioned by Erik in one of the related tickets)? |
I haven't looked at the implementation in astropysics lately, but we have similar code in sncosmo that could be upstreamed: http://sncosmo.readthedocs.org/en/v1.1.x/api/sncosmo.SFD98Map.html (class interface) It also requires the user to download the maps manually, just because I felt dodgy about automatically dumping 200MB+ into |
I guess being able to access many E(B-V) values (for many sources) Do I understand correctly that the newer 2011 E(B-V) values are just E(B − V)2011 = 0.86[E(B − V)1998] |
Please don't use LAB data anymore. Even the main author of the data release paper (Peter Kalberla, a colleague of mine) considers it as deprecated. There is a much better alternative now: HI4PI. HI column densities are provided in HEALPix and various other projections. |
@eteq - Given that this is available in other packages now (see comment #733 (comment) by @kbarbary above), maybe close this issue from 2013? Or would you still prefer to add / maintain this feature in Astropy core? |
Hi, am triaging and would like to check if this Issue should be kept open... Would opening a new issue and continue the discussion there in light of the developments leading up to late Feb 2019 be more favorable? Please advise. Thanks! |
I'm going ahead and close this issue. As @kakirastern pointed out, not much happened in core about it but in the meantime other packages and affiliated packages have been created to cover the topic. |
(This is based in part on the discussion in #700)
We should add a function/class that retrieves the Schlegel, Finkbeiner, and Davis 1998 dustmaps, caches them, and uses that to get E(B-V) at arbitrary locations (specified by objects from
astropy.coordinates
). We probably also should include the R_X coefficients from the SFD paper, so people can get out extinctions directly in their preferred bandpass.We might also include the improved Schlafly & Finkbeiner 2011 calibrations, although that might not be "standard" enough yet.
Note that there's already a python/numpy implementation of the interpolation scheme in astropysics. So most of the work is already done between that and the astropy data/caching framework.
The text was updated successfully, but these errors were encountered: