Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to do IRSA dust query on multiple objects at once #684

Open
eteq opened this issue Apr 28, 2016 · 5 comments
Open

Add ability to do IRSA dust query on multiple objects at once #684

eteq opened this issue Apr 28, 2016 · 5 comments

Comments

@eteq
Copy link
Member

eteq commented Apr 28, 2016

The IRSA dust service has an "Upload Table" option which should be an easy way to paralellize IRSA dust queries. Doing that should satisfy #682 for IRSA dust.

I think this is probably best implemented as a new function/method, maybe IrsaDust.get_extinctions or similar, but it might instead make sense to instead make it part of IrsaDust.get_extinction_table if a non-scalar coordinate is input. The output table would look quite different, though, so that makes me think a new method is better.

I might be able to do this in the next few weeks, but I'm not certain, so if any one else wants to volunteer, please say so!

@keflavich
Copy link
Contributor

I agree that probable a new get_extinction_tables method is the right way to go.

@bsipocz
Copy link
Member

bsipocz commented May 3, 2016

I assume one can query a single object with the "Upload Table" API, right? Is it possible to get the same bandpass info somehow (I didn't manage, but only had a quick look at it).

If there isn't any performance penalty to get the same info I would vote for deprecating the current IrsaDust.get_extinction_table and keeping only the new upload table one (either with the same name, or a new name, or whatever), otherwise get a new one.

@guillochon
Copy link
Contributor

Hi all, so in my opinion the IRSA dust functionality would be better done locally on a user's machine, especially when querying many coordinates, as the S&F (or the original Schlegel maps) are just table interpolations from a pair of FITS files (I have some Python that does this here: https://github.com/guillochon/dustmap/blob/master/sfd.py). However, the datafiles are few hundred MB in size, which would suggest that the module should download and cache these files to the user's machine (just once as they'll never change), which I'm not quite sure how to make astroquery do (anyone have an example of where this is done?).

Does everyone think this would be useful, or would it be better to create a new astroquery module to do this? I believe the implementation might only require a few dozen lines of Python.

@lpsinger
Copy link
Contributor

lpsinger commented Oct 6, 2017

Has anyone figured out how to do this using the service URL, IrsaDust.DUST_SERVICE_URL, rather than scraping the web form?

@lpsinger
Copy link
Contributor

lpsinger commented Oct 6, 2017

Ahh, too bad:

<results status="error">
  <message>
    Sorry, table upload is not an option for Program Interface mode.
  </message>
</results>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants