Skip to content

Commit

Permalink
Use the same API for convolution models as other models.
Browse files Browse the repository at this point in the history
It turns out that the load_xsXXX() routines aren't necessary, since
the models can be created using the same interface as the other models
(after this change is made): e.g.

  ui.set_source(ui.xsphabs.gal * ui.xscflux.mdl(ui.xspowerlaw.pl))

This commit leaves in the load_xsXXX() routines, but they will probably
be removed in the next commit.
  • Loading branch information
DougBurke committed Jul 21, 2015
1 parent 6375896 commit 810cae9
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 2 deletions.
6 changes: 5 additions & 1 deletion sherpa/astro/ui/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,9 @@
if hasattr(sherpa.astro, 'xspec'):
_session._add_model_types(sherpa.astro.xspec,
(sherpa.astro.xspec.XSAdditiveModel,
sherpa.astro.xspec.XSMultiplicativeModel))
sherpa.astro.xspec.XSMultiplicativeModel,
sherpa.astro.xspec.XSConvolutionKernel)
)

# Perhaps everything exported by the xspec module should be exported
# here?
Expand All @@ -64,6 +66,8 @@
# names, to make it easier to support different versions of
# the XSpec library.
#
# Is this still needed now that the models are added directly?
#
for n in [n for n in dir(sherpa.astro.xspec) if
n.startswith('load_xs') and n != 'load_xsconvolve']:
globals()[n] = getattr(sherpa.astro.xspec, n)
Expand Down
3 changes: 2 additions & 1 deletion sherpa/astro/xspec/tests/test_xspec.py
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,8 @@ def test_convolution_models(self):
# (log 10 of this is -8.8).
lflux = -5.0
#xs.load_xscflux("cmdl")
ui.load_xscflux("cmdl")
#ui.load_xscflux("cmdl")
ui.create_model_component('xscflux', 'cmdl')

# If the test is run directly, this is not needed (i.e. the
# variable cmdl is defined), but if run via 'python setup.py test'
Expand Down

0 comments on commit 810cae9

Please sign in to comment.