Skip to content

Commit

Permalink
Refs #11607. Additional parameter for setting threshold.
Browse files Browse the repository at this point in the history
Also added paragraph to documentation.
  • Loading branch information
Michael Wedel committed Apr 22, 2015
1 parent e85b3a4 commit 4cb086a
Show file tree
Hide file tree
Showing 3 changed files with 28 additions and 5 deletions.
Expand Up @@ -10,6 +10,7 @@ class PoldiLoadRuns(PythonAlgorithm):
_nameTemplate = ""
_mergeCheckEnabled = True
_autoMaskBadDetectors = True
_autoMaskThreshold = 3.0

def category(self):
return "SINQ\\Poldi"
Expand Down Expand Up @@ -50,6 +51,11 @@ def PyInit(self):
doc=('Automatically disable detectors with unusually small or large values, in addition'
' to those masked in the instrument definition.'))

self.declareProperty('BadDetectorThreshold', 3.0, direction=Direction.Input,
doc=('Detectors are masked based on how much their intensity (integrated over time) '
'deviates from the median calculated from all detectors. This parameter indicates '
'how many times bigger the intensity needs to be for a detector to be masked.'))

self.declareProperty(WorkspaceProperty(name='OutputWorkspace',
defaultValue='',
direction=Direction.Output),
Expand Down Expand Up @@ -100,6 +106,7 @@ def PyExec(self):

# The same for removing additional dead or misbehaving wires
self._autoMaskBadDetectors = self.getProperty('MaskBadDetectors').value
self._autoMaskThreshold = self.getProperty('BadDetectorThreshold').value

# Get a list of output workspace names.
outputWorkspaces = self.getLoadedWorkspaceNames(year, mergeRange, mergeWidth)
Expand Down Expand Up @@ -201,8 +208,8 @@ def loadAndTruncateData(self, workspaceName, year, j):
def autoMaskBadDetectors(self, currentTotalWsName):
Integration(currentTotalWsName, OutputWorkspace='integrated')

MedianDetectorTest('integrated', SignificanceTest=4.0, HighOutlier=400, CorrectForSolidAngle=False,
OutputWorkspace='maskWorkspace')
MedianDetectorTest('integrated', SignificanceTest=3.0, HighThreshold=self._autoMaskThreshold, HighOutlier=200, \
CorrectForSolidAngle=False, OutputWorkspace='maskWorkspace')

MaskDetectors(Workspace=AnalysisDataService.retrieve(currentTotalWsName), MaskedWorkspace='maskWorkspace')

Expand Down
Expand Up @@ -4,6 +4,7 @@
from mantid.api import *
import numpy as np


class POLDILoadRunsTest(stresstesting.MantidStressTest):
"""This assembly of test cases checks that the behavior of PoldiLoadRuns is correct."""

Expand Down Expand Up @@ -132,11 +133,22 @@ def loadWorkspacesDontOverwriteOther(self):

def checkRemoveBadDetectors(self):
# Determine bad detectors automatically
twoWorkspacesMerged = PoldiLoadRuns(2013, 6903, 6904, 2, MaskBadDetectors=True)
twoWorkspacesMerged = PoldiLoadRuns(2013, 6903, 6904, 2, MaskBadDetectors=True,
BadDetectorThreshold=2.5)

wsMerged = AnalysisDataService.retrieve("twoWorkspacesMerged_data_6904")
self.assertEquals(len([True for x in range(wsMerged.getNumberHistograms()) if wsMerged.getDetector(
x).isMasked()]), 36)

self.clearAnalysisDataService()

# Lower threshold, more excluded detectors
twoWorkspacesMerged = PoldiLoadRuns(2013, 6903, 6904, 2, MaskBadDetectors=True,
BadDetectorThreshold=2.0)

wsMerged = AnalysisDataService.retrieve("twoWorkspacesMerged_data_6904")
self.assertEquals(len([True for x in range(wsMerged.getNumberHistograms()) if wsMerged.getDetector(
x).isMasked()]), 76)
x).isMasked()]), 49)

self.clearAnalysisDataService()

Expand Down
6 changes: 5 additions & 1 deletion Code/Mantid/docs/source/algorithms/PoldiLoadRuns-v1.rst
Expand Up @@ -9,7 +9,11 @@
Description
-----------

This algorithm makes it easier to load POLDI data. Besides importing the raw data (:ref:`algm-LoadSINQ`), it performs the otherwise manually performed steps of instrument loading (:ref:`algm-LoadInstrument`), truncation (:ref:`algm-PoldiTruncateData`). To make the algorithm more useful, it is possible to load data from multiple runs by specifying a range. In many cases, data files need to be merged in a systematic manner, which is also covered by this algorithm. For this purpose there is a parameter that specifies how the files of the specified range should be merged. The data files are named following the scheme `group_data_run`, where `group` is the name specified in `OutputWorkspace` and `run` is the run number and placed into a WorkspaceGroup with the name given in `OutputWorkspace`.
This algorithm makes it easier to load POLDI data. Besides importing the raw data (:ref:`algm-LoadSINQ`), it performsthe otherwise manually performed steps of instrument loading (:ref:`algm-LoadInstrument`), truncation (:ref:`algm-PoldiTruncateData`). To make the algorithm more useful, it is possible to load data from multiple runs by specifying a range. In many cases, data files need to be merged in a systematic manner, which is also covered by this algorithm. For this purpose there is a parameter that specifies how the files of the specified range should be merged. The data files are named following the scheme `group_data_run`, where `group` is the name specified in `OutputWorkspace` and `run` is the run number and placed into a WorkspaceGroup with the name given in `OutputWorkspace`.

By default, detectors that show unusually large integrated intensities are excluded if the pass a certain threshold
with respect to the median of integrated intensities in all detectors. The threshold can be adjusted using an
additional parameter. Detectors that are masked in the instrument definition are always masked.

The data loaded in this way can be used directly for further processing with :ref:`algm-PoldiAutoCorrelation`.

Expand Down

0 comments on commit 4cb086a

Please sign in to comment.