Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
ec7ded3
add types for normalization workflow
nvaytet Oct 7, 2025
0731d2f
Merge branch 'main' into image-norm-wf
nvaytet Oct 9, 2025
4ea65e5
start adding providers for the standard normalization part of the wor…
nvaytet Oct 10, 2025
90c8e65
add question about how to perform the normalization by proton charge
nvaytet Oct 10, 2025
362e3b5
Merge branch 'main' into image-norm-wf
nvaytet Oct 20, 2025
d91fa67
start specializing normalize by proton charge part of workflow for or…
nvaytet Oct 20, 2025
238af65
fix proton charge normalization
nvaytet Oct 21, 2025
5f6b893
use the constraints on the workflow to load the proton charge and exp…
nvaytet Oct 21, 2025
4ae5414
cleanup
nvaytet Oct 22, 2025
aca88d2
use FluxNormalized to distinguish between normalized by proton-charge…
nvaytet Nov 4, 2025
f178656
Merge branch 'main' into image-norm-wf
nvaytet Nov 4, 2025
4332a55
fix last parts of the workflow, adding uncertainty broadcast mode
nvaytet Nov 4, 2025
49d1f9c
remove unused import
nvaytet Nov 4, 2025
1a18dde
add notebook that made tbl images from ymir data
nvaytet Nov 7, 2025
876d9c4
Merge branch 'main' into image-norm-wf
nvaytet Nov 7, 2025
b73f6e4
update data registry with new lego files
nvaytet Nov 7, 2025
75bd839
add image normalization workflow tests
nvaytet Nov 7, 2025
c7fd7b0
add notebook to docs
nvaytet Nov 7, 2025
9b88bc5
formatting
nvaytet Nov 7, 2025
591ded1
require essreduce>=25.11.1
nvaytet Nov 7, 2025
c76cfcd
update deps
nvaytet Nov 7, 2025
0a401d5
remove commented code
nvaytet Nov 7, 2025
62f3d8d
fix file paths in notebook
nvaytet Nov 7, 2025
20b610c
use new component_types constraints
nvaytet Nov 11, 2025
bc0c9cd
use load_from_path from most recent essreduce
nvaytet Nov 19, 2025
086e9b6
remove unused import
nvaytet Nov 19, 2025
e6e6b0e
bump essreduce version
nvaytet Nov 19, 2025
d9a06bd
update deps
nvaytet Nov 19, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/tbl/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,6 @@ maxdepth: 1
---

tbl-data-reduction
orca-image-normalization
tbl-make-tof-lookup-table
```
172 changes: 172 additions & 0 deletions docs/tbl/orca-image-normalization.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "0",
"metadata": {},
"source": [
"# TBL: Orca image normalization workflow\n",
"\n",
"This notebook shows how to use the workflow to compute normalized images recorded by the Orca detector on the TBL instrument."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1",
"metadata": {},
"outputs": [],
"source": [
"import ess.tbl.data # noqa: F401\n",
"from ess import tbl\n",
"from ess.imaging.types import *\n",
"import scipp as sc\n",
"import plopp as pp\n",
"\n",
"%matplotlib widget"
]
},
{
"cell_type": "markdown",
"id": "2",
"metadata": {},
"source": [
"## Workflow setup"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3",
"metadata": {},
"outputs": [],
"source": [
"wf = tbl.OrcaNormalizedImagesWorkflow()\n",
"\n",
"wf[Filename[SampleRun]] = tbl.data.tbl_lego_sample_run()\n",
"wf[Filename[DarkBackgroundRun]] = tbl.data.tbl_lego_dark_run()\n",
"wf[Filename[OpenBeamRun]] = tbl.data.tbl_lego_openbeam_run()\n",
"wf[NeXusDetectorName] = 'orca_detector'\n",
"\n",
"wf[MaskingRules] = {} # No masks to begin with\n",
"wf[UncertaintyBroadcastMode] = UncertaintyBroadcastMode.upper_bound"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4",
"metadata": {},
"outputs": [],
"source": [
"wf.visualize(NormalizedImage, compact=True, graph_attr={\"rankdir\": \"LR\"})"
]
},
{
"cell_type": "markdown",
"id": "5",
"metadata": {},
"source": [
"## Run the workflow\n",
"\n",
"We compute the final normalized image:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "6",
"metadata": {},
"outputs": [],
"source": [
"image = wf.compute(NormalizedImage)\n",
"image"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7",
"metadata": {},
"outputs": [],
"source": [
"pp.slicer(image, autoscale=False)"
]
},
{
"cell_type": "markdown",
"id": "8",
"metadata": {},
"source": [
"## Adding masks\n",
"\n",
"If we want to mask some part of the image, we update the masking rules.\n",
"For example, here we mask the upper part of the image:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9",
"metadata": {},
"outputs": [],
"source": [
"wf[MaskingRules] = {'y_pixel_offset': lambda x: x > sc.scalar(0.082, unit='m')}\n",
"\n",
"pp.slicer(wf.compute(NormalizedImage), autoscale=False)"
]
},
{
"cell_type": "markdown",
"id": "10",
"metadata": {},
"source": [
"## Intermediate results\n",
"\n",
"We can also inspect intermediate results, which is useful for debugging:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "11",
"metadata": {},
"outputs": [],
"source": [
"results = wf.compute([\n",
" RawDetector[SampleRun],\n",
" CorrectedDetector[SampleRun],\n",
" BackgroundSubtractedDetector[SampleRun]\n",
"])\n",
"\n",
"fig = pp.tiled(2, 2, hspace=0.3, wspace=0.3)\n",
"fig[0, 0] = results[RawDetector[SampleRun]]['time', 0].plot(title='Raw data')\n",
"fig[0, 1] = results[CorrectedDetector[SampleRun]]['time', 0].plot(title='Masks applied')\n",
"fig[1, 0] = results[BackgroundSubtractedDetector[SampleRun]]['time', 0].plot(title='Background subtracted')\n",
"fig[1, 1] = image['time', 0].plot(title='Final image')\n",
"fig"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.7"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ dependencies = [
"scippneutron>=24.12.0",
"scippnexus>=23.11.1",
"tifffile>=2024.7.2",
"essreduce>=25.11.0",
"essreduce>=25.11.2",
"scitiff>=25.7",
]

Expand Down
2 changes: 1 addition & 1 deletion requirements/base.in
Original file line number Diff line number Diff line change
Expand Up @@ -10,5 +10,5 @@ scipp>=25.4.0
scippneutron>=24.12.0
scippnexus>=23.11.1
tifffile>=2024.7.2
essreduce>=25.11.0
essreduce>=25.11.2
scitiff>=25.7
25 changes: 12 additions & 13 deletions requirements/base.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SHA1:da99ba40cef426287cd34dd1a2b56aefd0cb24e6
# SHA1:a1100845ace4b19ad8a759324efbe63c69022f7b
#
# This file was generated by pip-compile-multi.
# To update, run:
Expand All @@ -7,13 +7,13 @@
#
annotated-types==0.7.0
# via pydantic
asttokens==3.0.0
asttokens==3.0.1
# via stack-data
attrs==25.4.0
# via
# jsonschema
# referencing
click==8.3.0
click==8.3.1
# via dask
cloudpickle==3.1.2
# via dask
Expand All @@ -25,15 +25,15 @@ cyclebane==24.10.0
# via sciline
cycler==0.12.1
# via matplotlib
dask==2025.10.0
dask==2025.11.0
# via -r base.in
decorator==5.2.1
# via ipython
dnspython==2.8.0
# via email-validator
email-validator==2.3.0
# via scippneutron
essreduce==25.11.0
essreduce==25.11.2
# via -r base.in
executing==2.2.1
# via stack-data
Expand All @@ -57,7 +57,7 @@ ipydatawidgets==4.3.5
# via pythreejs
ipympl==0.9.8
# via plopp
ipython==9.6.0
ipython==9.7.0
# via
# ipympl
# ipywidgets
Expand Down Expand Up @@ -98,7 +98,7 @@ mpltoolbox==25.10.0
# scippneutron
networkx==3.5
# via cyclebane
numpy==2.3.4
numpy==2.3.5
# via
# contourpy
# h5py
Expand All @@ -125,7 +125,7 @@ pillow==12.0.0
# via
# ipympl
# matplotlib
plopp[all]==25.10.0
plopp[all]==25.11.0
# via
# -r base.in
# scippneutron
Expand All @@ -135,11 +135,11 @@ ptyprocess==0.7.0
# via pexpect
pure-eval==0.2.3
# via stack-data
pydantic==2.12.3
pydantic==2.12.4
# via
# scippneutron
# scitiff
pydantic-core==2.41.4
pydantic-core==2.41.5
# via pydantic
pygments==2.19.2
# via
Expand All @@ -151,7 +151,6 @@ python-dateutil==2.9.0.post0
# via
# matplotlib
# scippneutron
# scippnexus
pythreejs==2.4.2
# via plopp
pyyaml==6.0.3
Expand All @@ -160,7 +159,7 @@ referencing==0.37.0
# via
# jsonschema
# jsonschema-specifications
rpds-py==0.28.0
rpds-py==0.29.0
# via
# jsonschema
# referencing
Expand All @@ -180,7 +179,7 @@ scippneutron==25.7.0
# via
# -r base.in
# essreduce
scippnexus==25.6.0
scippnexus==25.11.0
# via
# -r base.in
# essreduce
Expand Down
23 changes: 11 additions & 12 deletions requirements/basetest.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ attrs==25.4.0
# via
# jsonschema
# referencing
certifi==2025.10.5
certifi==2025.11.12
# via requests
charset-normalizer==3.4.4
# via requests
Expand All @@ -37,12 +37,11 @@ lazy-loader==0.4
# tof
matplotlib==3.10.7
# via plopp
numpy==2.3.4
numpy==2.3.5
# via
# contourpy
# matplotlib
# scipp
# scipy
# tifffile
packaging==25.0
# via
Expand All @@ -54,21 +53,23 @@ pillow==12.0.0
# via matplotlib
platformdirs==4.5.0
# via pooch
plopp==25.10.0
plopp==25.11.0
# via tof
pluggy==1.6.0
# via pytest
pooch==1.8.2
# via -r basetest.in
pydantic==2.12.3
# via
# -r basetest.in
# tof
pydantic==2.12.4
# via scitiff
pydantic-core==2.41.4
pydantic-core==2.41.5
# via pydantic
pygments==2.19.2
# via pytest
pyparsing==3.2.5
# via matplotlib
pytest==8.4.2
pytest==9.0.1
# via -r basetest.in
python-dateutil==2.9.0.post0
# via matplotlib
Expand All @@ -78,23 +79,21 @@ referencing==0.37.0
# jsonschema-specifications
requests==2.32.5
# via pooch
rpds-py==0.28.0
rpds-py==0.29.0
# via
# jsonschema
# referencing
scipp==25.11.0
# via
# scitiff
# tof
scipy==1.16.3
# via tof
scitiff==25.7.0
# via -r basetest.in
six==1.17.0
# via python-dateutil
tifffile==2025.10.16
# via scitiff
tof==25.10.1
tof==25.12.0
# via -r basetest.in
typing-extensions==4.15.0
# via
Expand Down
4 changes: 2 additions & 2 deletions requirements/ci.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
#
# requirements upgrade
#
cachetools==6.2.1
cachetools==6.2.2
# via tox
certifi==2025.10.5
certifi==2025.11.12
# via requests
chardet==5.2.0
# via tox
Expand Down
Loading
Loading