Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: extract intensity components #244

Merged
merged 9 commits into from
Mar 23, 2021
3 changes: 3 additions & 0 deletions cspell.json
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@
"codemirror",
"colab",
"commitlint",
"darkred",
"doctest",
"doctests",
"dtype",
Expand All @@ -138,6 +139,7 @@
"imag",
"iminuit",
"indeterministic",
"isinstance",
"isort",
"jupyterlab",
"keras",
Expand Down Expand Up @@ -184,6 +186,7 @@
"seealso",
"sharex",
"sharey",
"startswith",
"subsys",
"suptitle",
"tolist",
Expand Down
3 changes: 2 additions & 1 deletion docs/usage/step1.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can write the {class}`~expertsystem.amplitude.helicity.HelicityModel` to disk via {mod}`pickle`:"
"Finally, we can write the {class}`~expertsystem.amplitude.helicity.HelicityModel` to disk via {mod}`pickle`, as well as store the {class}`~expertsystem.reaction.Result` as JSON:"
]
},
{
Expand All @@ -164,6 +164,7 @@
"source": [
"import pickle\n",
"\n",
"es.io.write(result, \"transitions.json\")\n",
"with open(\"helicity_model.pickle\", \"wb\") as stream:\n",
" pickle.dump(model, stream)"
]
Expand Down
11 changes: 11 additions & 0 deletions docs/usage/step2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -308,6 +308,17 @@
"plt.legend();"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
":::{seealso}\n",
"\n",
"{ref}`usage/step3:Intensity components`\n",
"\n",
":::"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
201 changes: 195 additions & 6 deletions docs/usage/step3.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,10 @@
"source": [
"import pickle\n",
"\n",
"import expertsystem as es\n",
"from tensorwaves.model import SympyModel\n",
"\n",
"result = es.io.load(\"transitions.json\")\n",
"with open(\"helicity_model.pickle\", \"rb\") as stream:\n",
" model = pickle.load(stream)\n",
"with open(\"data_set.pickle\", \"rb\") as stream:\n",
Expand Down Expand Up @@ -389,8 +391,8 @@
" callback=callbacks,\n",
" use_analytic_gradient=False,\n",
")\n",
"result = minuit2.optimize(estimator, initial_parameters)\n",
"result"
"fit_result = minuit2.optimize(estimator, initial_parameters)\n",
"fit_result"
]
},
{
Expand All @@ -413,7 +415,7 @@
},
"outputs": [],
"source": [
"optimized_parameters = result[\"parameter_values\"]\n",
"optimized_parameters = fit_result[\"parameter_values\"]\n",
"for p in optimized_parameters:\n",
" print(p)\n",
" print(f\" initial: {initial_parameters[p]:.3}\")\n",
Expand Down Expand Up @@ -515,6 +517,193 @@
"compare_model(\"m_12\", data_set, phsp_set, intensity)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Intensity components"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice that the {class}`~expertsystem.amplitude.helicity.HelicityModel` contains {attr}`~expertsystem.amplitude.helicity.HelicityModel.components` attribute. This is {obj}`dict` of component names to {class}`sympy.Expr <sympy.core.expr.Expr>`s helps us to identify amplitudes and intensities in the total amplitude."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sorted(model.components)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model.components[\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(1370)_{0} \\gamma_{+1};f_{0}(1370)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\"\n",
"].subs(model.parameters).doit()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just like in {ref}`usage/step2:2.2 Generate intensity-based sample`, these _intensity components_ can each be expressed in a computational backend. This can be done with the helper function {func}`.create_intensity_component`. After that we, update the parameters with the optimized parameter values we found in {ref}`usage/step3:Perform fit`. The two components in the example below should be the same:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from tensorwaves.physics import create_intensity_component\n",
"\n",
"from_amplitudes = create_intensity_component(\n",
" model,\n",
" components=[\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(500)_{0} \\gamma_{+1};f_{0}(500)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(980)_{0} \\gamma_{+1};f_{0}(980)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(1370)_{0} \\gamma_{+1};f_{0}(1370)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(1500)_{0} \\gamma_{+1};f_{0}(1500)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" R\"A[J/\\psi(1S)_{+1} \\to f_{0}(1710)_{0} \\gamma_{+1};f_{0}(1710)_{0} \\to \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" ],\n",
" backend=\"numpy\",\n",
")\n",
"from_amplitudes.update_parameters(latest_parameters)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from_intensity = create_intensity_component(\n",
" model,\n",
" components=R\"I[J/\\psi(1S)_{+1} \\to \\gamma_{+1} \\pi^{0}_{0} \\pi^{0}_{0}]\",\n",
" backend=\"numpy\",\n",
")\n",
"from_intensity.update_parameters(latest_parameters)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"difference = np.average(from_amplitudes(phsp_set) - from_intensity(phsp_set))\n",
"assert np.round(difference, decimals=15) == 0.0"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The result is a {class}`.LambdifiedFunction` that can be plotted just like in {ref}`usage/step3:Plot optimized model`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"jupyter": {
"source_hidden": true
},
"tags": [
"hide-input"
]
},
"outputs": [],
"source": [
"fig, ax = plt.subplots(1, figsize=(8, 5))\n",
"bins = 150\n",
"ax.hist(\n",
" phsp_set[\"m_12\"],\n",
" weights=intensity(phsp_set),\n",
" bins=bins,\n",
" alpha=0.2,\n",
" label=\"full intensity\",\n",
")\n",
"ax.hist(\n",
" phsp_set[\"m_12\"],\n",
" weights=from_intensity(phsp_set),\n",
" bins=bins,\n",
" histtype=\"step\",\n",
" label=R\"$J/\\psi(1S)_{-1} \\to \\gamma_{-1} \\pi^0 \\pi^0$\",\n",
")\n",
"ax.set_xlim(0.25, 2.5)\n",
"ax.set_xlabel(R\"$m_{\\pi^0\\pi^0}$ [GeV]\")\n",
"ax.set_yticks([])\n",
"plt.legend();"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Or generically, so that we can stack all the sub-intensities:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"jupyter": {
"source_hidden": true
},
"tags": [
"hide-input"
]
},
"outputs": [],
"source": [
"import logging\n",
"\n",
"from tensorwaves.data import generate_data\n",
"\n",
"logging.basicConfig()\n",
"logging.getLogger().setLevel(logging.ERROR)\n",
"intensity_components = [\n",
" create_intensity_component(model, c, backend=\"numpy\")\n",
" for c in model.components\n",
" if c.startswith(\"I\")\n",
"]\n",
"sub_intensities = [\n",
" generate_data(\n",
" size=5_000,\n",
" reaction_info=model.adapter.reaction_info,\n",
" data_transformer=model.adapter,\n",
" intensity=component,\n",
" )\n",
" for component in intensity_components\n",
"]\n",
"\n",
"fig, ax = plt.subplots(1, figsize=(8, 5))\n",
"plt.hist(\n",
" [model.adapter.transform(i)[\"m_12\"] for i in sub_intensities],\n",
" bins=100,\n",
" stacked=True,\n",
" alpha=0.6,\n",
")\n",
"ax.set_xlim(0.25, 2.5)\n",
"ax.set_xlabel(R\"$m_{\\pi^0\\pi^0}$ [GeV]\")\n",
"ax.set_yticks([])\n",
"ax.legend(\n",
" labels=[f\"${c[2:-1]}$\" for c in model.components if c.startswith(\"I\")]\n",
");"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -586,7 +775,7 @@
")\n",
"fit_traceback.plot(\"function_call\", \"estimator_value\", ax=ax1)\n",
"fit_traceback.plot(\"function_call\", sorted(initial_parameters), ax=ax2)\n",
"fig.suptitle(\"Minuit2 optimizer process\", fontsize=16)\n",
"fig.suptitle(\"SciPy optimizer process\", fontsize=16)\n",
"ax1.set_title(\"Negative log likelihood\")\n",
"ax2.set_title(\"Parameter values\")\n",
"ax1.set_xlabel(\"function call\")\n",
Expand Down Expand Up @@ -641,8 +830,8 @@
" ),\n",
" use_analytic_gradient=False,\n",
")\n",
"result = scipy.optimize(estimator, initial_parameters)\n",
"result"
"fit_result = scipy.optimize(estimator, initial_parameters)\n",
"fit_result"
]
},
{
Expand Down
3 changes: 2 additions & 1 deletion src/tensorwaves/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
as `sympy` expressions only.
"""

import logging
from typing import Callable, Dict, FrozenSet, Mapping, Tuple, Union

import numpy as np
Expand Down Expand Up @@ -111,7 +112,7 @@ def __init__(

if not set(parameters) <= set(expression.free_symbols):
redeboer marked this conversation as resolved.
Show resolved Hide resolved
unused_parameters = set(parameters) - set(expression.free_symbols)
raise ValueError(
logging.warning(
f"Parameters {unused_parameters} are defined but do not appear"
" in the model!"
)
Expand Down
54 changes: 54 additions & 0 deletions src/tensorwaves/physics.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
"""Extract physics info from data and fit output."""

from typing import Iterable, Sequence, Union

import sympy as sp
from expertsystem.amplitude.helicity import HelicityModel

from tensorwaves.model import LambdifiedFunction, SympyModel


def add_components(
model: HelicityModel,
components: Union[str, Iterable[str]],
) -> sp.Expr:
"""Coherently or incoherently add components of a helicity model."""
if isinstance(components, str):
components = [components]
for component in components:
if component not in model.components:
raise KeyError(
f'Component "{component}" not in model components',
list(model.components),
)
if any(map(lambda c: c.startswith("I"), components)) and any(
map(lambda c: c.startswith("A"), components)
):
intensity_sum = add_components(
model,
components=filter(lambda c: c.startswith("I"), components),
)
amplitude_sum = add_components(
model,
components=filter(lambda c: c.startswith("A"), components),
)
return intensity_sum + amplitude_sum
if all(map(lambda c: c.startswith("I"), components)):
return sum(model.components[c] for c in components)
if all(map(lambda c: c.startswith("A"), components)):
return abs(sum(model.components[c] for c in components)) ** 2
raise ValueError('Not all component names started with either "A" or "I"')


def create_intensity_component(
model: HelicityModel,
components: Union[str, Sequence[str]],
backend: str,
) -> LambdifiedFunction:
"""Create a `.LambdifiedFunction` of a sum of helicity model components."""
added_components = add_components(model, components)
sympy_model = SympyModel(
expression=added_components,
parameters=model.parameters,
)
return LambdifiedFunction(sympy_model, backend=backend)
Loading