Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: proposed structure for field params #332

Closed
wants to merge 61 commits into from

Conversation

alanphys
Copy link
Contributor

@alanphys alanphys commented Dec 7, 2020

Request For Comment

This is the proposed structure for an extensible field parameters module that does the same as the existing flatsym module but can be easily extended for all sort of parameters like FFF. Do:
from pylinac import FieldParams
fs = FieldParams.from_demo_image()
fs.analyze(protocol='varian') or fs.analyze(protocol='all')
print(fs.results())

I've made changes in core/profile.py as documented in issue #331. Let me know what you think. If you feel it has a place in the pylinac framework I'll carry on working on it and implement the rest of the parameters, write appropriate tests and document it.

Regards
Alan

@coveralls
Copy link

Coverage Status

Coverage decreased (-2.7%) to 80.887% when pulling 2538743 on alanphys:fieldparams into 863fce0 on jrkerns:master.

1 similar comment
@coveralls
Copy link

Coverage Status

Coverage decreased (-2.7%) to 80.887% when pulling 2538743 on alanphys:fieldparams into 863fce0 on jrkerns:master.

@coveralls
Copy link

coveralls commented Dec 7, 2020

Coverage Status

Coverage decreased (-5.8%) to 77.064% when pulling 18b5645 on alanphys:fieldparams into 0232234 on jrkerns:master.

@jrkerns
Copy link
Owner

jrkerns commented Feb 9, 2021

Hey Alan,
As usual, I'm pretty behind with this project so I'm finally getting around to this. Sorry. Generally speaking, the goal of this PR seems to be the same as the original flatsym module so splitting into multiple classes seems like an odd design choice from the users' perspective. There's also a lot of repeated code in the new class. I'd be far more in favor of just updating the flatsym class to use the protocol structure suggested. I'll leave comments in the PR. If I find time soon I'll also push up some commits on the branch for styling and some other stuff.

@@ -210,6 +236,57 @@ def fwxm_center(self, x: int=50, interpolate: bool=False) -> Tuple[NumberLike, N
fwxm_center_idx = int(round(fwxm_center_idx))
return fwxm_center_idx, self.values[fwxm_center_idx]

@argue.bounds(x=(0, 100))
@argue.options(norm=('max', 'max grounded', 'cax', 'cax grounded'), interpolate=(True, False))
def distance_to_dose(self, x: int=50, norm='max grounded', interpolate=True) -> Tuple[float, float]:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand this method name clearly. Distance to dose is usually associated with gamma. Isn't this just a FWXM call with normalization applied? Seems like if we add a normalization parameter to fwxm this would fix the problem. If you included the distance of left/right from the profile center this name might fit better. This also seems like it would simplify the functions later on in the fieldparams.py file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is just a FWXM, but I didn't want to change the existing FWXM as I didn't know what else used it so I needed another name and distance_to_dose is the best I could come up with. It will be great if we can add a normalisation parameter to FHXM, but as you can see from the amount of code in distance_to_dose it is quite a minefield.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you feel about calling it FWXM_indices as it returns the left and right index of the X dose level?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could probably be more easily accomplished with added parameters to field_edges to let the user set the height parameter instead of assuming 50% and adding in your normalization parameter? Setting good defaults would then not change outputs for upgrading users.
https://github.com/jrkerns/pylinac/blob/master/pylinac/core/profile.py#L262

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've implemented this, and it does simplify things considerably, but I've got bogged down in small round off errors that lead to 1 more or 1 less field edge values being selected which has quite a large effect on the symmetry. In your original field_edges function (now called field_edges_deprecated) you call fwxm_center with interpolation effectively set to false causing fwhmc to be rounded off. Calling this function with interpolation set to True gives the same results as my new field_edge function. To my mind this gives a more accurate result, but you may have had other reasons for calling fwxm_center with interpolation set to False.
I'll commit the changes and upload so you can take a look.

left = peak_props['left_ips'][0] if interpolate else int(round(peak_props['left_ips'][0]))
right = peak_props['right_ips'][0] if interpolate else int(round(peak_props['right_ips'][0]))
if norm in ['max', 'cax']:
left = left - 1
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the -1 for?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When the normalisation is "max" or "cax" I force find_peaks to use the full height of the peak from zero by adding a zero array value at the beginning and the end of the profile. This means that any indices returned by find_peaks needs to be decremented by 1 to correspond to the original profile.

return right_penum


def flatness_dose_difference(profile: SingleProfile, ifa: float=0.8):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is ifa?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Field Area, or the area over which flatness and symmetry (and other parameters) are calculated. One of the huge problems in comparing different protocols is that each define the IFA differently and the implementation by the manufacturers for the same protocol can differ. The intention is that the function get_infield_area will return the IFA as defined by the specific protocol, but for now I am going with 80%.

pylinac/fieldparams.py Outdated Show resolved Hide resolved
pylinac/core/profile.py Outdated Show resolved Hide resolved
pylinac/core/profile.py Outdated Show resolved Hide resolved
pylinac/core/profile.py Show resolved Hide resolved
@alanphys
Copy link
Contributor Author

Hi James

Thanks for looking at this. My original intention was to simply extend the flatsym module, but I immediately ran into problems with the structure in that some parameters such as field edge and size were hard coded while others such as flatness and symmetry were protocol dependent. Having thought about it I decided it would be easier and simpler to create a new module. This would make it easier evaluate and also allow the original flatsym module to be retained for backwards compatibility. I originally called the new module flatsym2 but it was soon obvious that I was moving past flatness and symmetry only and so renamed it to fieldparams.

The flatsym module has done excellent service, but in this day and age of FFF machines I certainly need more, and I believe pylinac will gain wider acceptance by supporting FFF.

I've got the inflection points for FFF working and as soon as I get a chance to finish testing I'll commit those changes.

You are welcome to retain, merge or drop this module. For me it is not a wasted effort as I was using the opportunity to develop a new algorithm for BeamScheme. I'd just like to know what you want to do before I go to the trouble of preparing documentation.

Hope this clarifies some of the thinking behind my decisions.
Regards
Alan

@alanphys
Copy link
Contributor Author

PS: I'm not quite sure what code is redundant, but I was trying to limit the amount of code that needed to be pushed to a lower level. You are welcome to mark any redundant code. I certainly agree there is a lot of scope for improvement in the structure and style of the module.

@jrkerns
Copy link
Owner

jrkerns commented Feb 16, 2021

Thanks for the responses Alan. I think what you're trying to add and the goal of pylinac is the same; thus I'd like to keep this as one module. You have found the limitations of my algorithm =). Would you be kind enough to give me some of your low-res datasets? I didn't have any when this was built so such scenarios were not on my radar and obviously didn't get tested either. Believe it or not, the velocity of pylinac should pick up here in a bit. 😉

@jrkerns
Copy link
Owner

jrkerns commented Feb 16, 2021

What may make better sense for both of us is for you to add the relevant methods to the profile module that allow for better calculation of FFF and array datasets. If you'd like to make another profile class that's fine. Underlying modules I'm much looser about updates/changes. Given I want to update the flatsym module to keep it all in sync that's my problem. I will attempt to modify your PR to make it all work. We can work on it together on this branch until we're both happy. I'll start by just making/adjusting the skeleton of the module (class names, protocols, method names and parameters). If that is in line with the needs of FFF/low-res then we/I can do the grunt work after that.

@@ -0,0 +1,46 @@
"""Perform non-linear regression using a Hill function"""
Copy link
Owner

@jrkerns jrkerns Feb 16, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar with the Hill function. Is this better for inflection points than the second derivative? Can you point me to some resources?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm working from the Dutch NCS Report 33 - Linac QA: Beam Parameters for Non-Wedge High-Energy Photon Beams (attached). The preprint was provided to me by a colleague Theo van Soest author of Bistromath. The Netherlands Commission on Radiation Dosimetry is generally quite ahead of the curve when it comes to reports and position papers. The sigmoid (Hill) model is much more resistant to noise than the second derivative.
Prepublication_-_NCS_Report_33_Beam_parameters_V2020-07-29.pdf

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The beauty of this model is of course that it is very simple to add an edge parameter using the second derivative and then compare them all.

@alanphys
Copy link
Contributor Author

I have uploaded testsmall.dcm and the MapCheck2 file it was derived from to your DropBox. I have PTW and IBA files available at https://github.com/alanphys/BeamScheme/tree/master/TestFiles but I haven't converted these to a format pylinac can import yet. I suppose at some stage we will have to write import routines, although I think QATrack+ might have already done this.

@alanphys
Copy link
Contributor Author

Hi James. Sounds good, although for evaluating the NCS33 parameters I am using a sigmoid model for the inflection point, a second order polynomial for the top of the profile and linear interpolation for the dose points. Would you be able to cater for all three of these?
Your use scenario is indeed the case. I am aware of a few people that have compared pylinac to existing software and found differences. Resolving these, I believe, will go a long way to establishing the credibility of pylinac.
Regards
Alan

@kegge13
Copy link

kegge13 commented Apr 20, 2021

@HeerNMeester many thanks for your valuable input. If you have more info to share on your experience using the Hill model that would be great.

Regards
Alan

As author of BistroMath and a member of the NCS-33 subcommitee I offer my help to improve pyLinac also. However, I don't speak Python momentarily.

@jrkerns
Copy link
Owner

jrkerns commented Apr 20, 2021

@alanphys So far, I've changed SingleProfile to have far more constructor parameters and removed a lot of the interpolate=True parameters from the methods, as even before this PR I was unhappy with trying to handle it all. The signature is looking like so:

    def __init__(self, values: np.ndarray, dpmm: float = None,
                 interpolation: Interpolation = Interpolation.LINEAR,
                 ground: bool = True,
                 interpolation_resolution_mm: float=0.1,
                 interpolation_factor: float=10,
                 normalization_method: Normalization = Normalization.BEAM_CENTER,
                 edge_detection_method: Edge = Edge.FWHM,
                 edge_smoothing_ratio: float=0.003):

Interpolation can be none, linear, cubic, normalization can be none, max, beam, geometric (i.e. middle pixel), and edge detection can be FWHM, inflection via derivative (i.e. no fitting but can still handle FFF beams), and inflection via Hill (your PR basically). Between these combinations I think it will be able to handle several scenarios. The methods have been simplified (e.g. penumbra(lower, upper)) but obviously take into account the constructor arguments. Additionally, the methods will return dictionaries for the most part to cut down on the side and kind arguments and just include it all.

As for the "top", I like the 2nd order poly fit so far, but obviously only needs to be called for FFF beams. Additionally, NCS33 says to analyze it across the middle 5cm (is this always at iso? Seems depth, FS, and energy-dependent), but this already assumes a field of at least 5cm (which I cannot assume. Not sure why you'd analyze an FFF <5cm, but still) and that I have the dpmm (probably will be true but again can't assume).

I'm not really a fan of the 20/50/80 dose points as it seems more sensitive to noise. What I've done to try and combine it all is to use an in-field ratio (what NCS calls in-field area; I don't like the term "area" as it can be conflated with the symmetry term) and a "top" region ratio and calculate the slopes on either side of the field but excluding the top region. E.g. for a 10cm field with passed in parameters of an in-field ratio of 0.8 and top region of 0.4, the region from +/-4cm (10cm0.8/2) to +/-2cm (10cm0.4/2) respectively on each side will get a linear regression fit of the slope using all points. The slope is unlikely to be a straight line exactly but I don't think it's any worse than "evaluation points" and is less sensitive to noise. The region from -2cm to +2cm will be used to evaluate the "top" using the polynomial fit.

To answer your original question Alan, I think the three topics you brought up are dealt with above. To retain compatibility with other software, the constructor parameters (which will also be added to the analyze signature of the flatsym/fieldparam module) will allow the user to choose the best parameters for them. W/R/T protocols, using these parameters will mean that only one field center, penumbra, etc are calculated depending on which parameters are chosen. The protocols now just basically include the symmetry and flatness definitions, although I'm making it extensible so that custom calculations can be done on the profiles.

@jrkerns
Copy link
Owner

jrkerns commented Apr 26, 2021

Little preview utilizing the Hill function and also the derivative inflection:
flatsym
Figure_1

@jrkerns
Copy link
Owner

jrkerns commented Apr 27, 2021

@HeerNMeester or @kegge13 Is there a permalink to the NCS-33 report so I can link it from pylinac documentation?

@kegge13
Copy link

kegge13 commented Apr 27, 2021

A link to the NCS-33 prepublication is https://radiationdosimetry.org/files/Prepublication_-_NCS_Report_33_Beam_parameters_V2020-07-29.pdf. We are working on a much better text.
The proposal to use the central 5 cm for the FFF Top model is a practical one: it represents pretty good the curved area. Indeed, this will not work for smaller fields but those field lack FFF-shape anyway. In BistroMath there are limits for a minimal field size and a minimal dose drop over the IFA to be "FFF".

@kegge13
Copy link

kegge13 commented Apr 27, 2021

Little preview utilizing the Hill function and also the derivative inflection:
flatsym
Figure_1

Nice presentation of results. Indeed the parameters do offer a wide range of strategies. In the images the penumbra area looks quite narrow. How is it defined?
I assume you use the real inflection point as I discussed earlier with Alan.
In these examples it seems that epid data are used. When water phantom data or equivalent are used mostly the origin is well defined by the user. I assume such an origin can also be used for normalisation and center definition?
Recently I also introduced a "near origin" definition which collapses to the origin when sufficiently small and otherwise uses a border based center.

@kegge13
Copy link

kegge13 commented Apr 27, 2021

@alanphys So far, I've changed SingleProfile to have far more constructor parameters and removed a lot of the interpolate=True parameters from the methods, as even before this PR I was unhappy with trying to handle it all. The signature is looking like so:

    def __init__(self, values: np.ndarray, dpmm: float = None,
                 interpolation: Interpolation = Interpolation.LINEAR,
                 ground: bool = True,
                 interpolation_resolution_mm: float=0.1,
                 interpolation_factor: float=10,
                 normalization_method: Normalization = Normalization.BEAM_CENTER,
                 edge_detection_method: Edge = Edge.FWHM,
                 edge_smoothing_ratio: float=0.003):

Interpolation can be none, linear, cubic, normalization can be none, max, beam, geometric (i.e. middle pixel), and edge detection can be FWHM, inflection via derivative (i.e. no fitting but can still handle FFF beams), and inflection via Hill (your PR basically). Between these combinations I think it will be able to handle several scenarios. The methods have been simplified (e.g. penumbra(lower, upper)) but obviously take into account the constructor arguments. Additionally, the methods will return dictionaries for the most part to cut down on the side and kind arguments and just include it all.

As for the "top", I like the 2nd order poly fit so far, but obviously only needs to be called for FFF beams. Additionally, NCS33 says to analyze it across the middle 5cm (is this always at iso? Seems depth, FS, and energy-dependent), but this already assumes a field of at least 5cm (which I cannot assume. Not sure why you'd analyze an FFF <5cm, but still) and that I have the dpmm (probably will be true but again can't assume).

I'm not really a fan of the 20/50/80 dose points as it seems more sensitive to noise. What I've done to try and combine it all is to use an in-field ratio (what NCS calls in-field area; I don't like the term "area" as it can be conflated with the symmetry term) and a "top" region ratio and calculate the slopes on either side of the field but excluding the top region. E.g. for a 10cm field with passed in parameters of an in-field ratio of 0.8 and top region of 0.4, the region from +/-4cm (10cm_0.8/2) to +/-2cm (10cm_0.4/2) respectively on each side will get a linear regression fit of the slope using all points. The slope is unlikely to be a straight line exactly but I don't think it's any worse than "evaluation points" and is less sensitive to noise. The region from -2cm to +2cm will be used to evaluate the "top" using the polynomial fit.

To answer your original question Alan, I think the three topics you brought up are dealt with above. To retain compatibility with other software, the constructor parameters (which will also be added to the analyze signature of the flatsym/fieldparam module) will allow the user to choose the best parameters for them. W/R/T protocols, using these parameters will mean that only one field center, penumbra, etc are calculated depending on which parameters are chosen. The protocols now just basically include the symmetry and flatness definitions, although I'm making it extensible so that custom calculations can be done on the profiles.

I'm a fan of the derivative function because that offers a base to find penumbra regions. And with more statistical analysis it will also find them for wedged profiles and FFF. This enables a reliable application of the Hill function without assuming anything.

@jrkerns
Copy link
Owner

jrkerns commented Apr 27, 2021

I'm a fan of the derivative function because that offers a base to find penumbra regions.

That's exactly what I ended up doing 😄.

Nice presentation of results. Indeed the parameters do offer a wide range of strategies. In the images the penumbra area looks quite narrow. How is it defined?

Exactly as NCS-33 says. Probably just the zoomed-out view that makes it narrow.

I assume you use the real inflection point as I discussed earlier with Alan.

I think for that figure it was the Hill function, but again, I will give the option to the user to use Hill (i.e. 4PL), 2nd derivative, or the traditional FWHM.

When water phantom data or equivalent are used mostly the origin is well defined by the user. I assume such an origin can also be used for normalisation and center definition?

I will have normalization options of None, geometric center (middle pixel), max, and beam center (for offset or wedge fields). I don't have water tank data so I'm unable to verify how the origin works there. Currently, I don't have an option for center definition; it's just the middle point between the field edges.

Here's a better image of an FFF beam using the Hill/4PL function with the left panel zoomed in:
FFF

@jrkerns
Copy link
Owner

jrkerns commented Apr 29, 2021

Closing this as it is duplicated by #382 and includes the commits here.

@jrkerns jrkerns closed this Apr 29, 2021
jrkerns added a commit that referenced this pull request Jan 12, 2024
RAM-3227 Make bb-finding use the metrics module and use adaptive histogram when at field edge

Approved-by: Randy Taylor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants