-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RFC: proposed structure for field params #332
Conversation
1 similar comment
Hey Alan, |
pylinac/core/profile.py
Outdated
@@ -210,6 +236,57 @@ def fwxm_center(self, x: int=50, interpolate: bool=False) -> Tuple[NumberLike, N | |||
fwxm_center_idx = int(round(fwxm_center_idx)) | |||
return fwxm_center_idx, self.values[fwxm_center_idx] | |||
|
|||
@argue.bounds(x=(0, 100)) | |||
@argue.options(norm=('max', 'max grounded', 'cax', 'cax grounded'), interpolate=(True, False)) | |||
def distance_to_dose(self, x: int=50, norm='max grounded', interpolate=True) -> Tuple[float, float]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand this method name clearly. Distance to dose is usually associated with gamma. Isn't this just a FWXM call with normalization applied? Seems like if we add a normalization parameter to fwxm this would fix the problem. If you included the distance of left/right from the profile center this name might fit better. This also seems like it would simplify the functions later on in the fieldparams.py file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it is just a FWXM, but I didn't want to change the existing FWXM as I didn't know what else used it so I needed another name and distance_to_dose is the best I could come up with. It will be great if we can add a normalisation parameter to FHXM, but as you can see from the amount of code in distance_to_dose it is quite a minefield.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do you feel about calling it FWXM_indices as it returns the left and right index of the X dose level?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could probably be more easily accomplished with added parameters to field_edges to let the user set the height parameter instead of assuming 50% and adding in your normalization parameter? Setting good defaults would then not change outputs for upgrading users.
https://github.com/jrkerns/pylinac/blob/master/pylinac/core/profile.py#L262
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've implemented this, and it does simplify things considerably, but I've got bogged down in small round off errors that lead to 1 more or 1 less field edge values being selected which has quite a large effect on the symmetry. In your original field_edges function (now called field_edges_deprecated) you call fwxm_center with interpolation effectively set to false causing fwhmc to be rounded off. Calling this function with interpolation set to True gives the same results as my new field_edge function. To my mind this gives a more accurate result, but you may have had other reasons for calling fwxm_center with interpolation set to False.
I'll commit the changes and upload so you can take a look.
left = peak_props['left_ips'][0] if interpolate else int(round(peak_props['left_ips'][0])) | ||
right = peak_props['right_ips'][0] if interpolate else int(round(peak_props['right_ips'][0])) | ||
if norm in ['max', 'cax']: | ||
left = left - 1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the -1 for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When the normalisation is "max" or "cax" I force find_peaks to use the full height of the peak from zero by adding a zero array value at the beginning and the end of the profile. This means that any indices returned by find_peaks needs to be decremented by 1 to correspond to the original profile.
return right_penum | ||
|
||
|
||
def flatness_dose_difference(profile: SingleProfile, ifa: float=0.8): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is ifa?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In Field Area, or the area over which flatness and symmetry (and other parameters) are calculated. One of the huge problems in comparing different protocols is that each define the IFA differently and the implementation by the manufacturers for the same protocol can differ. The intention is that the function get_infield_area will return the IFA as defined by the specific protocol, but for now I am going with 80%.
Hi James Thanks for looking at this. My original intention was to simply extend the flatsym module, but I immediately ran into problems with the structure in that some parameters such as field edge and size were hard coded while others such as flatness and symmetry were protocol dependent. Having thought about it I decided it would be easier and simpler to create a new module. This would make it easier evaluate and also allow the original flatsym module to be retained for backwards compatibility. I originally called the new module flatsym2 but it was soon obvious that I was moving past flatness and symmetry only and so renamed it to fieldparams. The flatsym module has done excellent service, but in this day and age of FFF machines I certainly need more, and I believe pylinac will gain wider acceptance by supporting FFF. I've got the inflection points for FFF working and as soon as I get a chance to finish testing I'll commit those changes. You are welcome to retain, merge or drop this module. For me it is not a wasted effort as I was using the opportunity to develop a new algorithm for BeamScheme. I'd just like to know what you want to do before I go to the trouble of preparing documentation. Hope this clarifies some of the thinking behind my decisions. |
PS: I'm not quite sure what code is redundant, but I was trying to limit the amount of code that needed to be pushed to a lower level. You are welcome to mark any redundant code. I certainly agree there is a lot of scope for improvement in the structure and style of the module. |
Thanks for the responses Alan. I think what you're trying to add and the goal of pylinac is the same; thus I'd like to keep this as one module. You have found the limitations of my algorithm =). Would you be kind enough to give me some of your low-res datasets? I didn't have any when this was built so such scenarios were not on my radar and obviously didn't get tested either. Believe it or not, the velocity of pylinac should pick up here in a bit. 😉 |
What may make better sense for both of us is for you to add the relevant methods to the profile module that allow for better calculation of FFF and array datasets. If you'd like to make another profile class that's fine. Underlying modules I'm much looser about updates/changes. Given I want to update the flatsym module to keep it all in sync that's my problem. I will attempt to modify your PR to make it all work. We can work on it together on this branch until we're both happy. I'll start by just making/adjusting the skeleton of the module (class names, protocols, method names and parameters). If that is in line with the needs of FFF/low-res then we/I can do the grunt work after that. |
pylinac/core/hillreg.py
Outdated
@@ -0,0 +1,46 @@ | |||
"""Perform non-linear regression using a Hill function""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not familiar with the Hill function. Is this better for inflection points than the second derivative? Can you point me to some resources?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm working from the Dutch NCS Report 33 - Linac QA: Beam Parameters for Non-Wedge High-Energy Photon Beams (attached). The preprint was provided to me by a colleague Theo van Soest author of Bistromath. The Netherlands Commission on Radiation Dosimetry is generally quite ahead of the curve when it comes to reports and position papers. The sigmoid (Hill) model is much more resistant to noise than the second derivative.
Prepublication_-_NCS_Report_33_Beam_parameters_V2020-07-29.pdf
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The beauty of this model is of course that it is very simple to add an edge parameter using the second derivative and then compare them all.
I have uploaded testsmall.dcm and the MapCheck2 file it was derived from to your DropBox. I have PTW and IBA files available at https://github.com/alanphys/BeamScheme/tree/master/TestFiles but I haven't converted these to a format pylinac can import yet. I suppose at some stage we will have to write import routines, although I think QATrack+ might have already done this. |
… fieldparams sync with upstream repo
Hi James. Sounds good, although for evaluating the NCS33 parameters I am using a sigmoid model for the inflection point, a second order polynomial for the top of the profile and linear interpolation for the dose points. Would you be able to cater for all three of these? |
As author of BistroMath and a member of the NCS-33 subcommitee I offer my help to improve pyLinac also. However, I don't speak Python momentarily. |
@alanphys So far, I've changed SingleProfile to have far more constructor parameters and removed a lot of the
Interpolation can be none, linear, cubic, normalization can be none, max, beam, geometric (i.e. middle pixel), and edge detection can be FWHM, inflection via derivative (i.e. no fitting but can still handle FFF beams), and inflection via Hill (your PR basically). Between these combinations I think it will be able to handle several scenarios. The methods have been simplified (e.g. As for the "top", I like the 2nd order poly fit so far, but obviously only needs to be called for FFF beams. Additionally, NCS33 says to analyze it across the middle 5cm (is this always at iso? Seems depth, FS, and energy-dependent), but this already assumes a field of at least 5cm (which I cannot assume. Not sure why you'd analyze an FFF <5cm, but still) and that I have the dpmm (probably will be true but again can't assume). I'm not really a fan of the 20/50/80 dose points as it seems more sensitive to noise. What I've done to try and combine it all is to use an in-field ratio (what NCS calls in-field area; I don't like the term "area" as it can be conflated with the symmetry term) and a "top" region ratio and calculate the slopes on either side of the field but excluding the top region. E.g. for a 10cm field with passed in parameters of an in-field ratio of 0.8 and top region of 0.4, the region from +/-4cm (10cm0.8/2) to +/-2cm (10cm0.4/2) respectively on each side will get a linear regression fit of the slope using all points. The slope is unlikely to be a straight line exactly but I don't think it's any worse than "evaluation points" and is less sensitive to noise. The region from -2cm to +2cm will be used to evaluate the "top" using the polynomial fit. To answer your original question Alan, I think the three topics you brought up are dealt with above. To retain compatibility with other software, the constructor parameters (which will also be added to the |
@HeerNMeester or @kegge13 Is there a permalink to the NCS-33 report so I can link it from pylinac documentation? |
A link to the NCS-33 prepublication is https://radiationdosimetry.org/files/Prepublication_-_NCS_Report_33_Beam_parameters_V2020-07-29.pdf. We are working on a much better text. |
I'm a fan of the derivative function because that offers a base to find penumbra regions. And with more statistical analysis it will also find them for wedged profiles and FFF. This enables a reliable application of the Hill function without assuming anything. |
Closing this as it is duplicated by #382 and includes the commits here. |
RAM-3227 Make bb-finding use the metrics module and use adaptive histogram when at field edge Approved-by: Randy Taylor
Request For Comment
This is the proposed structure for an extensible field parameters module that does the same as the existing flatsym module but can be easily extended for all sort of parameters like FFF. Do:
from pylinac import FieldParams
fs = FieldParams.from_demo_image()
fs.analyze(protocol='varian')
orfs.analyze(protocol='all')
print(fs.results())
I've made changes in core/profile.py as documented in issue #331. Let me know what you think. If you feel it has a place in the pylinac framework I'll carry on working on it and implement the rest of the parameters, write appropriate tests and document it.
Regards
Alan