Draft
Conversation
…o figure out if I should just instantiate both aux_params, and model then change the LR after one chckpoint
…sion from Corneel's most recent code). Also implemented the soft constraints, thing left to abstract away is the logging. Need to look at auxiliary params, maybe looking at ray instantiation?
Moving Tomo-NeRF HPC stuff to Tomography
Adding background subtraction imaging util
… can start including soft loss and general clean up.
…r to conventional tomography file for conv recons
…eed to track down; scheduler stepping fixed prior implementations was not stepping schedulers; end result volume saving, figure out protocol for how to save model weights, volume, etc...; added nvtx wrapper in core/ml; starting implementation of TomographyLite
… move conventional algorithm .forward calls.
…. Also included background_subtract again, did this get moved somewhere?
Collaborator
Author
…ed to figure out a better way to do this. Also the current way of reinitializing the z1_params is jank, need a better way to reinitialize parameters
1 task
|
|
||
| return train_dataloader, train_sampler, val_dataloader, val_sampler | ||
|
|
||
| def build_model( |
Collaborator
There was a problem hiding this comment.
shouldn't this be distribute_model or similar? Building the model implies initialization I think.
Comment on lines
+140
to
+151
| pretrained_weights: dict[str, torch.Tensor] | None = None, | ||
| ) -> nn.Module | nn.parallel.DistributedDataParallel: | ||
| """ | ||
| Wraps the model with DistributedDataParallel if mulitple GPUs are available. | ||
|
|
||
| Returns the model. | ||
| """ | ||
| print("Building Model on device: ", self.device) | ||
| model = model.to(self.device) | ||
| if pretrained_weights is not None: | ||
| model.load_state_dict(pretrained_weights.copy()) | ||
|
|
Collaborator
There was a problem hiding this comment.
Why is loading the weights done here? Feels rather independent of ddp
| @dataclass(slots=True) | ||
| class Constraints(ABC): | ||
| """ | ||
| Needs to be implemented in all object models that inherit from BaseConstraints. |
Collaborator
There was a problem hiding this comment.
I would maybe add a slightly extended docstring here. The idea is that any Model that inherits from BaseConstraints also has .constraints attribute that is of type Constraints, right?
Comment on lines
+63
to
+64
| def soft_constraint_losses(self) -> list[float]: | ||
| return np.array(self._soft_constraint_losses) |
Collaborator
There was a problem hiding this comment.
type hinting of this is inconsistent
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Tutorial notebooks to be written in https://github.com/electronmicroscopy/quantem-tutorials/tree/tomography_tutorials, reorganizingconventional_tomography.pyand implementation ofAutoSerializeneeded, do not review yet - 1/30Tutorials ready; still have some cleanup to do 2/02
Implementation of conventional (simutlaneous iterative reconstruction technique (SIRT) and filtered back projection (FBP)) and machine-learning enabled (implicit neural representations) tomographic reconstruction methods.
The code implements the overall design patterns of
diffractive_imagingi.e, usage ofcore/mlclasses, and reconstruction loop design. Hopefully everything is still able to be easily extended to different types of tomography experiments.Since this a big PR, I'll provide a brief description of the relevant
.pyfiles intomography.tomography.py: Top-level file that contains the reconstruction loop and instantiation of theTomographyobject through.from_models.tomography_lite.py: Similar toptychography_lite.py, this file abstracts the object, model, optimizer, and scheduler initialization to simply loading the tomographic dataset and perform an immediate reconstruction.tomography_base.py: Base class that inherits fromAutoSerialize,RNGMixin, andDDPMixin(new) with the appropriate properties that is needed for every reconstruction.tomography_opt.py: Contains all the necessary optimizer parameters for reconstructions i.e, object and pose.object_models.py: Contains the classes for both pixelated and INR reconstructions. Can directly pretrain the volume from conventional methods.dataset_models.py: Contains pixelated and INR datasets with their respective.forwardand.__getitem__calls.logger_tomography.py: Contains the logger for tomography reconstructions.utils.py: Various functions for helping process tilt series datasets and also has the tools for performing voxelwise AD reconstructions.There are also some added functionalities to
core/mlthat was implemented for helping initialization of distributed computing on HPC platforms (NERSC). The updates included briefly described here:core/ml/ddp.py: Initializes all the necessary parameters needed for doing distirbuted computing (defining world size, global rank, and local rank). Also contains helper functions for setting up model parallelization usingDistributedDataParalleland setting upDataLoaderdistributed sampling usingDistributedSamplercore/ml/inr.py: Added Winner initialization to SIREN neural networks.core/ml/loss_functions.py: Added custom loss functions asnn.Module's.core/ml/profiling.py: Context manager for profiling code using NVIDIA Nsight.What should the PR reviewer do?
The main points to check for this PR would be:
core/mlpriority toloss_functions.pyif we want to turn loss functions intonn.Module's.object_models.pyanddataset_models.pyfollow the same design patterns and does not contain any redundant code that might be inherited from base classes.tomography.pyand check if loss calculations are being performed correctly, i.e stepping the scheduler/optimizer at the correct places.Please make note of any design patterns that were not followed or potential bugs.
@arthurmccray will notify you when examples are ready to be tested.