Skip to content

Tomography Module#167

Draft
cedriclim1 wants to merge 65 commits intodevfrom
tomography_refactor
Draft

Tomography Module#167
cedriclim1 wants to merge 65 commits intodevfrom
tomography_refactor

Conversation

@cedriclim1
Copy link
Collaborator

@cedriclim1 cedriclim1 commented Jan 31, 2026

What does this PR do?

Tutorial notebooks to be written in https://github.com/electronmicroscopy/quantem-tutorials/tree/tomography_tutorials, reorganizing conventional_tomography.py and implementation of AutoSerialize needed, do not review yet - 1/30
Tutorials ready; still have some cleanup to do 2/02
Implementation of conventional (simutlaneous iterative reconstruction technique (SIRT) and filtered back projection (FBP)) and machine-learning enabled (implicit neural representations) tomographic reconstruction methods.

The code implements the overall design patterns of diffractive_imaging i.e, usage of core/ml classes, and reconstruction loop design. Hopefully everything is still able to be easily extended to different types of tomography experiments.

Since this a big PR, I'll provide a brief description of the relevant .py files in tomography.

  • tomography.py: Top-level file that contains the reconstruction loop and instantiation of the Tomography object through .from_models.
  • tomography_lite.py: Similar to ptychography_lite.py, this file abstracts the object, model, optimizer, and scheduler initialization to simply loading the tomographic dataset and perform an immediate reconstruction.
  • tomography_base.py: Base class that inherits from AutoSerialize, RNGMixin, and DDPMixin (new) with the appropriate properties that is needed for every reconstruction.
  • tomography_opt.py: Contains all the necessary optimizer parameters for reconstructions i.e, object and pose.
  • object_models.py: Contains the classes for both pixelated and INR reconstructions. Can directly pretrain the volume from conventional methods.
  • dataset_models.py: Contains pixelated and INR datasets with their respective .forward and .__getitem__ calls.
  • logger_tomography.py: Contains the logger for tomography reconstructions.
  • utils.py: Various functions for helping process tilt series datasets and also has the tools for performing voxelwise AD reconstructions.

There are also some added functionalities to core/ml that was implemented for helping initialization of distributed computing on HPC platforms (NERSC). The updates included briefly described here:

  • core/ml/ddp.py: Initializes all the necessary parameters needed for doing distirbuted computing (defining world size, global rank, and local rank). Also contains helper functions for setting up model parallelization using DistributedDataParalleland setting up DataLoader distributed sampling using DistributedSampler
  • core/ml/inr.py: Added Winner initialization to SIREN neural networks.
  • core/ml/loss_functions.py: Added custom loss functions as nn.Module's.
  • core/ml/profiling.py: Context manager for profiling code using NVIDIA Nsight.

What should the PR reviewer do?

The main points to check for this PR would be:

  • Ensure that all example scripts and notebooks are easily run using local or HPC clusters.
  • Check all changes in core/ml priority to loss_functions.py if we want to turn loss functions into nn.Module's.
  • object_models.py and dataset_models.py follow the same design patterns and does not contain any redundant code that might be inherited from base classes.
  • Check the reconstruction loop in tomography.py and check if loss calculations are being performed correctly, i.e stepping the scheduler/optimizer at the correct places.

Please make note of any design patterns that were not followed or potential bugs.

@arthurmccray will notify you when examples are ready to be tested.

Cedric Lim and others added 30 commits September 16, 2025 10:42
…o figure out if I should just instantiate both aux_params, and model then change the LR after one chckpoint
…sion from Corneel's most recent code). Also implemented the soft constraints, thing left to abstract away is the logging. Need to look at auxiliary params, maybe looking at ray instantiation?
Moving Tomo-NeRF HPC stuff to Tomography
Adding background subtraction imaging util
@cedriclim1
Copy link
Collaborator Author

…ed to figure out a better way to do this. Also the current way of reinitializing the z1_params is jank, need a better way to reinitialize parameters

return train_dataloader, train_sampler, val_dataloader, val_sampler

def build_model(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't this be distribute_model or similar? Building the model implies initialization I think.

Comment on lines +140 to +151
pretrained_weights: dict[str, torch.Tensor] | None = None,
) -> nn.Module | nn.parallel.DistributedDataParallel:
"""
Wraps the model with DistributedDataParallel if mulitple GPUs are available.

Returns the model.
"""
print("Building Model on device: ", self.device)
model = model.to(self.device)
if pretrained_weights is not None:
model.load_state_dict(pretrained_weights.copy())

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is loading the weights done here? Feels rather independent of ddp

@dataclass(slots=True)
class Constraints(ABC):
"""
Needs to be implemented in all object models that inherit from BaseConstraints.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would maybe add a slightly extended docstring here. The idea is that any Model that inherits from BaseConstraints also has .constraints attribute that is of type Constraints, right?

Comment on lines +63 to +64
def soft_constraint_losses(self) -> list[float]:
return np.array(self._soft_constraint_losses)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

type hinting of this is inconsistent

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants