Skip to content

Commit

Permalink
Add toggle instructions on widget
Browse files Browse the repository at this point in the history
Also updated plugin preview version and napari-hub description page
  • Loading branch information
gschlafly committed May 11, 2023
1 parent 5395c3d commit 29deddb
Show file tree
Hide file tree
Showing 4 changed files with 98 additions and 30 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/plugin_preview.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,6 @@ jobs:
uses: actions/checkout@v2

- name: napari hub Preview Page Builder
uses: chanzuckerberg/napari-hub-preview-action@v0.1.5
uses: chanzuckerberg/napari-hub-preview-action@v0.1.6
with:
hub-ref: main
91 changes: 74 additions & 17 deletions .napari-hub/DESCRIPTION.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,11 +90,14 @@ during conference talks and other public facing events. If you'd like to be cite
a particular format, or have a DOI you'd like used, you should provide that information here. -->


Deconvolves a 4D light field image into a full 3D focal stack reconstruction
Deconvolves a 4D light field image into a full 3D focus stack reconstruction

https://user-images.githubusercontent.com/23206511/180571940-9500dd19-119b-4d0d-8b33-5ab1705e9b6f.mov
https://user-images.githubusercontent.com/23206511/236919283-d53ca97a-9bdd-4598-b553-34996f688237.mp4

napari-LF provides three basic processes to Calibrate, Rectify, and Deconvolve light field images:
napari-LF contains an analytic and neural net analysis methods for light field images. To download example light field images, see our repository [napari-LF-docs-samples](https://github.com/PolarizedLightFieldMicroscopy/napari-LF-docs-samples).

### LF Analyze
**LF Analyze**, the analytic method, provides three basic processes to Calibrate, Rectify, and Deconvolve light field images:

The **Calibrate** process generates a calibration file that represents the optical setup that was used to record the light field images. The same calibration file can be used to rectify and deconvolve all light field images that were recorded with the same optical setup, usually the same microscope and light field camera. The Calibrate process requires as input the radiometry frame, dark frame, optical parameters, and volume parameters to generate the calibration file, which is subsequently used to rectify and deconvolve related light field images. The calibration file includes a point spread function (PSF) derived from the optical and volume parameters and is stored in HDF5 file format.

Expand All @@ -104,28 +107,82 @@ The **Deconvolve** process uses the PSF and a wave optics model to iteratively d

The **Parameter** panels, located in the lower half of the napari-LF widget, allows the user to specify settings for the reconstruction process. Once the appropriate parameters are selected, the Calibrate button followed by the Deconvolve button can be pushed to complete the reconstruction.

### Neural Net
**Neural Net** provides a method of applying a trained neural net model to deconvolve a light field image. Based on Pytorch Lightning and a provided [base class](https://github.com/PolarizedLightFieldMicroscopy/napari-LF/blob/main/src/napari_lf/lfa/neural_nets/LFNeuralNetworkProto.py), you can either create your own network, or use the pre-shipped networks (LFMNet, VCDNet, ...).

## Quickstart
1. Install the napari-LF plugin into your napari environment, as described below under **Installation**.
1. From the napari Plugins menu, select the napari-LF plugin to install its widget into the napari viewer
1. From the napari Plugins menu, select the napari-LF plugin to install its widget into the napari viewer.
### LF Analyze
1. Near the top of the widget, select your project folder containing the following images: light field, radiometry, and dark frame.
1. Write the name of the metadata file you want for recording your reconstruction settings, e.g. metadata.txt. This file will be updated each time a calibration process is started.
1. Calibration
- In the parameters panel, navigate to **Calibrate, Required** (top tab **Calibrate**, bottom tab **Required**), which is the default selection.
- Select **radiometry** and **dark frame** images from pull down menus.
- Write the name of the **calibration file** you would like to produce, e.g. calibration.lfc.
- Enter the appropriate **optical parameters** according to your microscope and sample material.
- Enter the **volume parameters** you would like for your 3D reconstuction.
- Push the `Calibrate` button.
1. In the processing panel, navigate to **Calibrate, Required** (top tab **Calibrate**, bottom tab **Required**), which is the default selection.
1. Select **radiometry** and **dark frame** images from pull down menus.
1. Write the name of the **calibration file** you would like to produce, e.g. calibration.lfc.
1. Enter the appropriate **optical parameters** according to your microscope and sample material.
1. Enter the **volume parameters** you would like for your 3D reconstuction.
1. Push the `Calibrate` button.
1. Deconvolution
- In the parameters panel, navigate to **Deconvolve, Required**.
- Select **light field** image and **calibration file** from pull down menus.
- Write the name of the **output image stack** you would like to produce, e.g. output_stack.tif.
- Push the `Deconvolve` button.
3D focal stack reconstruction will display in the napari viewer and be saved in your original project folder.
1. In the processing panel, navigate to **Deconvolve, Required**.
1. Select **light field** image and **calibration file** from pull down menus.
1. Write the name of the **output image stack** you would like to produce, e.g. output_stack.tif.
1. Push the `Deconvolve` button.
The 3D focal stack reconstruction will display in the napari viewer and be saved in your original project folder.
### Neural Net
1. Click on the **LF Analyze** logo to toggle to the **Neural Net** mode.
1. Near the top of the widget, select your project folder containing the light field image and the trained neural net. If you do not already have a trained model, you can train a model using this [Jupyter notebook](https://github.com/PolarizedLightFieldMicroscopy/napari-LF/blob/main/src/napari_lf/lfa/main_train_neural_net.ipynb).
1. In the processing panel, select your **light field image** and **neural net model**.
1. Write the name of the **output image stack** you would like to produce, e.g. output_stack.tif.
1. Push the `Deconvolve` button.
The 3D focal stack reconstruction will display in the napari viewer and be saved in your original project folder.

## Getting Help
For details about each parameter, hover over each parameter textbox to read the tooltip description.
For additional information about the reconstruction process, see our [User Guide](https://github.com/PolarizedLightFieldMicroscopy/napari-LF/blob/description/docs/napari-LF_UserGuide_5July2022.docx) along with our general documentation on [GitHub](https://github.com/PolarizedLightFieldMicroscopy/napari-LF).
For additional information about the reconstruction process, see our [User Guide](docs/napari-LF_UserGuide_5July2022.docx).

## Installation

After you have [napari] installed, you can one of the methods below to install `napari-LF`.

Method 1: You can install `napari-LF` via [pip]:

pip install napari-LF

Method 2: Use the napari plugin menu.

1. Open napari from the command line:

napari

1. From the napari menu, select **Plugins > Install/uninstall Packages**.

1. Either (a) scroll through the list of available plugins to find `napari-LF`, or (b) drag and drop a downloaded `napari-LF` directory into the bottom bar.

1. Select **Install** to install the light field plugin.

Method 3: Install the latest development version from the command line.

pip install git+https://github.com/PolarizedLightFieldMicroscopy/napari-LF.git

Lastly, to access the installed plugin, open napari from the command line:

napari

From the napari menu, select **Plugins > Main Menu (napari-LF)**. Note that you may need to close and reopen napari for the `napari-LF` to appear.

### Installation for developers

Create a virtual environment from the command line for napari with the python libraries necessary for the light field plugin:

conda create --name napari-lf python==3.9
conda activate napari-lf

Clone the github repository:

conda install git
git clone https://github.com/PolarizedLightFieldMicroscopy/napari-LF.git
cd napari-LF
pip install -e .

## Contributing

Expand Down
22 changes: 11 additions & 11 deletions src/napari_lf/_widgetLF.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,17 +223,17 @@ def input_model_change_call():

# Define input shape, and extract it either from a calib file or the stored checkpoint
LFshape = None
# Load calib file
if self.gui.gui_elms["lfmnet"]["calibration_file"].value == None:
pass
else:
calibFile_path = str(os.path.join(str(self.gui.gui_elms["main"]["img_folder"].value), self.gui.gui_elms["lfmnet"]["calibration_file"].value))
path = Path(calibFile_path)
if path.is_file():
import h5py
with h5py.File(calibFile_path, "r") as f:
lf = f['geometry']
LFshape = [lf.attrs['nu'], lf.attrs['nv'], lf.attrs['ns'], lf.attrs['nt']]
# # Load calib file
# if self.gui.gui_elms["lfmnet"]["calibration_file"].value == None:
# pass
# else:
# calibFile_path = str(os.path.join(str(self.gui.gui_elms["main"]["img_folder"].value), self.gui.gui_elms["lfmnet"]["calibration_file"].value))
# path = Path(calibFile_path)
# if path.is_file():
# import h5py
# with h5py.File(calibFile_path, "r") as f:
# lf = f['geometry']
# LFshape = [lf.attrs['nu'], lf.attrs['nv'], lf.attrs['ns'], lf.attrs['nt']]

if self.gui.gui_elms["lfmnet"]["input_model"].value == None:
return
Expand Down
13 changes: 12 additions & 1 deletion src/napari_lf/_widgetLF_gui.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,17 @@ def __init__(self):
self.NeuralNet_btn.setSizePolicy(QSizePolicy.Fixed, QSizePolicy.Fixed)
self.NeuralNet_btn.clicked.connect(self.NeuralNet_btn_call)
self.NeuralNet_btn_cont.hide()



_toggle_instructions = QWidget()
mylabel = QLabel()
mylabel.setText("Click the button below to toggle between analysis methods.")
mylabel.setAlignment(Qt.AlignCenter)
vbox = QVBoxLayout()
vbox.addWidget(mylabel)
_toggle_instructions.setLayout(vbox)
_toggle_instructions.layout().setContentsMargins(0,0,0,0)

_processing_methods = QWidget()
hBoxLayout = QHBoxLayout()
hBoxLayout.addWidget(self.LFAnalyze_btn_cont.native)
Expand All @@ -207,6 +217,7 @@ def __init__(self):
_widget_data.setLayout(_QFormLayoutData)
_QFormLayoutData.addRow(self.gui_elms["main"]["img_folder"].label, self.gui_elms["main"]["img_folder"].native)
_QFormLayoutData.addRow(_cont_img_list_btn.native)
_QFormLayoutData.addRow(_toggle_instructions)
_QFormLayoutData.addRow(_processing_methods)
if self.gui_elms["main"]["metadata_file"].visible:
_QFormLayoutData.addRow(self.gui_elms["main"]["metadata_file"].label, self.gui_elms["main"]["metadata_file"].native)
Expand Down

0 comments on commit 29deddb

Please sign in to comment.