diff --git a/BraTS-Toolkit/glioma_segmentation_with_BraTS_Toolkit.ipynb b/BraTS-Toolkit/glioma_segmentation_with_BraTS_Toolkit.ipynb index 6f58fb8..2d8df8c 100644 --- a/BraTS-Toolkit/glioma_segmentation_with_BraTS_Toolkit.ipynb +++ b/BraTS-Toolkit/glioma_segmentation_with_BraTS_Toolkit.ipynb @@ -6,7 +6,7 @@ "source": [ "# Warning\n", "\n", - "This tutorial is a work in progress, it does not cover all the functionalities yet. However, it will give you a good overview how you can generate tumor segmentations with BraTS Toolkit.\n" + "This tutorial is a work in progress, it does not cover all the functionalities yet. However, it will give you a good overview of how you can generate tumor segmentations with BraTS Toolkit.\n" ] }, { @@ -28,13 +28,13 @@ "2. **Import** of raw data (cMRI)\n", " \n", " \n", - "3. **Preprocessing via BrainLes preprocessig package** instead of vanilla preprocessing pipeline from BraTS Toolkit. BraTS challenge algorithms expect preprocessed files (co-registered, skullstripped in SRI-24 space)\n", + "3. **Preprocessing via BrainLes preprocessing package** instead of vanilla preprocessing pipeline from BraTS Toolkit. BraTS challenge algorithms expect preprocessed files (co-registered, skullstripped in SRI-24 space)\n", " \n", " \n", "4. **Segmentation with BraTS Toolkit** \n", " \n", " \n", - "5. **Fusion** of generated segementations \n", + "5. **Fusion** of generated segmentations \n", "\n" ] }, @@ -55,10 +55,10 @@ "optional (but recommended):\n", "\n", "- CUDA drivers\n", - "- a GPU that is supported (each algorithms supports different set of GPUs)\n", + "- a GPU that is supported (each algorithm supports different set of GPUs)\n", "\n", "\n", - "Run the subsequent cell to check CUDA availability, your python and docker version. " + "Run the subsequent cell to check CUDA availability, your Python and docker version. " ] }, { @@ -206,14 +206,14 @@ "source": [ "## 2. Import raw input data\n", "\n", - "Raw imput data require\n", + "Raw input data require\n", "- exam with skull and 4 sequences (T1, T1c, T2, T2-FLAIR), only one file each (in total 4 files) \n", " - file names must end with \"*t1.nii.gz\", \"*t1c.nii.gz\", \"*t2.nii.gz\" and \"*fla.nii.gz\"\n", " - [Nifti \".nii.gz\"](https://brainder.org/2012/09/23/the-nifti-file-format/) file type \n", "- each exam (4 files) needs to be places in an individual folder within the data folder \n", "```/BraTS-Toolkit/data/```. \n", "\n", - "The structure is shown using two example exams:\n", + "The structure is shown using two examples exams:\n", "```\n", "BraTS-Toolkit\n", "├── data\n", @@ -297,7 +297,7 @@ "source": [ "## 3. Preprocessing\n", "\n", - "BraTS challenge algorithms expect co-registered, skullstripped files in SRI-24 space, to achieve this preprocessing is required.\n", + "BraTS challenge algorithms expect co-registered, skull-stripped files in SRI-24 space, to achieve this preprocessing is required.\n", "Instead of using the vanilla preprocessing pipeline from BraTS Toolkit, we recommend using the new [BrainLes preprocessing package](https://github.com/BrainLesion/preprocessing/tree/main/brainles_preprocessing).\n" ] },