diff --git a/workshops/mne_course/.gitkeep b/workshops/mne_course/.gitkeep new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/workshops/mne_course/.gitkeep @@ -0,0 +1 @@ + diff --git a/workshops/mne_course/Day 1/1 - Raw.ipynb b/workshops/mne_course/Day 1/1 - Raw.ipynb new file mode 100644 index 0000000..dc165e2 --- /dev/null +++ b/workshops/mne_course/Day 1/1 - Raw.ipynb @@ -0,0 +1,722 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "id": "a7f32110-e046-4c35-94cd-dfc8aeda9c9d", + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne\n", + "import numpy as np" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Timeseries data in MNE - the `Raw` class\n", + "\n", + "The most basic form of electrophysiological data is timeseries data: a continuous set of voltage values recorded over time for each [channel](https://mne.tools/stable/documentation/glossary.html#term-channels).\n", + "\n", + "Timeseries data in MNE is stored in [`mne.io.Raw`](https://mne.tools/stable/generated/mne.io.Raw.html) and [`mne.io.RawArray`](https://mne.tools/stable/generated/mne.io.RawArray.html) objects.\n", + "\n", + "`Raw` objects can be created through loading data from the disk via one of the various [`mne.io.read_raw_xxx()`](https://mne.tools/stable/api/reading_raw_data.html) functions.\n", + "\n", + "`RawArray` objects can be created from data [arrays](https://numpy.org/doc/stable/reference/arrays.html) directly." + ] + }, + { + "cell_type": "markdown", + "id": "08faf2c6", + "metadata": {}, + "source": [ + "### Part 1 - Reading timeseries data from disk\n", + "\n", + "To familiarise ourselves with `Raw` objects, we will start by loading [MNE's sample dataset](https://mne.tools/stable/documentation/datasets.html#sample)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1be1ad46", + "metadata": {}, + "outputs": [], + "source": [ + "# Filepath to MNE's sample dataset on disk\n", + "sample_data_folder = mne.datasets.sample.data_path()\n", + "\n", + "# Load sample data from disk\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(sample_data_folder, \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "9ff126be", + "metadata": {}, + "source": [ + "`Raw` objects contain:\n", + "- the timeseries data\n", + "- the metadata, stored under the `info` attribute" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f453cf46", + "metadata": {}, + "outputs": [], + "source": [ + "# Show information about the data object\n", + "raw.info" + ] + }, + { + "cell_type": "markdown", + "id": "e8586edd", + "metadata": {}, + "source": [ + "For example, here you can see that the data we have loaded contains a mixture of MEG data (gradiometers and magnetometers) and EEG data sampled at 600 Hz, as well as the timings of stimulus presentation and subject behaviour during the recording ([stimulus channels](https://mne.tools/stable/documentation/glossary.html#term-stim-channel))." + ] + }, + { + "cell_type": "markdown", + "id": "bcb90bb1", + "metadata": {}, + "source": [ + "The data itself can be accessed via the [`get_data()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.get_data) method, which returns an array of shape `(channels, times)`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7968a4cf", + "metadata": {}, + "outputs": [], + "source": [ + "# Get data as an array\n", + "data = raw.get_data()\n", + "print(f\"Data has shape: {data.shape} (channels, times)\")" + ] + }, + { + "cell_type": "markdown", + "id": "8bc07090", + "metadata": {}, + "source": [ + "`Raw` objects have various methods for working with timeseries data, such as:\n", + "- isolating specific channels - [`pick()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.pick), [`drop_channels()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.drop_channels)\n", + "- isolating specific windows of time - [`crop()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.crop)\n", + "- spectral filtering of the data - [`filter()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.filter), [`notch_filter()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.notch_filter)\n", + "- plotting the data - [`plot()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.plot)\n", + "- computing the power spectra of the data - [`compute_psd()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.compute_psd)\n", + "\n", + "We will explore some of these methods below, and others in later notebooks." + ] + }, + { + "cell_type": "markdown", + "id": "050840af", + "metadata": {}, + "source": [ + "**Exercises - Manipulating `Raw` objects**\n", + "\n", + "We will start by selecting only a subset of channels to store in our `Raw` object, using the [`pick()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.pick) method.\n", + "\n", + "`pick()` accepts channel names, channel types, or channel indices as input, and retains only those channels that match this criteria.\n", + "\n", + "Below, we select only the MEG channels.\n", + "\n", + "*Hint:* Generally in MNE, methods will modify the object in-place to save memory. Because we want to play around with the data without modifying the original object, we will first make a copy of the `Raw` object using the [`copy()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.copy) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "dc53d8f0", + "metadata": {}, + "outputs": [], + "source": [ + "# Create a copy of the data so that we can manipulate it\n", + "raw_copy = raw.copy()\n", + "\n", + "# Select the MEG channels only\n", + "raw_copy.pick(\"meg\")\n", + "\n", + "# Verify that we have only MEG channels\n", + "print(raw_copy.get_data().shape)\n", + "raw_copy.info" + ] + }, + { + "cell_type": "markdown", + "id": "763f5cb0", + "metadata": {}, + "source": [ + "As you can see, the new `Raw` object now has only 306 channels, corresponding to the 203 Gradiometers and 102 Magnetometers (as well as 1 'bad' channel where the data was not properly recorded)." + ] + }, + { + "cell_type": "markdown", + "id": "7b513315", + "metadata": {}, + "source": [ + "**Exercise:** Select only the EEG channels from the original `Raw` object, and verify that only these channels remain." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ba1a38be", + "metadata": {}, + "outputs": [], + "source": [ + "# Remember to copy the original data!\n", + "raw_copy = raw.copy()\n", + "\n", + "## CODE GOES HERE\n", + "raw_copy.pick(\"eeg\")\n", + "print(raw_copy.get_data().shape)\n", + "raw_copy.info" + ] + }, + { + "cell_type": "markdown", + "id": "6ffc94b9", + "metadata": {}, + "source": [ + "**Exercise:** Now select the EEG and MEG channels simultaneously, and verify that only these channels remain." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2b7c6b0e", + "metadata": {}, + "outputs": [], + "source": [ + "# Remember to copy the original data\n", + "raw_copy = raw.copy()\n", + "\n", + "## CODE GOES HERE\n", + "raw_copy.pick([\"eeg\", \"meg\"])\n", + "print(raw_copy.get_data().shape)\n", + "raw_copy.info" + ] + }, + { + "cell_type": "markdown", + "id": "84739775", + "metadata": {}, + "source": [ + "\n", + "**Exercise:** Select the EEG and stimulus channels simultaneously, and verify that only these channels remain." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "59e0cc90", + "metadata": {}, + "outputs": [], + "source": [ + "raw_copy = raw.copy()\n", + "\n", + "## CODE GOES HERE\n", + "raw_copy.pick(\"stim\")\n", + "print(raw_copy.get_data().shape)\n", + "raw_copy.info" + ] + }, + { + "cell_type": "markdown", + "id": "57ff82a2", + "metadata": {}, + "source": [ + "**Exercise:** Select three channels of your choice by specifying their names, and verify that only these channels remain.\n", + "\n", + "*Hint:* channel names are stored in the `ch_names` attribute of the `Raw` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2fc3b959", + "metadata": {}, + "outputs": [], + "source": [ + "raw_copy = raw.copy()\n", + "\n", + "## CODE GOES HERE\n", + "raw_copy.pick(raw_copy.ch_names[:3])\n", + "print(raw_copy.get_data().shape)\n", + "print(raw_copy.ch_names)\n", + "raw_copy.info" + ] + }, + { + "cell_type": "markdown", + "id": "4519d91e", + "metadata": {}, + "source": [ + "To proceed, we will select only the EEG data from the original `Raw` object.\n", + "\n", + "Having isolated the EEG data, we can visualise it using the [`plot()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.plot) method.\n", + "\n", + "Navigate through the different channels using the up and down arrow keys, and navigate through the timepoints using the left and right arrow keys.\n", + "\n", + "The home key reduces the time window displayed, and the end key increases the time window displayed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e2b1a4fa", + "metadata": {}, + "outputs": [], + "source": [ + "# Select EEG data\n", + "raw_eeg = raw.copy().pick(\"eeg\")\n", + "\n", + "# Plot EEG data\n", + "raw_eeg.plot(scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "id": "2f29a9af", + "metadata": {}, + "source": [ + "We can also plot the locations of the EEG sensors using the [`plot_sensors()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.plot_sensors) method.\n", + "\n", + "Why is one of the channels red?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0e36e7d3", + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "# Plot EEG sensors\n", + "raw_eeg.plot_sensors(show_names=True);\n", + "\n", + "%matplotlib widget" + ] + }, + { + "cell_type": "markdown", + "id": "fbe0e7f4", + "metadata": {}, + "source": [ + "If you go to the end of the recording, you will see that is has a duration of ~280 seconds.\n", + "\n", + "You can find the exact end time using the `times` attribute of the `Raw` object, which contains the times of each sample in the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2904d6e2", + "metadata": {}, + "outputs": [], + "source": [ + "# Show how many timepoints are in the data\n", + "print(f\"{raw_eeg.times.shape[0]} timepoints in the data\")\n", + "\n", + "# Take the last timepoint to show the duration of the data\n", + "print(f\"{raw_eeg.times[-1] :.0f} seconds of data\")" + ] + }, + { + "cell_type": "markdown", + "id": "c049907a", + "metadata": {}, + "source": [ + "Just like we were able to select only a specific set of channels, we can also select only a specific window of time using the [`crop()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.crop) method.\n", + "\n", + "Below, we omit the first 10 seconds of the recording (as for `pick()`, this modifies the object in-place)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a1ad3dd0", + "metadata": {}, + "outputs": [], + "source": [ + "# Units of time should be in seconds\n", + "raw_eeg.crop(tmin=10, tmax=None)" + ] + }, + { + "cell_type": "markdown", + "id": "e2ded8c3", + "metadata": {}, + "source": [ + "**Exercise:** Verify that the duration of the recording has been reduced by 10 seconds by plotting the data or using the number of timepoints." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f6618fed", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "print(f\"{raw_eeg.times[-1] :.0f} seconds of data\")" + ] + }, + { + "cell_type": "markdown", + "id": "a6ffd302", + "metadata": {}, + "source": [ + "**Exercise:** Omit the last 10 seconds of the same object, and verify that the duration of the recording has been reduced by a further 10 seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "84772f27", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "duration = raw_eeg.times[-1]\n", + "raw_eeg.crop(tmin=0, tmax=duration - 10)\n", + "print(f\"{raw_eeg.times[-1] :.0f} seconds of data\")" + ] + }, + { + "cell_type": "markdown", + "id": "9d565cad", + "metadata": {}, + "source": [ + "**Exercise:** Select only the time between 30 and 60 seconds, and verify that the duration of the recording is now 30 seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e496b76c", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_eeg.crop(tmin=30, tmax=60)\n", + "print(f\"{raw_eeg.times[-1] :.0f} seconds of data\")\n", + "raw_eeg.plot();" + ] + }, + { + "cell_type": "markdown", + "id": "d66e14b8", + "metadata": {}, + "source": [ + "With this brief overview of `Raw` objects, we have seen how timeseries data can loaded and manipulated in MNE.\n", + "\n", + "Generally, timeseries data is loaded from disk using one of the many `mne.read_raw_xxx()` functions tailored to specific data formats, like we have done above.\n", + "\n", + "See also the [MNE-BIDS](https://mne.tools/mne-bids/stable/index.html) package for loading data in the BIDS format.\n", + "\n", + "However, it is sometimes also useful to create `Raw` objects from arrays directly, which we will explore below." + ] + }, + { + "cell_type": "markdown", + "id": "6db6258f", + "metadata": {}, + "source": [ + "### Part 2 - Creating `RawArray` objects from data arrays\n", + "\n", + "Rather than using `Raw` objects to store data from arrays, we must instead use the [`RawArray`](https://mne.tools/stable/generated/mne.io.RawArray.html) class.\n", + "\n", + "But first, this requires having some data to store!\n", + "\n", + "Below we randomly generate some data consisting of 3 channels and 1,000 timepoints. Remember, MNE expects timeseries data to have shape `(channels, times)`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0ba02d9e", + "metadata": {}, + "outputs": [], + "source": [ + "# Define parameters for generating data\n", + "n_channels = 3\n", + "n_times = 1000 # samples\n", + "np.random.seed(44) # set seed for consistency\n", + "\n", + "# Generate the data\n", + "data = np.random.randn(n_channels, n_times)\n", + "print(f\"Data has shape: {data.shape} (channels, times)\")" + ] + }, + { + "cell_type": "markdown", + "id": "cddeb137", + "metadata": {}, + "source": [ + "If we want to store this data in a `RawArray` object, we need to also specify the metadata, so that MNE can keep track of what the data represents.\n", + "\n", + "This information is stored as an [`mne.Info`](https://mne.tools/stable/generated/mne.Info.html) object, which we create using the [`mne.create_info()`](https://mne.tools/stable/generated/mne.create_info.html) function.\n", + "\n", + "For this, we need to specify:\n", + "- the names of the channels - `ch_names` parameter\n", + "- the types of the channels - `ch_types` parameter\n", + "- the sampling frequency - `sfreq` parameter" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b2faff92", + "metadata": {}, + "outputs": [], + "source": [ + "# Create the data information\n", + "info = mne.create_info(ch_names=[\"CH_1\", \"CH_2\", \"CH_3\"], ch_types=\"eeg\", sfreq=100)\n", + "\n", + "# Show what is stored in the information object\n", + "print(f\"Channel names: {info['ch_names']}\")\n", + "info" + ] + }, + { + "cell_type": "markdown", + "id": "2129fa5e", + "metadata": {}, + "source": [ + "As you can see, we have created an `Info` object for 3 EEG channels. We arbitrarily set the sampling frequency to 100 Hz (with 1,000 samples, this corresponds to 10 seconds of data)." + ] + }, + { + "cell_type": "markdown", + "id": "c6a92807", + "metadata": {}, + "source": [ + "**Exercises - Creating `RawArray` objects from arrays**\n", + "\n", + "We can also specify the types of each channel separately using the `ch_types` argument.\n", + "\n", + "**Exercise:** Create an `Info` object for 3 EEG channels, specifying the type of each channel separately." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "44e87423", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info = mne.create_info(ch_names=[\"CH_1\", \"CH_2\", \"CH_3\"], ch_types=[\"eeg\", \"eeg\", \"eeg\"], sfreq=100)\n", + "info" + ] + }, + { + "cell_type": "markdown", + "id": "a0cf65af", + "metadata": {}, + "source": [ + "**Exercise:** Create an `Info` object for 3 channels, where the first is EEG, the second a gradiometer, and the third a magnetometer." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7a91f535", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info = mne.create_info(\n", + " ch_names=[\"CH_1\", \"CH_2\", \"CH_3\"], ch_types=[\"eeg\", \"grad\", \"mag\"], sfreq=100\n", + ")\n", + "info" + ] + }, + { + "cell_type": "markdown", + "id": "ee132d5e", + "metadata": {}, + "source": [ + "As you may have noticed, specific bits of information can be accessed from the `Info` object in the same way you would access information from a [dictionary](https://docs.python.org/3/tutorial/datastructures.html#dictionaries).\n", + "\n", + "**Exercise:** Get the sampling frequency from the `Info` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "360aa365", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info[\"sfreq\"]" + ] + }, + { + "cell_type": "markdown", + "id": "e3880447", + "metadata": {}, + "source": [ + "**Exercise:** Get the channel names from the `Info` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ea627cb5", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info[\"ch_names\"]" + ] + }, + { + "cell_type": "markdown", + "id": "ba1f1300", + "metadata": {}, + "source": [ + "Using an `Info` object where all 3 channels are EEG, we can now create a `RawArray` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "81e53afb", + "metadata": {}, + "outputs": [], + "source": [ + "# Create data information\n", + "info = mne.create_info(ch_names=[\"CH_1\", \"CH_2\", \"CH_3\"], ch_types=\"eeg\", sfreq=100)\n", + "\n", + "# Store the data and information in a RawArray object\n", + "raw = mne.io.RawArray(data=data, info=info)\n", + "\n", + "# Show what is stored in the RawArray object\n", + "raw.info" + ] + }, + { + "cell_type": "markdown", + "id": "34f7bdc6", + "metadata": {}, + "source": [ + "Although the `RawArray` object is of a different class to the `Raw` object we were working with before, is still supports the same methods, e.g. [`pick()`](https://mne.tools/stable/generated/mne.io.RawArray.html#mne.io.RawArray.pick), [`crop()`](https://mne.tools/stable/generated/mne.io.RawArray.html#mne.io.RawArray.crop), [`plot()`](https://mne.tools/stable/generated/mne.io.RawArray.html#mne.io.RawArray.plot), etc...\n", + "\n", + "Here we again use [`plot()`](https://mne.tools/stable/generated/mne.io.RawArray.html#mne.io.RawArray.plot) to visualise the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7c390f91", + "metadata": {}, + "outputs": [], + "source": [ + "raw.plot(scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "id": "dc7a7a28", + "metadata": {}, + "source": [ + "The `Info` object we created is now stored under the `info` attribute of the `RawArray` object.\n", + "\n", + "**Exercise:** Get the sampling frequency from the `info` attribute of the `RawArray` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57261fa1", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.info[\"sfreq\"]" + ] + }, + { + "cell_type": "markdown", + "id": "48ba0a80", + "metadata": {}, + "source": [ + "**Exercise:** Get the channel names from the `info` attribute of the `RawArray` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c1aaeaad", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.info[\"ch_names\"]" + ] + }, + { + "cell_type": "markdown", + "id": "8e7bba4e", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "`Raw` class objects are one of the most heavily used items in the MNE package, being the way in which timeseries data is stored.\n", + "\n", + "They can be created easily from data stored on the disk (`mne.read_raw_xxx()` -> `Raw`), or from arrays (`array` -> `RawArray`).\n", + "\n", + "Here we have covered the very basic aspects of handling timeseries data in MNE. In later notebooks, we will explore the more advanced features of `Raw` objects, including filtering activity in particular frequency ranges, and computing the power spectral densities of data." + ] + }, + { + "cell_type": "markdown", + "id": "de913f10", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on `Raw` objects: https://mne.tools/stable/auto_tutorials/raw/10_raw_overview.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/workshops/mne_course/Day 1/2 - Epochs.ipynb b/workshops/mne_course/Day 1/2 - Epochs.ipynb new file mode 100644 index 0000000..da7f5f8 --- /dev/null +++ b/workshops/mne_course/Day 1/2 - Epochs.ipynb @@ -0,0 +1,938 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "id": "a7f32110-e046-4c35-94cd-dfc8aeda9c9d", + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne\n", + "import numpy as np" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Epoching timeseries data in MNE - the `Epochs` class\n", + "\n", + "For many analyses, it is useful to divide timeseries data into discrete chunks of time, called [epochs](https://mne.tools/stable/documentation/glossary.html#term-epochs).\n", + "\n", + "Epochs can take the form of individuals trials (e.g. isolating data around a given stimulus or behaviour), or divide continuous resting-state data into discrete chunks.\n", + "\n", + "Epochs are stored in MNE as [`mne.Epochs`](https://mne.tools/stable/generated/mne.Epochs.html) objects." + ] + }, + { + "cell_type": "markdown", + "id": "aa1dc34a", + "metadata": {}, + "source": [ + "### Part 1 - Epoching data from events\n", + "\n", + "To explore how we can create epochs around events (e.g. stimulus presentation, behaviour), we will reload the example dataset and isolate the EEG data and stimulus channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "96f4425a", + "metadata": {}, + "outputs": [], + "source": [ + "# Load sample data from disk\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "\n", + "# Select only EEG and stimulus channels\n", + "raw.pick([\"eeg\", \"stim\"])\n", + "raw.info" + ] + }, + { + "cell_type": "markdown", + "id": "60f0dfc7", + "metadata": {}, + "source": [ + "Stimulus channels contain information about e.g. when stimuli were presented to subjects, when subjects performed an action, etc...\n", + "\n", + "We can use the `plot()` method to visualise how stimulus data is stored in the `Raw` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "704fbe44", + "metadata": {}, + "outputs": [], + "source": [ + "# Plot data of stimulus channels\n", + "raw.copy().pick(\"stim\").plot();" + ] + }, + { + "cell_type": "markdown", + "id": "e5c6252f", + "metadata": {}, + "source": [ + "The [`mne.find_events()`](https://mne.tools/stable/generated/mne.find_events.html) function can be used to convert this information into discrete timepoints based on changes in the signal.\n", + "\n", + "[Events](https://mne.tools/stable/documentation/glossary.html#term-events) are stored as an array of shape `(events, 3)`, where:\n", + "- the first column is the timepoint of the event (in samples)\n", + "- the second column is the previous type of the event \n", + "- the third column is the new type of the event\n", + "\n", + "An event ID of `0` corresponds to the absence of an event, and event IDs > `0` are stimuli/responses." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5fd3018a", + "metadata": {}, + "outputs": [], + "source": [ + "# Find the events from a given stimulus channel\n", + "events = mne.find_events(raw, stim_channel=\"STI 014\")\n", + "\n", + "# Print a subset of events\n", + "events[:5]" + ] + }, + { + "cell_type": "markdown", + "id": "731a5ada", + "metadata": {}, + "source": [ + "Using these events, we can now create an `Epochs` object.\n", + "\n", + "If we already have a `Raw` object, this is simply a case of passing the `Raw` object and the events array to the `Epochs` class.\n", + "\n", + "Here, we create epochs for all events with an ID > `0`, taking the data from 1 second before to 1 second after each event using the `tmin` and `tmax` parameters (times are relative to the timings of events)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "575fd461", + "metadata": {}, + "outputs": [], + "source": [ + "# Epoch timeseries data from event markers\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=1)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "90c809ed", + "metadata": {}, + "source": [ + "As you can see:\n", + "- this data has 320 events across all event types.\n", + "- we have selected data in the [-1, +1] second window around each event.\n", + "- each epoch was baseline-corrected using the time from the start of each epoch to the event itself (0 seconds)." + ] + }, + { + "cell_type": "markdown", + "id": "2b390df1", + "metadata": {}, + "source": [ + "Similarly to `Raw` objects, we can visualise the data stored in `Epochs` objects using the [`plot()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.plot) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "46a73297", + "metadata": {}, + "outputs": [], + "source": [ + "# Plot first 3 epochs in the data\n", + "epochs.plot(scalings=\"auto\", n_epochs=3);" + ] + }, + { + "cell_type": "markdown", + "id": "29c36081", + "metadata": {}, + "source": [ + "As for `Raw` objects, the data itself can be accessed using the [`get_data()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.get_data) method, which returns an array of shape `(epochs, channels, times)`.\n", + "\n", + "Having the epochs as the first dimension is convenient for iterating over the data of each epoch, e.g.\n", + "```python\n", + " data = epochs.get_data()\n", + " for epoch_data in data:\n", + " ### Do something with the data of a single epoch...\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c91eccbc", + "metadata": {}, + "outputs": [], + "source": [ + "# Get data as an array\n", + "data = epochs.get_data(copy=False)\n", + "print(f\"Data has shape: {data.shape} (epochs, channels, times)\")" + ] + }, + { + "cell_type": "markdown", + "id": "b5b64d1c", + "metadata": {}, + "source": [ + "**Exercises - Creating epochs around events**\n", + "\n", + "**Exercise:** Create epochs around all events in the window [-2, +2] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "78def82c", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-2, tmax=2)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "933227cb", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around all events in the window [-1, +3] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b8666842", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=3)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "1dfecefc", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around all events in the window [-0.5, +0.5] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d3fe6ce9", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-0.5, tmax=0.5)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "1f35f53e", + "metadata": {}, + "source": [ + "Baseline-correction of epochs involves taking the mean of a given data period and subtracting this value from each data point of the whole epoch.\n", + "\n", + "The `baseline` parameter of `Epochs` is used to control which period is used for baseline correction.\n", + "\n", + "The default when creating an `Epochs` object in MNE is to take the period from the start of the epoch to the event itself as the baseline period, specified as `baseline=(None, 0)`.\n", + "\n", + "Like for `tmin` and `tmax`, the times in `baseline` are relative to the events.\t" + ] + }, + { + "cell_type": "markdown", + "id": "cd87de7d", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around all events in the window [-1, +1], but only use the window [-0.5, 0] seconds as a baseline." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "442ede0e", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=1, baseline=(-0.5, 0))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "131574f5", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around all events in the window [-1, +2], but only use the window [0, +2] seconds as a baseline.\n", + "\n", + "There are two ways you can specify this, so try to include them both." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9c61e8c6", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=2, baseline=(0, 2))\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=2, baseline=(0, None))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "22251a6d", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around all events in the window [-1, +2], and use this whole period as a baseline.\n", + "\n", + "There are again two ways you can specify this, so try to include them both." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0b2f262e", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=2, baseline=(-1, 2))\n", + "epochs = mne.Epochs(raw=raw, events=events, tmin=-1, tmax=2, baseline=(None, None))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "b48e0647", + "metadata": {}, + "source": [ + "Above, we have been creating epochs around all events, however we may wish to only create epochs around a single type of event.\n", + "\n", + "Epochs for only particular event types can be specified using the `event_id` parameter of `Epochs`." + ] + }, + { + "cell_type": "markdown", + "id": "e465a10c", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around only the events with an ID of `1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7ce811db", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, event_id=1)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "db1b749f", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around only the events with an ID of `2`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "60bc6332", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, event_id=2)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "ac6f1ad7", + "metadata": {}, + "source": [ + "**Exercise:** Create epochs around only the events with IDs of `1`, `2`, and `3`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "13794689", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, event_id=[1, 2, 3])\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "1c576703", + "metadata": {}, + "source": [ + "As you can see, the `Epochs` object of MNE is a very convenient way to create epochs of a given duration, with a given baseline, around specific stimuli/behaviours.\n", + "\n", + "However, we can also create continuous epochs unrelated to any events, such as you would do for an analysis of resting-state data (i.e. no stimuli, no behaviour)." + ] + }, + { + "cell_type": "markdown", + "id": "83fc2446", + "metadata": {}, + "source": [ + "### Part 2 - Creating continuous epochs of data\n", + "\n", + "Continuous epochs can be created easily from `Raw` object using the [`mne.make_fixed_length_epochs()`](https://mne.tools/stable/generated/mne.make_fixed_length_epochs.html) function.\n", + "\n", + "This creates an `Epochs` object with epochs of a specified duration directly from a `Raw` object." + ] + }, + { + "cell_type": "markdown", + "id": "dbdca124", + "metadata": {}, + "source": [ + "Here we create an `Epochs` object with 1-second-long epochs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0a577294", + "metadata": {}, + "outputs": [], + "source": [ + "# Create continuous epochs\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=1)\n", + "epochs.plot(scalings=\"auto\", n_epochs=3)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "2afbfb89", + "metadata": {}, + "source": [ + "**Exercises - Creating continuous epochs (specifying the duration)**\n", + "\n", + "**Exercise:** Create an `Epochs` object with 2-second-long epochs and verify the length of epochs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "37a03ae6", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=2)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "0b8d62a1", + "metadata": {}, + "source": [ + "**Exercise:** Create an `Epochs` object with 4-second-long epochs and verify the length of epochs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1aea4ac2", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=4)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "5b6f28fb", + "metadata": {}, + "source": [ + "Continuous epochs do not need to contain data for unique windows of data.\n", + "\n", + "We can artificially increase the amount of data available by having overlapping epochs. By default, there is no overlap between epochs.\n", + "\n", + "Here, we create 1-second-long epochs that have an overlap of 0.5 seconds (50% overlap).\n", + "\n", + "How does the number of epochs compare to that when there was no overlap?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4ed078cc", + "metadata": {}, + "outputs": [], + "source": [ + "# Create continuous epochs with overlap\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=1, overlap=0.5)\n", + "epochs.plot(scalings=\"auto\", n_epochs=3)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "7a4fd0c6", + "metadata": {}, + "source": [ + "**Exercises - Creating continuous epochs (specifying the overlap)**\n", + "\n", + "**Exercise:** Create 2-second-long epochs with 1 second overlap." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4f216e9f", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=2, overlap=1)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "d2345edb", + "metadata": {}, + "source": [ + "**Exercise:** Create 4-second-long epochs with 25% overlap." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "da4a54a3", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=4, overlap=1)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "4efc1790", + "metadata": {}, + "source": [ + "You may have noticed that when creating `Epochs` in this way, we do not specify the baseline settings.\n", + "\n", + "Inspecting the `Epochs` shows that `baseline` is set to `\"off\"`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "613dee61", + "metadata": {}, + "outputs": [], + "source": [ + "# Create continuous epochs\n", + "epochs = mne.make_fixed_length_epochs(raw=raw, duration=2)\n", + "\n", + "# Show information about the epochs\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "b407218a", + "metadata": {}, + "source": [ + "This is no problem, as we can use the [`apply_baseline()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.apply_baseline) method of the `Epochs` object to do this." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f27d0f8f", + "metadata": {}, + "outputs": [], + "source": [ + "# Baseline-correct the epochs\n", + "epochs.apply_baseline(baseline=(None, None))" + ] + }, + { + "cell_type": "markdown", + "id": "66b95a1d", + "metadata": {}, + "source": [ + "Above, we added a baseline based on the whole epoch duration.\n", + "\n", + "But what would happen if we set the baseline to the [-1, 0] second window?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "504d9761", + "metadata": {}, + "outputs": [], + "source": [ + "try:\n", + " epochs.apply_baseline(baseline=(-1, 0))\n", + "except ValueError as error:\n", + " print(f\"ValueError: {error}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7879bfc0", + "metadata": {}, + "source": [ + "We get an error!\n", + "\n", + "This is because `Epochs` created from `make_fixed_length_epochs()` always start at 0 seconds and end at the epoch's duration (in this case, 2 seconds)." + ] + }, + { + "cell_type": "markdown", + "id": "b15ff849", + "metadata": {}, + "source": [ + "**Exercises - Baseline-correcting `Epochs` object**\n", + "\n", + "**Exercise:** Apply a baseline to the first second of the `Epochs` object.\n", + "\n", + "There are two ways you can specify this, so try to include them both." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "770e37de", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.apply_baseline(baseline=(0, 1))\n", + "epochs.apply_baseline(baseline=(None, 1))" + ] + }, + { + "cell_type": "markdown", + "id": "c2cd80fb", + "metadata": {}, + "source": [ + "**Exercise:** Apply a baseline to the last second of the `Epochs` object.\n", + "\n", + "Again, there are two ways you can specify this, so try to include them both." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c385f1f8", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.apply_baseline(baseline=(1, epochs.times[-1]))\n", + "epochs.apply_baseline(baseline=(1, None))" + ] + }, + { + "cell_type": "markdown", + "id": "ff11f537", + "metadata": {}, + "source": [ + "As you can see, MNE has plenty of tools for creating epochs of data, either around event markers or as continuous segments of data." + ] + }, + { + "cell_type": "markdown", + "id": "cc897763", + "metadata": {}, + "source": [ + "### Part 3 - Creating `Epochs` from arrays\n", + "\n", + "Just like for `Raw` objects, we can also create `Epochs` objects from data arrays. Specifically, we create [`mne.EpochsArray`](https://mne.tools/stable/generated/mne.EpochsArray.html) objects.\n", + "\n", + "Again, this requires that we provide some metadata so that MNE can keep track of what the data represents. This is also done as an [`Info`](https://mne.tools/stable/generated/mne.Info.html) object.\n", + "\n", + "Below, we randomly generate some data of 3 channels and 1,000 timepoints, then reshape this into 10 1-second-long epochs.\n", + "\n", + "Remember that MNE expects epoched data to have shape `(epochs, channels, times)`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0fc18a2a", + "metadata": {}, + "outputs": [], + "source": [ + "# Define parameters for generating data\n", + "n_channels = 3\n", + "n_epochs = 10\n", + "n_times = 1000 # samples\n", + "n_times_per_epoch = n_times // n_epochs # samples\n", + "np.random.seed(44) # set seed for consistency\n", + "\n", + "# Generate the data\n", + "data = np.random.randn(n_channels, n_times)\n", + "data = np.reshape(data, (n_channels, n_epochs, n_times_per_epoch))\n", + "data = data.transpose((1, 0, 2))\n", + "print(f\"Data has shape: {data.shape} (epochs, channels, times)\")" + ] + }, + { + "cell_type": "markdown", + "id": "4a7ad916", + "metadata": {}, + "source": [ + "**Exercises - Creating `Epochs` from arrays**\n", + "\n", + "**Exercise:** Create an `Info` object for this data using the [`mne.create_info()`](https://mne.tools/stable/generated/mne.create_info.html) function.\n", + "\n", + "Recall that we need to specify:\n", + "- the names of the channels - `ch_names` parameter\n", + "- the types of the channels - `ch_types` parameter\n", + "- the sampling frequency - `sfreq` parameter\n", + "\n", + "Create the `Info` object for the 3 channels with: names of your choice; of type EEG; and a sampling frequency of 100 Hz." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "22f74b1e", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info = mne.create_info(ch_names=[\"CH_1\", \"CH_2\", \"CH_3\"], ch_types=\"eeg\", sfreq=100)\n", + "info" + ] + }, + { + "cell_type": "markdown", + "id": "451bed6a", + "metadata": {}, + "source": [ + "We can then pass the data array and the `Info` object to the `EpochsArray` class." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3adf6e6a", + "metadata": {}, + "outputs": [], + "source": [ + "# Store the data and information in an EpochsArray object\n", + "epochs = mne.EpochsArray(data=data, info=info)\n", + "\n", + "# Show what is stored in the EpochsArray object\n", + "epochs.plot(scalings=\"auto\", n_epochs=3)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "c091ad0f", + "metadata": {}, + "source": [ + "Just like an `Epochs` object, we can specify the baseline correction to apply (default is no baseline correction)." + ] + }, + { + "cell_type": "markdown", + "id": "8aa75aec", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EpochsArray` object from the data, with baseline correction for the whole epoch duration." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "48b61bd0", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.EpochsArray(data=data, info=info, baseline=(None, None))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "ee3f1dfc", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EpochsArray` object from the data, with baseline correction for the first half of each epoch." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "42dcc9e1", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.EpochsArray(data=data, info=info, baseline=(None, 0.5))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "70868db1", + "metadata": {}, + "source": [ + "Like when we created continuous epochs, the first sample of each epoch is considered to be 0 seconds.\n", + "\n", + "We can change this when creating the `EpochsArray` object, similarly to how we specified `tmin` when creating `Epochs` objects from `Raw` objects." + ] + }, + { + "cell_type": "markdown", + "id": "8d19db0f", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EpochsArray` object from the data with times in the window [-1, 0] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d8091413", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.EpochsArray(data=data, info=info, tmin=-1)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "59115177", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EpochsArray` object from the data with times in the window [-0.5, +0.5] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3332c43b", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.EpochsArray(data=data, info=info, tmin=-0.5)\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "a67f096f", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EpochsArray` object from the data with times in the window [-0.5, +0.5] seconds, and baseline correct it for the first half of the epochs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "45bb4c77", + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.EpochsArray(data=data, info=info, tmin=-0.5, baseline=(None, 0))\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "id": "f988bb9a", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "Alongside `Raw` objects, `Epochs` objects are some of the most heavily used parts of MNE, storing segments of data around experimentally-relevent events, or fixed-length chunks of continuous data.\n", + "\n", + "They can be created from `Raw` objects (`Raw` -> `Epochs`), or from arrays (`array` -> `EpochsArray`).\n", + "\n", + "In the upcoming notebooks, we will continue to build on this foundation for working with epochs as we explore different forms of data analysis." + ] + }, + { + "cell_type": "markdown", + "id": "0aefa91c", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on `Epochs` objects: https://mne.tools/stable/auto_tutorials/epochs/10_epochs_overview.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/workshops/mne_course/Day 2/1 - Exploring Data.ipynb b/workshops/mne_course/Day 2/1 - Exploring Data.ipynb new file mode 100644 index 0000000..78a7787 --- /dev/null +++ b/workshops/mne_course/Day 2/1 - Exploring Data.ipynb @@ -0,0 +1,458 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Exploring data with MNE\n", + "\n", + "The purpose of this exercise is to get you more comfortable with the MNE API, as well as Python programming more generally.\n", + "\n", + "Like yesterday, we will look at the MNE sample data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sample_data_folder = mne.datasets.sample.data_path()\n", + "sample_data_folder" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Open the folder and explore its contents." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Windows\n", + "!explorer {sample_data_folder}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Mac\n", + "!open {sample_data_folder}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Load the data as a `Raw` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw = mne.io.read_raw_fif(\n", + " os.path.join(sample_data_folder, \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the following tasks, use the MNE API as a resource for finding out how to use the following methods: https://mne.tools/stable/generated/mne.io.Raw.html" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Transforming and slicing the data\n", + "\n", + "**N.B.** You may want to create a copy of the data before modifying it." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.crop()`: take only the data between 0 and 180 seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().crop(tmin=0, tmax=180)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.pick()`: select only the EEG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().pick(picks=\"eeg\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.pick()`: select only the MEG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().pick(picks=[\"grad\", \"mag\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.pick()`: select the MEG and EOG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().pick(picks=[\"grad\", \"mag\", \"eog\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.rename_channels()`: rename the channel \"EOG 061\" to \"blink_detector\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().rename_channels(mapping={\"EOG 061\": \"blink_detector\"})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.time_as_index()`: find the index of the sample in the data which occurs closest to 1 second." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.time_as_index(times=1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.time_as_index()`: find the index of the samples in the data which occur closest to 0, 3, and 6 seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.time_as_index(times=[0, 3, 6])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - slicing the data**\n", + "\n", + "There are various ways to extract data from `Raw` objects, as shown in this overview: https://mne.tools/stable/auto_tutorials/raw/10_raw_overview.html#summary-of-ways-to-extract-data-from-raw-objects\n", + "\n", + "Let's try some of them out!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Get the first 100 samples of data from the channel \"MEG 0113\"." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "data = raw.get_data(picks=\"MEG 0113\")[:, :100]\n", + "print(f\"Shape of data: {data.shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Get all data from the first 5 channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "data = raw.get_data()[:5]\n", + "print(f\"Shape of data: {data.shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Get all data from the channels \"EEG 030\" and \"EOG 061\"." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "data = raw.get_data(picks=[\"EEG 030\", \"EEG 031\"])\n", + "print(f\"Shape of data: {data.shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Get all the data as an array, as well as the times of each sample." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "data, times = raw[:]\n", + "data, times = raw.get_data(return_times=True)\n", + "print(f\"Shape of data: {data.shape}\")\n", + "print(f\"Shape of times: {times.shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - Plotting the data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.plot()`: plot the data of the EEG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.copy().pick(picks=\"eeg\").plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.plot_sensors()`: plot the locations of the EEG channels as a topomap." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.plot_sensors(kind=\"topomap\", ch_type=\"eeg\", show_names=True);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.plot_sensors()`: plot the locations of the EEG channels in 3D." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Makes the plot interactive so that you can change the orientation by dragging\n", + "%matplotlib widget" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw.plot_sensors(kind=\"3d\", ch_type=\"eeg\", show_names=True);" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This final section touches on a fundamental part of electrophysiological signal analysis: power spectral densities (PSDs), showing the information content of signals as a function of frequency.\n", + "\n", + "Spectral analysis will be covered in more detail in the next notebook, but we present a preliminary look at the tools offered in MNE here." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.compute_psd().plot()`: compute and plot the power spectral densities of the EEG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.compute_psd(picks=\"eeg\").plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.compute_psd().plot()`: compute and plot the average power spectral densities of the EEG channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.compute_psd(picks=\"eeg\").plot(average=True);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`Raw.compute_psd().plot_topomap()`: compute and plot the power spectral densities of the EEG channels as a topomap." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw.compute_psd(picks=\"eeg\").plot_topomap();" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.6" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day 2/2 - Filtering and Spectra.ipynb b/workshops/mne_course/Day 2/2 - Filtering and Spectra.ipynb new file mode 100644 index 0000000..7f790f9 --- /dev/null +++ b/workshops/mne_course/Day 2/2 - Filtering and Spectra.ipynb @@ -0,0 +1,819 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne\n", + "import numpy as np\n", + "\n", + "from _helper_functions import plot_psd" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Spectral analysis of data - the `time_frequency` module\n", + "\n", + "Often in neuroscience, we are interested in determining the spectral composition of signals, represented as a power spectral density (PSD).\n", + "\n", + "Furthermore, we may want to isolate the spectral content of a signal in a particular range of frequencies, which involves filtering the data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Filtering data\n", + "\n", + "To examine activity at a limited range of frequencies, we perform spectral filtering.\n", + "\n", + "This can take the form of:\n", + "- Lowpass filtering - retaining information content of a signal below a certain frequency.\n", + "- Highpass filtering - retaining information content of a signal above a certain frequency.\n", + "- Bandpass filtering - retaining information content of a signal within a certain frequency range.\n", + "- Bandstop filtering - retaining information content of a signal outside a certain frequency range.\n", + "\n", + "
\n", + "\"Filter\n", + "\n", + "Credit: [allaboutcircuits.com](https://www.allaboutcircuits.com/technical-articles/low-pass-filter-tutorial-basics-passive-RC-filter/)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For an understanding of how to perform such filtering in MNE, we start by simulating 5 seconds of data sampled at 200 Hz, consisting of sine waves at 5 Hz, 10 Hz, and 20 Hz." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Simulation settings\n", + "duration = 10 # seconds\n", + "sfreq = 200 # sampling rate (Hz)\n", + "\n", + "# Timepoints of the simulated data\n", + "times = np.linspace(start=0, stop=duration, num=sfreq * duration, endpoint=False)\n", + "\n", + "# Simulate data as sine waves of given frequencies\n", + "chan_1 = np.sin(2 * np.pi * times * 5) # 5 Hz signal\n", + "chan_2 = np.sin(2 * np.pi * times * 10) # 10 Hz signal\n", + "chan_3 = np.sin(2 * np.pi * times * 20) # 20 Hz signal\n", + "\n", + "# Combine channels into a single array\n", + "data = np.array([chan_1, chan_2, chan_3])\n", + "ch_names = [\"5Hz\", \"10Hz\", \"20Hz\"] # channel names" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To play around with the data, let us first store it in a `Raw` object." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Spectral filtering**\n", + "\n", + "**Exercise:** Create an [`Info`](https://mne.tools/stable/generated/mne.Info.html) object for the 3 channels, setting the channel types to be EEG, and using the sampling frequency we specified above.\n", + "\n", + "Afterwards, use the `data` array and the `Info` object to create a [`RawArray`](https://mne.tools/stable/generated/mne.io.RawArray.html) object for the signals, called `raw`.\n", + "\n", + "*Hint:* use the [`create_info()`](https://mne.tools/stable/generated/mne.create_info.html) function to create the `Info` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info = mne.create_info(ch_names=ch_names, sfreq=sfreq, ch_types=\"eeg\")\n", + "raw = mne.io.RawArray(data=data, info=info)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now easily plot the data.\n", + "\n", + "Count the number of cycles in each channel per second. Do they match our expectations?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# The object containing the signals should be called `raw`\n", + "raw.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Computing power spectra - a brief introduction\n", + "\n", + "We can easily compute the power spectral density of channels in `Raw` objects.\n", + "\n", + "This is done by calling the [`compute_psd()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.compute_psd) method.\n", + "\n", + "`compute_psd()` returns a [`mne.time_frequency.Spectrum`](https://mne.tools/stable/generated/mne.time_frequency.Spectrum.html#mne.time_frequency.Spectrum) object containing the power spectra." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute PSD of the data\n", + "spectrum = raw.compute_psd(fmax=30)\n", + "spectrum" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can plot the PSD using the [`plot()`](https://mne.tools/stable/generated/mne.time_frequency.Spectrum.html#mne.time_frequency.Spectrum.plot) method of the `Spectrum` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Plot the PSD\n", + "spectrum.plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, there are distinct peaks in the power spectrum at 5, 10, and 20 Hz.\n", + "\n", + "We will explore PSD computation in more detail below after examining spectral filtering." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Spectral filtering\n", + "\n", + "Let us now look at how we can filter the data.\n", + "\n", + "##### Lowpass, highpass, bandpass, and bandstop filtering\n", + "\n", + "Lowpass, highpass, bandpass, and bandstop filtering is most easily done using the [`filter()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.filter) method of `Raw` objects.\n", + "\n", + "Frequencies to filter using the `filter()` method are specified using the `l_freq` and `h_freq` parameters:\n", + "- `l_freq` specifies the lowest frequency of information to retain (in Hz).\n", + "- `h_freq` specifies the highest frequency of information to retain (in Hz).\n", + "\n", + "
\n", + "\n", + "In this way:\n", + "- specifying only `l_freq` highpass filters the data.\n", + "- specifying only `h_freq` lowpass filters the data.\n", + "- specifying `l_freq` to be lower than `h_freq` bandpass filters the data.\n", + "- specifying `l_freq` to be higher than `h_freq` bandstop filters the data.\n", + "\n", + "
\n", + "\"Filter\n", + "\n", + "Adapted from: [allaboutcircuits.com](https://www.allaboutcircuits.com/technical-articles/low-pass-filter-tutorial-basics-passive-RC-filter/)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The example below shows how to highpass filter the data to remove the 5 Hz activity.\n", + "\n", + "**N.B.** Note that the `Raw` object is copied so that the original data is not modified." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Copy to preserve original data\n", + "raw_copy = raw.copy()\n", + "\n", + "# Highpass filter at 8 Hz to exclude 5 Hz activity\n", + "raw_copy.filter(l_freq=8, h_freq=None)\n", + "\n", + "# Compute the PSD of the new data and plot it\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "\n", + "# View the filtered data\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, the activity in the 5 Hz channel has been removed, along with the peak in the power spectrum at 5 Hz.\n", + "\n", + "On the other hand, the 10 and 20 Hz activity remains." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Lowpass, highpass, bandpass, and bandstop filtering**\n", + "\n", + "**Exercise:** Lowpass filter the data at 15 Hz to remove the 20 Hz activity, then plot the PSD and raw data to confirm that only the 20 Hz activity has been removed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.filter(l_freq=None, h_freq=15)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Lowpass filter the data at 8 Hz to remove the 10 and 20 Hz activity, then plot the PSD and raw data to confirm that only the 5 Hz activity remains." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.filter(l_freq=None, h_freq=8)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Bandstop filter the data between 8 and 15 Hz to remove the 10 Hz activity, then plot the PSD and raw data to confirm that only the 10 Hz activity has been removed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.filter(l_freq=15, h_freq=8)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Bandpass filter the data between 8 and 15 Hz to remove the 5 and 20 Hz activity, then plot the PSD and raw data to confirm that only the 10 Hz activity remains." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.filter(l_freq=8, h_freq=15)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, the filters from the `filter()` method can operate over a large frequency range.\n", + "\n", + "Sometimes however, the attenuation of a very limited frequency range is desired, for example when removing line noise artefacts, see e.g.:\n", + "- https://pressrelease.brainproducts.com/eeg-artifacts-handling-in-analyzer/#technical\n", + "- https://labeling.ucsd.edu/tutorial/labels\n", + "- https://mne.tools/stable/auto_tutorials/preprocessing/30_filtering_resampling.html#power-line-noise\n", + "\n", + "In these situations, a notch filter is often used." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Notch filtering\n", + "\n", + "Notch filters have their own dedicated [`notch_filter()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.notch_filter) method in the `Raw` object.\n", + "\n", + "Below, we use a notch filter to remove the 5 Hz activity alone." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "raw_copy = raw.copy()\n", + "\n", + "# Apply notch filter at 5 Hz\n", + "raw_copy.notch_filter(freqs=5)\n", + "\n", + "# Plot the PSD and timeseries of the filtered data\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Notch filtering**\n", + "\n", + "**Exercise:** Notch filter the data at 10 Hz, and visualise the PSD and raw data to confirm that only the 10 Hz activity has been removed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.notch_filter(freqs=10)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Notch filter the data at 20 Hz, and visualise the PSD and raw data to confirm that only the 20 Hz activity has been removed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.notch_filter(freqs=20)\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Notch filter the data at 10 and 20 Hz in a single call to the `notch_filter()` method, and visualise the PSD and raw data to confirm that only the 5 Hz activity remains." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_copy = raw.copy()\n", + "raw_copy.notch_filter(freqs=[10, 20])\n", + "raw_copy.compute_psd(fmax=30).plot()\n", + "raw_copy.plot(duration=6, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, MNE provides a number of convenient tools for the spectral filtering of data.\n", + "\n", + "There are many options for specifying filter parameters to fine tune the filters for your needs which are discussed in more depth in the following tutorials:\n", + "- Background on filtering: https://mne.tools/stable/auto_tutorials/preprocessing/25_background_filtering.html\n", + "- Filtering and resampling: https://mne.tools/stable/auto_tutorials/preprocessing/30_filtering_resampling.html" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - Computing power spectral densities\n", + "\n", + "Up until now, we have computed PSDs using the `compute_psd()` method of `Raw` objects.\n", + "\n", + "Note that an equivalent method exists for `Epochs` objects: [`mne.Epochs.compute_psd()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.compute_psd).\n", + "\n", + "The `compute_psd()` methods of `Raw` and `Epochs` objects support PSD computations using the Welch and multitaper methods.\n", + "\n", + "There exist equivalent functions for computing PSDs using the Welch and multitaper methods from arrays of data:\n", + "- [`mne.time_frequency.psd_array_welch()`](https://mne.tools/stable/generated/mne.time_frequency.psd_array_welch.html)\n", + "- [`mne.time_frequency.psd_array_multitaper()`](https://mne.tools/stable/generated/mne.time_frequency.psd_array_multitaper.html)\n", + "\n", + "**N.B.** Performing the PSD computations on arrays requires the sampling frequency of the data (`sfreq` parameter) to be specified." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, using the sample data, we specify the multitaper method to use in `compute_psd()`.\n", + "\n", + "Using the `fmax` parameter, we only return the results until 50 Hz.\n", + "\n", + "We additionally take only the EEG channels and crop to the first 60 seconds to reduce computation time." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "raw.pick(picks=\"eeg\", exclude=\"bads\")\n", + "raw.crop(tmax=60)\n", + "raw.load_data()\n", + "\n", + "# Compute PSD\n", + "spectrum = raw.compute_psd(method=\"welch\", fmax=50, n_fft=2048)\n", + "\n", + "# Plot the PSD\n", + "spectrum.plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can extract the power values and the corresponding frequencies from the `Spectrum` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Extract PSD data\n", + "psd = spectrum.get_data()\n", + "\n", + "# Extract frequencies in the PSD\n", + "freqs = spectrum.freqs\n", + "\n", + "print(f\"PSD data has shape: {psd.shape} # channels x frequencies\")\n", + "print(f\"Frequencies has shape: {freqs.shape} # frequencies\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using a custom function `plot_psd()`, we can verify that these values match those plotted using the `plot()` method of the `Spectrum` object.\n", + "\n", + "For the `plot_psd()` function, we pass in the array of power spectral density values for a set of channels, alongisde the corresponding frequencies." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Plot PSD from arrays with custom function\n", + "plot_psd(psd=psd, freqs=freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Computing PSDs from standalone functions\n", + "\n", + "**Exercises - Computing PSDs**\n", + "\n", + "**Exercise:** Perform the equivalent computation using the `psd_array_welch()` function on the data array extracted from the `Raw` object,\n", + "\n", + "Remember to specify a maximum frequency of 50 Hz and an FFT length of 2,048.\n", + "\n", + "Use the custom `plot_psd()` function to visualise the results.\n", + "\n", + "Do the results match the output of `compute_psd()`?\n", + "\n", + "*Hint:* data can be extracted from `Raw` objects using the [`get_data()`](https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.get_data) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], fmax=50, n_fft=2048\n", + ")\n", + "plot_psd(psd=psd, freqs=freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Again using `psd_array_welch()`, compute the PSDs for the frequency range from 5 Hz onwards (i.e. no 50 Hz limit), and visualise the results.\n", + "\n", + "*Hint:* use the `fmin` parameter to specify the starting frequency." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], fmin=5, n_fft=2048\n", + ")\n", + "plot_psd(psd=psd, freqs=freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Using `psd_array_welch()`, compute the PSDs for the frequency range 5 - 50 Hz, and visualise the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], fmin=5, fmax=50, n_fft=2048\n", + ")\n", + "plot_psd(psd=psd, freqs=freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Indexing frequencies of results\n", + "\n", + "**Exercise:** Using `psd_array_welch()`, compute the PSDs for the entire frequency range, but only plot the results up to 50 Hz.\n", + "\n", + "*Hint:* Use the [`np.where()`](https://numpy.org/doc/stable/reference/generated/numpy.where.html) function to find where the appropriate frequency values are." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], n_fft=2048\n", + ")\n", + "plot_freqs_idx = np.where(freqs <= 50)[0]\n", + "plot_psd(psd=psd[:, plot_freqs_idx], freqs=freqs[plot_freqs_idx])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Using `psd_array_welch()`, compute the PSDs for the entire frequency range, but only plot the results from 5 Hz onwards." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], n_fft=2048\n", + ")\n", + "plot_freqs_idx = np.where(freqs >= 5)[0]\n", + "plot_psd(psd=psd[:, plot_freqs_idx], freqs=freqs[plot_freqs_idx])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Using `psd_array_welch()`, compute the PSDs for the entire frequency range, but only plot the results from 5 - 50 Hz.\n", + "\n", + "*Hint:* Use the form `np.where((condition1) & (condition2))` when you want to index an array based on multiple conditions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "psd, freqs = mne.time_frequency.psd_array_welch(\n", + " x=raw.get_data(), sfreq=raw.info[\"sfreq\"], n_fft=2048\n", + ")\n", + "plot_freqs_idx = np.where((freqs >= 5) & (freqs <= 50))[0]\n", + "plot_psd(psd=psd[:, plot_freqs_idx], freqs=freqs[plot_freqs_idx])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Summary of PSD computation\n", + "\n", + "As you can see, the `compute_psd()` methods of `Raw` and `Epochs` objects are very convenient ways of computing PSDs, with equivalent standalone functions for computations on arrays.\n", + "\n", + "However, MNE also offers tools for more advanced time-frequency analyses based on epoched data. These include time-frequency representations ([TFRs](https://mne.tools/stable/documentation/glossary.html#term-tfr)) based on the multitaper, Morlet wavelet, or Stockwell transformation methods:\n", + "- Multitaper:\n", + " - From `Epochs` objects: [`mne.time_frequency.tfr_multitaper()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_multitaper.html)\n", + " - From arrays: [`mne.time_frequency.tfr_array_multitaper()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_array_multitaper.html)\n", + "- Morlet wavelet:\n", + " - From `Epochs` objects: [`mne.time_frequency.tfr_morlet()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_morlet.html)\n", + " - From arrays: [`mne.time_frequency.tfr_array_morlet()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_array_morlet.html)\n", + "- Stockwell transformation:\n", + " - From `Epochs` objects: [`mne.time_frequency.tfr_stockwell()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_stockwell.html)\n", + " - From arrays: [`mne.time_frequency.tfr_array_stockwell()`](https://mne.tools/stable/generated/mne.time_frequency.tfr_array_stockwell.html)\n", + "\n", + "Time-frequency analyses are discussed in more detail here: https://mne.tools/stable/auto_tutorials/time-freq/20_sensors_time_frequency.html#time-frequency-analysis-power-and-inter-trial-coherence" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 3 - Spectral filtering to remove artefacts\n", + "\n", + "Spectral filtering is not only useful for isolating activity at some frequencies of interest, but it can also be used to remove artefacts from the data.\n", + "\n", + "The ability to remove technical artefacts was previously mentioned in the context of notch filtering line noise, but biological artefacts such as cardiac activity can also be identified using spectral filtering, and subsequently removed (see e.g. https://labeling.ucsd.edu/tutorial/labels)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Cardiac artefacts can be clearly seen in the MEG channels of MNE's sample data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "raw.del_proj() # delete existing PCA projections\n", + "\n", + "# Pick some channels with strong artefacts and plot them\n", + "artefact_picks = [152, 155, 158, 164, 167, 170, 272, 275, 278, 284, 287, 290]\n", + "raw.plot(order=artefact_picks);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If our desire is to analyse neural data, not removing these non-neural artefacts could of course lead to erroneous conclusions.\n", + "\n", + "Thankfully, MNE has a convenient function for doing just that: [`mne.preprocessing.compute_proj_ecg()`](https://mne.tools/stable/generated/mne.preprocessing.compute_proj_ecg.html).\n", + "\n", + "`compute_proj_ecg()` involves:\n", + "- Filtering the data within a given frequency range to isolate the cardiac activity.\n", + "- Finding the peaks of cardiac activity.\n", + "- Creating epochs around these peaks of activity.\n", + "- Using these epochs to create [projection vectors](https://mne.tools/stable/documentation/glossary.html#term-projector) that can be used to minimise the cardiac artefacts in the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Find projections to minimise cardiac artefacts\n", + "projs, _ = mne.preprocessing.compute_proj_ecg(raw=raw)\n", + "\n", + "# Apply projections to the data and plot the cleaned data\n", + "raw.add_proj(projs=projs)\n", + "raw.plot(order=artefact_picks);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An equivalent function exists for removing eye movement artefacts: [`mne.preprocessing.compute_proj_eog()`](https://mne.tools/stable/generated/mne.preprocessing.compute_proj_eog.html)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "Spectral filtering is an important part of many analyses in neuroscience, involving e.g. the extraction of activity at specific frequencies of interest and the removal of artefacts. The `filter()` and `notch_filter()` methods of `Raw` and `Epochs` objects provide convenient ways of performing such filtering, with equivalent standalone functions for working with arrays of data.\n", + "\n", + "Spectral activity can be represented as PSDs, which can be computed using the `compute_psd()` methods of `Raw` and `Epochs` objects, or the equivalent standalone functions for computations on arrays. More advanced spectral analyses are also offered in the form of TFR computations, with MNE's [Time-Frequency module](https://mne.tools/stable/api/time_frequency.html) also offering several other useful tools, such as for computing cross-spectral densities (CSDs)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on spectral analysis: https://mne.tools/stable/auto_tutorials/time-freq/20_sensors_time_frequency.html\n", + "\n", + "MNE tutorial on `Spectrum` and `EpochsSpectrum` classes: https://mne.tools/stable/auto_tutorials/time-freq/10_spectrum_class.html\n", + "\n", + "Video introducing the Fourier transform with some very nice visualisations:\n", + "https://youtu.be/spUNpyF58BY?si=hUC2zG8dG6Zah8tP" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day 2/3 - ICA.ipynb b/workshops/mne_course/Day 2/3 - ICA.ipynb new file mode 100644 index 0000000..06e177b --- /dev/null +++ b/workshops/mne_course/Day 2/3 - ICA.ipynb @@ -0,0 +1,786 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne\n", + "import numpy as np\n", + "import scipy as sp" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data decomposition and artefact removal with independent component analysis (ICA) - the `ICA` class\n", + "\n", + "ICA is a common approach for breaking a set of signals down into the underlying components, with a mixing matrix explaining how the components are combined to form the observed signals.\n", + "\n", + "In practice, this is often used to isolate artefact sources from electrophysiological recordings (cardiac activity, eye movements, stimulation artefacts, etc...) and then reconstruct the signals with these unwanted sources removed.\n", + "\n", + "MNE has a comprehensive toolkit for performing ICA, in particular the [`mne.preprocessing.ICA`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html) class." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Data decomposition with ICA\n", + "\n", + "To explore ICA in MNE, we will generate a set of source signals which we then mix together to form the sensor signals.\n", + "\n", + "The source signals are:\n", + "- A 10 Hz sine wave\n", + "- An 8 Hz sawtooth wave\n", + "- Some randomly generated noise\n", + "\n", + "We create a matrix of random numbers that acts as our mixing matrix, determining how the sources project into the sensor signals.\n", + "\n", + "The mixing matrix has shape `(sensors, sources)`. The (3 x 3) matrix we use here means that we will project our 3 sources into 3 sensor signals." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Simulation settings\n", + "duration = 10 # seconds\n", + "sfreq = 200 # sampling rate (Hz)\n", + "np.random.seed(44) # for reproducibility\n", + "\n", + "# Timepoints of the simulated data\n", + "times = np.linspace(start=0, stop=duration, num=sfreq * duration, endpoint=False)\n", + "\n", + "# Generate source signals\n", + "sources = np.array(\n", + " [\n", + " np.sin(2 * np.pi * times * 10), # 10 Hz sine wave\n", + " sp.signal.sawtooth(2 * np.pi * times * 8), # 8 Hz sawtooth wave\n", + " np.random.normal(0, 1, times.shape), # Noise with normal distribution\n", + " ]\n", + ")\n", + "source_names = [\"sine\", \"sawtooth\", \"noise\"]\n", + "\n", + "# Generate mixing matrix of sources to sensors\n", + "mixing_matrix = np.random.rand(3, 3)\n", + "\n", + "# Combine sources into sensor signals\n", + "sensors = mixing_matrix @ sources # @ is matrix multiplication in Python\n", + "sensor_names = [\"chan_1\", \"chan_2\", \"chan_3\"]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Data decomposition with ICA**\n", + "\n", + "**Exercise:** Create an [`Info`](https://mne.tools/stable/generated/mne.Info.html) object for the source signals, specifying `source_names` as the channel names, the channel types as EEG, and using the sampling frequency we specified above.\n", + "\n", + "Afterwards, use the `sources` array and `Info` object to create a [`RawArray`](https://mne.tools/stable/generated/mne.io.RawArray.html) object for the source signals, called `raw_sources`.\n", + "\n", + "*Hint:* use the [`create_info()`](https://mne.tools/stable/generated/mne.create_info.html) function to create the `Info` object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info_sources = mne.create_info(ch_names=source_names, sfreq=sfreq, ch_types=\"eeg\")\n", + "raw_sources = mne.io.RawArray(data=sources, info=info_sources)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `Info` object for the sensor signals, specifying `sensor_names` as the channel names, the channel types as EEG, and using the sampling frequency we specified above.\n", + "\n", + "Afterwards, use the `sensors` array and `Info` object to create a `RawArray` object for the sensor signals, called `raw_sensors`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info_sensors = mne.create_info(ch_names=sensor_names, sfreq=sfreq, ch_types=\"eeg\")\n", + "raw_sensors = mne.io.RawArray(data=sensors, info=info_sensors)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If we plot the sources, we can clearly see the individual sine wave, sawtooth, and noise channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# The object containing the source data should be called `raw_sources`\n", + "raw_sources.plot(scalings=\"auto\", title=\"Source signals\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In contrast, thanks to the mixing matrix, no one sensor signal resembles any one of the source signals.\n", + "\n", + "Instead, the signals are a mix of all 3 sources." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# The object containing the sensor data should be called `raw_sensors`\n", + "raw_sensors.plot(scalings=\"auto\", title=\"Sensor signals\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now consider that the sine and sawtooth signals are our signals of interest, and the random noise is some activity we are not interested in (i.e. noise!) and want to remove.\n", + "\n", + "This is a perfect use-case for ICA!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Performing ICA\n", + "\n", + "To perform ICA in MNE, we start by instantiating an [`ICA`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html) object.\n", + "\n", + "Here, we set the `random_state` parameter for reproducibility, as ICA fitting is not deterministic (i.e. there can be sign flips, components returned in different orders)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Instantiate ICA object\n", + "ica = mne.preprocessing.ICA(random_state=0)\n", + "ica" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, without specifying anything else, MNE defaults to using the FastICA algorithm (`Method` tab), with a set of default fitting parameters (`Fit parameters` tab).\n", + "\n", + "The algorithm to use is specified with the `method` parameter, and the fitting parameters with the `fit_params` parameter.\n", + "\n", + "You may also notice that we did not supply any data when instantiating the `ICA` object, and that the `Fit` tab is set to `no` (i.e. not fitted to data).\n", + "\n", + "Data is provided to the `ICA` object when we call the [`fit()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.fit) method." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When calling `fit()`:\n", + "1. The data is [whitened](https://mne.tools/stable/documentation/glossary.html#term-whitening).\n", + "2. The ICA algorithm is run to generate an unmixing matrix, with which we can separate the sources in the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Fit ICA to the data\n", + "ica.fit(inst=raw_sensors)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now see that the `Fit` tab has changed to show that the ICA object has been fit to the data, and that we have 3 different ICA components available.\n", + "\n", + "We can inspect the extracted ICA sources using the [`plot_sources()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.plot_sources) method.\n", + "\n", + "Comparing these extracted sources to the original sources, we can clearly see that ICA has successfully separated the sine, sawtooth, and noise signals." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Plot extracted sources\n", + "ica.plot_sources(inst=raw_sensors, title=\"ICA sources\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We are able to extract these sources from the sensor signals thanks to the unmixing matrix, which is applied to the data provided to the `plot_sources()` method.\n", + "\n", + "The unmixing matrix can be accessed under the `unmixing_matrix_` attribute (and inverse mixing matrix located under the `mixing_matrix_` attribute)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualise unmixing matrix\n", + "ica.unmixing_matrix_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Applying PCA for ICA\n", + "\n", + "Before we explore how to remove the noise source, we will explore the object instantiation and fitting options in more detail.\n", + "\n", + "An important implementation note for ICA in MNE is that when `fit()` is called, principal component analysis (PCA) is performed prior to running the ICA algorithm.\n", + "\n", + "PCA is a well-established algorithm for dimensionality reduction, whereby correlated signals are grouped together into components that are ordered according to how much they explain the variance in the data.\n", + "\n", + "You can then take only the first `n` principal components that contain a desired amount of variance (information) in the data, representing the data in a lower-dimensional space.\n", + "\n", + "If you need a refresher on PCA, check out this short introductory video: https://www.youtube.com/watch?v=FD4DeN81ODY\n", + "\n", + "The benefits of performing PCA prior to ICA include:\n", + "- Reduced computational time for the ICA algorithm.\n", + "- Easier interpretability of the resulting extracted ICA sources." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When instantiating the `ICA` object, MNE gives you the option to specify the degree of dimensionality reduction prior to performing ICA, using the `n_components` parameter." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Using PCA with a fixed number of components\n", + "\n", + "**Exercises - Applying PCA for ICA**\n", + "\n", + "**Exercise:** Instantiate an `ICA` object, specifying that 3 PCA components should be used for ICA.\n", + "\n", + "Fit this to the sensor signals (as above), and plot the extracted sources.\n", + "\n", + "How do the extracted sources compare to those where no PCA components were specified?\n", + "\n", + "*Hint:* Set `random_state=0` for reproducibility." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "ica_3PCA = mne.preprocessing.ICA(n_components=3, random_state=0)\n", + "ica_3PCA.fit(inst=raw_sensors)\n", + "ica_3PCA.plot_sources(inst=raw_sensors, title=\"ICA sources (3 PCA components)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You should see that the extracted sources are identical.\n", + "\n", + "This is because the default behaviour of `n_components=None` means that those PCA components which explain 99.9999% of the variance in the data will be used, which will almost always correspond to the number of sensors in the data (an exception being when you are working with rank-deficient data).\n", + "\n", + "Therefore, not specifying `n_components` is equivalent to specifying `n_components=3` in this case." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Perform the same procedure again, but this time specify that 2 PCA components should be used for ICA.\n", + "\n", + "What do you see when you plot the extracted sources?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "ica_2PCA = mne.preprocessing.ICA(n_components=2, random_state=0)\n", + "ica_2PCA.fit(inst=raw_sensors)\n", + "ica_2PCA.plot_sources(inst=raw_sensors, title=\"ICA sources (2 PCA components)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Using PCA with a proportional number of components\n", + "\n", + "In addition to specifying a particular number of PCA components to use, `n_components` also accepts floats in the range `(0, 1)`.\n", + "\n", + "Providing a float value means that the number of PCA components used will be the minimum number required to explain this proportion of variance.\n", + "\n", + "E.g. `n_components=0.9` means that the number of PCA components used will be the minimum number required to explain 90% of the variance in the data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Perform the same procedure again, but this time specify that 95% of the variance should be explained by the PCA components used for ICA." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "ica_95PCA = mne.preprocessing.ICA(n_components=0.95, random_state=0)\n", + "ica_95PCA.fit(inst=raw_sensors)\n", + "ica_95PCA.plot_sources(inst=raw_sensors, title=\"ICA sources (95% variance PCA components)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this case, only 2 PCA components were passed to the ICA algorithm, meaning the first 2 PCA components explain at least 95% of the variance in the data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To see how much variance each PCA component explains, we can use the `explained_variance_ratio_` attribute.\n", + "\n", + "Dividing by the sum of the variances normalises these values to the range `[0, 1]`.\n", + "\n", + "Here, the first 2 PCA components explain 97% of the variance." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute variance explained by PCA components\n", + "explained_variance = ica.pca_explained_variance_ / np.sum(ica.pca_explained_variance_)\n", + "\n", + "print(f\"Variance explained by PCA components: {explained_variance}\")\n", + "print(f\"Variance explained by first 2 PCA components: {np.sum(explained_variance[:2]) * 100:.2f}%\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Excluding ICA components\n", + "\n", + "Now we will look at how to exclude a given ICA component from the data.\n", + "\n", + "We can remove ICA components to \"clean\" the data using the [`apply()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.apply) method.\n", + "\n", + "Below, we specify the first component (i.e. the random noise source) to be removed from the data, by setting `exclude=[0]`.\n", + "\n", + "Note that we use a copy of the sensor signals, as the `apply()` method operates in-place." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Re-instantiate the ICA object, using all PCA components\n", + "ica = mne.preprocessing.ICA(random_state=0)\n", + "\n", + "# Fit the ICA to the sensor signals\n", + "ica.fit(inst=raw_sensors)\n", + "\n", + "# Remove the first ICA component (the random noise) from the data\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), exclude=[0])\n", + "\n", + "# Plot the cleaned data\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (removed noise)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, the remaining activity in the 3 sensor signals is a combination of our sine and sawtooth waves." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Choosing which sources to retain for data reconstruction**\n", + "\n", + "**Exercise:** Use ICA to remove the sine wave source from the sensor signals with the `exclude` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), exclude=[1])\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (removed sine wave)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use ICA to remove the sawtooth wave source from the sensor signals with the `exclude` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), exclude=[2])\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (removed sawtooth wave)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use ICA to remove both the noise and sawtooth wave sources from the sensor signals with the `exclude` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), exclude=[0, 2])\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (removed noise & sawtooth wave)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For convenience, the `apply()` function also has an `include` parameter which operates in the opposite way." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use ICA to keep only the random noise source in the sensor signals with the `include` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), include=[0])\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (included noise)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use ICA to keep both the sine and sawtooth wave sources in the sensor signals with the `include` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "raw_cleaned = ica.apply(inst=raw_sensors.copy(), include=[1, 2])\n", + "raw_cleaned.plot(scalings=\"auto\", title=\"Cleaned sensor signals (included sine & sawtooth wave)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, it is very easy in MNE to apply ICA to data and remove particular sources of unwanted activity.\n", + "\n", + "Now, you will see how this applies for a more typical use-case with the MNE sample dataset." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - Artefact rejection with ICA\n", + "\n", + "Using the MNE sample dataset, we will see how ICA can be used to remove cardiac and ocular artefacts from MEG & EEG data.\n", + "\n", + "Like for the previous notebook, we highlight some channels where this activity is particularly strong, with the MEG channels showing strong cardiac activity, and the EEG channels showing strong ocular activity (see e.g. https://labeling.ucsd.edu/tutorial/labels)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "raw.crop(tmax=60)\n", + "raw.load_data()\n", + "raw.del_proj() # delete existing PCA projections\n", + "\n", + "# Highpass filter at 1 Hz for better ICA performance\n", + "raw.filter(l_freq=1, h_freq=None)\n", + "\n", + "# Pick some channels with strong artefacts and plot them\n", + "artefact_picks = [152, 155, 158, 170, 315, 316, 317, 318]\n", + "raw.plot(order=artefact_picks, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Manual exclusion of artefacts\n", + "\n", + "Now we perform ICA on the data, specifying that the first 10 PCA component should be passed to the ICA algorithm.\n", + "\n", + "Plotting the extracted ICA sources, we see that the first source reflects ocular activity, and the second source reflects cardiac activity." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Fit ICA to the data\n", + "ica = mne.preprocessing.ICA(n_components=10, random_state=0)\n", + "ica.fit(inst=raw)\n", + "ica.plot_sources(inst=raw, title=\"ICA sources (first 10 PCA components)\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the [`plot_overlay()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.plot_overlay) method, we can see show excluding the ocular artefact source (the first ICA component) would affect the EEG data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualise effects of ocular artefact on EEG data\n", + "ica.plot_overlay(inst=raw, exclude=[0], picks=\"eeg\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also see how excluding the cardiac artefact source (the second ICA component) would affect the MEG data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualise effects of cardiac artefact on MEG data\n", + "ica.plot_overlay(inst=raw, exclude=[1], picks=\"mag\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After this, it is simply a case of excluding the first 2 ICA components to clean the data of cardiac and ocular artefacts." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Remove artefact sources from the data\n", + "raw_cleaned = ica.apply(inst=raw.copy(), exclude=[0, 1])\n", + "raw_cleaned.plot(order=artefact_picks, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Automatic exclusion of artefacts\n", + "\n", + "Although manually selecting the components is quite simple, when dealing with a large number of recordings, this can be a time-consuming process.\n", + "\n", + "Thankfully, the `ICA` class has the [`find_bads_eog()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.find_bads_eog) and [`find_bads_ecg()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.find_bads_ecg) methods, which automatically detect the ICA components corresponding to ocular and cardiac artefacts, respectively.\n", + "\n", + "**N.B.** There are also the [`find_bads_muscle()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.find_bads_muscle) and [`find_bads_ref()`](https://mne.tools/stable/generated/mne.preprocessing.ICA.html#mne.preprocessing.ICA.find_bads_ref) methods for automatically detecting muscle and MEG reference artefacts, respectively." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using `find_bads_eog()`, we can see that the first ICA component is detected as an ocular artefact." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Automatically identify ocular artefact sources\n", + "eog_bads, _ = ica.find_bads_eog(inst=raw, threshold=1)\n", + "print(f\"Ocular artefact ICA component(s): {eog_bads}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similarly, using `find_bads_ecg()`, we can see that the second ICA component is detected as a cardiac artefact." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Automatically identify cardiac artefact sources\n", + "ecg_bads, _ = ica.find_bads_ecg(inst=raw, threshold=0.5)\n", + "print(f\"Cardiac artefact ICA component(s): {ecg_bads}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can then pass the ICA components that were automatically identified to `apply()` to achieve the same result as a manual selection of the artefact components." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Remove artefact sources from the data\n", + "raw_cleaned = ica.apply(inst=raw.copy(), exclude=[*eog_bads, *ecg_bads])\n", + "raw_cleaned.plot(order=artefact_picks, scalings=\"auto\");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "ICA is a common approach for artefact rejection with electrophysiological data. MNE's `ICA` class provides a comprehensive set of tools for:\n", + "- Isolating unwanted sources of activity.\n", + "- Visualising the effects of removing particular sources.\n", + "- Visualising the spatial topographies of the extracted sources.\n", + "- Removing (manually or automatically) artefact activity." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on ICA for artefact correction: https://mne.tools/stable/auto_tutorials/preprocessing/40_artifact_correction_ica.html\n", + "\n", + "arXiv paper discussing the maths behind ICA: https://arxiv.org/pdf/1404.2986.pdf" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day 2/4 - Evoked.ipynb b/workshops/mne_course/Day 2/4 - Evoked.ipynb new file mode 100644 index 0000000..a5c5297 --- /dev/null +++ b/workshops/mne_course/Day 2/4 - Evoked.ipynb @@ -0,0 +1,937 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne\n", + "import numpy as np" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Analysis of event-related potentials (ERPs) - the `Evoked` class\n", + "\n", + "ERPs/[evoked](https://mne.tools/stable/documentation/glossary.html#term-evoked) data are another staple of many signal analysis projects.\n", + "\n", + "As the name implies, ERPs are changes in voltage associated with a particular event, e.g. the presentation of a stimulus, or the execution of some action.\n", + "\n", + "The first step in generating ERPs is to epoch the data around the event of interest, after which we generally average across the epochs to generate the final ERPs." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Creating `Evoked` objects from `Epochs`\n", + "\n", + "We start by loading the sample data, and choosing the stimulus channel from which we want to generate event markers.\n", + "\n", + "We additionally create a dictionary of event labels and their corresponding IDs which we want to generate epochs around." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "raw.pick(picks=[\"eeg\", \"stim\"])\n", + "raw.del_proj()\n", + "\n", + "# Generate the events array\n", + "events = mne.find_events(raw, stim_channel=\"STI 014\")\n", + "\n", + "# Choose the events to create epochs around\n", + "event_id = {\n", + " \"auditory/left\": 1,\n", + " \"auditory/right\": 2,\n", + " \"visual/left\": 3,\n", + " \"visual/right\": 4,\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Creating `Evoked` objects from `Epochs`**\n", + "\n", + "**Exercise:** Create an [`Epochs`](https://mne.tools/stable/generated/mne.Epochs.html) object from the data called `epochs`.\n", + "\n", + "Pass the `events` and `event_id` variables to the corresponding parameters to specify the events to epoch around.\n", + "\n", + "Using the `tmin` and `tmax` parameters, create epochs around the events in the window [-0.25, 0.75].\n", + "\n", + "*Hint:* Refer to the documentation for the `Epochs` class and the [Epochs notebook](../Day%201/2%20-%20Epochs.ipynb) for a reminder how to do this." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = mne.Epochs(raw=raw, events=events, event_id=event_id, tmin=-0.25, tmax=0.75)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Below, you can see that the numeric IDs of the events in the `Epochs` object have been assigned to more descriptive names provided in the `event_id` dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# You should call the `Epochs` object created above \"epochs\"\n", + "epochs.load_data()\n", + "epochs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Averaging across the epochs to create ERPs is as simple as calling the [`average()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.average) method of the `Epochs` object.\n", + "\n", + "This returns an [`Evoked`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked) object.\n", + "\n", + "An example is shown below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create the evoked responses for the left auditory stimulus\n", + "evoked_aud_l = epochs[\"auditory/left\"].average()\n", + "evoked_aud_l" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, we selected only those events corresponding to the `\"auditory/left\"` label to average across.\n", + "\n", + "We can visualise the ERPs using the [`plot()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.plot) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualise the evoked response\n", + "evoked_aud_l.plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create ERPs for the \"auditory/right\" stimulus and visualise them." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked_aud_r = epochs[\"auditory/right\"].average()\n", + "evoked_aud_r.plot()\n", + "evoked_aud_r" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create ERPs for the \"visual/left\" stimulus and visualise them." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked_vis_l = epochs[\"visual/left\"].average()\n", + "evoked_vis_l.plot()\n", + "evoked_vis_l" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Creating ERPs: mean vs. median\n", + "\n", + "By default, calling the `average()` method on `Epochs` objects will generated `Evoked` objects using the mean of the values across the epochs.\n", + "\n", + "However, we can also create `Evoked` objects using the median, or even custom functions.\n", + "\n", + "Below, we create ERPs for the \"visual/right\" stimulus, explicitly specifying the mean to be used with the `method` parameter." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked responses specifying the mean method explicitly\n", + "epochs.load_data()\n", + "evoked_vis_r_mean = epochs[\"visual/right\"].average(method=\"mean\")\n", + "evoked_vis_r_mean.plot()\n", + "evoked_vis_r_mean" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create ERPs for the \"visual/right\" stimulus using the median method, and visualise them." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked_vis_r_median = epochs[\"visual/right\"].average(method=\"median\")\n", + "evoked_vis_r_median.plot()\n", + "evoked_vis_r_median" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Custom functions can also be used to control how the information is combined across epochs.\n", + "\n", + "For example, the function below uses the data of only every other epoch and takes the mean of this." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def every_other_epoch_mean(epoched_data: np.ndarray) -> np.ndarray:\n", + " \"\"\"Take the mean across every other epoch.\n", + "\n", + " Parameters\n", + " ----------\n", + " epoched_data : numpy.ndarray, shape of (epochs, channels, times)\n", + " - The epochs to create evoked data from.\n", + "\n", + " Returns\n", + " -------\n", + " evoked_data : numpy.ndarray, shape of (channels, times)\n", + " - The evoked data as the mean across every other epoch.\n", + " \"\"\"\n", + " # Select every other epoch\n", + " every_other_epoch = epoched_data[::2, :, :] # [::2] takes every other element\n", + "\n", + " # Average across the remaining epochs\n", + " evoked_data = np.mean(every_other_epoch, axis=0)\n", + "\n", + " return evoked_data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We then simply pass this to the `method` parameter of the `average()` method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked responses using the custom method\n", + "evoked_vis_r_custom = epochs[\"visual/right\"].average(method=every_other_epoch_mean)\n", + "evoked_vis_r_custom.plot()\n", + "evoked_vis_r_custom" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Handling multiple event types simultaneously\n", + "\n", + "Multiple events types can also be selected at once for processing into `Evoked` objects.\n", + "\n", + "By default, averaging is performed across all selected event types, such that a single `Evoked` object is returned." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked responses for the auditory stimuli\n", + "evoked_aud = epochs[[\"auditory/left\", \"auditory/right\"]].average()\n", + "evoked_aud.plot()\n", + "evoked_aud" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, you can also specify to only average across events with the same label.\n", + "\n", + "This behaviour is controlled with the `by_event_type` parameter of the `average()` method.\n", + "\n", + "`by_event_type` is by default `False`, which combines the epochs regardless of type." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Same behaviour as above (i.e. average across events regardless of type)\n", + "evoked_vis = epochs[[\"visual/left\", \"visual/right\"]].average(by_event_type=False)\n", + "evoked_vis.plot()\n", + "evoked_vis" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "On the other hand, setting `by_event_type=True` returns a list of `Evoked` objects, one for each event type." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Average across events of the same type only\n", + "evoked_vis = epochs[[\"visual/left\", \"visual/right\"]].average(by_event_type=True)\n", + "for evoked in evoked_vis:\n", + " print(evoked)\n", + " evoked.plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - Controlling how event counts are handled\n", + "\n", + "You may have noticed above that the event counts for each type of event were not equal:\n", + "- `\"auditory/left\"` stimuli occur 72 times\n", + "- `\"auditory/right\"` stimuli occur 73 times\n", + "- `\"visual/left\"` stimuli occur 73 times\n", + "- `\"visual/right\"` stimuli occur 71 times" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(\n", + " os.path.join(mne.datasets.sample.data_path(), \"MEG\", \"sample\", \"sample_audvis_raw.fif\")\n", + ")\n", + "raw.pick(picks=[\"eeg\", \"stim\"])\n", + "raw.del_proj()\n", + "\n", + "# Generate the events array\n", + "events = mne.find_events(raw, stim_channel=\"STI 014\")\n", + "\n", + "# Choose the events to create epochs around\n", + "event_id = {\n", + " \"auditory/left\": 1,\n", + " \"auditory/right\": 2,\n", + " \"visual/left\": 3,\n", + " \"visual/right\": 4,\n", + "}\n", + "\n", + "# Create the epochs\n", + "epochs = mne.Epochs(raw=raw, events=events, event_id=event_id, tmin=-0.25, tmax=0.75)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Equalising event counts\n", + "\n", + "You may wish to use an equal number of events for each type when creating ERPs, e.g. for statistical purposes.\n", + "\n", + "This is easily done using the [`equalize_event_counts()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.equalize_event_counts) method of the `Epochs` object (i.e. before creating the `Evoked` object).\n", + "\n", + "By default, the number of events will be equalised:\n", + "- for all event types (`event_ids=None`)\n", + "- according to those events which are temporally closest to one another (`method=\"mintime\"`)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Equalise the number of epochs in each condition\n", + "epochs.copy().equalize_event_counts()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Equalising event counts**\n", + "\n", + "**Exercise:** Use the `event_ids` parameter of `equalize_event_counts()` to specify the auditory stimuli to have an equal number of events." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.copy().equalize_event_counts(event_ids=[\"auditory/left\", \"auditory/right\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use the `event_ids` parameter of `equalize_event_counts()` to specify the visual stimuli to have an equal number of events." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.copy().equalize_event_counts(event_ids=[\"visual/left\", \"visual/right\"])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use the `method` parameter of `equalize_event_counts()` to specify that event counts should be equalised by dropping the last events.\n", + "\n", + "How do the IDs of the dropped epochs compare to when the default `\"mintime\"` is used?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.copy().equalize_event_counts(method=\"truncate\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Combining information from multiple `Evoked` objects\n", + "\n", + "If equalised event counts are not a concern when you compute evoked data, it is important to consider what happens when you combine this data across multiple `Evoked` objects.\n", + "\n", + "Data can be combined across multiple `Evoked` objects using the [`combine_evoked()`](https://mne.tools/stable/generated/mne.combine_evoked.html) function.\n", + "\n", + "`combine_evoked()` takes a list of `Evoked` objects, as well as a way to weight the information from these `Evoked` objects (the `weights` parameter).\n", + "\n", + "The weighting approaches are: `\"nave\"`; `\"equal\"`; or a list of floats." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### `\"nave\"` weighting\n", + "\n", + "With `\"nave\"`, the data from each `Evoked` object is weighted proportionally to the number of epochs that `Evoked` object was averaged across.\n", + "\n", + "For example, if you have one `Evoked` object from 5 auditory events and another from 15 visual events:\n", + "- The evoked auditory data will be weighted by $\\frac{1}{4}$.\n", + "- The evoked visual data will be weighted by $\\frac{3}{4}$.\n", + "\n", + "This may help to reduce noise in your evoked data, however it biases the final evoked data towards those events with a greater number, e.g. it could make it appear as if the brain's response to visual stimuli is much stronger than to auditory stimuli!\n", + "\n", + "An example of `\"nave\"` weighting is shown below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create ERPs for two event types\n", + "evoked = epochs[[\"visual/right\", \"auditory/right\"]].average(by_event_type=True)\n", + "\n", + "# Combine ERPs by weighting according to number of events per type\n", + "evoked_nave = mne.combine_evoked(all_evoked=evoked, weights=\"nave\")\n", + "evoked_nave.plot()\n", + "evoked_nave" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how the weightings differ slightly for each condition.\n", + "\n", + "This reflects the fact that there were 73 auditory stimuli and 71 visual stimuli.\n", + "\n", + "Dividing these counts by the total number of events (144) gives the 0.507 and 0.493 weightings for auditory and visual events, respectively." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### `\"equal\"` weighting\n", + "\n", + "The alternative approach is to weight the epochs of each event type equally.\n", + "\n", + "Using the same example of 5 auditory events and 15 visual events:\n", + "- The evoked auditory data will be weighted by $\\frac{1}{2}$.\n", + "- The evoked visual data will be weighted by $\\frac{1}{2}$.\n", + "\n", + "This avoids biases arising from differences in the number of events, but may lead to noiser results." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Combine the visual and auditory ERPs with an `\"equal\"` weighting.\n", + "\n", + "How do the results and reported weightings compare to the approach above?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked_equal = mne.combine_evoked(all_evoked=evoked, weights=\"equal\")\n", + "evoked_equal.plot()\n", + "evoked_equal" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Custom weightings\n", + "\n", + "Finally, custom weightings for the `Evoked` objects can also be supplied.\n", + "\n", + "This is done as a list of floats, with one for each `Evoked` object." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Provide a custom weighting for the stimuli, weighting the visual stimuli by 0.9 and the auditory stimuli by 0.1.\n", + "\n", + "How do the results compare to the weighting approaches above?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked_custom = mne.combine_evoked(all_evoked=evoked, weights=[0.9, 0.1])\n", + "evoked_custom.plot();\n", + "evoked_custom" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "##### Summary of different weighting approaches\n", + "\n", + "Since the number of left and right visual events were so similar, weighting according to the number of events per type (`\"nave\"`) or providing equal weights (`\"equal\"`) gives very similar results.\n", + "\n", + "Naturally, our custom weighting skewed the evoked responses heavily towards the auditory stimuli.\n", + "\n", + "However, this custom weighting also mimics scenarios where there is a large difference in the number of events per condition (e.g. 90 auditory stimulus trials and 10 visual stimulus trials), where weighting according to the total number of events may have a large effect on your interpretation of the data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 3 - Creating `Evoked` objects from arrays\n", + "\n", + "Like for `Raw` and `Epochs` objects, `Evoked` objects can also be created from data arrays, using the [`EvokedArray`](https://mne.tools/stable/generated/mne.EvokedArray.html) class.\n", + "\n", + "Below, we generate some signals as sine waves, reshape them into continuous epochs, and then average across to create 'evoked' data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Simulation settings\n", + "duration = 10 # seconds\n", + "sfreq = 200 # sampling rate (Hz)\n", + "epoch_duration = 2 # seconds\n", + "n_epochs = duration // epoch_duration\n", + "np.random.seed(44) # for reproducibility\n", + "\n", + "# Timepoints of the simulated data\n", + "times = np.linspace(start=0, stop=duration, num=sfreq * duration, endpoint=False)\n", + "\n", + "# Generate timeseries signals\n", + "data_raw = np.array(\n", + " [\n", + " np.sin(2 * np.pi * times * 1), # 1 Hz sine wave\n", + " np.sin(2 * np.pi * times * 3), # 3 Hz sine wave\n", + " ]\n", + ")\n", + "n_channels = data_raw.shape[0]\n", + "print(f\"Shape of timeseries data: {data_raw.shape} (channels x times)\")\n", + "\n", + "# Reshape into epochs\n", + "data_epochs = np.reshape(data_raw, (n_channels, n_epochs, epoch_duration * sfreq))\n", + "data_epochs = np.transpose(data_epochs, (1, 0, 2))\n", + "print(f\"Shape of epoched data: {data_epochs.shape} (epochs x channels x times)\")\n", + "\n", + "# Average across epochs\n", + "data_evoked = np.mean(data_epochs, axis=0)\n", + "print(f\"Shape of evoked data: {data_evoked.shape} (channels x times)\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Creating `Evoked` objects from arrays**\n", + "\n", + "**Exercise:** Using the information above, create an `Info` object for the simulated data, specifying them to be EEG channels and using the sampling frequency given above.\n", + "\n", + "*Hint:* use the [`create_info()`](https://mne.tools/stable/generated/mne.create_info.html) function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "info = mne.create_info(ch_names=n_channels, sfreq=sfreq, ch_types=\"eeg\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Use the `data_evoked` and `Info` object generated above to create an `EvokedArray` object and display its properties." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked, info=info)\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Plot the data to verify that it matches our expectations, i.e.:\n", + "- 2 channels of a 1 and 3 Hz sine wave.\n", + "- Duration of 2 seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked.plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You should see the evoked data span from 0 to 2 seconds.\n", + "\n", + "The times of the data in the `EvokedArray` object can be controlled with the `tmin` parameter.\n", + "\n", + "Below, we explicitly set `tmin=0` (the default behaviour)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked data from the array with explicit tmin\n", + "info = mne.create_info(ch_names=n_channels, sfreq=sfreq, ch_types=\"eeg\")\n", + "evoked = mne.EvokedArray(data=data_evoked, info=info, tmin=0)\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EvokedArray` object where the data spans the period [-1, 1] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked, info=info, tmin=-1)\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EvokedArray` object where the data spans the period [-0.5, 1.5] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked, info=info, tmin=-0.5)\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Baselining of the evoked data can be controlled when creating an `EvokedArray` object using the `baseline` parameter.\n", + "\n", + "Below, we explicitly set `baseline=None` (the default behaviour, i.e. no baselining)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked data from the array with explicit (lack of) baselining\n", + "info = mne.create_info(ch_names=n_channels, sfreq=sfreq, ch_types=\"eeg\")\n", + "evoked = mne.EvokedArray(data=data_evoked.copy(), info=info, baseline=None)\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EvokedArray` object baselined for the first 100 ms of data.\n", + "\n", + "Make sure to pass in a copy of the `data_evoked` array, like above.\n", + "\n", + "*Hint:* Specifying baselines for evoked data takes the same form as for epoched data, i.e. `baseline=(start, end)`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked.copy(), info=info, baseline=(0, 0.1))\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EvokedArray` object baselined for the first 500 ms of data, where the data spans the period [-0.5, 1.5] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked.copy(), info=info, tmin=-0.5, baseline=(None, 0))\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Create an `EvokedArray` object baselined for the first 200 ms of data, where the data spans the period [-1, 1] seconds." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "evoked = mne.EvokedArray(data=data_evoked.copy(), info=info, tmin=-1, baseline=(None, -0.8))\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Controlling the times and baselining of evoked data are some of the most useful features when creating `EvokedArray` objects, however additional options exist for:\n", + "- Providing a label for the evoked data - `comment` parameter (default `\"\"`)\n", + "- Specifying the number of epochs which have been averaged across - `nave` parameter (default `1`)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create evoked data from the array with explicit comment and nave\n", + "info = mne.create_info(ch_names=n_channels, sfreq=sfreq, ch_types=\"eeg\")\n", + "evoked = mne.EvokedArray(data=data_evoked, info=info, comment=\"example_data\", nave=n_epochs)\n", + "evoked.plot()\n", + "evoked" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "MNE makes generating ERP data and storing it in `Evoked` objects easy, either from averaging data of `Epochs` objects, or from data arrays.\n", + "\n", + "Similarly to the `Raw` and `Epochs` classes, the `Evoked` class also has useful methods for:\n", + "- picking channels - [`pick()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.pick)\n", + "- cropping activity by time - [`crop()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.crop)\n", + "- plotting topographies of activity - [`plot_topo()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.plot_topo) and [`plot_topomap()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.plot_topomap)\n", + "- computing PSDs - [`compute_psd()`](https://mne.tools/stable/generated/mne.Evoked.html#mne.Evoked.compute_psd)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on `Evoked` objects: https://mne.tools/stable/auto_tutorials/evoked/10_evoked_overview.html\n", + "\n", + "MNE tutorial on visualising `Evoked` objects: https://mne.tools/stable/auto_tutorials/evoked/20_visualize_evoked.html\n", + "\n", + "MNE tutorial on ERP analysis: https://mne.tools/stable/auto_tutorials/evoked/30_eeg_erp.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day 2/_helper_functions.py b/workshops/mne_course/Day 2/_helper_functions.py new file mode 100644 index 0000000..1cb0e23 --- /dev/null +++ b/workshops/mne_course/Day 2/_helper_functions.py @@ -0,0 +1,51 @@ +"""Helper functions for using the notebooks.""" + +import numpy as np +from matplotlib import pyplot as plt + + +def plot_psd(psd: np.ndarray, freqs: np.ndarray) -> None: + """Plot power spectral density. + + Parameters + ---------- + psd : np.ndarray, shape of (channels, freqs) + Power spectral density data. + + freqs : np.ndarray, shape of (freqs,) + Frequencies in `psd`. + + Notes + ----- + PSD values are not scaled according to channel type, unlike in MNE's PSD plotting. + """ + # Input checking + if not isinstance(psd, np.ndarray): + raise TypeError("`psd` must be a numpy array.") + if not isinstance(freqs, np.ndarray): + raise TypeError("`freqs` must be a numpy array.") + + if psd.ndim != 2: + raise ValueError("`psd` must be a 2D array.") + if freqs.ndim != 1: + raise ValueError("`freqs` must be a 1D array.") + + if psd.shape[1] != freqs.shape[0]: + raise ValueError("`psd` and `freqs` must have the same number of frequencies.") + + # Convert to same scale as MNE plotting + psd = psd.copy() + psd = np.log10(np.maximum(psd, np.finfo(float).tiny)) * 10 + + # Plotting + _, ax = plt.subplots(1, 1) + + for psd_chan in psd: + plt.plot(freqs, psd_chan, alpha=0.5) + + ax.set_xlim((freqs.min(), freqs.max())) + + ax.set_xlabel("Frequency (Hz)") + ax.set_ylabel("Power (dB; unscaled)") + + plt.show(block=False) diff --git a/workshops/mne_course/Day 2/figures/filter_types.png b/workshops/mne_course/Day 2/figures/filter_types.png new file mode 100644 index 0000000..e11e57e Binary files /dev/null and b/workshops/mne_course/Day 2/figures/filter_types.png differ diff --git a/workshops/mne_course/Day 2/figures/filter_types_marked.png b/workshops/mne_course/Day 2/figures/filter_types_marked.png new file mode 100644 index 0000000..a08b836 Binary files /dev/null and b/workshops/mne_course/Day 2/figures/filter_types_marked.png differ diff --git a/workshops/mne_course/Day3/1 - Source Space.ipynb b/workshops/mne_course/Day3/1 - Source Space.ipynb new file mode 100644 index 0000000..c1c620f --- /dev/null +++ b/workshops/mne_course/Day3/1 - Source Space.ipynb @@ -0,0 +1,598 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import os\n", + "\n", + "import mne" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Working in source space in MNE\n", + "\n", + "Localising activity in [source space](https://mne.tools/stable/documentation/glossary.html#term-source-space) is a fundamental part of many EEG and MEG analyses.\n", + "\n", + "Source space localisation generally involves:\n", + "1. Computing a subject-specific [forward model](https://mne.tools/stable/documentation/glossary.html#term-forward-solution)\n", + "2. Using the forward model to generate an [inverse model](https://mne.tools/stable/documentation/glossary.html#term-inverse-operator) for translating from sensor space to source space" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Creating forward models\n", + "\n", + "Although MNE provides sample forward models, creating these based on MRI recordings of each individual subject offers greater accuracy when localising sources.\n", + "\n", + "Thankfully, MNE offers a number of tools for creating subject-specific forward models.\n", + "\n", + "Computing forward models requires:\n", + "- [Coregistration information](https://mne.tools/stable/documentation/glossary.html#term-trans) (stored as a `-trans.fif` file) for aligning head and sensor positions\n", + "- The boundary element model ([BEM](https://mne.tools/stable/documentation/glossary.html#term-BEM)) surfaces which influence how source activity propagates to the sensors\n", + "- A source space of locations in the brain at which to estimate source activity\n", + "\n", + "We start by establishing the sample data paths." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "# Path to the sample data\n", + "sample_data_path = mne.datasets.sample.data_path()\n", + "sample_dir = sample_data_path / \"MEG\" / \"sample\"\n", + "\n", + "# Path to the FreeSurfer reconstructions\n", + "subjects_dir = os.path.join(sample_data_path, \"subjects\")\n", + "subject = \"sample\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Transformation information from coregistration\n", + "\n", + "Coregistration is the process of aligning head and sensor locations in a common coordinate system.\n", + "\n", + "This can be performed using the [`mne.gui.coregistration()`](https://mne.tools/stable/generated/mne.gui.coregistration.html) function.\n", + "\n", + "Run the cell below to open the coregistration GUI." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Using pyvistaqt 3d backend.\n", + "\n", + " Triangle neighbors and vertex normals...\n", + "Using high resolution head model in C:\\Users\\sangeetha\\mne_data\\MNE-sample-data\\subjects\\sample\\surf\\lh.seghead\n", + " Triangle neighbors and vertex normals...\n", + "Estimating fiducials from fsaverage.\n", + " Triangle neighbors and vertex normals...\n", + "Using high resolution head model in C:\\Users\\sangeetha\\mne_data\\MNE-sample-data\\subjects\\sample\\surf\\lh.seghead\n", + " Triangle neighbors and vertex normals...\n", + "Estimating fiducials from fsaverage.\n", + "Estimating fiducials from fsaverage.\n", + "Placing MRI fiducials - LPA\n", + "Using lh.seghead for head surface.\n", + "Placing MRI fiducials - LPA\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n", + "Using lh.seghead for head surface.\n" + ] + } + ], + "source": [ + "mne.gui.coregistration(subject=subject, subjects_dir=subjects_dir);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Below you can view a transformation file from a previous coregisration for the sample data.\n", + "\n", + "What do the coloured dots represent?\n", + "\n", + "What do the blue panels represent?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the transformation file obtained by coregistration\n", + "transformation = os.path.join(sample_dir, \"sample_audvis_raw-trans.fif\")\n", + "\n", + "# Load the data's information\n", + "raw = mne.io.read_raw_fif(os.path.join(sample_dir, \"sample_audvis_raw.fif\"))\n", + "\n", + "# Visualise coregistration of head and sensors\n", + "mne.viz.plot_alignment(\n", + " info=raw.info,\n", + " trans=transformation,\n", + " subject=subject,\n", + " subjects_dir=subjects_dir,\n", + " surfaces=\"head-dense\",\n", + " meg=[\"helmet\", \"sensors\"],\n", + " dig=True,\n", + ");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Computing the BEM Surfaces\n", + "\n", + "BEM surfaces are triangulations of the interfaces between different tissues which affect the propagation of signals (e.g. inner skull surface, outer skull surface, scalp surface).\n", + "\n", + "Computing BEM surfaces makes use of [FreeSurfer](https://surfer.nmr.mgh.harvard.edu/), and can be performed using the [`mne.bem.make_flash_bem()`](https://mne.tools/stable/generated/mne.bem.make_flash_bem.html) or [`mne.bem.make_watershed_bem()`](https://mne.tools/stable/generated/mne.bem.make_watershed_bem.html) functions.\n", + "\n", + "As this takes several minutes to compute per subject, we can take advantage of the pre-existing surfaces.\n", + "\n", + "We plot these surfaces using the [`mne.viz.plot_bem()`](https://mne.tools/stable/generated/mne.viz.plot_bem.html) function.\n", + "\n", + "What do the coloured lines represent?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mne.viz.plot_bem(\n", + " subject=subject,\n", + " subjects_dir=subjects_dir,\n", + " orientation=\"coronal\",\n", + " slices=[50, 100, 150, 200],\n", + " brain_surfaces=\"white\",\n", + ");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Defining the source space\n", + "\n", + "The source space determines the position and orientation of candidate source locations.\n", + "\n", + "Source spaces can be:\n", + "- Surface-based - source candidates are confined to a surface, e.g. the cortical surface ([`mne.setup_source_space()`](https://mne.tools/stable/generated/mne.setup_source_space.html))\n", + "- Volumetric/discrete - source candidates are arbitrary points within an area, e.g. within the inner skull ([`mne.setup_volume_source_space()`](https://mne.tools/stable/generated/mne.setup_volume_source_space.html))\n", + "\n", + "Below, we create source space candidates on the cortical surface and visualise them in 3D." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create source space candidates\n", + "surface_sources = mne.setup_source_space(\n", + " subject, spacing=\"oct6\", subjects_dir=subjects_dir, add_dist=\"patch\"\n", + ")\n", + "\n", + "# Visualise the candidates in 3D\n", + "mne.viz.plot_alignment(\n", + " subject=subject,\n", + " subjects_dir=subjects_dir,\n", + " surfaces=\"white\",\n", + " coord_frame=\"mri\",\n", + " src=surface_sources,\n", + ");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When creating volumetric source space candidates, these can be bound to a limited area.\n", + "\n", + "For example, we can create candidate sources only within the brain using the BEM surface of the inner skull, which we can visualise alongside the BEM surfaces." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the BEM surface for the inner skull\n", + "inner_skull_surface = os.path.join(subjects_dir, subject, \"bem\", \"inner_skull.surf\")\n", + "\n", + "# Create the volumetric source candidates\n", + "volume_sources = mne.setup_volume_source_space(\n", + " subject=subject, surface=inner_skull_surface, subjects_dir=subjects_dir, add_interpolator=False\n", + ")\n", + "\n", + "# Visualise BEM surfaces and source space candidates (purple dots)\n", + "mne.viz.plot_bem(\n", + " subject=subject,\n", + " subjects_dir=subjects_dir,\n", + " orientation=\"coronal\",\n", + " slices=[50, 100, 150, 200],\n", + " brain_surfaces=\"white\",\n", + " src=volume_sources,\n", + ");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "What would happen if we did not specify a surface to bound the candidates to?" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Computing the forward model\n", + "\n", + "With the coregistration, BEM surfaces, and source space sorted, we can at last compute the forward model.\n", + "\n", + "We first load the BEM surfaces to create a BEM model using the [`mne.make_bem_model()`](https://mne.tools/stable/generated/mne.make_bem_model.html) function.\n", + "\n", + "We then create a BEM solution from this using the [`mne.make_bem_solution()`](https://mne.tools/stable/generated/mne.make_bem_solution.html) function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Use only a single-layer BEM (just the inner skull) for speed\n", + "conductivity = 0.3\n", + "\n", + "# Create the BEM model\n", + "bem_model = mne.make_bem_model(\n", + " subject=subject, ico=4, conductivity=conductivity, subjects_dir=subjects_dir\n", + ")\n", + "\n", + "# Use the BEM model to create the BEM solution\n", + "bem_solution = mne.make_bem_solution(bem_model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we can pass the BEM solution along with our source space candidates and coregistration information to the [`mne.make_forward_solution()`](https://mne.tools/stable/generated/mne.make_forward_solution.html) function to create the forward model.\n", + "\n", + "Here, we only compute the model for the MEG sensors, using the surface-based (i.e. cortical) source candidates." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create the forward model\n", + "forward = mne.make_forward_solution(\n", + " info=raw.info, # data information\n", + " trans=transformation, # coregistration information\n", + " src=surface_sources, # surface-based source candidates\n", + " bem=bem_solution, # BEM solution\n", + " meg=True,\n", + " eeg=False,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This returns an [`mne.Forward`](https://mne.tools/stable/generated/mne.Forward.html) object, which contains the forward model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "forward" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The leadfield matrix representing the transformation between source and sensor spaces can be extracted from the `Forward` object as below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "leadfield = forward[\"sol\"][\"data\"]\n", + "print(f\"Leadfield matrix shape: {leadfield.shape} (sensors x dipoles)\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how the number of dipoles (24,579) is 3 times greater than the number of sources (8,193).\n", + "\n", + "This reflects the fact that we have not specified the orientations of the sources, so the leadfield matrix is computed for each orientation in 3D space.\n", + "\n", + "Fixing the orientation of the sources (e.g. to a cortical orientation) can be performed using the [`mne.convert_forward_solution()`](https://mne.tools/stable/generated/mne.convert_forward_solution.html) function." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - Inverse modelling for source localisation\n", + "\n", + "Now that we have a forward model, we can use this to reconstruct source activity from sensor space data.\n", + "\n", + "Various approaches for source reconstruction are offered in MNE:\n", + "- Minimum norm estimation ([MNE](https://mne.tools/stable/documentation/glossary.html#term-MNE))\n", + "- Dynamic statistical parametric mapping ([dSPM](https://mne.tools/stable/documentation/glossary.html#term-dSPM))\n", + "- Standardised low-resolution electromagnetic tomography ([sLORETA](https://mne.tools/stable/documentation/glossary.html#term-sLORETA))\n", + "- Exact low-resolution electromagnetic tomography ([eLORETA](https://mne.tools/stable/documentation/glossary.html#term-eLORETA))\n", + "- Linearly constrained minimum variance ([LCMV](https://mne.tools/stable/documentation/glossary.html#term-LCMV)) [beamformer](https://mne.tools/stable/documentation/glossary.html#term-beamformer)\n", + "- Dynamic imaging of coherent sources ([DICS](https://mne.tools/stable/documentation/glossary.html#term-DICS)) beamformer\n", + "\n", + "
\n", + "\n", + "Creating an inverse model requires:\n", + "- A forward model\n", + "- A covariance matrix\n", + "\n", + "
\n", + "\n", + "Below we will explore the process for creating an inverse model and applying it with the dSPM approach.\n", + "\n", + "We start by loading MNE's sample data and creating epochs around the left auditory stimuli." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load the sample data\n", + "raw = mne.io.read_raw_fif(fname=os.path.join(sample_dir, \"sample_audvis_raw.fif\"))\n", + "raw.pick(picks=[\"eeg\", \"stim\"], exclude=\"bads\")\n", + "raw.set_eeg_reference(ref_channels=\"average\", projection=True)\n", + "\n", + "# Create the events array\n", + "events = mne.find_events(raw=raw, stim_channel=\"STI 014\")\n", + "\n", + "# Create the epochs\n", + "epochs = mne.Epochs(\n", + " raw=raw,\n", + " events=events,\n", + " event_id={\"auditory/left\": 1}, # isolate the left auditory stimuli\n", + " tmin=-0.2, # start each epoch 200 ms before the stimulus\n", + " tmax=0.5, # end each epoch 500 ms after the stimulus\n", + " baseline=(None, 0), # baseline epochs in the window [-200, 0] ms\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, we average over the epochs to create ERPs in sensor space.\n", + "\n", + "This is the activity we will reconstruct in source space." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create ERPs from the epochs\n", + "evoked = epochs.average()\n", + "evoked.plot()\n", + "evoked.plot_topomap(times=[-0.1, 0.0, 0.1, 0.2]);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As we are working with MNE's sample data, we can load the pre-computed forward model using [`mne.read_forward_solution()`](https://mne.tools/stable/generated/mne.read_forward_solution.html)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Load pre-computed EEG forward solution\n", + "forward = mne.read_forward_solution(os.path.join(sample_dir, \"sample_audvis-eeg-oct-6-fwd.fif\"))\n", + "forward" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, we need to compute the covariance matrix - specifically the [covariance matrix of the noise](https://mne.tools/stable/documentation/glossary.html#term-noise-covariance) - using the [`mne.compute_covariance()`](https://mne.tools/stable/generated/mne.compute_covariance.html) function.\n", + "\n", + "Here, we define noise to be the baseline period preceding stimulus presentation, and as such specify that covariance should only be computed for the window [-0.2, 0] seconds.\n", + "\n", + "What does the covariance matrix represent, and what role does it play in the inverse model?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute and plot noise covariance\n", + "noise_covariance = mne.compute_covariance(epochs=epochs, tmin=None, tmax=0, method=\"empirical\")\n", + "mne.viz.plot_cov(noise_covariance, raw.info);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the forward model and covariance matrix, we can create an inverse model using the [`mne.minimum_norm.make_inverse_operator()`](https://mne.tools/stable/generated/mne.minimum_norm.make_inverse_operator.html) function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Create inverse model\n", + "inverse = mne.minimum_norm.make_inverse_operator(\n", + " info=evoked.info, forward=forward, noise_cov=noise_covariance\n", + ")\n", + "inverse" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The inverse model can be applied to:\n", + "- `Evoked` objects - [`mne.minimum_norm.apply_inverse()`](https://mne.tools/stable/generated/mne.minimum_norm.apply_inverse.html)\n", + "- `Raw` objects - [`mne.minimum_norm.apply_inverse_raw()`](https://mne.tools/stable/generated/mne.minimum_norm.apply_inverse_raw.html)\n", + "- `Epochs` objects - [`mne.minimum_norm.apply_inverse_epochs()`](https://mne.tools/stable/generated/mne.minimum_norm.apply_inverse_epochs.html)\n", + "- `EpochsTFR` objects - [`mne.minimum_norm.apply_inverse_tfr_epochs()`](https://mne.tools/stable/generated/mne.minimum_norm.apply_inverse_tfr_epochs.html)\n", + "- `Covariance` objects - [`mne.minimum_norm.apply_inverse_cov()`](https://mne.tools/stable/generated/mne.minimum_norm.apply_inverse_cov.html)\n", + "\n", + "Here, we apply the inverse model to the ERP data using the dSPM method, which returns an [`mne.SourceEstimate`](https://mne.tools/stable/generated/mne.SourceEstimate.html) object." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Extract source activity of the ERP\n", + "source_activity = mne.minimum_norm.apply_inverse(\n", + " evoked=evoked, inverse_operator=inverse, method=\"dSPM\"\n", + ")\n", + "source_activity" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can now visualise the ERPs in source space using the [`plot()`](https://mne.tools/stable/generated/mne.SourceEstimate.html#mne.SourceEstimate.plot) method.\n", + "\n", + "Play around with the options in the visualisation GUI to see how the [source activity](https://mne.tools/stable/documentation/glossary.html#term-source-estimate) changes over time in response to the stimuli.\n", + "\n", + "Does the localisation of this source activity make sense given the simulus being presented?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualise source activity\n", + "source_activity.plot(\n", + " subject=subject, hemi=\"rh\", subjects_dir=subjects_dir, initial_time=0.1, backend=\"pyvistaqt\"\n", + ");" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "Reconstructing activity in source space is a fundamental part of many neuroscience projects involving EEG and MEG data.\n", + "\n", + "As you can see, there are a wide range of tools available in MNE for translating activity from sensor space to source space, which we have only briefly covered here.\n", + "\n", + "For more in-depth discussions of particular approaches, see the MNE tutorials linked below." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE forward modelling tutorials: https://mne.tools/stable/auto_tutorials/forward/index.html\n", + "\n", + "MNE inverse modelling tutorials: https://mne.tools/stable/auto_tutorials/inverse/index.html\n", + "\n", + "MNE forward modelling module: https://mne.tools/stable/api/forward.html\n", + "\n", + "MNE inverse modelling module: https://mne.tools/stable/api/inverse.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day3/2 - Connectivity 1.ipynb b/workshops/mne_course/Day3/2 - Connectivity 1.ipynb new file mode 100644 index 0000000..44823df --- /dev/null +++ b/workshops/mne_course/Day3/2 - Connectivity 1.ipynb @@ -0,0 +1,679 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import mne_connectivity\n", + "\n", + "from _helper_functions import simulate_connectivity, plot_connectivity" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Frequency-resolved connectivity - the `mne-connectivity` package\n", + "\n", + "A common analysis in neuroscience is to compute the connectivity between signals, from which we can make inferences about the communication between different regions of the brain.\n", + "\n", + "In particular, we are often interested in the frequency-specific coupling of signals, often termed effective/spectral/oscillatory connectivity.\n", + "\n", + "Thankfully, MNE has a sister package, [MNE-Connectivity](https://mne.tools/mne-connectivity/stable/index.html), for doing just that!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Simulating connectivity\n", + "\n", + "Above, we import a custom helper function `simulate_connectivity()` which generates some signals with interactions at a given frequency range (feel free to check out this function in the [_helper_functions.py](_helper_functions.py) file).\n", + "\n", + "We will use this as a starting point to explore connectivity computations in MNE." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We start by simulating 2 interacting channels (1 seed and 1 target) in the frequency range 5-10 Hz." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "# Simulate 5-10 Hz connectivity\n", + "epochs_5_10 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(5, 10))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Simulating connectivity**\n", + "\n", + "**Exercise:** Verify that activity is present in the 5-10 Hz frequency range of these channels by computing the power spectra of the data.\n", + "\n", + "*Hint:* Both signals should contain activity in this frequency range." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Using multitaper spectrum estimation with 7 DPSS windows\n", + "Averaging across epochs...\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "C:\\Users\\sangeetha\\AppData\\Local\\Temp\\ipykernel_7056\\3961488573.py:2: RuntimeWarning: Channel locations not available. Disabling spatial colors.\n", + " epochs_5_10.compute_psd().plot();\n", + "c:\\Users\\sangeetha\\anaconda3\\envs\\mne\\Lib\\site-packages\\mne\\viz\\utils.py:165: UserWarning: FigureCanvasAgg is non-interactive, and thus cannot be shown\n", + " (fig or plt).show(**kwargs)\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA/MAAAFpCAYAAADQnnivAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8g+/7EAAAACXBIWXMAAA9hAAAPYQGoP6dpAACrH0lEQVR4nOzdd1gU1/c/8PcWugICooACih0bKhbsxt57qjFqoomJicbELnYxGjWm2KPmk8RoVFQSQVGxIQrSRBCV3pHeYdnZnd8f/tivhCJld2d3OK/n4WHYMnMOcxj2zty5V8CyLAtCCCGEEEIIIYRoDSHXARBCCCGEEEIIIaR+qDFPCCGEEEIIIYRoGWrME0IIIYQQQgghWoYa84QQQgghhBBCiJahxjwhhBBCCCGEEKJlqDFPCCGEEEIIIYRoGWrME0IIIYQQQgghWoYa84QQQgghhBBCiJahxjwhhBBCCCGEEKJlqDFPCCGEEJw6dQoCgaDGr9u3bwMA7O3ta3zNiBEjqqw3LCwMixYtgoODAwwMDGBgYICOHTtiyZIlCAwMVG+ShBBCCI+IuQ6AEEIIIZrj5MmT6NKlS5XHu3XrplgePHgwvv/++yqvMTY2rvTzkSNH8MUXX6Bz58746quv4OjoCIFAgMjISPz1119wdnZGdHQ0HBwclJ8IIYQQwnPUmCeEEEKIQvfu3dGvX79aX2NqaoqBAwfW+pr79+9j6dKlmDRpEs6fPw9dXV3Fc6NGjcLnn3+Oc+fOwcDAQClxE0IIIU0NNeYJIYQQonQ7d+6ESCTCkSNHKjXkXzdnzhw1R0UIIYTwBzXmCSGEEKIgk8nAMEylxwQCAUQikeJnlmWrvAYARCIRBAIBZDIZbt26hX79+sHKykrlMRNCCCFNEQ2ARwghhBCFgQMHQkdHp9KXnp5epdd4enpWeY2Ojg527NgBAMjKykJpaSns7OyqrL/iZEHFF8uyasmLEEII4Ru6Mk8IIYQQhf/973/o2rVrpccEAkGln4cMGYL9+/dXea+Njc0b19+3b188fvxY8fOePXvwzTffNDBaQgghpOmixjwhhBBCFLp27frGAfBMTExqfY2FhQUMDAyQkJBQ5bnTp0+jpKQEaWlpmDp1aqPjJYQQQpoqaswTQgghRKlEIhFGjRoFb29vpKWlVbpvvmKKu/j4eI6iI4QQQviB7pknhBBCiNKtXbsWMpkMn376KaRSKdfhEEIIIbxDV+YJIYQQohAeHl7tSPUODg5o2bIlACAvLw8PHz6s8ho9PT04OTkBAAYPHoxffvkFy5YtQ58+fbB48WI4OjpCKBQiLS0NFy5cAAAYGxurMBtCCCGEvwQsDSNLCCGENHmnTp3CggULanz+2LFj+Pjjj2Fvb1/tvfDAqwHwkpOTKz32+PFjHDhwALdv30ZqaioEAgHatGkDFxcXzJ8/H6NGjVJqHoQQQkhTQY15QgghhBBCCCFEy9A984QQQgghhBBCiJahxjwhhBBCCCGEEKJlqDFPCCGEEEIIIYRoGWrME0IIIYQQQgghWoYa84QQQgghhBBCiJbRqMb83bt3MWXKFFhbW0MgEODSpUuK56RSKVavXo0ePXrAyMgI1tbW+PDDD5Gamqp4TU5ODpYtW4bOnTvD0NAQtra2+PLLL5Gfn89BNoQQQgghhBBCiGqIuQ7gdcXFxejVqxcWLFiAWbNmVXqupKQEwcHB2LhxI3r16oXc3FwsX74cU6dORWBgIAAgNTUVqamp+P7779GtWzckJCTg008/RWpqKs6fP1/nOORyOVJTU9G8eXMIBAKl5kgIIYQQQgghpGliWRaFhYWwtraGUNi4a+saO8+8QCDAxYsXMX369Bpf8+jRI/Tv3x8JCQmwtbWt9jXnzp3DBx98gOLiYojFdTt3kZycjLZt2zYkbEIIIYQQQgghpFZJSUlo06ZNo9ahUVfm6ys/Px8CgQCmpqa1vsbY2LjWhrxEIoFEIlH8XHF+IyYmBhYWFpDJZAAAkUhUaZlhGAgEAsWyUCiEUCiscVkqlUIkEimWxWIxBAKBYhkAGIaptKyjowOWZRXLcrkcMplMsSyXyyEWi2tclslkYFlWsVxdHpRT08qptLQUjx8/Rr9+/RTb1Pac+LifKKeG5ySRSBASEoJ+/fopeldpe0583E+UU8NzksvlePToEfr16wcdHR1e5MTH/UQ5NSyn8vJyPH78GL1794ZIJOJFTnzcT5RTw3PKzMxEhw4d0Lx5czSWRt0zXx9lZWVYs2YN3nvvPRgbG1f7muzsbGzbtg1LliypdV1ubm4wMTFRfFVc5U9JSYGxsTFSUlIUy/Hx8Xj58iWMjY0RExOD7OxsGBsb49mzZ4oTB+Hh4SguLoaxsTFCQ0NRXl4OY2NjBAYGQi6Xw9jYGA8fPoRAIICxsTF8fX2hq6sLQ0ND+Pr6wtDQELq6uvD19YWxsTEEAgEePnwIY2NjyOVyBAYGwtjYGOXl5QgNDYWxsTGKi4sRHh4OY2Nj5Ofn49mzZzA2NkZ2djZiYmJgbGyMly9fIj4+nnJq4jkFBgZi4MCBMDMz401OfNxPlFPDc8rPz0eLFi1gZmbGm5z4uJ8op4bnZGZmpviQypec+LifKKeG5RQdHY0xY8YgPz+fNznxcT9RTg3P6fHjxwCglNu5tbKbvVQqxZw5c5CYmIjbt2/D2LhqY76goABjx45FixYt4OHhAR0dnRq39d8r8wUFBWjbti3u3LmDYcOG0ZklyolXOZWVlSEpKQnt27dXbF/bc+LjfqKcGp5TeXk54uPj4eDgoDiua3tOfNxPlFPDc2JZFtHR0XBwcIBYLOZFTnzcT5RTw3KSSqVISkqCra0thEIhL3Li436inBqe0507dzBixAhFo78xtK4xL5VKMXfuXMTGxsLHxwfm5uZV3ltYWIhx48bB0NAQ//77L/T19eu17YKCApiYmMDLywvjx49vTBqEaByGYRASEgInJyfFgY8QPqEaJ3xHNU74jOqb8N3Vq1cxYcIEpTTmteovpKIhHxUVhVu3blXbkC8oKMC4ceOgp6cHDw+PejfkX0cHEMJHYrEYzs7OXIdBiMpQjRO+oxonfEb1TfhOmW1MjbpnvqioCKGhoQgNDQUAxMXFITQ0FImJiWAYBrNnz0ZgYCD+/PNPyGQypKenIz09HeXl5QBeXZEfO3YsiouL8euvv6KgoEDxmoquEPXRkPcQoulkMhmePXtG9U14i2qc8B3VOOEzqm/Cd8qsbY269BwYGIiRI0cqfv76668BAPPnz8fmzZvh4eEBAOjdu3el9926dQsjRoxAUFAQ/P39AQAdOnSo9Jq4uDjY29urLnhCtEhpaSnXIRCiUlTjhO+oxgmfUX0TUjcae888lyrumb9+/TpGjx7NdTiEEEIIIYQQQnjgxo0bihkbGnvPvEZ1s9c01L2H8JFMJkN4eDjVN+EtqnHCd1TjhM+ovgnfKbO2qTFPCCGEEEIIIYRoGY26Z17TiEQirkMgTQTLshAIBGrZlkgkQvfu3dWyLUK4QDVO+I5qnPAZ1TfhO2W2MenKfC2oew9RJZZl8eTJE+zduxfLly/HoUOHIJfLVb5dmUyGkJAQqm/CW1TjhO+oxgmfUX0TvuPtaPaaxtramusQCEeSkpIQHByMdu3aoV27dmjevLnS1p2cnIzr168jPDwc3bt3x9tvv402bdrg2rVr2LZtG1avXg19fX2lba86BgYGKl0/IVyjGid8RzVO+Izqm/CZMtuYNJp9NSpGs1fGCINE+9y+fRtXr17FlClTkJiYiNjYWBQWFkIkEqFt27Zo37492rVrBzs7O+jq6tZpnfn5+fDx8YGvry8MDAzQpUsXGBgYIDU1FS9fvkRZWRkGDRqE9u3b4/Tp01izZg3Mzc1VnCkhhBBCCCFEnZTZ1qTGfDUqfsHZ2dkwMzPjOhyiJgzD4OjRowCAxYsXQywWV3k+JSUFsbGxiIuLQ0JCAqRSqeJed5ZlIZPJUFJSgsLCQpSUlKC0tBSFhYWQyWSwt7eHnZ0dbGxsYG1trfhuaWkJkUiECxcuICIiAnPnzsWxY8fw+eefw8HBQSV5hoSEwMnJqUqOhPAB1TjhO6pxwmdU34TvcnJyYG5urpTGPP2F1EJdA5IR7mVnZ2PPnj0YP348RowYUe1rxGIx7OzsYGtri27duiEmJgZxcXFITU1Fbm4uAEBPTw92dnawsrJCq1at0LJlS7Rs2RImJiZvrKfZs2eje/fuOHr0KN555x0cO3YMM2bMwIABA5Saq0AgQIsWLai+CW9RjRO+oxonfEb1TfhOmbVNV+arUXFlPicnBy1atOA6HKJiYWFhOHXqFJYvXw5bW9tKz0kkEsTFxSEmJgaxsbFITU0Fy7Jo1aoVHBwc0K5dO9jY2Cj1n05RURH279+PDh06ID4+Hl27dsX06dOVsm5CCCGEEEIId3Jzc2FmZkbd7FWlojHv5eWF8ePHcx0OURGWZfH3338jJiYGK1asUAy2wrIsPDw84OfnB319fdjb26NDhw5wcHCAlZWVWs4UsyyLS5cuITQ0FFZWVmAYBkuXLoVQ2PgJKBiGQUBAAPr370/d1wgvUY0TvqMaJ3xG9U347urVq5gwYQJ1s1c1ZTSciGYqKSnB3r174ejoiLVr1yoa6Hl5edi/fz969OiBXbt2cdbFSyAQYMaMGXB0dMThw4fRrVs3bNmyBWvWrGn0CK9CoRA2NjZU34S3qMYJ31GNEz6j+iZ8p8zapr+SWtBBhJ8SEhKwYcMGzJw5EzNnzlQ02P39/bF161bMnz8fs2fP1oh7tTp16oRt27bh5cuXMDIywoYNG/Dy5ctGrVMoFMLOzo7qm/AW1TjhO6pxwmdU34TvqDGvJgzDcB0CUbKMjAwcOHAAGzduhKOjI4BX98X/+OOPCAgIgJubG9q3b89xlJUZGRlh3bp16Ny5M3R0dLBr1y6Eh4c3eH0Mw+Du3btU34S3qMYJ31GNEz6j+iZ8p8zapsZ8LeiMIL8UFhZi165d+PbbbxUDG0ZHR2Pt2rUYOHAgli1bBj09PY6jrJ5AIMC0adMwffp06Orq4syZM/Dy8mrQuoRCIRwcHKi+CW9RjRO+oxonfEb1TfhOmbVN98zXgg4i/CGVSrFz504sWbIEVlZWYFkW586dw9OnT+Hq6gpTU1OuQ6yTgQMHwtDQEL///juioqIQHx+PJUuW1KtWK+5FI4SvqMYJ31GNEz6j+iZ8R93s1YS69/ADy7LYs2cPpk+fjs6dOyMrKwsbNmyAgYEBNm3apDUN+Qo9e/bEp59+ivj4eFhYWGDz5s0oLi6u8/sZhoGPjw/VN+EtqnHCd1TjhM+ovgnfUTd7NaEr8/xw9OhR9OzZEwMGDMC9e/ewa9cuLFmyBFOmTNGIQe4awsHBAd988w0ePHiAESNGYOPGjUhJSanTe4VCIbp37071TXiLapzwHdU44TOqb8J3dGVeTeggov3c3d2hq6uL0aNHY9++fYiIiICbmxtsbW25Dq3RrK2tsX79ely5cgVz5szB/v37ERIS8sb3CYVCWFpaUn0T3qIaJ3xHNU74jOqb8B015tWEuvdot3v37iE6OhqDBw/GunXrMHz4cHz66afQ0dHhOjSlMTc3x+bNm3Hp0iXMnj0bV65cwY0bN2p9j1QqxbVr1yCVStUUJSHqRTVO+I5qnPAZ1TfhO2W2MQUsy7JKWxtPFBQUwMTEBImJiWjbti3X4ZAGiIiIwOnTp9G9e3e8ePECy5cvh4mJCddhqYxEIsGOHTswevRo/PPPP9i0aROaNWtW7Wvlcjny8vJgampKZ70JL1GNE76jGid8RvVN+C4pKQm2trbIz8+HsbFxo9ZFjflqVDTmlfELJuqXnJwMNzc3NGvWDEOGDMHkyZO19t74+mAYBnv27IGJiQl0dXXx8ccfcx0SIYQQQggh5DXKbGvS6a5aUPce7ZObm4tvvvkGALB06VKtHuSuvsRiMdasWYO4uDhERkYiJyen2tdJpVJcuXKF6pvwFtU44TuqccJnVN+E75RZ29SYr4VYLOY6BFIP5eXlePfdd9G9e3f88MMPsLOz4zoktRMIBPj4448hEAjwxx9/VPsasViMoUOHUn0T3qIaJ3xHNU74jOqb8J0ya5sa87XIzc3lOgRSRyzLYvXq1ejWrRs2bNjAq0Hu6qtz586wsrLCixcvkJaWVuV5gUAAY2PjJtNjgTQ9VOOE76jGCZ9RfRO+U2YbkxrztXj27BnXIZA6unDhAp49e4bt27dzHYpGWLRoEcrLy/Hbb79VeU4qleLy5cvUfY3wFtU44TuqccJnVN+E75TZxqTGfC1EIhHXIZA6CAsLwx9//IENGzbA0NCQ63A0gqmpKcaPH48XL14gLi6u0nNisRhjx46l7muEt6jGCd9RjRM+o/omfKfMNiY15olWy8zMxA8//ICePXti8ODBXIejUaZNmwY9PT0cP368ynP0D5LwHdU44TuqccJnVN+E1A015mshk8m4DoHUory8HN999x309PTw1VdfcR2OxhGJRFiyZAlevHiBp0+fKh5nGAaenp5gGIbD6AhRHapxwndU44TPqL4J3ymzjalRjfm7d+9iypQpsLa2hkAgwKVLlxTPSaVSrF69Gj169ICRkRGsra3x4YcfIjU1tdI6JBIJli1bBgsLCxgZGWHq1KlITk5uUDzUzV5zsSyLH3/8EZaWlpgwYQLMzc25Dkkj9e7dG506dcIvv/wClmUBvDrbPXHiRDrrTXiLapzwHdU44TOqb8J3vO1mX1xcjF69euHnn3+u8lxJSQmCg4OxceNGBAcHw93dHS9evMDUqVMrvW758uW4ePEizpw5A19fXxQVFWHy5Ml0lZ1nLl++DCMjI+Tk5GDKlClch6PRvvrqK8TFxSEwMFDxGJ3tJnxHNU74jmqc8BnVNyF1o1GN+QkTJmD79u2YOXNmledMTExw/fp1zJ07F507d8bAgQPx008/ISgoCImJiQCA/Px8/Prrr9i7dy9Gjx4NJycn/PHHH3jy5Alu3LhR73joBIBmCgsLQ1hYGF6+fInPPvuMpi55A0tLS8ycORN79+4Fy7JgGAbe3t70j5LwFtU44TuqccJnVN+E73jbzb6+8vPzIRAIYGpqCgAICgqCVCrF2LFjFa+xtrZG9+7d4efnV+N6JBIJCgoKKn0BUDQSZTKZ4pf++jLDMJWW5XJ5rctSqbTSckW354pllmWrLAOotCyXyystVxzoalqWyWSVlqvLQ5tyyszMxG+//YaBAwfCzs4Obdq00fqc1LGf3nvvPRQWFuLmzZsAgKlTp0JHR0erc+LjfqKclJOTUCjEpEmToKOjw5uc+LifKKeG56Sjo1OpGzIfcuLjfqKcGpaTQCDAtGnTIBQKeZMTH/cT5dTwnJRJaxvzZWVlWLNmDd577z0YGxsDANLT06Grq4sWLVpUem2rVq2Qnp5e47rc3NxgYmKi+Grbti0AIDY2FgAQGRmJyMhIAK+uCkdFRQEAQkJCFNN+BQQEICkpCQDg5+eHtLQ0AK/GAcjKygIA+Pj4IC8vDwDg7e2NwsJCAICnpyfKysrAMP834EdZWRk8PT0BAIWFhfD29gYA5OXlwcfHBwCQlZWFu3fvAgDS0tIUJyySkpIQEBAAAIiLi0NISAgAICoqCmFhYVqbk7+/P3bv3o3Zs2cjMTER77//vtbnpK79dO/ePaxYsQJ79uyBt7c30tPTwbKsVufEx/1EOSkvp4CAALAsy6uc+LifKKeG5cSyLK5du6a4+MCHnPi4nyinhudUUFDAu5z4uJ8op4blVLFuZRCwFac3NIxAIMDFixcxffr0Ks9JpVLMmTMHiYmJuH37tqIxf/r0aSxYsAASiaTS68eMGQMHBwccPny42m1JJJJK7ykoKEDbtm3h5eWF8ePHK860iESiSssMw0AgECiWhUKh4ixidctSqRQikUixLBaLIRAIFMvAq7M2ry/r6Ogoukbr6OhALpdDJpMpluVyOcRicY3LMpkMLMsqlqvLQ1ty+uGHHzB48GA8ePAAw4cPh5OTk9bnpM79JBQK8d5772Hq1KkwMTHBuHHjAECrc+LjfqKcGp9TWVkZbty4gXHjxkEoFPIiJz7uJ8qp4TnJZDJcu3YNY8eOha6uLi9y4uN+opwalpNEIoGPjw/eeustiMViXuTEx/1EOTU8Jy8vL0ycOBH5+fmKdmxDaV1jXiqVYu7cuYiNjYWPj0+lUcwr/vBzcnIqXZ3v1asXpk+fji1bttRp2wUFBTAxMVHKL5gox7179/D8+XMMHDgQ165dw8qVK7kOSSvFx8fj3Xffxd27d6Gjo8N1OIQQQgghhDQpymxralU3+4qGfFRUFG7cuFFlOrK+fftCR0cH169fVzyWlpaG8PBwuLi41Ht7qrivgdRfQUEBLl68iPfeew8nTpzA4sWLuQ5Ja9nb22PQoEE4ePAg1TfhLblcjpycHKpxwltU44TPqL4J3ymztjWqMV9UVITQ0FCEhoYCeHW/QmhoKBITE8EwDGbPno3AwED8+eefkMlkSE9PR3p6OsrLywG8GvF+0aJFWLlyJW7evImQkBB88MEH6NGjB0aPHl3veCq6TxBuHTx4EIsXL8b58+cxdepUNG/enOuQtNqmTZtgZmamuJ+HEL6RyWR49OgRHcMJb1GNEz6j+iZ8p8za1qhu9rdv38bIkSOrPD5//nxs3rwZ7dq1q/Z9t27dwogRIwC8Ghjv22+/xenTp1FaWoq33noLBw8eVAxqVxfUzV5z+Pn5ITw8HOPHj8evv/6KzZs301R0SnDp0iUcPnwY//77r+IeI0IIIYQQQohqKbOtqVGNeU1R8QsODg6Gk5MT1+E0WYWFhdi0aRPc3NywadMmfP3117C0tOQ6LK0nl8uRlZWFdevWwd7eHhs2bOA6JEKUqqLGLSwsFAPgEcInVOOEz6i+Cd+FhISgT58+Te+eeXV7+fIl1yE0aYcOHcLHH3+Ma9euYcCAAdSQVxK5XI7w8HC4ubnBx8cH9+/f5zokQpSqosbpfkvCV1TjhM+ovgnfKbONSY35WlD3Y+48fPgQzZs3h6WlJe7fv1/tFIWkYcRiMUaNGoWWLVti+fLlcHNzQ0ZGBtdhEaI0FTVOx3DCV1TjhM+ovgnfKbO2qTFfCzojyI3i4mL8/fffWLBgAX755RcsXbqU7pNXIrlcjpSUFMjlckyZMgXt2rXD1q1bwTAM16ERohSv1zghfEQ1TviM6pvwHW9Hs9c0dBDhxuHDh7Fw4UI8fPgQ9vb2sLOz4zokXpHL5YiJiYFcLodAIMC6deuQm5uLEydOcB0aIUrxeo0TwkdU44TPqL4J31FjXk2oe4/6PXr0CPr6+rCzs4OHhwfee+89rkPiHbFYjGHDhinq28rKCtOnT0dgYCD8/f05jo6QxvtvjRPCN1TjhM+ovgnfUTd7NaEzgupVUlKCv/76C4sWLcKRI0ewaNEi6OjocB0W78jlciQkJFSq75kzZ6JZs2b43//+h8zMTA6jI6TxqqtxQviEapzwGdU34Tu6Mq8mdBBRryNHjmD+/Pl4+vQp9PT04OjoyHVIvFTdvWgikQhffPEFDAwMsHfvXshkMg4jJKRx6H5LwndU44TPqL4J31FjXk2oe4/6BAcHQywWo0uXLvjjjz+waNEirkPiLbFYDBcXlyr13b59e3Tt2hWtW7fGyZMnOYqOkMarqcYJ4QuqccJnVN+E76ibvZrQ1Un1KCsrwx9//IGPP/4Yp06dwty5c2FoaMh1WLwlk8kQHR1dbX3PmzcPiYmJyMzMxKNHjziIjpDGq63GCeEDqnHCZ1TfhO+UWdvUmK9FixYtuA6hSfDy8sKUKVOQnJyMnJwcDBw4kOuQeI1lWeTm5oJl2SrP6erqYsGCBRAKhThz5gzy8vLUHyAhjVRbjRPCB1TjhM+ovgnfKbONSY35WnTp0oXrEHiPZVn4+vpi8ODBOHr0KD777DOuQ+I9sVgMZ2fnGrv49OjRA/r6+hg8eDCOHz+u5ugIabw31Tgh2o5qnPAZ1TfhO2W2MakxXwvq3qN6/v7+cHZ2hru7O8aMGQNTU1OuQ+I9mUyGZ8+e1Vrfn3zyCXx9fSGXyxEREaHG6AhpvLrUOCHajGqc8BnVN+E76mZPeOPff/9F//79ER4ejjFjxnAdTpNRWlpa6/OGhoaYOnUqzMzMcPLkSRpRlmidN9U4IdqOapzwGdU3IXUjYOmGlCoKCgpgYmKC/Px8GBsbcx0Ob8XFxeHy5cvQ0dHBW2+9Rbc1aBiWZbFmzRo4OTnBwMAA06ZN4zokQgghhBBCtJoy25p0Zb4WgYGBXIfAa+7u7hg3bhySk5OpIa9GMpkM4eHhb+ziIxAIMH/+fKSmpsLX15cGwyNao641Toi2ohonfEb1TfhOmW1MaszXQiqVch0Cb+Xl5SEvLw9hYWGYPHky1+GQGnTr1g2lpaUYP348jh07xnU4hBBCCCGEaDVltjGpMV8LkUjEdQi89e+//2LixIkICAiAi4sL1+E0KSKRCN27d69zfS9cuBB3796FWCymwfCIVqhvjROibajGCZ9RfRO+U2ZtU2O+FtS9RzUYhkFISAiKi4sxbNgwCAQCrkNqUmQyGUJCQupc31ZWVrC2tkbfvn1pMDyiFepb44RoG6pxwmdU34TvaDR7otVu376N4cOH49q1axg/fjzX4TRJBgYG9Xr9+++/Dw8PDwwbNgweHh4qiooQ5alvjROibajGCZ9RfRNSN9SYrwV171E+lmXh7e2N1q1bo2vXrtDT0+M6pCZHJBKhS5cu9arvZs2aYfDgwRCLxfDz86PB8IhGa0iNE6JNqMYJn1F9E76jbvZqwjAM1yHwztOnT9GhQwd4enpi+vTpXIfTJDEMg0ePHtW7vqdMmYIbN27ggw8+oMHwiEZraI0Toi2oxgmfUX0TvlNmbVNjvhZ0L7fyXbx4Ef369UOzZs1gZmbGdThNkkAgQIsWLepd32KxGLNnz0ZISAgNhkc0WkNrnBBtQTVO+Izqm/CdMmubGvO1oO49yvXy5UsAr+6Znz17NsfRNF0ikQgdOnRoUH0PGjQIz58/x4wZM2gwPKKxGlPjhGgDqnHCZ1TfhO+om72aUPce5bp06RJGjx6NvLw82Nvbcx1Ok8UwDPz8/BpU3wKBAAsWLMD58+cxYsQIGgyPaKTG1Dgh2oBqnPAZ1TfhO+pmrybt27fnOgTeKC0tRWxsLJ4/f46pU6dyHU6TJhQKYWNjA6GwYX/+HTt2hFwuR5cuXXD//n0aDI9onMbWOCGajmqc8BnVN+E7ZbYxBSzLskpbG08UFBTAxMQE+fn5MDY25jocXrh8+TJ0dXVx8+ZN7Nmzh+6D0nJZWVn44YcfMH78eGRlZdFghoQQQgghhNSBMtuadMqrFtS9RzlYlsXdu3dRXl6OMWPGUEOeYwzD4O7du42qbwsLCzg4OIBhGDx+/FiJ0RHSeMqocUI0GdU44TOqb8J31M1eTah7j3IEBATAyckJd+/exVtvvcV1OE2eUCiEg4NDo+v7nXfewT///IOioiIaCI9oFGXVOCGaimqc8BnVN+E7ZdY2/ZXUoqysjOsQeOGff/6BlZUV+vTpA7FYzHU4TZ6y7kUzMDDAkCFDAADR0dHKCI0QpaD7LQnfUY0TPqP6JnynzDamRv2V3L17F1OmTIG1tTUEAgEuXbpU6Xl3d3eMGzcOFhYWEAgECA0NrbKO9PR0zJs3D61bt4aRkRH69OmD8+fPNyie6tZP6ichIQEtWrSAj48PDXynIRiGgY+Pj1K6+AwdOhQSiQTBwcFKiIwQ5VBmjROiiajGCZ9RfRO+U2YbU6Ma88XFxejVqxd+/vnnGp8fPHgwdu3aVeM65s2bh+fPn8PDwwNPnjzBzJkz8fbbbyMkJKTe8dAZwcb7559/0Lt3b7Rq1QrNmzfnOhyCV3XdvXt3pdS3hYUFxGIxnjx5ooTICFEOZdY4IZqIapzwGdU34Ttl1rZG9XmeMGECJkyYUOPz8+bNAwDEx8fX+JoHDx7g0KFD6N+/PwBgw4YN2L9/P4KDg+Hk5FSveOgg0jgMwyAhIQGFhYWYP38+1+GQ/08oFMLS0lJp6+vSpQtCQkIglUqho6OjtPUS0lDKrnFCNA3VOOEzqm/Cd3TPfC2GDBmCs2fPIicnB3K5HGfOnIFEIsGIESNqfI9EIkFBQUGlr4rHAUAmk0Emk1VZZhim0nLFIGA1LUul0krLFbMCViyzLFtlGUClZblcXmm5ogtSTcsymazScnV5qCqngIAAdOnSBTKZDFZWVrzIiQ/7qaSkBFevXoVUKlVKTv3794dIJMLTp09pP1FOGpFTWVmZosb5khMf9xPl1PCcpFIprl69ivLyct7kxMf9RDk1LKfS0lJcu3YNZWVlvMmJj/uJcmp4ThVtTGXgXWP+7NmzYBgG5ubm0NPTw5IlS3Dx4kU4ODjU+B43NzeYmJgovtq2bQsASExMBABERkYiMjISABAWFoaoqCgAQEhICOLi4gC8GrE9KSkJAODn54e0tDQAr8YByMrKAgD4+PggLy8PAODt7Y3CwkIAgKenp+KA5enpCYZhUFZWBk9PTwBAYWEhvL29AQB5eXnw8fEB8Gqu77t37wIA0tLS4OfnBwBISkpCQEAAACAuLk5xi0FUVBTCwsLUltPdu3dRVFSEHj168CYnPuynW7duwdHRESKRSCk5mZubo2/fvggODqb9RDlpRE5JSUlo3rw5RCIRb3Li436inBqek0gkAsMwKC0t5U1OfNxPlFPDcgoJCYGzs7NimQ858XE/UU4Nz6lifcogYCtOb2gYgUCAixcvYvr06VWei4+PR7t27RASEoLevXtXem7ZsmUICAjAzp07YWFhgUuXLmH//v24d+8eevToUe22JBJJpTMkBQUFaNu2La5evYpx48YpzrSIRKJKywzDQCAQKJaFQiGEQmGNy1KpFCKRSLEsFoshEAgUy8CrszavL+vo6IBlWcWyXC6HTCZTLMvlcojF4hqXZTIZWJZVLFeXhypyys3Nxffff4/y8nLs2LFD0f1am3Pi435SVk7btm1Dfn4+vv/+e97kxMf9RDlRTpQT5UQ5UU6UE+VEOXGb09WrVzFhwgTk5+fD2NgYjcGrxnxMTAw6dOiA8PBwODo6Kh4fPXo0OnTogMOHD9dp2wUFBTAxMYGXlxfGjx/f2FSapH/++QeRkZFwdHTEpEmTuA6HvEYqlcLb2xtjx45V2j3unp6eOHPmDI4cOQIDAwOlrJOQhlJFjROiSajGCZ9RfRO+U2Zjnlfd7EtKSgBUHVRAJBIp7lGoD5FIpJS4mqL79+/j5cuXGD16NNehkP8Qi8UYOnSo4gymMvTv3x9yuVyp3YYIaShV1DghmoRqnPAZ1TfhO2W2MTXqr6SoqAjR0dGKn+Pi4hAaGgozMzPY2toiJycHiYmJSE1NBQA8f/4cANC6dWu0bt0aXbp0QYcOHbBkyRJ8//33MDc3x6VLl3D9+nX8+++/9Y5HIBAoJ7EmJjk5GSzLwsHBAXp6elyHQ/5DIBA0+izgf1lYWMDQ0BCBgYEYMGCAUtdNSH2posYJ0SRU44TPqL4J3ymzjalRV+YDAwPh5OSkmELu66+/hpOTE1xdXQEAHh4ecHJyUnTbfuedd+Dk5KToPq+jowNPT0+0bNkSU6ZMQc+ePfG///0Pv/32GyZOnFjveBiGUVJmTYu3tzekUimmTZvGdSikGlKpFJcvX1aM6qksffv2RWhoqFLXSUhDqKrGCdEUVOOEz6i+Cd8ps42psffMc6ninvmMjAy0bNmS63C0CsuyWLZsGVq0aIFt27ZxHQ6pBsuyKCsrg76+vlLPDIaFhWHr1q349ddfYWJiorT1ElJfqqpxQjQF1TjhM6pvwneZmZmwtLSke+ZVTVdXl+sQtE5YWBhkMhkmT57MdSikFqq4D61bt24QCASKqTwI4RLda0n4jmqc8BnVN+EzZbYxqTFfC+pmX39eXl6Qy+Xo378/16GQGrw+H6cyicVi2Nvb48GDB0pdLyH1paoaJ0RTUI0TPqP6JnynzNqmxnwt6Kxg/UgkEoSHh+Ott96iblEaTCwWY+LEiSqp75EjR9J984RzqqxxQjQB1TjhM6pvwnfKrG1qzNciLS2N6xC0yr1798AwDCZMmMB1KOQNVHW2u3///igsLERGRoZK1k9IXdEVHcJ3VOOEz6i+CZ8ps41JjflaxMfHcx2CVvHw8EDPnj3RvHlzrkMhtWAYBt7e3ir5R2lhYYHmzZsjKChI6esmpK5UWeOEaAKqccJnVN+E75TZxqTGfC2oe0/dZWVlITY2FnPmzOE6FPIGOjo6mDZtGnR0dFSy/kGDBuHGjRsqWTchdaHqGieEa1TjhM+ovgnfUTd7NaFZ++rO09MT5ubm6NixI9ehkDdgWRYFBQUqq+9Ro0bh2bNn9PdDOKPqGieEa1TjhM+ovgnfKbO2qTFfC5lMxnUIWoFlWbi7u+P999/nOhRSBwzDKMY3UIVu3bqhvLwciYmJKlk/IW+i6honhGtU44TPqL4J3ymzjUmN+VpQN/u6iYqKQlFREUaNGsV1KKQOdHR0MGnSJJV1XxOLxbCzs4Ofn59K1k/Im6i6xgnhGtU44TOqb8J31M1eTeRyOdchaIUTJ05gzJgxdPJDS8jlcuTk5Ki0vidOnIhr166pbP2E1EYdNU4Il6jGCZ9RfRO+U2ZtU2O+FnQQeTOGYeDr64uFCxdyHQqpI5lMhkePHqn0NpLhw4cjPj6e/oYIJ9RR44RwiWqc8BnVN+E7asyrCV1pfjNvb2+0adMGLVu25DoUUkc6OjoYN26cSruvmZubQ19fHy9evFDZNgipiTpqnBAuUY0TPqP6JnxH3ezVhK4qvtmxY8ewdOlSrsMg9SCXy5GRkaHy+nZ2doanp6dKt0FIddRV44RwhWqc8BnVN+E7jbkyL5VKkZSUhOfPnyMnJ0dZMWkMXV1drkPQaLm5uUhPT8fQoUO5DoXUg1wuR3h4uMr/SU6fPh23b99W6TYIqY66apwQrlCNEz6j+iZ8p8w2poCt50R3RUVF+PPPP/HXX38hICAAEolE8VybNm0wduxYLF68GM7OzkoLUt0KCgpgYmKC/Px8GBsbcx2Oxtq2bRsEAgE2bNjAdShEAzEMg2HDhuHOnTvUVY4QQgghhBAot61Zryvz+/fvh729PY4dO4ZRo0bB3d0doaGheP78OR48eIBNmzaBYRiMGTMG48ePR1RUVKOC4xqdEaydt7c3Pv/8c67DIPUkl8uRkpKi8voWi8WwtrbGo0ePVLodQv5LXTVOCFeoxgmfUX0TvlNmbdfr7ns/Pz/cunULPXr0qPb5/v37Y+HChTh8+DB+/fVX3LlzBx07dlRKoFygg0jNysrKIJPJ0KJFC65DIfUkl8sRExODVq1aQShU7bAZY8aMwaVLl+Di4qLS7RDyOnXWOCFcoBonfEb1TfhOmW3Menezbwqom/2bXbx4EV5eXjh69CjXoRANlpGRgblz59K984QQQgghhIDDbvZNTWRkJNchaCxPT0+MGzeO6zBIA8jlciQkJKil54mlpSXKy8tRXFys8m0RUkGdNU4IF6jGCZ9RfRO+U2Ybs8GN+ezsbMVyUlISXF1d8e233+LevXtKCUwT5Obmch2CRmJZFi9evMDIkSO5DoU0gLrvRevZsydNUUfUiu63JHxHNU74jOqb8J0y25j1bsw/efIE9vb2sLS0RJcuXRAaGgpnZ2fs378fR48exciRI3Hp0iWlBcglsbheQwo0GYmJidDV1YWZmRnXoZAGEIvFcHFxUVt9T506FV5eXmrZFiGA+mucEHWjGid8RvVN+E6ZtV3vxvyqVavQo0cP3LlzByNGjMDkyZMxceJE5OfnIzc3F0uWLMGuXbuUFiCXZDIZ1yFopJs3b6JTp05ch0EaSCaTITo6Wm31PXz4cK2f2YJoF3XXOCHqRjVO+Izqm/CdMmu73o35R48eYceOHRgyZAi+//57pKamYunSpRAKhRAKhVi2bBmePXumtAC5RGMDVu/27dsYNWoU12GQBmJZFrm5uWqrbyMjI0ilUrVsixBA/TVOiLpRjRM+o/omfKfM2q53Yz4nJwetW7cGADRr1gxGRkaVulu3aNEChYWFSguQS9S9p6qSkhLk5OTAycmJ61BIA4nFYjg7O6u1vk1MTJCYmKi27ZGmjYsaJ0SdqMYJn1F9E77jtJs9AAgEglp/5gvq3lNVUFAQ9PX1YW9vz3UopIFkMhmePXum1vp2dHTEnTt31LY90rRxUeOEqBPVOOEzqm/Cd8qs7QadFvjoo4+gp6cHACgrK8Onn34KIyMjAIBEIlFacETz3L9/H23atIFQSLMaarPS0lK1bq9///548OAB5s2bp9btkqZL3TVOiLpRjRM+o/ompG7q3ZifP39+pZ8/+OCDKq/58MMPGx6RBhGJRFyHoFEqpqQbPnw416GQRhCJRGq/TWLYsGH49ddf1bpN0nRxUeOEqBPVOOEzqm/Cd8psY9a7MX/y5EmlbVzTUfeeyqKjoyEWi9G9e3euQyGNIJPJEBkZia5du6rthJW1tTVvxtIgmo+LGidEnajGCZ9RfRO+43Q0+6akVatWXIegUfz9/SESidC1a1euQyFaSFdXF/n5+VyHQQghhBBCCGeU2cas15X5r7/+us6v3bdvX72DuXv3Lvbs2YOgoCCkpaXh4sWLmD59uuJ5d3d3HDlyBEFBQcjOzkZISAh69+5dZT0PHjzA+vXr4e/vDx0dHfTu3RteXl4wMDCoVzzt27evdw589uTJEzRv3hyGhoZch0IaQSQScdK7olOnTrh//z4mTpyo9m2TpoWrGidEXajGCZ9RfRO+U2Ybs16N+ZCQkEo/BwUFQSaToXPnzgCAFy9eQCQSoW/fvg0Kpri4GL169cKCBQswa9asap8fPHgw5syZg08++aTadTx48ADjx4/H2rVr8dNPP0FXVxePHz9u0IBt1M3+/xQUFIBlWVhbW3MdCmkkmUyGsLAw9OzZU63d1/r06YOHDx9SY56oHFc1Toi6UI0TPqP6JnzH2Wj2t27dUizv27cPzZs3x2+//YYWLVoAAHJzc7FgwQIMHTq0QcFMmDABEyZMqPH5ipGw4+Pja3zNihUr8OWXX2LNmjWKxzp27NigeMj/efToEUxNTelMKU/Ut5eKMgwZMgSurq5q3y5pmriocULUiWqc8BnVNyF10+B75vfu3Qs3NzdFQx4AWrRoge3bt2Pv3r1KCa6+MjIy4O/vD0tLS7i4uKBVq1YYPnw4fH19a32fRCJBQUFBpa/XyWQyxRmU15cZhqm0LJfLa12WSqWVllmWrbTMsmyVZQCVluVyeaVlhmFqXZbJZJWWq8ujLjk9evQILMuie/fuvMmJj/upLjnJ5XJ07twZIpFIrTl16dIFGRkZtJ8oJ5XnBAAdOnSASCTiTU583E+UU8NzEolEcHBwUPQ65ENOfNxPlFPDcmJZFl26dFHEwIec+LifKKeG51TxXRka3JgvKCjAy5cvqzyekZHB2ajVsbGxAIDNmzfjk08+wdWrV9GnTx+89dZbiIqKqvF9bm5uMDExUXy1bdsWAHDmzBkAQGRkJCIjIwEAYWFhinWFhIQgLi4OABAQEICkpCQAgJ+fH9LS0gC8GgcgKysLAODj44O8vDwAgLe3t+L35OnpibKyMjAMA09PTzAMg7KyMnh6egIACgsL4e3tDQDIy8uDj48PACArKwt3794FAKSlpcHPzw8AkJSUhICAAABAXFyc4vaIqKgohIWF1TsnuVwOGxsb6OjooFWrVrzIiY/7qT45PXjwQJGHunKKiIiAQCDAkydPaD9RTirNKSYmBjdv3gTDMLzJiY/7iXJqeE4Mw8DLy0uRHx9y4uN+opwalpO/vz8ePXqEmJgY3uTEx/1EOTU8p6NHj0Jp2AaaN28ea2try547d45NSkpik5KS2HPnzrH29vbshx9+2NDVKgBgL168WO1zcXFxLAA2JCSk0uP3799nAbBr166t9HiPHj3YNWvW1LitsrIyNj8/X/GVlJTEAmA9PT1ZlmVZhmFYhmGqLEul0krLMpms1uXy8vJKy3K5vNKyXC6vssyybKVlmUxWaVkqlda6zDBMpeXq8nhTThEREezhw4fZzZs38yYnPu6nuuZUWlrKvnjxgmUYRu05zZkzh/Xz86P9RDmpNCeJRMI+f/5cETcfcuLjfqKcGp4TwzDss2fPFNviQ0583E+UU8NyKisrY6OioliJRMKbnPi4nyinhud05coVFgCbn5/PNpaAZf9/X4V6KikpwTfffIMTJ04oui2IxWIsWrQIe/bsgZGRUaNOMggEgiqj2VeIj49Hu3btqoxmHxcXh/bt2+P333/HBx98oHj87bffhlgsxp9//lmnbRcUFMDExATXr1/H6NGjG5UHH5w8eRLNmjUDAMyZM4fjaIg227FjB/T19bFy5UquQyGEEEIIIUTtbty4gTFjxiA/Px/GxsaNWleDu9kbGhri4MGDiinigoODkZOTg4MHDza6Id9Q9vb2sLa2xvPnzys9/uLFC9jZ2dV7fcxr9182Zc+ePUNhYSENfscTDMPAz8+Pk/oeNGgQQkND1b5d0rRwWeOEqAPVOOEzqm/Cd8qs7Xo15hMTE6s8ZmRkhJ49e6JXr15VGvEpKSn1CqaoqAihoaGKD/txcXEIDQ1VbDcnJwehoaF4+vQpAOD58+cIDQ1Feno6gFdX87/99lv8+OOPOH/+PKKjo7Fx40Y8e/YMixYtqlcsABo0nR3fZGdnw9TUFDExMTQrAE8IhULY2NhwUt/9+vVDQkKC2rdLmhYua5wQdaAaJ3xG9U34Tpm1Xa81OTs745NPPlEMDlCd/Px8HDt2DN27d4e7u3u9ggkMDISTkxOcnJwAAF9//TWcnJwU01l5eHjAyckJkyZNAgC88847cHJywuHDhxXrWL58OdauXYsVK1agV69euHnzJq5fvw4HB4d6xQJQYx54NYiDs7Mz5HI5xOJ6zWRINJRQKISdnR0n9W1sbFxpxE9CVIHLGidEHajGCZ9RfRO+U2Zt16t1FhkZiZ07d2L8+PHQ0dFBv379YG1tDX19feTm5uLp06eIiIhAv379sGfPnlrnjK/OiBEjUNst/B999BE++uijN65nzZo1leaZbyjq3vNqSrpp06ahXbt2XIdClKSi+5qLiwsnJ2hatGiBmJgYdOrUSe3bJk0D1zVOiKpRjRM+o/omfMdZN3szMzN8//33SE1NxaFDh9CpUydkZWUphuV///33ERQUhPv379e7Ia+JmvoZQYZhUFJSgvj4eLpfnkeEQmGl+YnVrWvXrvD19eVk26Rp4LrGCVE1qnHCZ1TfhO84uzJfQV9fHzNnzsTMmTOVFogmauoHkYiICPTo0QNPnz7F0qVLuQ6HKEnFvWhcGTBgAG7duoWFCxdyFgPhN65rnBBVoxonfEb1TfiOs3vmm5qm3s3+4cOHGDBgAPLz82FiYsJ1OERJGIaBj48PZ/U9cOBAREdHc7Jt0jRwXeOEqBrVOOEzqm/Cd5x1s29qunTpwnUInIqJiYGpqSnMzMy4DoUokVAoRPfu3TnreWJjY4PCwsJax8cgpDG4rnFCVI1qnPAZ1TfhO2W2MemvpBZNuRH78uVLWFpa4unTp3B0dOQ6HKJEQqEQlpaWnP2TFAqFaNasGVJTUznZPuE/rmucEFWjGid8RvVN+E6ZbUz6K6mFVCrlOgTO+Pv7Y8CAAQgPD6fB73hGKpXi2rVrnNa3vb19rVNcEtIYmlDjhKgS1TjhM6pvwnfKrG1qzNdCJBJxHQJngoKC0LdvXyQmJsLW1pbrcIgSiUQiODs7c1rfffv2pcY8URlNqHFCVIlqnPAZ1TfhO2XWdoMb86NGjcKWLVuqPJ6bm4tRo0Y1KihNUVBQwHUInJBIJGAYBiKRCLq6uhAIBFyHRJRIKBTCzMyM0+5rAwcOxNOnTznbPuE3TahxQlSJapzwGdU34TtltjEb/Fdy+/Zt/Pzzz5g+fTqKi4sVj5eXl+POnTtKCY5rTbWx8ejRI/Tv3x8vXrxA586duQ6HKJlUKsWVK1c47b7WqVMn5OTk0CB4RCU0ocYJUSWqccJnVN+E75TZxmzUKa8bN24gPT0dAwcORHx8vJJC0hxNtXvP/fv34eLigoiICBr8jofEYjGGDh0KsVjMWQyGhobQ0dFBdnY2ZzEQ/tKEGidElajGCZ9RfRO+04hu9gBgZWWFO3fuoGfPnnB2dsbt27eVFJZmaIrdy2UyGXJyctCyZUtERkaia9euXIdElEwgEMDY2Jjz+ra0tER4eDinMRB+0pQaJ0RVqMYJn1F9E75TZm03uDFfEYSenh7+/PNPfPXVVxg/fjwOHjyotOC4xjAM1yGoXVhYGHr27AmWZSGRSKCnp8d1SETJpFIpLl++zHn3tZ49e+Lhw4ecxkD4SVNqnBBVoRonfEb1TfhOmW3MBjfm/3uv64YNG/Dnn39i7969jQ5KUzTFbva+vr4YMmQIUlJSYGNjw3U4RAXEYjHGjh3Lefe1AQMG4PHjx5zGQPhJU2qcEFWhGid8RvVN+E6ZbcwG/5XExcXBwsKi0mOzZs1C586dERQU1OjAiPqxLKuYiu7atWvo0aMH1yERFdGEf5DdunVDZmYm12EQntKEGidElajGCZ9RfRNSN/W+Ml9QUICCggK0aNECRUVFip8rvmxtbTFjxgxVxKp2MpmM6xDUKjo6Gh06dIBAIKDB73iMYRh4enpyfhtJ69atIZVKkZ+fz2kchH80pcYJURWqccJnVN+E75TZxqz3aS9TU9Nab9pnWRYCgYAXDeGm1s2+oos9AGRmZqJly5YcR0RUQSwWY+LEiZyf9RYIBGjRogVevHgBZ2dnTmMh/KIpNU6IqlCNEz6j+iZ8x2k3+1u3bimWWZbFxIkTcfz4cbq/mgeePn2Kjz76CAzDQEdHh+twiAoxDKMR/yQ7duyIoKAgaswTpdOUGidEVajGCZ9RfRNSN/X+Kxk+fHiln0UiEQYOHIj27dsrLShNwYfeBXWVmpoKKysrCAQCJCUloW3btlyHRFSEYRh4e3tj4sSJnJ+06d+/P65du8ZpDIR/NKnGCVEFqnHCZ1TfhO+U2cZs1DzzfOfi4sJ1CGrzehf7+Ph42NvbcxsQURkdHR1MmzZNI/5BOjo6Ij09neswCM9oUo0TogpU44TPqL4J3ymzjUmN+Vr8d/o9PgsODoaTkxOAVzMVtGvXjuOIiKqwLIuCggKNqO/27dujqKgIJSUlXIdCeESTapwQVaAaJ3xG9U34Tpm1rZTGfG0D4mmzpjKKZl5eHpo1a6Y4A1oxPR3hJ4ZhcO/ePY2ob319fTRv3hwxMTFch0J4RJNqnBBVoBonfEb1TfhOmbVd73vmZ86cWennsrIyfPrppzAyMqr0uLu7e+Mi0wBNpXuPn58fBg8erPhZIpFAT0+Pw4iIKuno6GDSpElch6FgZWWF8PBw9OjRg+tQCE9oWo0TomxU44TPqL4J3ymzjVnvK/MmJiaVvj744ANYW1tXeZwP4uLiuA5BLR4+fIgBAwYAeNWQ19XV5TgiokpyuRw5OTmQy+VchwIA6NOnDwICArgOg/CIptU4IcpGNU74jOqb8J0y25j1vjJ/8uRJpW1c06WlpaFXr15ch6FSFfcqGxoaAqAu9k2BTCbDo0ePMGrUKAiF3A+b0bNnT1y/fp3rMAiPaFqNE6JsVOOEz6i+Cd+lpaUpbV31/gtZt25dk7mK1hTmt3z06BH69++v+Dk+Pp4Gv+M5HR0djBs3TmNuI3FwcEBhYSHKy8u5DoXwhKbVOCHKRjVO+Izqm/CdMtuY9W7Mp6WlYfLkybCyssLixYtx5coVSCQSpQWkSZpC9x4/P79K0yPQtHT8J5fLkZGRoTH1bWlpCbFY3GRuayGqp2k1ToiyUY0TPqP6JnynzNqud2P+5MmTePnyJf7++2+Ymppi5cqVsLCwwMyZM3Hq1ClkZWUpLTiu8f0gwjAM8vPzYWZmpngsKSkJbdu25TAqompyuRzh4eEaU98CgQBmZmZ48eIF16EQntC0GidE2ajGCZ9RfRO+47QxD7z68D106FDs3r0bz549Q0BAAAYOHIhjx47BxsYGw4YNw/fff4+UlBSlBcoFvnezDw0NRe/evSs9JpVKqVsTz4nFYowaNUqj6rtbt24IDAzkOgzCE5pY44QoE9U44TOqb8J3nHazr07Xrl2xatUq3L9/H8nJyZg/fz7u3buHv/76Sxmr5wzfzwj6+vpiyJAhip9LS0uhr6/PYUREHeRyOVJSUjSqvnv16oXIyEiaU5YohSbWOCHKRDVO+Izqm/Adp1fmg4ODsXbtWuTk5AAANmzYUOn5li1bYtGiRbh8+TK++eabeq377t27mDJlCqytrSEQCHDp0qVKz7u7u2PcuHGwsLCAQCBAaGhojetiWRYTJkyodj11xeeDCMuySElJQZs2bRSPJSQk0P3yTYBcLkdMTIxG1XeHDh3Qtm1b/PPPP1yHQnhAE2ucEGWiGid8RvVN+I7Txvwnn3yCZs2aYebMmcjLy4OPj4/SgikuLkavXr3w888/1/j84MGDsWvXrjeu64cffoBAIGhUPHzu3vP8+XN07ty50mM0+F3TIBaLMWzYMI2qb3t7exgYGOD+/fsoLS3lOhyi5TSxxglRJqpxwmdU34TvOO1mr6uri/Xr1+P777/Hxx9/DJZllRbMhAkTsH37dsycObPa5+fNmwdXV1eMHj261vU8fvwY+/btw4kTJ+q0XYlEgoKCgkpfABRdfmUyGWQyWZVlhmEqLVecZalpWSqVVlqu+N1VLLMsW2UZQKVluVxeabkixpqWZTJZtXn4+vpi8ODBlR6Pi4uDra2t1ubEx/2kipwkEgni4+MVMWpCTnp6etDX18fw4cNx7tw52k+UU6NykkqliIuLg1wu501OfNxPlFP9ckpNTcXFixfh6uqKVatW4cCBA/Dw8EB+fr7W5sTH/UQ5NT6n8vJyJCQkQCqV8iYnPu4nyqlxOSlLvRvzzZs3BwD069cPEyZMQFBQkNKCUYaSkhK8++67+Pnnn9G6des6vcfNzQ0mJiaKr4rR3DMzMwEAkZGRiIyMBACEhYUhKioKABASEqKYTisgIABJSUkAXk33lpaWBuDVrQMVI/z7+PggLy8PAODt7Y3CwkIAgKenJ8rKysAwDDw9PcEwDMrKyuDp6QkAKCwshLe3NwBU6g2RlZWFu3fvAng1ZaCfnx+AVyPSBwQEAADi4uIQEhICAIiKikJYWBgAoLy8XFGkFTmlpKQgIyNDa3Pi435SRU43btxAYmIi5HK5RuXk4uKCzMxMPH36FH5+fk1+P1FOjcvp2bNnkMvlvMqJj/uJcqo5p6SkJPj7++PixYtwc3PD33//DbFYjM8++ww7d+5Eu3btYGBggB9++AGenp7w8vJCXl6eRufEx/1EOSk/p0ePHiElJYVXOfFxP1FODc+p4n3KIGDreWk9ICAATk5OihHPL126hOnTpystIEVgAgEuXrxY7brj4+PRrl07hISEVBmNfcmSJZDJZDh+/Pgb11NBIpFAIpEofi4oKEDbtm2Rk5ODFi1aKM60iESiSssMw0AgECiWhUIhhEJhjctSqRQikUixLBaLIRAIFMvAq7M2ry/r6OiAZVnFcsWVpopluVwOsVhc47JMJgPLsoplAIqz+59//nmlnDZs2IBt27ZpZU7/3Td82E9NMadt27bhrbfeQkREBJYsWcKLnPi4nygnyolyUk1OAoEA//zzD4KDgwG8Ghy0b9++aNu2LUQiUY05FRQU4OHDh/D19YVQKMSgQYMwZMgQ6Orqcp4TH/cT5UQ5UU6UU2Nyys7OhoWFBfLz82FsbIzGqHdjvrrpzFShIY15Dw8PrFy5EiEhIWjWrNkb11OTgoICmJiYKBrzfPPXX3+hU6dO6Nu3r+Kx4uJi7N+/v8qAhoR/ZLJXt1S0a9cOIpGI63AqiYiIwO3bt5GZmYnPPvsMrVq14jokooU0ucYJqQnLsjhw4ADs7OwwceJE6Onp1fja2mq8qKgIvr6+uH//PkQiEVxcXDBo0CBFz0pCNB0dwwnf5ebmwszMTCmN+Xp3s+/Tpw/69u2LQ4cOIT8/v1EbVzYfHx/ExMTA1NQUYrFYcYZm1qxZGDFiRL3XV8/zHFrj8ePH6NWrV6XHaPC7poNlWeTm5mpkfTs6OiI1NRXTp0/Hb7/9xnU4REtpco0TUpPjx4/D1tYWM2bMqLUhD9Re482aNcP48eOxbds2rFy5EgCwd+9ebNy4EVeuXFF0VyVEU9ExnPCdMmu73kPp3b9/HydOnMCaNWuwcuVKzJw5E4sWLcLIkSOVFlRDrVmzBh9//HGlx3r06IH9+/djypQp9V4fH0fRjIqKgoWFRZXc4uLiqDHfRIjFYjg7O3MdRo3mzp2LgIAAsCxLJ5lIg2h6jRPyX6dPn4aenl6NAwD/V11rvHnz5hg7dizGjh2LkpISPHz4ED/99BPKysrg7OyMoUOHwtzcvLHhE6JUdAwnfMfpaPaDBg3CsWPHkJ6ejkOHDiE5ORmjR4+Gg4MDduzYgeTk5AYHU1RUhNDQUMX88XFxcQgNDUViYiIAICcnB6GhoXj69CmAV9OrhYaGIj09HQDQunVrdO/evdIXANja2qJdu3b1jqdi4AO+KCkpwaFDh7B48eIqz1XcukD4TyaT4dmzZ4p7fTRNz549ERMTg1mzZtHVedIgml7jhLyuYkT6efPm1fk9DalxQ0NDjBo1Chs3boSrqyvMzc1x9OhRrF69Gj/88AO8vb2RlJREV0MJ5+gYTvhOmW3MBp8WMDAwwPz58zF//nzExMTg5MmTOHLkCDZv3owxY8YoRhKsj8DAwEpX+L/++msAwPz583Hq1Cl4eHhgwYIFiuffeecdAMCmTZuwefPmhqZSo7KyMqWvkyssy2L//v1YuHBhtfdmpKWlwcrKioPICBc0eS53gUCAWbNm4f79+zA3N0dERAQcHR25DotoGU2u8cZ4+PAhvLy88MEHH6Bjx45ch0Ma6ebNm4iKisLXX38NgUBQr/c2psb19PQwdOhQDB06FCzLIj09HZGRkbh8+TKSk5MhFovRoUMHdO3aFd26daP77Yna8fUYTgig3DZmvQfAq0lRURH+/PNPrFu3Dnl5eVp9Nq1iALzr16+/cU57beHh4YGSkhLFCZD/Wrt2Ldzc3NQcFSHVY1kWq1atwooVK/DTTz9h586d9f6gSwifpKSk4OjRo4rB0U6dOgVjY2MsWLAABgYGXIdHGsDPzw8+Pj5Yt24dhMJ6d5RUKalUipiYGDx9+hSRkZEQiURYs2YN12ERQggv3LhxA2PGjFHKAHiN7rB/584dnDhxAhcuXIBIJMLcuXOxaNGixq5WI2jzCYnXRUdHIzAwEFu2bKn2+YKCgkYXEtEeMpkMkZGR6Nq1q8aOEisQCDB9+nTcuXMHXbp0QUBAAAYMGMB1WERLaEON11VZWRn+/PNPJCUlYfHixbCxsQHwaoyYwMBArF+/HrNnz4aLiwvHkZL6CAkJwdWrV+Hq6tqghryqa1xHRwddunRBly5dAAD79u1TjC5OtF9ZWRlevHiB58+f4/nz5ygrK8Po0aMxZMgQjRgvik/HcEKqo8w2ZoP+YpOSknDq1CmcOnUKcXFxcHFxwU8//YS5c+fCyMhIacGRxispKcHBgwexadOmGq9s0iBjRBO5uLjA3d0dGzZswI4dO9C/f3+6Ok+aDJZlcefOHXh4eNR4krxfv37o0aMHfv/9d9y8eROfffYZLCwsOIiW1MezZ89w9uxZbNmyRSMaTnUxZ84c/PXXX1i1ahXXoZB6YhgGMTExePbsGZ4/f46cnBzo6emhU6dO6NKlC8aPHw+hUIhr165h1apVcHJywpQpU2Bqasp16ISQOqh3N/sxY8bg1q1baNmyJT788EMsXLgQnTt3VlV8nOBLN3uWZeHm5oYpU6agR48eNb7u8uXLaNWqFQYOHKjG6Ah5s1u3biEnJwcCgQDGxsZa/fdISF0lJCTg6NGj6NatG+bMmQNdXd03vic+Ph6HDx/GgAEDMG3aNI3rtk1eiYuLw8GDB7F582atu/ixefNmLFu2jEa/1xIlJSU4evQoMjIy0KFDB3Tp0gWdO3eudf+xLIvAwED8+++/0NfXx4wZMxS9Mwghb1ZWVobr16+/cRY1TrvZGxgY4MKFC5g8eXK1XV8yMzNx48YNvPvuu40KTBNoezf7K1euwN7evtaGPPDqwwV1YW46ZDIZwsLC0LNnT43vvjZ8+HB888032L59OzZs2IARI0ZozZUswh1tqvHXlZWV4cSJE8jNzcVXX30FS0vLOr/X3t4ebm5u+Pfff7F69WosWbIE9vb2kMlkYBhG8SWTyRSPyWQytGnThv6mVKikpAQpKSmKr6CgILi6uja6Ic9Fjc+YMQMXL16sMgUw0TxBQUH4888/MW/ePDg5OdX5fQKBAM7OznB2dkZqaiouX76MEydOYPTo0Rg5ciR0dHRUGPX/0dZjOGna4uLi8NNPPyE/Px9jx46Fnp5eja/ltJu9h4cHAGDr1q3VPh8TEwN3d3deNOa1WUxMDAICAmq8T/51GRkZaNWqlRqiIppCWwbMEgqFGDduHG7evInx48fjn3/+wYwZM2p8PcMwKC8vh4GBAXXJb+K0pcYrVHwIeO+999CvX78GrUMgEGDKlCkYPHgwTp06hby8PIhEIojFYsXX6z8LBAI8f/4cH3zwAfr06aPkjPhPJpOhsLAQ+fn5yM/PR2ZmpqLRnp+fD4FAACMjI9jY2MDGxga9e/fGzJkzlVab6q7xnj174o8//kBZWRn09fXVum1SN8XFxTh8+DB0dXWxc+fORu0na2trfPbZZygrK8ONGzfwzTffYMuWLWrrfq9tx3DSdLEsiytXruDhw4fYuHEjLly4gKSkJHTo0EEt22/w6fiLFy9W+lkmkyEpKQkFBQXYtm1bowPTBNp6NrCkpAS//PILXF1d39igYVkWLMtSw6cJEYlEWtVt7q233sKqVauwa9cubNy4EYGBgTW+ViQSQSgUQi6X44svvqjXlU3CH9pU4yzLwsvLC35+fti4cSNatGjR6HWamZkppnZ9k5KSEpw6dQpXr17FkiVLqAt1Na5evYqQkBAUFRVVmoNdKBTC2NgYJiYmMDY2hqWlJfr37w8bGxuVT+XGRY0LBAJMmDABXl5etZ5UJdwICAjAmTNn8NFHH6Fnz55KW6++vj4mT56MLl264MCBA3X6bNlY2nQMJ01bcXExfvjhB7Rr1w7btm2DQCCAra0tEhMTa23MK7ON2eDGfHWT3TMMg+XLl+Pp06eNCkpTMAzDdQj1xrIsDhw4gI8++qhOZ0/z8vKU8uGRaA+GYRASEgInJyet6F4rFosxatQo+Pj44LvvvqvTexITE7Fv3z4MHjwYkydPppNVTYy21HhpaSkOHDiANm3aKD4EqJuhoSGWLl2K2NhY7NmzB3379sXMmTO19mS2sgUGBuLJkyf44osv0KxZM405lnBV48OHD8e3335L4zJokKKiIhw6dAhGRkZwc3OrtWtvY1Tcd3/lyhVMnjxZJduooC3HcNK0PXv2DEePHsWiRYvg6OioeNzOzg5+fn61vleZbUylHonFYjGWL18Od3d3Za6WM7a2tlyHUG9XrlyBra1tnc/KxsfH01QzTYxAIECLFi005kNpXYwbNw7e3t6Qy+V1er2trS127twJqVSK9evXIyUlRcUR8tPjx48RGBiIeo6TqlJyufyN8WhDjcfGxmL9+vWYMGECPvjgA85jbd++Pdzc3NC8eXOsWrUKT5484TQeTZCZmYm//voLX375JZo3b875PnodVzUuEokwaNAg+Pr6qnW7pHoPHz6Eq6srJkyYgKVLl6qsIV9h7ty5ePDgAZKTk1W6HW04hpOmi2VZnDt3TjEryesNeeDVZ9CkpKRa16HMNma9R7N/k5s3b+LLL79ERESEMlerVhWj2StjhEF1evr0Kc6cOYMtW7bU+QDo7u6Otm3bwtnZWcXREdI47u7uMDU1xahRo+r1vvT0dPz000/o1asXZs+eTVeT6iggIAD//vsv7O3tERUVhffee++Ng2mqUkZGBjw8PBAZGan4wCoUCtGyZUu0bt0aVlZWaN26NVq3bq1xDa/XVdxb5+/vj5UrV2rk9E+FhYX49ddfIZVK8cknn2hkjKrGMAzWr1+PZcuWoU2bNlyHo1FKSkqwZcuWOveUIsoXHR2N06dPw9LSEgsWLFB5I/51mZmZ2L17N9zc3OiqOWly8vPzsW/fPjg5OWHatGk1ftZYu3Yt3NzcalyPMtuaDW7M//jjj1UeS09Px8mTJzFlyhR0795d8fiXX37Z8Ag5UPELzs7OhpmZGdfh1Mm1a9dw//59rF69ul6j5O7duxfz58+nuYmbEIZhEBAQgP79+2vVP2KJRIJVq1Zh5syZaN++PWxsbOrcMGdZFlevXsWtW7ewdOlS2NvbqzZYLRcYGIjLly9j48aN0NXVRV5eHk6fPo309HR88MEH6NSpk1riqJgm6cqVK9DV1VX8b6n45ymTyZCVlYW0tDSkp6cjPT0daWlpKCwshJOTE2bMmKFRNV5SUoIDBw7A1tYW7733nsaecKjw7Nkz/Prrr7CxsYGRkRH09fWhr68PPT29St/19fVhbm7Oq0bvoUOH0K1bNwwfPpzrUKrF9XH8yJEjGDJkSJUrUkS1IiIicPbsWZiZmeHtt9+GlZUVJ3Hcvn0b0dHRKpvZgOv6JqQCy7JIT09HREQEwsPDkZSUhM8+++yNg9utX78e27Ztq/Fzak5ODszNzbltzNe1a7ZAIEBsbGxDNsGZisZ8bm6uxl+RkMlkOHToEPT19bFw4cJ6X3Vcu3Ytdu7cqfEfKonyyOVyJCUloW3btlp3lTo5ORlPnjxBbGwsUlNTIZPJYGpqivbt26N9+/Zo164dzMzMaqzn7Oxs/Pzzz2jXrh3ee+89+pBQjZCQEJw/fx6urq5VrvZkZmbijz/+QFFRET788EPY2dmpJIbCwkJ4eXnh4cOH6Nu3LyZOnFivsT3kcjnOnj0LsViMOXPmqCTG+mBZFmFhYfjtt9+wYMECTns41JdcLsfLly9RVlYGiURS5XtpaSkkEglSU1ORnp6OESNGYOTIkWq9Uqhsvr6+CA0NxRdffMF1KDXi+jiemZmJQ4cOwdXVVe3bbmpYlkVQUBDc3d1ha2uLOXPmaMRAlbt378b48eOVOthehbrUN8uyeP78OWxtbWFoaKj0GEjTJJVK8eLFC0RERCAyMhKlpaVo3bo1unfvDkdHR7Ru3bpObaaffvoJs2bNgrW1dbXPV4xZxmljns8qGvOZmZkafcU6Pz8fu3fvxpgxYzBixIh6v59lWaxbt67WbiCEaLq8vDzExsYqvnJyciAUCmFjY4P27dvDwcEB9vb20NXVBfCq7m/fvo1///0XvXr1wqRJkzTig5EmCAsLw19//YXNmzfX2hhLTU3F//73PwgEAnz44YeKq0NyuRy5ubnIzMys8lVSUgKxWAwjIyMYGhqiWbNmlZYNDQ0hFArh4+ODwsJCTJgwAQMHDmxwQ6ViMNAePXrgrbfeatA6Gksmk+HOnTvw8vJC586dMWfOHJiYmHASizpIJBLcunULt2/fho2NDSZPnqx1Y7KkpaVh//792LlzJ53sewM3NzfMnz+/xg+rpHFYloWvry88PDzg6OiImTNnatStnyUlJdiwYQO2bt2KZs2aqXXbUqkUBw4cgFgsRn5+PkpKSmBnZ4devXqhZ8+eKp9NgvBPYGAgzp8/D5FIhM6dO8PR0RFdu3Zt8Imiy5cvw9LSEoMGDar2+aysLLRs2ZIa86pS0Zj38vLC+PHjuQ6nWjExMfjll1/w+eefw8HBoUHryMrKwm+//YaVK1cqOTqiyRiGgZ+fH1xcXHj7YVUulyM1NRUxMTGIjY1FXFwcysvLYWRkpGjgd+7cGXFxcbhy5QrkcjkmTZqEvn37NtleKhEREfj999+xadOmOs/vGx8fj//9738oKSmBUCiEQCCAmZkZWrZsWeXL0NAQDMOguLgYJSUlKCoqQklJCYqLixVf5eXlGDx4MGxsbBqVS0WNDxw4EG5ubpgyZYpa51EvKSnBlStX8ODBAwwdOhTjx49vcnMmx8TE4J9//kF6ejpGjRqF4cOHa/zV+vLycqxbtw7ffvstWrVqxXU4tdKE43hUVBQ8PT3x1VdfcbJ9vsrOzoa/vz9u3rwJZ2dnTJ06VWOvPEdERMDDwwNr165V6nprq++8vDzs2rULU6dOhYuLC4BXJz4SEhLw+PFjhIWFobCwEG3btlU07jW9ly3hlqenJ4KDg7Fq1SrFhZ/GCgkJwYsXL/D2229X+/zVq1cxYcIEasyrSkVj/tq1axg7dizX4VRx+/ZtXL9+HatWrWrUVZ5Hjx4hKSkJM2fOVGJ0RNPJ5XKkpaXByspK67rZN1ZxcTHi4uIQExOD8PBwFBYWwtHREZ07d0ZkZCQeP36MgQMHYvz48bUeXMvLyxWvf/HiBYRCIZo1a6a42vzf78bGxmjfvr3GniiIjIzEqVOnsGnTJo390Fgfr9e4VCrFpk2bsGjRInTs2FGl283IyIC7uzvi4+MxceJEDBkypMn9jf1XWVkZfHx8cOfOHbRt2xaTJk1Sy9V6qVSK2NhYdOrUqc5/dz/88AMGDhyIgQMHqji6xtOU4/j69euxZs0auhLaCIWFhQgNDUVwcDBSUlJgbm6Ovn37YtiwYUprWKjSqVOn0LZtW6X2gKqpvuPi4vDjjz9i2bJlaN++fY3vZ1kWycnJePz4MUJCQiCTyTBjxgz07NlTY/8PE/VjWRYnTpwAwzBYvHixUmsjJycHx48fx6pVq6p93tvbG+PGjaPGvKpUNOavX7+O0aNHcx2Oglwux4kTJyCRSPDpp582eh7gc+fOoUOHDnByclJShIRoF5lMhoiICDx48AAxMTFo2bIljI2NkZCQAGNjY8ycOROdOnWCRCJBRESEovEuEonQtWtX9O7dG507dwbwaq7f4uJiFBUVVVouLi5GSkoKRCIRvvjiC437IPH8+XMcP34cmzdvrtfgmdqksLAQmzZtwrfffquSAaNiY2Nx9uxZMAyDmTNn0qBgNYiOjoaXlxeSk5MxZMgQjBo1SiU1l5+fDzc3N1hbWyMxMRFDhw7F2LFja+0dcePGDcTHx6tsQC/g1QmGuLg4mJqaokWLFtDR0VHZttTF398f0dHReP/997kORWuUlZXhyZMnCA4ORmxsLJo1a4bevXujT58+sLa21rj/EW8ik8mwbt06fP311yrt0fLw4UNcvHgRa9eurfeV9uzsbFy6dAnPnj3D2LFjMXLkSN72TCR1I5VKsXfvXnTt2hXTpk1T+vrfdCvzjRs3MGbMGGrMq4omdrMvKirC7t27MXjwYIwbN04p69y9ezc++eSTeg0sRbQfwzC4e/cuhg0bRv/M/iM9PR0PHz5EcHAwCgsLkZCQgKKiIvTr1w89e/ZE79690bFjxwadSHN3d0d6ejo+++wzjfmwFhUVhaNHj2LTpk1qv+dRlaqr8YyMDOzatQuurq5K63JZUlKCkydPoqCggO4drofy8nL4+vrCx8cHzZo1w/jx49GrVy+l/F0kJCTgwIED+OKLL9C+fXswDIM7d+7g+vXrsLGxwfTp09G2bdsq7zl8+DC2b9/e6JPkNYmOjsahQ4fQtWtXFBUVIScnBwzDKJ7X19eHmZkZWrRoARsbG7i4uNR6VVZTjuMsy+Kbb77Brl27eHFyQhVYlkViYiL8/f3x+PFjiEQi9OjRA3369NHoHlv1kZqaip9++gk7duxQSk8RqVSKu3fvYvjw4RCJRPj777+RkJCAFStWNKrOJBIJvL29cfv2bfTp0wdTpkzRqHEIiHoUFhbCzc0NU6dOVWlPrNoGGadu9iqmad3sJRIJNm7ciIULF6JLly5KWy+NZN80yeVyZGVlwcLCosl3Aa5NWVkZ0tLSkJmZiTNnzmDevHkN7sWSn58PIyMjXLx4ETk5OUrvztUQsbGxOHjwIDZt2sS7LrI11XhcXBwOHTqErVu3Ql9fv8HrZ1kW9+7dw6VLl/DBBx+o9X58vnn58iWuXr2KsLAwODk5Yfz48Q0eeDYwMBB///031q5dW+1J6qioKFy+fBm5ubmYNGkSBg4cqLhPfv369SoZCFMmk+H06dOIj4/HsmXLajyRVFpaitzcXOTm5iI2NhZ37tyBo6Mjpk6dWm1cmnQc9/T0hEAgwIQJEziNQ5OUlZUhODgY/v7+SEtLg62tLQYOHIiePXuqtet8fn4+JBIJLC0tVb4tLy8v5OTkNKqXRkpKCry9vfHkyRNYWlqirKwMmZmZcHFxUep0nizLIiAgAP/88w8sLCwwa9asKif5CD+lp6dj9+7d+PTTT1U+ze7u3buxePHiao/71M1exTSpmz3Lsti1axfGjBmDfv36KXW969evx86dO5W2TkL4qqSkBMePHwfDMFiyZEmdugazLIsnT57Aw8MDDMOgpKQErVu3hlQqhZmZGT7++GPOGvTJycnYt28fNm3axOvR1asTFhammHqvIVc009PTcfDgQTg4OODdd9/VintatQHLsggODoaXlxcEAgHeeeedeg3uevnyZURERGDlypVvHGjv9akPWZbFhx9+qJLbzRISEvDzzz9j3LhxeOutt+r1986yLEJDQ+Hh4QE9PT3FLT+aqLy8HGvXrsX333/P+UlKLhUXF8PT0xOhoaEQiURwcnLCgAEDOOmxw7IsfHx84OXlBQMDA7Rv3x7vvfeeSgehZFkW+/fvR35+Prp27Yru3bujU6dObzxGFhYW4tatW/Dz84OFhQXGjh2LHj16ID8/H5s2bYKlpSXy8/NhZWWFMWPGwNHRUal1FhsbC3d3d+Tk5GDgwIEYNmwYDZjHU8+fP8eRI0ewevVqtQxyeubMGXTt2hW9evWq8hx1s1cxTepm/8cff8DQ0FDpg9Slp6fjzJkzWL58uVLXSzSfVCqFj48PRo0aRd0i6yksLAynTp3C3Llza+yaJZFI4OPjAx8fH3Tu3BlTp05VXBVJTk6Gj48Pzp49CzMzM2zYsKFeA3QpQ1ZWFnbu3Il169Zp9NSbjfGmGr937x4ePnyIb775ps6/e4ZhcO7cOYSHh+Ozzz5DmzZt6vQ+lmWRkpKiuDe2Z8+edHvLG6SlpeHs2bPIzc3F3Llzax2DQCaT4eDBgzA2NsaHH35Y7wZzWlqa0htbLMvi/PnzCAsLw5dffomWLVs2an2pqalwd3dHcnIyJkyYgCFDhkAul2vUcfz3339H586d0b9/f65D4UTF7R2zZs1Cv379OJ25ISsrCz///DM6dOiAd955ByKRCH5+frhw4QJmzZoFFxcXlf7PKS0txbNnzxAeHo4XL15AKpXC3t4e3bt3R/fu3WFqagqGYfDw4UPcvHkTcrkcI0eOxKBBgxS/t8DAQMTGxsLJyUkxcGliYiJu3LiBiIgIODo6YvTo0bC1tVVa3BKJBA8fPsSdO3cglUoxePBgDBkyhFe3oDVlDx48wD///IN169apbZ/6+fkhOzsbU6ZMqfIcdbNXMU3pZn/v3j0EBQXhq6++UvqB9+HDh3j58qVKBn0gmk0ulyMvLw+mpqacd8/URhKJBKdOnUJubi6WLl2qOAhnZGTg0qVLiIqKwujRozFy5Mgar0iwLIs9e/bg2bNnMDMzg6OjI0aNGgU7OzuVxl4xENzXX39d58aoNqpLjV++fBmhoaHo3bs3bG1tYWtrCzMzs2qPtU+ePMHJkycxceLEWq+wlpSUIDY2FjExMYiJiUFmZiYEAgGsra3h4OCAvLw8PH78GCYmJhg8eDCcnZ2b3JR19ZGdnY2///4bSUlJmDVrFvr06VPpd19cXIxdu3Zh5MiRGDVqVKX3ZmZm4smTJ+jXr59a74lNS0vDgQMHMGTIEEyaNEmp/7tLSkrg5eWF+/fvw9nZGR07dkTHjh01ondNQUEBtm7dimXLlsHW1rZJXaG/c+cOvLy8sHr1ak7HIGJZFleuXMG9e/ewdOnSKv9PJBIJTp8+jbi4OCxZsqTRU4DWJ674+HiEh4cjPDwc+fn5kMvlGDhwIEaNGlXpKnh8fDxOnDgBGxsbTJ48udrZGliWxdOnT3H9+nWkpKSgX79+GDlypFJvJSgpKcH9+/dx7949CIVCDBs2DC4uLo26PYtw59atW/D398c333yj1pPpycnJuHTpEr744osqz1E3exXThG72UVFROHnyJLZu3aqSwvvrr7/g6OiInj17Kn3dhDQFz549w7FjxzB48GCEh4dDJBJh+vTp6NatW50+yLIsi5MnT0JfXx89e/bEjRs3UFpaiq+//lolV3XKysrg6uqKTz75ROVTtGmLtLQ0xMfHIykpCYmJicjOzgYANGvWDG3atIFMJsPTp08hEAgUvbQqZih4fbaC0tJSyGQyGBoaon379nBwcICDgwNatmxZbS1kZ2fDz88Pjx49AgAMGDAALi4uNBhpDQoKCnDhwgVERkZi2rRpcHFxQXp6Ovbs2YOPP/4Y3bp1U7y2sLAQp0+fRkpKCvr374/g4GBIJBL07dsXQ4YMUdm9wyzL4t9//8X9+/fx1VdfqWTWhApyuRxBQUF48eIF4uLiUFRUBKFQCCsrK9jZ2cHe3h729vZqH9jryZMnuHv3LhISEtCiRQv07dsXffv2VclYBJpALpfj+PHjKC8vx6effsppj5vU1FT88ssv6NOnD2bMmFHrifrU1FQcOXIEdnZ2eP/99zntRVAhOzsbv/32GyQSCRYuXFjnLtAMwyAoKAi3bt1CXl4eBgwYgOHDh8PMzExpsRUWFuLevXu4f/8+WJaFqakpWrZsWeWrWbNmTeoklrYoLS3Fhg0bsHv3bpUNcFoTmUwGV1dX7Nixo8pz1M1exbjuZp+Tk4Pt27dj8+bNKvtn7Obmhs8//5xG8WyCpFIpvL29MXbsWI3onqnNGIbB7du34eTk1KAPrCzL4tixY2jRogXmzJmDoKAgnDlzBqtWrWp019z/xrllyxbMmTOnSZzAa2iNsyyLiIgIeHl54enTpzAzM4OVlRVsbGxgZGQEIyMjNGvWrNJ3IyMjGBgYNPhDXElJCQICAnD//n0UFhZi6NChmDBhAvWaqUZJSQk8PDzw6NEjCAQCrFq1StE4Lysrw4ULFxAWFoZ3330XvXv3VrxPIpEgKCgIvr6+yM7ORo8ePTB06NBqryAzDIPMzEykp6fj5cuXiq+SkhJIpVIAr+oEAAQCgWKZZVn06dMHs2fPVssH+v/WuFwuR3p6OuLj45GQkID4+Hjk5ubCwcEB77//vtq7CmdnZyM4OBhBQUHIzs6GnZ2dYlYQQ0NDtcaiCgUFBfjuu+8wYsQIjBkzps7vqxh4rVmzZmjZsiXMzc0b1cCQy+W4cOECQkND8cUXX9TrJNKDBw9w7tw5zJgxA0OGDOGkIVpWVoazZ8/i2bNn+OijjxRTvTbkGF5eXg5/f3/cuXMHpaWlcHFxwbBhw5Q6wCvLsigoKEBGRgYyMzMrfRUWFoJlWRgZGWH+/Pk0oJ6GOHXqFLp06aLSUetrs3bt2mqnp6Nu9ipW0ZhPTU1V6dn16pSXl2PDhg34/PPPVdrltqbiIvzHsiwKCwvRvHlzOousAViWxeHDh9GqVSvMnDlTMTjdxx9/jC5duqC0tFRx9bekpATNmzevVxd5uVwONzc3jBw5Ei4uLirMRHPUp8Yrumzevn0bcXFxcHR0xIgRI2Bvb6/2vw+GYeDp6QlfX1+V96Co6HJbWloKZ2dn9OzZUyOu0NVFeXk5BAIBdHR0wDAMrly5grt372LmzJlvvB9YJpPhyZMnuHfvHhITE2FjY4OioiKUlpaCZVmIxWK0bNkSrVq1UnxZWlqiWbNm0NHR0ZhjZl1rPDg4GKdPn8bQoUMxefJktV+ZAv5varagoCDcv38fs2fPxqBBg9Qeh7JERUXh0KFD+Oyzz+r1N8qyLH744QeYmJhAT08PGRkZyM7OhlwuBwDo6enB0tISlpaWsLCwqLYR+/q+ZhgG//zzD0aMGIEJEyY0qDbLy8tx5swZPH/+HO+++y66d+9e73U0hFwux9WrV3Hjxg1FPbwef2M/p5SWluLBgwe4e/euYpDL+gyo2Rjp6ek4cuQIrKysMG/ePLqVikPZ2dnYu3cvduzYwdmxe8uWLVi1alWVOqgYr4Ua8ypS0ZhXxi+4PliWxXfffYdRo0apdBAZuVyOjRs3VtvtgxCifhUN+sTERAiFQpSXl+Phw4do06YNevToAUNDQxgYGMDQ0BAZGRnIz8/H1KlT0bdv31r/QbEsiwMHDqBbt24aMc0m1yQSCTIyMhRXVWJiYhAXF4du3bphxIgRaNeunUY01nJzc3H06FEYGRlh4cKFSr+S6e/vj7Nnz2Lu3Llo1aoVHj16hMePH0MgEKBXr15wdnaGnZ2dRvwualIxWveVK1cwduxYjB07tt69GViWRXp6OkxMTHhxtbgmcrkc165dw/Xr1/H222+jf//+nO1bqVSKdevW4euvv1b7xRJl8Pb2xt27d7Fq1ap6fT5kWRY//fQT2rVrV+1gWMCrq9QVx6esrCzIZDLFe2syaNAgpdzGkJOTgzNnziAlJQVvv/220ntwsSyL3NxcJCQkIC4uDvfv38ewYcMwadIkld+ekJ2djUOHDqF169aYN2+e2k5aVhxnp06diuHDh2v08ZSv9u/fj4kTJyp6fHDh5MmTGDJkSJUTf8psa1JjvhoVv+CsrCy13uv1559/Qk9PD7Nnz1bpdlJSUnDx4sVqB2Qg/CeVSuHp6YmJEydSN3sNJpfLcezYMcjlcixZsqRSQyUvLw8eHh4ICwvD6NGj8dZbb1W7L0+cOAETExPMmjVLnaFzLj09Hf7+/oiLi0NWVhbkcjlYloW+vj5atmypuPrVtm1bTq7A11VISAh+//13pX0YrPhQa2lpifnz51f5UCuRSBAWFoZHjx4hPj4eFhYW6N+/P/r3768xjV2JRAJfX19cvXoVAwYMwNSpU5vk9IANOY6XlpbizJkziImJwcKFC9G+fXsVR1m99PR0xdUybdl3DMPg0KFD0NfXx6JFi+p14ohlWRw8eBA2NjaYPn266oJUgtzcXJw9exaJiYl4++23q51SqzYsyyIrK0txq0dCQgKysrIgEAhgZmamGNPB0dGx1mOKKj6n+Pn54fz585g3b55KpqKsTnl5Of7++288e/YMixcvVuro+6R2CQkJOH36NNauXctpHDdv3gTLslXGYMvOzoaFhQU15lWlojGfl5entlFifX194e/vj6+//lrlHyx9fX2Rl5eHyZMnq3Q7RDOxLIuysjLo6+trbCOG/J/r16/j3r17WLVqVZX7XsvLy3Hz5k3cvHkTvXr1wtSpUxXHrPPnz6OgoAALFy7kImzOsCyLjRs3YsqUKWjXrl2j70nlGsMwintKP/300waNQC2Xy3H58mX4+/tj8eLFdW7EZWRk4NGjR/Dx8cHGjRs5m3tZLpcjODgYN27cQEFBAQYPHoyRI0dqzAkGLjTmOJ6RkYGTJ09CLBZjwYIFSh0srK78/f3h5+eHFStWqH3b9ZWXl4fvvvsOEyZMwLBhw+r1XpZlceTIEVhYWKj8Qo0y5eXl4ezZs4iPj8fcuXPRu3fvKnUmk8mQkJCA6OhoREVFITk5GSzLwtLSEvb29oqGe02zhNRGVZ9TSkpKcOzYMTAMg8WLFyv1fvravHz5EkePHoWlpSU+/PBD3na9T05OxtmzZ5GdnQ2xWAwTExOYmJjA1NS0yncDAwMwDKP4kslkkEqliuX/Pvffx2UymaL3ypAhQ6r8f9qyZQsWL17MeQ+gqKgo3Lt3r8pnsfz8fJiamlJjXlUqGvMvXrxQy6jPMTExOH78OLZt26aW0VD/+OMPODk51Tp3L+EvlmXBMAzEYjE15rXEs2fPcPTo0RqnlGNZFoGBgbh8+TLMzMxgY2ODtLQ0lUxrqem8vb2RnZ2N2bNn86rG09LScOjQITg4OODdd9+t8xXNqKgoHDt2TNGltSG/j7i4OPzyyy/YuHGjWqdBi4mJgbe3N2JiYuDk5ITRo0fXeZRrvlPGcfz58+c4efIkevTogTlz5qj9Kvmvv/6Ktm3bavQtQNHR0Th48CC++OKLevdkYFkWv/76K5o3b463335bRRGqVn5+Pv7++2/ExsYqBut68eIFMjMzIRKJYGtrq5gi0cbGRmkDd6r6c0pERAR+/fVXTJs2DcOGDVPb/4lHjx7hr7/+woQJE9C9e3eYm5s36u9OLpdDIpFAIBDU+gVApTnGxcXhzJkzAIB3330X9vb2YBgG+fn5yM/PR15enuJ7xXJpaSnEYrHiSyQSQUdHR7H83+f++3PFYxWDJVpZWeHtt9+GhYUFwsLC4Ovri6VLl6os57oqKyvDd999h02bNlV6PCoqCp06daLGvKqoczT71NRUfP/999i0aZPaPiRt374dK1asgJGRkVq2RzQLdbPXTllZWfjuu+/wzjvvoG/fvjW+Li4uDiEhIZg+fbrGjIju7++PEydOwN7eHqNGjUK/fv1UcrU8Ly8P27dvx7Zt2+Dt7c27GmdZFr6+vvjnn38gFouhq6sLc3NzmJmZVfpubm4OHR0d/PbbbygsLMSSJUsafVU9NjYWBw8exKZNmxp0NaviHvfi4mLo6OhAV1cXYrG4yrJIJEJoaCiCgoLQrl07jBkzBh06dODNSRllUdZxnGVZ3LlzBx4eHpg1a9YbBxBUporxexYtWsRZl//a3L59G9evX8fq1asb9GH75MmT0NXVxfvvv6+C6NSroKAAPj4+sLS0RIcOHWqcdlNZ1PE5RSqV4vTp04iPj8fSpUuVOoPMm7br5eWFhIQEZGdnQyqVgmVZCAQCmJqawsLCAi1btoSFhQUYhkFeXh5yc3MV3yUSiWJdAoEAenp6YFlWMbZCxW1lry9XfDcyMoKtrS1sbW1hZ2eHNm3aNOpkwosXL/DXX3/ByMgI7777boN6jilLZGQkzp49C2NjY0RHR+O7775TW8+LN6lu0HHejmZ/9+5d7NmzB0FBQUhLS8PFixcr3V/k7u6OI0eOKKY6CQkJqTT9TE5ODjZt2gRvb28kJSXBwsIC06dPx7Zt2+rVUK5ozHt7e9drypH6SktLw+7du7Fu3Tq1HUQAGsm+qaMr89pLIpFgx44dmD59Ovr06cN1OHUSERGB06dPY/PmzcjLy4OPjw8CAwNha2uL0aNHo0uXLkqrw++++w7Tpk1D586dm0SNSyQS5OTkKL6ys7ORnZ2NnJwcFBYWYtq0abWe+Kmv6OhoHDlyBK6urvX6kJSdnY19+/ahd+/esLW1RXl5uaI75X+XGYZBt27d0KdPH62+PULVlH0cl0gkOHPmDKKiovDJJ5+odDad1+Xl5WHr1q3Ytm2bxlxgYFkWJ06cQElJCT777LMG9Zj8/fffFaOok/pT5+eUhIQEHDp0CI6OjpgxY4bap3GsUDFIYFZWluJLR0cHLVq0gKmpqeJ7YwbwKykpQWJiIhITE5GQkICkpCRIpVKIRCLY2NjA2toaVlZWsLa2RqtWrWo8kRIREYGzZ8/C3Nwc7777rmKKUE3w559/4vLly+jXrx/effddjZgicP369di6dWul/2nXr1/H2LFjldKYV32f7nooLi5Gr169sGDBgmoHbCouLsbgwYMxZ84cfPLJJ1WeT01NVVzp7tatGxISEvDpp58iNTUV58+fV2qsLMtCJpM1uFt8eno69uzZg7Vr16q1Ic8wDH1AIop/kkS76OnpKf4piMVijZ8zPj4+HqdOncKWLVugo6ODli1b4u2338bbb7+N+Ph43LhxA8ePH0ePHj0wevToek2591+BgYFo3rw5unTpUumDIJ/p6enByspKbfcEdujQAYsXL8bWrVuxadOmOn3o9fPzg7u7O5YtW6a2BmJTocwa19PTw/z585GRkYFjx47BxMQE8+fPV/mVLVNTUyxatAj79u3Dhg0bOD/5VlJSgt27d6N///6YOHFig9bx119/QSqVNrnxSpRNXcdwOzs7uLm5ISAgANu2bUPHjh0xe/ZstY8RUjFIoJmZGTp16qSSbRgaGqJLly7o0qVLpccZhkFqairS0tKQmpqKwMBApKeng2EYAICxsTGsra1hZmaGBw8eoE2bNvjqq6/UOkh4XTAMg6CgIPzxxx9IT0/H6dOnwTAM3n33XU57/1hbWyMtLa1Rn3Fqo1FX5l8nEAiqXJmvEB8fj3bt2lW5Ml+dc+fO4YMPPkBxcXGdDwp16WYfFhaGPXv2YM2aNfW+9zwjIwO7du3C6tWr1X7/38GDBxXTMJGmibrZa7+ysjJs3rwZH374Ibp168Z1ONV6+fIlvvvuO7i6utb6oYhlWTx58gQ3btxARkYGVqxYUe/jYllZGdatW4cdO3bAwMCAalzFXrx4gePHj2PTpk01Xk2VSCQ4dOgQdHV18cknn9B+UDJV1/jjx4/x+++/Y/jw4Zg0aZLKb9k5f/48WJbFnDlzVLqd2qSkpGDv3r1YtGhRg8cUOnfuHHJzc/HJJ59wfmJCm3F1DGdZFqGhoTh//rziHmx1XnDTRCzLorCwEKmpqcjIyECvXr3UOnZKfbi7u6N58+aVelWnpaXh7NmzyMzMrPY9urq6MDAwgL6+Ptq0aYPhw4cr/STFv//+CzMzM7i4uCgeU2Y3e7AaCgB78eLFap+Li4tjAbAhISFvXM+xY8dYCwuLWl9TVlbG5ufnK76SkpJYAOzVq1dZlmVZhmFYhmEqLXt4eLDXrl1jv/vuO/bXX39lS0tLWZlMxrIsy0ql0mqXy8vL2fT0dHb58uVsUlISK5fLFY/L5XJWLpdXWWZZttKyTCartCyVSmtdZhhGsezp6ckePXq0xpwq4n19uS45vb6s7pxqyoNyopz4nlNRURG7Zs0a9sWLFxqXU1ZWFrtixQo2LS2tXjmlpqay33zzjeL4WNecDh06xD548EAj9xMfa08ul7Ph4eHsqlWr2KKioio5xcTEsCtWrGAfPnyoVTnxcT81JieJRMJevnyZ/eabb9i7d++ySUlJbElJiUpyYhiG3bZtGxsaGsrJfgoICGC/+eYbNj09vd45ZWZmspcuXWLXr1/PnjhxQhEX1Z525xQeHs5u2rSJ3bdvH5uSkqLWnPLy8tgzZ86waWlptJ/qmFNeXh77zTffVIr9TTnJ5XK2pKSEzc7OZtPS0th79+6xO3bsYNeuXct6eHiwOTk5Dcrpv23C0NBQ9vTp05Vy8vLyYgGw+fn5bGNpxuhIKpKdnY1t27ZhyZIltb7Ozc1NMX2CiYmJ4v6K2NhYAK8GVYiMjATw6op8xRQcenp6mDlzJtq1a4czZ87g8ePHAF51K0xLSwPwahyArKwsAMCNGzdw6NAhfPvtt3j8+DEKCwsBAJ6enigrKwPDMPD09ATDMCgrK4OnpycAoLCwEN7e3gCguOcUeDUg1t27dwG8OvPk5+cHAEhKSkJAQACA/xsM6+nTp0hISEC/fv1qzAl4Na9xXFwcACAgIABJSUm15uTj44O8vDwAr0aRVmdOwKvRIMPCwiineuaUnp4OlmV5lRMf91NtOQmFQgwcOBBHjx5FZGSkSnIKDg5GaWlpvXIqKSnBtWvX8NFHH6F169b1ysnU1BRDhgzBvn378OzZszrl5OPjg9zcXFhZWVXKKSAgACzLcr6f+Fh7ZWVliI6Oxvz58/H999/j2rVrAF7NUX3lyhWcOnUKX3zxBYqLi7UqJ23aTyzL4tq1aygoKFBZToGBgZg6dSo++ugjZGdnw8PDA7/99htOnDiBdevW4cSJEzhz5gzc3d1x/fp1BAYGNjin/Px8fP3114iOjkZqaqra9hPDMHB3d0dERAS2b9+OzMzMOu2nyMhIuLu748yZMzhz5gzMzMwwcuRIxYwRfK49deVUUFDAaU7t27fH+vXr0b59e/z222/Yu3cvPD09ERQUhKCgIFy9ehVRUVGIjIzE9evX8fLlS8TGxuL27dsN3k9hYWH43//+h7Nnz6JZs2b44YcfEBwcrNH76U05qav2bt++jblz50IikdQ5J4FAgJycHISHh6N169Zo27YtRowYAVdXV+jo6MDDwwMbN27E1atXERwcXG1OQUFBuHPnDtzd3XHo0CGsXbsWv/32GxITExU5GRoaIjExsVJOFb8vZeBtN/uCggKMHTsWLVq0gIeHR63ddCQSSaXRIQsKCtC2bVv8+++/mDRpkmIeQ5FIpFjesmUL1q9fr5giIT09HT///DOcnZ0xceJEiEQiCIVCxYfu3Nxc7Nq1C19++SXatm0LqVSqGNijYhmofI8QwzDQ0dFR3P+po6MDuVwOmUymWJbL5RCLxTUuy2QyxSjYW7ZsgaGhYaU8/rvMMAwEAoFiWSgUVsrjv8vS/z9wRsWyunJiWVaxXF0elFPNOZWUlMDHxwfjxo0DAF7kxMf9VNecSkpKsHXrVnzxxRdo165do3IqKytDeHg4AgMDERcXB3Nzc2RmZqJr166YPn06zMzMas1JJpNh69ateOedd+Do6NjgnIqLi7F9+3YsWrQIXbt2rTEnqVSKDRs2YO3atTA2NlY8XlZWhhs3bmDcuHGK7sFc7yc+1p6Ojg7Cw8Px119/4YsvvsChQ4fQu3dvzJgxQzGCsjbmpA37SSaT4dq1axg7dix0dXU5yamkpASZmZnIzMxUfMj/4IMP0Llz5wbvp+joaJw8eRLbtm2rNC6RKnIKCQnB6dOnMXbsWIwePbrWfcYwDLKysuDr64vg4GA0b94cw4YNg5OTE/T19ZtU7akjJ4lEAh8fH7z11luK6ci4zikpKUlxAkj6/wftrPjOMP83gGdBQQHkcjlcXFwwaNAgmJqavnE/lZWV4cKFCwgPD8esWbPQt29fCAQC3Lp1C8nJyZg3b55G7idNqb2MjAwcOnRIMf2bMnOSSCS4d+8e/Pz8IBKJ0K1bNyQnJ+Ply5cQiURo3bo1OnTogPbt28PW1hYGBgbYt28f3n//fbRq1UqR64YNG7Bt2zZFTleuXMHkyZP5N5r96xrTmC8sLMS4ceNgaGiIf//9F/r6+vXadsU989evX8fo0aOrfU11I8KzLIvz58/jyZMnWLFiBVq0aAHg1Sj7FdPBqXtURYlEgg0bNuCrr75S2cALhBBu5ebmYtu2bVi5cmW9p4ZJTU1FQEAAgoODIZPJ0KNHD/Tv3x/t2rWDQCBQ3EPo7u4Oc3NzzJkzp9ptyOVy7Ny5E2PGjMGAAQManVNRURG2bduGefPmoXv37tW+5uzZszAxMVH5FKKkdk+ePMFvv/2GpUuXauQUY0Q9ioqK4Orqiu3bt8PQ0LDB67l9+za8vLygo6ODtm3bws7ODvb29rCzs4OBgUGj43z58iWOHj0KCwsLzJ8/v8ZYWZZFTEwMHjx4gPDwcLRo0QLDhw9Hv379aAwIUqvi4mL4+fnB19cXLMtiyJAhcHFxqTJoaElJCS5cuICwsDDMmjULAwYMqDLWgpubG6ZOndrgcRy0UV5eHq5cuYLQ0FC0bNkSjo6OcHR0hJ2dXbVjUezevRuzZ89W+f+fvLw8REVFwc7OrtbpGc+dO4d27dopekMDr9qNO3fuVLznxo0bGDNmDDXmq2vMFxQUYNy4cdDT04Onp2eD/qFUNOYrznr/l/z/z426Y8eOat8fGxuLX375BbNmzUK3bt2wdetWLF++HLa2tvWOBXj1D+XUqVMwMzPD1KlT6zyoCsuy+O677/DWW2/B2dm5Qdsm/COXy5GXlwdTU1ONmYecNF5WVhZ27txZ68Ca5eXliIuLQ1RUFKKiopCamgobGxv0798fTk5Ob/ygHBMTg7///hsymQxz5sxB586dAbw61hw4cACOjo5Knc6zpKQE27dvx+zZs6tMxZeWloaff/4Z27dvr3JMpBonfKepNR4aGoqbN29i5cqVjV4XwzBITk5GfHw84uPjkZCQgNLSUojFYrRp0wYuLi71mtGjYvq9Fy9eYPHixdXOrlBeXo6QkBA8ePAAKSkpcHBwgIuLCxwdHWkmIDXS1PpuiOLiYty/fx/3798HAAwZMgROTk64evUqHj9+jJkzZ2LgwIE1frYvLi7Gxo0bNWr6RlWJjY3FhQsXUFhYiMmTJ8PZ2Rk5OTmIiIhAREQEEhMTIRQK0bFjRzg6OqJr167IyMjAhQsX8O2333IdvkJAQACSk5Mxc+ZMxWN79uzBwoULFYPreXt7Y9y4cfxrzBcVFSE6OhoA4OTkhH379mHkyJEwMzODra0tcnJykJiYiNTUVEyaNAlnzpxB586d0bp1a7Ru3RqFhYUYM2YMSkpKcPHixUpF37JlyzofiCsa8zdv3sSoUaOqPJ+Wloa///4bX331VY3rKC8vx/HjxxEcHIyNGzc2akqeo0ePwtzcHGKxGPfv38enn35ap7NPf//9NwBg7ty5Dd424R+pVAofHx+MGjWKri7wTMUI8mvXroWhoSGio6MVDfeCggLo6OigXbt26NChAzp27IhWrVo1aMTlly9f4u+//0ZaWhqmT5+OiIgING/eHLNnz1Z6ThKJBDt27MCECRMwaNAgAK9OHmzcuBGfffZZtb0EqMYJ32lyjR85cgTdunXD0KFDVbJ+hmGQkpKC48ePY+7cuejRo0etr2dZFr6+vrh48SJmzJiBIUOGVDru5eXl4cGDBwgICEB5eTl69+6NQYMGUW9GDmlyfTdGUVER7t+/j8DAQIwcORKDBg2q0//g8PBw/Pvvv1izZo0aolQvuVwOPz8/eHp6wtLSEjNnzqz14qdUKkV0dDQiIiIQGRmJzMxMbNy4UaNmHXj58iVOnz6NFStWKB77+++/0bFjRzg5OQGA4jYS3jXmb9++jZEjR1Z5fP78+Th16hROnTqFBQsWVHl+06ZN2Lx5c43vB14NzmBvb1+nOCoa8zX9ggMDA5GQkIBZs2a9cV0syzZqepLTp09DJpNh3rx5AF4N6nf48GG0bNkS8+fPh56eXrXve/ToEW7duoVvv/2WpkchpAlJTU3FDz/8AGNjY3To0EHxpYo5cwsKCnDp0iXo6urinXfeUfr6KzAMAzc3NwwdOhQjRoyAt7c3srOz8e6776psm4SQhpFKpVizZg3WrVun0nmoS0pK4Orqii+//LLGD/8JCQk4duwYOnbsiHfeeafKZ6a4uDj89NNPmDx5Mvr371+lGzQhmuLUqVNo06ZNjbf/apuioiJ4eXnhwYMHGDhwICZOnMibvz+WZbFu3bpKt2P7+/sjPT0d06ZNA/DmtmZ9aFRjXlNU/IJzc3Or/QB86dIlWFlZKeW+0NpcvnwZaWlpWLJkSZUGub+/P86ePYu33367ShzJycn48ccfsX37dujq6qo0RqJ95HI5srKyYGFhofXd10jTIZPJsGfPHjg4OMDf3x+7du1SDODzX1TjhO80vcbj4+Nx4sQJbNmyRaUXFLKzs7Fjxw5s2LABZmZmAF6Nm3Tnzh34+fnBxMQE8+fPR+vWrau8t6Ih7+rqqpKTnaThNL2+uSCTybBu3TosX74cVlZWXIfTYC9fvsS5c+eQkpKCiRMnYvDgwbzcx2vWrMGuXbsUP6elpeHcuXP48ssvAbzqEdSiRQulNOb599tTIrlcXu3jycnJtXa/YlkWhYWFSEtLQ2ZmZoO2ff36dURHR1fbkAeAAQMGYOfOnQgNDcWOHTuQnZ0N4NW9Nfv27cPq1aupIU+qJZfLER4eXmN9E6KJRCIRVq9ejZycHCxevLjGhjxANU74T9Nr3N7eHr169cLly5dVuh1zc3N89dVX2Lp1Kzw9PbFp0ybs3bsXRkZGcHV1xerVq6ttyMfGxuLnn3+mhryG0vT65oJIJMLy5ctx4MABxcjw2iQmJgZubm44evQoRo8erehtx8eGPAA0b95cMYUeALRq1QovX75U/KzM2qYr89V4U9eHDRs2YPny5fjtt99QUFAAhmHw+q9RIBDAyMgIzZs3R25uLoyNjfHRRx/V+R+Gn58fbt26hbVr19apyGNiYnDkyBEMGTIEgYGBeOedd9CtW7c650sIIYQQokwsy2LDhg01DjbXWAzDICAgAD4+PkhISEBRUREOHTr0xs9aFYMUb9y4kRryROt4e3sjLS0N8+fP5zqUN6qYDefChQswNzfH3Llz6z3jjrb6448/0Lt370qz8bw+E5oyu9nXfGmD4MWLF5WmFaggl8sRFhaGDh06YOLEiW8cnOPp06dwc3ODk5MTZs2aVevrg4ODcfXqVbi6utb5bJWDgwN27dqFf/75B2PGjKGGPKmVXC5HWloarKyseHtGlDRtVOOE77ShxgUCAVasWIHvvvsObm5utfamqQuJRILnz58jIiICz549g0QigbOzMz777DOYm5vj9u3b+PXXX/H111/X2LU/NjYWBw8ehKurK0xMTBoVD1EdbahvrowZMwa7du1CRESExk5XJ5fLcffuXVy5cgVdu3bFypUrFdN1NxV2dnZITEys1Jg3MDBASUkJDA0N8eLFC6VtixrztcjKyqrymEwmg1AoREJCAgYPHlynUTa7deuGXbt24datW1i9enW1o6kCQGRkJM6dO4fNmzfX+5+eUChUDKpASG3kcjliYmLQqlUr+idJeIlqnPCdttS4hYUFJk+ejFOnTuHjjz+u8/tYlkV6ejqePn2KiIgIpKamQldXF507d4ajoyNmzJgBfX39Su8ZMWIEMjMz8ccffygGDX5dTEwMDh06BFdX10ZfCSOqpS31zQWBQIBly5bB1dUV27dvb9AU3Kp0//59XLhwAYMHD8bWrVvfOOUtX9na2sLT07PKY4mJiejSpUu1bcyGosZ8LaprUKenp6N169ZITEys10jKAoEAo0aNwuDBg3Hu3Dl4enpi4cKF6NixI4BXZ4srBoupaYR6QpRBLBZj2LBhXIdBiMpQjRO+06YaHz58OPz9/REWFlbrvPCFhYUICAjAw4cPUVBQACsrK3Tr1g2zZs2CtbV1nQbSmz17Nn755RdcvXoV48ePVzxODXntok31zYVmzZphwYIF+PHHHxXT1cnlcmRnZ+Ply5dIT09XfM/Ozkbz5s3x1VdfqbThL5fLceLECZSWlmL37t2N7omj7WxsbJCSklLpsYqr9V26dFHq76dp/6bfoLrBCSoGv0tJSalyVrgu9PT08MEHHyA7OxsnTpwAAIwbNw6nTp3Cpk2bNO4MG+EfuVyOpKQktG3bls54E16iGid8p201/sUXX2DDhg3Ytm0bjIyMALy6+p6UlAQ/Pz88fvwY+vr6GDBgAD7//PMG38suEAiwdOlS7NixAxYWFujXrx+io6Nx5MgRashrEW2rby706NEDwcHBWLlyJXR1dSEUCmFubo5WrVqhdevW6N27N1q3bg0zMzM8ffoUGzduxOrVq2Fpaan0WAoLC7F7924MHToUY8eOVfr6tZFYLK4yUKGdnR18fHwAKHcAPGrM16KmxryNjQ1EIlGj1m1ubo5vv/0WUVFRuHjxItatW0f3bxG1kMvlSElJgY2NDf2TJLxENU74Tttq3NDQEB999BF++uknjBs3Dn5+foiPj4etrS0GDRr0xvGE6kMoFGLVqlVwdXVFTk4Orl+/DldXVzRv3lwp6yeqp231zZX58+eDZdk39lpxdHTEypUrsWvXLnz22WeKXsHKEBMTg19++UXp6+UDXV1dSCQSRY9rGxsbJCUlAaDGvNpU1wUiOTkZtra2tU5NVx8dO3bEqlWrlLIuQupCLBbDxcWF6zAIURmqccJ32ljjPXv2REJCAiIjIzFx4kTY29urbA56PT09rF+/HocPH6aGvBbSxvrmSl3/hqytrbFlyxbs3LkTU6ZMUcrv9+bNm/Dx8cGmTZvogmQ12rZti6SkJHTo0AHAq+NSeXk5gOrbmA1Fp7tqUd08jhkZGSguLlbJNCuEqINMJkN0dLRWzlNKSF1QjRO+09YanzJlCt577z20a9dOZQ35CsbGxli1ahU15LWQtta3pmvevDm2bdsGX19fuLu7N3g9DMPg4MGDiI6OxtatW6khXwNbW1skJCRUekwkEoFhGKXWNjXma/H63PGvP5aYmAh7e3v1B0SIErAsi9zc3GrrmxA+oBonfEc1TviM6lt1xGIxvv32WxQVFeGnn36qd3fvvLw8bNy4Ed26dcOSJUsafdsxn1UMePc6GxsbpKamKrW2qTFfi/92gWAYBmKxGAkJCXRlnmgtsVgMZ2fnJj/SKOEvqnHCd1TjhM+ovlVLIBDgww8/RNeuXbF161aUlpbW+nqWZZGXlwd/f39s3boVixcvxogRI9QTrBarmIruv48lJCTQaPbq8t8uEKmpqbCyskJiYiKNOk+0lkwmQ1RUFDp27EhnVAkvUY0TvqMaJ3xG9a0eo0ePhqWlJTZu3Ii1a9eiWbNmSElJQVJSkuKroKAAAGBqagpbW1ts3boVzZo14zhy7WBgYFDlRImdnR0eP34MCwsLpW2HGvO1aNmyZaWfk5OTYWVlheTkZI4i+n/t3XtQ1XX+x/HX4RxA5KZA3BSIxLIVL6RZdPFSipF5mXY3u4x5q5kma3XcLnuZrX47TrZm22452baVWW1ZO6VdRlnYVDAdR1EYpbUSwxQDDVFA5HrO+f3R8h1PIIV+6Xg++3zMMPM93+/he94feHnw/f1+vt8D2OOHjsICgY6Mw3RkHCYj3z+N4cOHa9GiRVq+fLlCQkI0cOBApaSkKCsrS9OmTePjHM+T0+mU2+22Dkqlpqbqo48+0uWXX27ba9DMd6Pj7oMdjhw5osjISCUnJ/upIuD8OZ1OZWVl+bsMoNeQcZiOjMNk5PunNXDgQC1dutTfZRgpKSlJVVVV1qegRUVFqb6+vlOPeT64Zr4b359mX1lZqfb2dq6XR0Bzu90qKyvjLrEwFhmH6cg4TEa+YYq0tLROd7R3OBxqb2+37TVo5nvg22+/1cmTJ7mTPQAAAADgrLq6o31sbKxOnDhh22vQzHfj+zfd6PhYOs7MI5A5nU5lZmZyUxkYi4zDdGQcJiPfMEVXnzWflpZm6/3XaOa7sWPHDmu5paVFISEhamhoUGRkpB+rAs6P2+1WSUkJ09dgLDIO05FxmIx8wxT9+vVTXV2dz7rU1FRt3rzZttegme/GmW8i33zzjZKTk+VwOPxYEWCPsLAwf5cA9CoyDtORcZiMfMMkXq/XWk5LS1N1dbVt+6aZ78aZ03sqKysVHh6uxMREP1YEnD+n06khQ4YwfQ3GIuMwHRmHycg3TBIXF6fjx49bj+Pj47lm/qdy5p0Gjxw5IklcL4+A197erp07d9p6J03gQkLGYToyDpORb5jk+9fNOxwOW2d608x348wfdGVlpZqamriTPQKew+FQ//79uWQExiLjMB0Zh8nIN0zS1cfT9enTx7b908x348zpPbW1tTp+/Dhn5hHwnE6nMjIymL4GY5FxmI6Mw2TkGybp6uPprrvuOtv2TzPfjTOn93i9XtXV1alfv37+KwiwQXt7u7Zt28b0NRiLjMN0ZBwmI98wSXx8vI4ePeqzbsCAAbbtn2a+G0FB3/14mpqaFBoa6udqAHsEBQVpwIABVr4B05BxmI6Mw2TkGyZxOBw+d7OXZGu2XbbtyUAdP+gjR44oKiqKa3dghKCgIC4XgdHIOExHxmEy8g3TREREqKGhQZGRkZLsbeY55NWNjuk9lZWVcrlcvLHACO3t7SoqKmL6GoxFxmE6Mg6TkW+Y5vvXzduZbZr5blx66aWSvjsz73a7aeZhhKCgIA0aNIjpazAWGYfpyDhMRr5hmu/f0b6jx7QD/0q6ERcXJ+m7M/ONjY18LB2MwLVoMB0Zh+nIOExGvmGa75+Z7+gx7XBB/SspKirS1KlTlZycLIfDoXXr1vlsf//99zV58mTFxcXJ4XCotLS00z5aWlr04IMPKi4uTuHh4Zo2bZoqKyvPqZ6OKRB1dXU6deqUYmJizmk/wIWkvb1dGzduZPoajEXGYToyDpORb5hmwIABOnLkiPXY2Gn2jY2NGjFihFasWHHW7ddee62eeuqps+5j0aJFWrt2rdasWaNPP/1Up06d0i233CK3293jejqOCHq9XjkcDm6AByMEBQUpMzOTI94wFhmH6cg4TEa+YRqXy+XTwBt7N/vc3Fzl5uaedfusWbMkSQcPHuxye11dnV555RW98cYbmjhxoiTpzTffVEpKiv79739r8uTJPaqnsbFRISEhcjgc6t+/f4++F7hQBQUFKT4+3t9lAL2GjMN0ZBwmI98wUUhIiFpaWhQaGqrGxkbb9mvUIa9du3apra1NOTk51rrk5GRlZmZq27ZtZ/2+lpYW1dfX+3xJUmlpqSorKxUaGmpdL+92u62z/O3t7T7LHo+n2+W2tjaf5Y7PHOxY9nq9nZYl+Sx7PB6f5Y6jPGdbdrvdPssd9Z5tmTGZP6bTp08rLy9PbW1txozJxN8TYzr3MTU3N1sZN2VMJv6eGNO5j6mtrU15eXlqbW01Zkwm/p4Y07mNqampSf/617/U3NxszJhM/D0xpp6NKSUlRYcPH5akLi8VP1dGNfPV1dUKCQnpdBY9ISFB1dXVZ/2+pUuXKjo62vpKSUmRJB06dEiHDx9WUlKStc89e/Zo//79kqSSkhJVVFRIknbs2GH9grZt26aqqipJ390HoKamRpK0ceNGnTx5UpKUn5+vhoYGSdL69eutN6z169ervb1dzc3NWr9+vSSpoaFB+fn5kqSTJ09q48aNkqSamhoVFRVJkqqqqqwDFocPH9aOHTskSRUVFSopKZEk7d+/X3v27JEk7du3T/v27WNM/4Nj2rRpk4YOHSqn02nMmEz8PTGmcx/T4cOHFRkZKafTacyYTPw9MaZzH5PT6VR7e7uampqMGZOJvyfGdG5jKikp0ZVXXmktmzAmE39PjKlnY4qPj9dXX31lfa9dHN6OwxsXGIfDobVr12rGjBmdth08eFDp6ekqKSnRyJEjrfVvvfWW5s6dq5aWFp/nT5o0SYMGDdKLL77Y5Wu1tLT4fE99fb1SUlKUl5eno0eP6vPPP9evfvUrJSYmWkddOv6QOhwOazkoKEhBQUFnXW5ra5PT6bSWXS6XHA6HtSx9dwTnzOXg4GB5vV5r2ePxyO12W8sej0cul+usy263W16v11ruqP1sy4yJMTEmxsSYGBNjYkyMiTExJsbEmOwb0759+7R9+3bNnTtXeXl5ys3NVV1dnaKionQ+jGrmN27cqBtvvFG1tbU+Z+dHjBihGTNm6P/+7/9+1GvX19crOjpaGzZs0O7du3X8+HEtX76cG+DBCG1tbcrPz1dOTo6Cg4P9XQ5gOzIO05FxmIx8w0RNTU1atmyZHn/8cVubeaOm2Y8aNUrBwcEqKCiw1lVVVamsrEzXXHNNj/fndDrV0NBg3QQPMIHL5dL1119vHcEETEPGYToyDpORb5goLCxMzc3Nkr7rMe1yQf0rOXXqlMrLy63HFRUVKi0tVUxMjFJTU1VbW6tDhw7pm2++kSR98cUXkqTExEQlJiYqOjpa8+fP169//WvFxsYqJiZGDz30kIYNG2bd3b4nHA6HWlpauKMmjOJwOM77KCBwISPjMB0Zh8nIN0wVFBQkt9tt60niC+rMfHFxsbKyspSVlSVJWrx4sbKysvTYY49Jkj788ENlZWVpypQpkqTbb79dWVlZPtfCP/vss5oxY4Zuu+02XXvtterbt68++uijczoCUl9fL7fbbd3JHjBBW1ubPvjgA+uunoBpyDhMR8ZhMvINUyUlJamqqkrt7e0//OQf6YK9Zt6fOq6Z//vf/66ysjLNnDlT2dnZ/i4LsIXX61Vzc7P69OnD5SMwEhmH6cg4TEa+YaqPPvpIsbGxamxsVE5ODtfM97aamhrOzMNIXIcG05FxmI6Mw2TkGyZKS0vT119/bes+aea7UVNTo7a2NiUkJPi7FMA2Z34eJ2AiMg7TkXGYjHzDVKmpqTp06JD1sXZ2oJnvRnBwsCIiIhQUxI8J5nC5XLr55ps56g1jkXGYjozDZOQbpoqOjtbJkyc1ZswY2/ZJl9qNhoYGn8+rB0zB0W6YjozDdGQcJiPfMFHHPSDsPFFMM9+N06dPc708jNPe3q78/Hz+UMJYZBymI+MwGfmGyWJjY1VTU2Pb/pi/0o2WlhalpaX5uwzAVsHBwZo+fbq/ywB6DRmH6cg4TEa+YbK0tDR98803tu2PM/PdaGxs5Mw8jOP1elVfXy8+lRKmIuMwHRmHycg3TJaamqrS0lLb9kcz343Tp08rKSnJ32UAtmpvb9eWLVuYvgZjkXGYjozDZOQbJktLS9O+ffts2x/T7LsREhIip9Pp7zIAWwUHB2vKlCn+LgPoNWQcpiPjMBn5hskSEhJ04sQJ2/bHmflu9OvXz98lALbzeDyqra2Vx+PxdylAryDjMB0Zh8nIN0zWcUd7u9DMd2PAgAH+LgGwndvt1s6dO+V2u/1dCtAryDhMR8ZhMvIN0/Xt29e2fdHMdyM+Pt7fJQC2Cw4O1uTJkxUcHOzvUoBeQcZhOjIOk5FvmG7s2LG27YtmvhsJCQn+LgGwncfj0bFjx5i+BmORcZiOjMNk5BumS05Otm1fNPPd6N+/v79LAGzn8XhUVlbGH0kYi4zDdGQcJiPfMJ2d2eZu9t0IDQ31dwmA7Vwul2644QZ/lwH0GjIO05FxmIx8w3Qul30tOGfmu8ERQZjI4/HoyJEj5BvGIuMwHRmHycg3TGdntmnmuxEWFubvEgDbeTweHThwgD+SMBYZh+nIOExGvmE6O3tMh9fr9dq2N0PU19crOjpadXV1ioqK8nc5AAAAAAAD2Nlrcma+GxwRhIk8Ho++/vpr8g1jkXGYjozDZOQbpmOa/U+ENxGYiGvRYDoyDtORcZiMfMN0dmabafZdYJo9AAAAAMBuTLP/iezZs8ffJQC2c7vdKi8vl9vt9ncpQK8g4zAdGYfJyDdMZ2ePSTPfjVOnTvm7BMB2Xq9XJ06cEJNyYCoyDtORcZiMfMN0dvaY9n1ivYFcLn48MI/L5dKVV17p7zKAXkPGYToyDpORb5jOzh6TM/PdYHoPTOR2u/X555+TbxiLjMN0ZBwmI98wnZ3ZppkH/gc1NTX5uwSgV5FxmI6Mw2TkG/hxmEfeDafT6e8SANs5nU5lZWX5uwyg15BxmI6Mw2TkG6azs8fkzHw3mN4DE7ndbpWVlZFvGIuMw3RkHCYj3zAd0+wBAAAAAPgfxjT7bjDNHiZyOp3KzMz0dxlAryHjMB0Zh8nIN0xnZ49JM9+Fjs+1rK+vV319vZ+rAezVMX0tMzOTA1YwEhmH6cg4TEa+YbqO/rKj5zwfNPNdOH78uCTp5z//uZ8rAQAAAACY5vjx44qOjj6vfdDMdyEmJkaSdOjQofP+AQMXmvr6eqWkpOjw4cOKiorydzmA7cg4TEfGYTLyDdPV1dUpNTXV6jnPB818F4KCvrsvYHR0NG8iMFZUVBT5htHIOExHxmEy8g3TdfSc57UPG+oAAAAAAAA/IZp5AAAAAAACDM18F0JDQ/X4448rNDTU36UAtiPfMB0Zh+nIOExGvmE6OzPu8NpxT3wAAAAAAPCT4cw8AAAAAAABhmYeAAAAAIAAQzMPAAAAAECAoZkHAAAAACDA0Mx/zwsvvKD09HT16dNHo0aN0pYtW/xdEnBOioqKNHXqVCUnJ8vhcGjdunU+271er5544gklJycrLCxM48eP12effeafYoEeWrp0qa688kpFRkYqPj5eM2bM0BdffOHzHDKOQLZy5UoNHz5cUVFRioqKUnZ2tjZs2GBtJ98wydKlS+VwOLRo0SJrHRlHIHviiSfkcDh8vhITE63tduWbZv4M77zzjhYtWqTf//73Kikp0fXXX6/c3FwdOnTI36UBPdbY2KgRI0ZoxYoVXW5ftmyZ/vznP2vFihXauXOnEhMTNWnSJDU0NPzElQI9V1hYqAULFmj79u0qKChQe3u7cnJy1NjYaD2HjCOQDRw4UE899ZSKi4tVXFysG264QdOnT7f+s0e+YYqdO3fqpZde0vDhw33Wk3EEuqFDh6qqqsr62rt3r7XNtnx7YRkzZoz3vvvu81k3ZMgQ729+8xs/VQTYQ5J37dq11mOPx+NNTEz0PvXUU9a65uZmb3R0tPfFF1/0Q4XA+Tl27JhXkrewsNDr9ZJxmKl///7el19+mXzDGA0NDd7Bgwd7CwoKvOPGjfMuXLjQ6/XyHo7A9/jjj3tHjBjR5TY7882Z+f9qbW3Vrl27lJOT47M+JydH27Zt81NVQO+oqKhQdXW1T95DQ0M1btw48o6AVFdXJ0mKiYmRRMZhFrfbrTVr1qixsVHZ2dnkG8ZYsGCBpkyZookTJ/qsJ+Mwwf79+5WcnKz09HTdfvvt+uqrryTZm2+XrRUHsJqaGrndbiUkJPisT0hIUHV1tZ+qAnpHR6a7yvvXX3/tj5KAc+b1erV48WJdd911yszMlETGYYa9e/cqOztbzc3NioiI0Nq1a/Wzn/3M+s8e+UYgW7NmjXbv3q2dO3d22sZ7OALdVVddpddff12XXnqpjh49qiVLluiaa67RZ599Zmu+aea/x+Fw+Dz2er2d1gGmIO8wwQMPPKA9e/bo008/7bSNjCOQXXbZZSotLdXJkyf13nvvafbs2SosLLS2k28EqsOHD2vhwoXKz89Xnz59zvo8Mo5AlZubay0PGzZM2dnZGjRokFavXq2rr75akj35Zpr9f8XFxcnpdHY6C3/s2LFOR02AQNdxN03yjkD34IMP6sMPP9SmTZs0cOBAaz0ZhwlCQkKUkZGh0aNHa+nSpRoxYoT++te/km8EvF27dunYsWMaNWqUXC6XXC6XCgsL9dxzz8nlclk5JuMwRXh4uIYNG6b9+/fb+h5OM/9fISEhGjVqlAoKCnzWFxQU6JprrvFTVUDvSE9PV2Jiok/eW1tbVVhYSN4RELxerx544AG9//772rhxo9LT0322k3GYyOv1qqWlhXwj4N14443au3evSktLra/Ro0frrrvuUmlpqS655BIyDqO0tLRo3759SkpKsvU9nGn2Z1i8eLFmzZql0aNHKzs7Wy+99JIOHTqk++67z9+lAT126tQplZeXW48rKipUWlqqmJgYpaamatGiRXryySc1ePBgDR48WE8++aT69u2rO++8049VAz/OggUL9NZbb+mDDz5QZGSkdXQ7OjpaYWFh1ucVk3EEqt/97nfKzc1VSkqKGhoatGbNGm3evFl5eXnkGwEvMjLSusdJh/DwcMXGxlrryTgC2UMPPaSpU6cqNTVVx44d05IlS1RfX6/Zs2fb+h5OM3+GmTNn6vjx4/rjH/+oqqoqZWZmav369UpLS/N3aUCPFRcXa8KECdbjxYsXS5Jmz56t1157TY888oiampp0//3368SJE7rqqquUn5+vyMhIf5UM/GgrV66UJI0fP95n/apVqzRnzhxJIuMIaEePHtWsWbNUVVWl6OhoDR8+XHl5eZo0aZIk8g3zkXEEssrKSt1xxx2qqanRRRddpKuvvlrbt2+3+kq78u3wer3e3hgAAAAAAADoHVwzDwAAAABAgKGZBwAAAAAgwNDMAwAAAAAQYGjmAQAAAAAIMDTzAAAAAAAEGJp5AAAAAAACDM08AAAAAAABhmYeAAAAAIAAQzMPAAACQmtrqzIyMrR161Zb9/vxxx8rKytLHo/H1v0CANCbaOYBAPCDOXPmyOFwdPoqLy/3d2kXrJdeeklpaWm69tprrXUOh0Pr1q3r9Nw5c+ZoxowZP2q/t9xyixwOh9566y2bKgUAoPfRzAMA4Cc33XSTqqqqfL7S09M7Pa+1tdUP1V14nn/+ed1zzz29su+5c+fq+eef75V9AwDQG2jmAQDwk9DQUCUmJvp8OZ1OjR8/Xg888IAWL16suLg4TZo0SZL0n//8RzfffLMiIiKUkJCgWbNmqaamxtpfY2Oj7r77bkVERCgpKUnPPPOMxo8fr0WLFlnP6epMdr9+/fTaa69Zj48cOaKZM2eqf//+io2N1fTp03Xw4EFre8dZ7+XLlyspKUmxsbFasGCB2trarOe0tLTokUceUUpKikJDQzV48GC98sor8nq9ysjI0PLly31qKCsrU1BQkA4cONDlz2r37t0qLy/XlClTevhTlg4ePNjlLIjx48dbz5k2bZp27Nihr776qsf7BwDAH2jmAQC4AK1evVoul0tbt27V3/72N1VVVWncuHEaOXKkiouLlZeXp6NHj+q2226zvufhhx/Wpk2btHbtWuXn52vz5s3atWtXj1739OnTmjBhgiIiIlRUVKRPP/1UERERuummm3xmCGzatEkHDhzQpk2btHr1ar322ms+BwTuvvturVmzRs8995z27dunF198UREREXI4HJo3b55WrVrl87qvvvqqrr/+eg0aNKjLuoqKinTppZcqKiqqR+ORpJSUFJ/ZDyUlJYqNjdXYsWOt56SlpSk+Pl5btmzp8f4BAPAHl78LAADgf9XHH3+siIgI63Fubq7++c9/SpIyMjK0bNkya9tjjz2mK664Qk8++aS17tVXX1VKSoq+/PJLJScn65VXXtHrr79unclfvXq1Bg4c2KOa1qxZo6CgIL388styOBySpFWrVqlfv37avHmzcnJyJEn9+/fXihUr5HQ6NWTIEE2ZMkWffPKJ7r33Xn355Zd69913VVBQoIkTJ0qSLrnkEus15s6dq8cee0w7duzQmDFj1NbWpjfffFNPP/30Wes6ePCgkpOTu9x2xx13yOl0+qxraWmxzuI7nU4lJiZKkpqbmzVjxgxlZ2friSee8PmeAQMG+MxAAADgQkYzDwCAn0yYMEErV660HoeHh1vLo0eP9nnurl27tGnTJp/mv8OBAwfU1NSk1tZWZWdnW+tjYmJ02WWX9aimXbt2qby8XJGRkT7rm5ubfabADx061KeBTkpK0t69eyVJpaWlcjqdGjduXJevkZSUpClTpujVV1/VmDFj9PHHH6u5uVm//OUvz1pXU1OT+vTp0+W2Z5991jpo0OHRRx+V2+3u9Nz58+eroaFBBQUFCgrynaAYFham06dPn7UGAAAuJDTzAAD4SXh4uDIyMs667Uwej0dTp07Vn/70p07PTUpK0v79+3/UazocDnm9Xp91Z17r7vF4NGrUKP3jH//o9L0XXXSRtRwcHNxpvx0f7RYWFvaDddxzzz2aNWuWnn32Wa1atUozZ85U3759z/r8uLg462DB9yUmJnb6OUZGRurkyZM+65YsWaK8vDzt2LGj08EKSaqtrfUZIwAAFzKaeQAAAsAVV1yh9957TxdffLFcrs5/vjMyMhQcHKzt27crNTVVknTixAl9+eWXPmfIL7roIlVVVVmP9+/f73M2+oorrtA777yj+Pj4c7o+XZKGDRsmj8ejwsLCTmfMO9x8880KDw/XypUrtWHDBhUVFXW7z6ysLK1cuVJer9ea/t8T7733nv74xz9qw4YNXV6X3zHzICsrq8f7BgDAH7gBHgAAAWDBggWqra3VHXfcYd11PT8/X/PmzZPb7VZERITmz5+vhx9+WJ988onKyso0Z86cTlPJb7jhBq1YsUK7d+9WcXGx7rvvPp+z7HfddZfi4uI0ffp0bdmyRRUVFSosLNTChQtVWVn5o2q9+OKLNXv2bM2bN0/r1q1TRUWFNm/erHfffdd6jtPp1Jw5c/Tb3/5WGRkZPpcHdGXChAlqbGzUZ5991oOf2nfKysp0991369FHH9XQoUNVXV2t6upq1dbWWs/Zvn27QkNDf7AOAAAuFDTzAAAEgOTkZG3dulVut1uTJ09WZmamFi5cqOjoaKthf/rppzV27FhNmzZNEydO1HXXXadRo0b57OeZZ55RSkqKxo4dqzvvvFMPPfSQz/T2vn37qqioSKmpqbr11lt1+eWXa968eWpqaurRmfqVK1fqF7/4he6//34NGTJE9957rxobG32eM3/+fLW2tmrevHk/uL/Y2FjdeuutXU7//yHFxcU6ffq0lixZoqSkJOvr1ltvtZ7z9ttv66677up2qj8AABcSh/f7F84BAABjjB8/XiNHjtRf/vIXf5fSydatWzV+/HhVVlYqISHhB5+/d+9eTZw4scsb9J2Pb7/9VkOGDFFxcbHS09Nt2y8AAL2JM/MAAOAn1dLSovLycv3hD3/Qbbfd9qMaeem7a/GXLVtm+8fHVVRU6IUXXqCRBwAEFG6ABwAAflJvv/225s+fr5EjR+qNN97o0ffOnj3b9nrGjBmjMWPG2L5fAAB6E9PsAQAAAAAIMEyzBwAAAAAgwNDMAwAAAAAQYGjmAQAAAAAIMDTzAAAAAAAEGJp5AAAAAAACDM08AAAAAAABhmYeAAAAAIAAQzMPAAAAAECA+X/WWJfI8d2GlQAAAABJRU5ErkJggg==", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "## CODE GOES HERE\n", + "epochs_5_10.compute_psd().plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Simulate 2 interacting channels in the frequency range 15-20 Hz and verify that activity is present in this frequency range." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Using multitaper spectrum estimation with 7 DPSS windows\n", + "Averaging across epochs...\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "C:\\Users\\sangeetha\\AppData\\Local\\Temp\\ipykernel_7056\\3634693051.py:3: RuntimeWarning: Channel locations not available. Disabling spatial colors.\n", + " epochs_15_20.compute_psd().plot();\n", + "c:\\Users\\sangeetha\\anaconda3\\envs\\mne\\Lib\\site-packages\\mne\\viz\\utils.py:165: UserWarning: FigureCanvasAgg is non-interactive, and thus cannot be shown\n", + " (fig or plt).show(**kwargs)\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA/MAAAFpCAYAAADQnnivAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjguMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8g+/7EAAAACXBIWXMAAA9hAAAPYQGoP6dpAACn60lEQVR4nOzdeVxU1f8/8NcsLIqyCSooCgruiuCOOyqK+1JmpqmZlZVlmWZpLmVZadliaZ/SMtPMTJQUFRQVFRVREFFQNlkERJaBYZlh7p37+8PfzFdikeXCDIf38/GYh9dZ7n2/uW/ucO499xyJIAgCCCGEEEIIIYQQ0mhIDR0AIYQQQgghhBBCaoYa84QQQgghhBBCSCNDjXlCCCGEEEIIIaSRocY8IYQQQgghhBDSyFBjnhBCCCGEEEIIaWSoMU8IIYQQQgghhDQy1JgnhBBCCCGEEEIaGWrME0IIIYQQQgghjQw15gkhhBBCCCGEkEaGGvOEEEIIwW+//QaJRFLp49y5cwAAZ2fnSt8zatSocuuNiorC4sWL0blzZzRr1gzNmjWDm5sbXn31VYSHhzdskoQQQghD5IYOgBBCCCHG49dff0W3bt3KPd+jRw/98tChQ7F169Zy77G0tCzz/59++glvvvkmunbtirfffhs9e/aERCJBTEwM/vzzTwwYMADx8fHo3Lmz+IkQQgghjKPGPCGEEEL0evXqhf79+1f5HmtrawwePLjK91y6dAmvv/46Jk2ahEOHDsHU1FT/mre3N9544w38/fffaNasmShxE0IIIU0NNeYJIYQQIrrPPvsMMpkMP/30U5mG/JOeffbZBo6KEEIIYQc15gkhhBCix/M8OI4r85xEIoFMJtP/XxCEcu8BAJlMBolEAp7ncfbsWfTv3x8ODg71HjMhhBDSFNEAeIQQQgjRGzx4MExMTMo8zMzMyrwnICCg3HtMTEzw6aefAgCys7NRUlKCjh07llu/7mSB7iEIQoPkRQghhLCGrswTQgghRO/3339H9+7dyzwnkUjK/H/YsGHYtm1buc+2a9fuqevv168fbt68qf//li1b8N5779UyWkIIIaTposY8IYQQQvS6d+/+1AHwrKysqnyPnZ0dmjVrhuTk5HKv7d+/H8XFxcjIyMDUqVPrHC8hhBDSVFFjnhBCCCGikslk8Pb2RmBgIDIyMsrcN6+b4u7+/fsGio4QQghhA90zTwghhBDRffDBB+B5Hq+99ho0Go2hwyGEEEKYQ1fmCSGEEKIXHR1d4Uj1nTt3hr29PQBAoVDgypUr5d5jZmYGDw8PAMDQoUPxww8/YNmyZfD09MQrr7yCnj17QiqVIiMjA//88w8AwNLSsh6zIYQQQtglEWgYWUIIIaTJ++2337Bo0aJKX//555/x8ssvw9nZucJ74YHHA+ClpaWVee7mzZv49ttvce7cOaSnp0MikaB9+/bw8vLCggUL4O3tLWoehBBCSFNBjXlCCCGEEEIIIaSRoXvmCSGEEEIIIYSQRoYa84QQQgghhBBCSCNDjXlCCCGEEEIIIaSRocY8IYQQQgghhBDSyFBjnhBCCCGEEEIIaWSoMU8IIYQQQgghhDQyckMHYIy0Wi3S09PRsmVLSCQSQ4dDCCGEEEIIIYQBgiBAqVTC0dERUmkdr60LRuT8+fPC5MmTBQcHBwGA4Ofnp3+ttLRUWLVqldCrVy+hefPmgoODgzB//nzhwYMH+vfk5OQIb775ptClSxehWbNmgpOTk7Bs2TJBoVDUKI7U1FQBAD3oQQ960IMe9KAHPehBD3rQgx6iP1JTU+vcfjaqK/NFRUVwd3fHokWLMGvWrDKvFRcX48aNG/joo4/g7u6OvLw8LF++HFOnTkV4eDgAID09Henp6di6dSt69OiB5ORkvPbaa0hPT8ehQ4eqHUfLli0BAElJSbC1tRUvQUKMAMdxuH79Ovr16we53KgOAYSIgmqcsI5qnLCM6puwLjc3Fy4uLvo2Z11IBEEQRIhJdBKJBH5+fpg+fXql77l27RoGDhyI5ORkdOjQocL3/P3335g3bx6KioqqfUAoKCiAlZUV8vPzYWlpWZvwCSGEEEIIIYSQMsRsazbqAfDy8/MhkUhgbW1d5XssLS2rbMir1WoUFBSUeQBAZGQkAIDnefA8X26Z47gyy1qttspljUZTZll3HkW3LAhCuWUAZZa1Wm2ZZY7jqlzmeb7MckV5UE5NKyeVSoW4uDjwPM9MTizuJ8qp9jmVlpbi3r17+rhZyInF/UQ51T4nnudx9+5d/bZYyInF/UQ51S4ntVqN+Ph4lJaWMpMTi/uJcqp9ThERERBLo23Mq1QqrF69GnPnzq30jEZOTg4++eQTvPrqq1Wua/PmzbCystI/nJycAAC3b98GAMTExCAmJgYAEBUVhbi4OACPd0RSUhIAICwsDKmpqQCA0NBQZGRkAABCQkKQnZ0NAAgODoZCoQAABAYGQqlUAgACAgKgUqnAcRwCAgLAcRxUKhUCAgIAAEqlEoGBgQAAhUKB4OBgAEB2djZCQkIAABkZGQgNDQUApKamIiwsDMDjWwV0BRMXF4eoqCjKqYnndObMGTx69AiCIDCTE4v7iXKqfU73799HUlISBEFgJicW9xPlVPucBEHA3bt3mcqJxf1EOdUup/DwcOTl5eH+/fvM5MTifqKcap+TbptiaJTd7DUaDZ599lmkpKTg3LlzFTbmCwoK4OPjAxsbG/j7+8PExKTSbanVaqjV6jKfdXJywsmTJzF+/Hj9mRaZTFZmmeM4SCQS/bJUKoVUKq10WaPRQCaT6ZflcjkkEol+GXh81ubJZRMTEwiCoF/WnZHXLWu1Wsjl8kqXeZ6HIAj65YryoJwoJ8qJcqKcKCfKiXKinCgnyolyopzqP6eTJ0/C19dXlG72ja4xr9FoMHv2bCQmJiI4OBitWrUq91mlUonx48ejefPmOHbsGMzNzWu0bd19DLrGPCEs4XkecXFxcHNzg0wmM3Q4hIiOapywjmqcsIzqm7Du1KlTmDBhgiiN+UY1RKSuIR8XF4ezZ89W2JAvKCjA+PHjYWZmBn9//xo35AlpCkpKSgwdAiH1imqcsI5qnLCM6puQ6jGqxnxhYSHi4+P1/09KSkJkZCRsbW3h6OiIZ555Bjdu3MCxY8fA8zwyMzMBALa2tjA1NYVSqYSPjw+Ki4vxxx9/lBnMzt7evsZn9+hsIGGRTCaDh4eHocMgpN5QjRPWUY0TllF9E9aJ2cY0qgHwwsPD4eHhof8Ffvfdd+Hh4YF169YhLS0N/v7+SEtLQ9++feHg4KB/6AYRuH79Oq5evYpbt27B1dW1zHt0AxHUhO5eCEJYwvM8oqOjqb4Js6jGCeuoxgnLqL4J68SsbaO6Mj9q1ChUdQv/027vf9rnCSGEEEIIIYQQFhhVY97YUDd7wiKZTIZevXoZOgzCCN0JVIlEYuBI/g/VOGEd1ThhGdU3YR2z3eyNDXXvISzieR4RERFU36RO0tLSsGfPHrz33nt4//339fO+GgOqccI6qnHCMqpvwjpmu9kbG0dHR0OHQEi9aNasmaFDII3Qo0ePcPbsWYSHh6Nt27YYM2YMXnzxRSQkJGDDhg1Ys2YNbG1tDR0mAKpxwj6qccIyqm/CMjHbmEY7z7wh6eaZF2PuP0IIacwKCgpw/vx5hIaGwtLSEt7e3ujXrx/k8rLnglNSUvDNN99g9erVaN26tYGiJYQQQggxbmK2NakxXwHdDzgnJ8dorjIRIhaO4xAREQEPD49yDTJCgMf3wV+7dg0nTpwAAIwcORJDhgyBmZlZlZ/LyMjAli1b8O6776J9+/YNEWqFqMYJ66jGCcuovgnrcnNz0apVK1Ea8/QbUgVjGtCJELFIJBLY2NhQfZNyFAoFAgICEBoaChMTE/A8j+bNm+PKlStQKBTo1q0bOnXqBFNT0wo/7+DggDVr1uDTTz/Fm2++iU6dOjVwBo9RjRPWUY0TllF9E9aJWdt0Zb4Cuivzubm5sLGxMXQ4hBBSbwRBQFRUFP7991+kpaVBIpHAzs4O48aNg5eXF+RyOfLy8nD37l3ExsYiISEBGo0GrVu3Rrdu3dCtWzd07NixzMis+fn52LRpExYvXoxu3boZMDtCCCGEEOOSl5cHW1tb6mZfX3SN+RMnTmDChAmGDocQUXEch7CwMAwcOJC6rzVhSqUSgYGBOH36NDiOg0wmg5eXFyZMmPDUe94FQcCjR48QGxurb+CvXbsWLVu21L+nqKgIn3zyCebMmYO+ffvWczZlUY0T1lGNE5ZRfRPWnTx5Er6+vtTNvr5JpTRzH2GPVCpFu3btqL6bIEEQEBkZCX9/f8THx0Mmk6FTp06YNGkSPD09q93tSyKRoHXr1mjdujVGjBiBuLg4bNy4ER9++KF+nBELCwts3LgRmzZtglqtxqBBg+oztTKoxgnrqMYJy6i+CevErG26Ml8B3ZX5oKAgjB071tDhEEJInTx69Ah//fUXAgMDAQBubm4YOXIkRo0aJdqMHbrR7N9//320adNG/7xGo8HmzZsxYsQIjBo1SpRtEUIIIYQ0VqdPn8a4ceNEuTJPp7yqwHGcoUMgRHQcxyEkJITqm3EajQZ79+7FlClTMHPmTMTFxWHVqlU4ePAgvvrqK0ydOlXUqTc7dOiA999/H1988QVSU1P1z5uYmGDNmjWIjIzE77//joY4f0w1TlhHNU5YRvVNWCdmbVNjvgrUvYewSCqVonPnzlTfjCosLMRzzz2HoUOH4urVq9iwYQNCQkLw7bffYtiwYTA3N6+3bbdp0wZr1qzBN998g/j4eP3zMpkMy5cvh5WVFT799FOoVKp6iwGgGifsoxonLKP6JqwTs7bpt6QKdBAhLKJ70diVn5+PiRMnYty4cQgLC8P27dvRr1+/Bp3ep1WrVtiwYQN++ukn3L59u8xr06ZNw+TJk7F27Vo8fPiw3mKgGiesoxonLKP6JqyjxnwDoe49hEUcxyE4OJjqmzHp6emYPn06Fi1ahJdfftmgsbRs2RIbN27Evn37cP369TKv9e3bFytWrMCXX36JqKioetk+1ThhHdU4YRnVN2EddbNvIHRGkLBIKpWiV69eVN8MiY2Nxcsvv4w5c+Zg0aJFhg4HANC8eXOsX78e/v7+uHjxYpnXHBwcsGnTJhw5cgTHjh0TfdtU44R1VOOEZVTfhHV0Zb6B0EGEsEgqlaJ169ZU34wIDQ3F+vXrMXbsWLzyyiuGDqcMMzMzfPTRRzh//rx+JH2dZs2a4aOPPkJ2dja+++47cc9SU40TxlGNE5ZRfRPWUWO+gVD3HsIijUaDU6dOQaPRGDoUUkdHjx7FH3/8gV69euGdd95p0Hvjq0sul+ODDz5AdHQ0jhw5UuY1iUSChQsXom/fvli3bh3y8/NF2SbVOGEd1ThhGdU3YZ2YbUyaZ74CunnmU1JS4OTkZOhwCBGVVquFQqGAtbU1nfVupARBwP/+9z88evQIxcXF2LhxI0xMTAwdVpV0MVtYWOCFF14od+IhISEBP/zwAzZt2oTmzZvXaVtU44R1VOOEZVTfhHWpqano0KEDzTNf36ysrAwdAiGik0qlsLW1pS/IRqq0tBSfffYZzMzMkJ+fj7Vr1xp9Qx54fBX+lVdegUQiwU8//VRuvvnOnTtj7ty52Lt3b523RTVOWEc1TlhG9U1YJ2Ybk35LqkDdewiLNBoNjh8/TvXdCKlUKqxbtw4DBw5EVFQUPvzwwzpfxW5IEokEL7zwAhwcHPD1119Dq9WWeb1///54+PAhUlNT67QdqnHCOqpxwjKqb8I6MWubutlXQNfNXqFQ0NV5whxBEKBUKtGyZUujvMeaVG7Xrl1wdnZGQEAA3n33XbRr187QIdXa6dOnceXKFaxevRpyuVz/fGZmJn788Uds3Lix1vVJNU5YRzVOWEb1TViXn58Pa2tr6mZf3/Ly8gwdAiGik0gksLS0pC/IRiYzMxOJiYk4ffo0XnnllUbdkAeAsWPHwtvbGxs3boRardY/37ZtW7i4uODy5cu1XjfVOGEd1ThhGdU3YZ2YbUxqzFchNjbW0CEQIjqNRoOjR49S97VG5pdffoGpqSmmTJmCrl27GjocUXh5eWHmzJlYv349ioqK9M/PnTsXhw4dQmlpaa3WSzVOWEc1TlhG9U1YJ2YbkxrzVZDJZIYOgRDRyeVy+Pj4lOnaTIzbzZs3kZOTA3t7e3h5eRk6HFF5eHhg4cKF2LBhAxQKBYDH89NPnz4dBw8erNU6qcYJ66jGCcuovgnrxGxjUmOekCaIviAbD61Wi507d0IQBLzyyiuGDqdedOvWDW+88QY+/vhj/dyrw4cPR0xMDLKysmq1TqpxwjqqccIyqm9Cqoca81Xged7QIRAiOo7jEBAQoG80EeP277//4uHDh3j//feZ/uPG2dkZ3t7eCA4OBvD4nsklS5bg559/rvG6qMYJ66jGCcuovgnrxGxjGlVjPiQkBFOmTIGjoyMkEgmOHDmif02j0eD9999H7969YWFhAUdHR7z44otIT08vsw61Wo1ly5bBzs4OFhYWmDp1KtLS0moVD3WzJyySy+WYOHEi0w1DVhQWFmL79u1YtmwZHBwcDB1OvRs3bhwCAwP1c9A7OzvD2toakZGRNVoP1ThhHdU4YRnVN2Eds93si4qK4O7uju3bt5d7rbi4GDdu3MBHH32EGzdu4PDhw7h37x6mTp1a5n3Lly+Hn58fDhw4gIsXL6KwsBCTJ0+mq+yEPIHOdjcOGzduRN++fTF69GhDh9IgzMzM0LNnzzKN9wULFmDv3r01PoZTjRPWUY0TllF9E1I9RtWY9/X1xaZNmzBz5sxyr1lZWSEoKAizZ89G165dMXjwYHz//fe4fv06UlJSADyes2/Xrl346quvMHbsWHh4eOCPP/7ArVu3cPr06Uq3q1arUVBQUOYBQD+SMs/z+j8kn1zmOK7MslarrXJZo9GUWdZdfdItC4JQbhlAmWWtVltmWXewq2yZ5/kyyxXlQTk1rZxKSkoQGBgIjuOYyYnF/XTjxg2Ehobi008/ZSan6uynyZMn48iRI/rnW7RogTFjxuDo0aPVzkmtVutr3BhyYnE/UU6GzYnjOAQGBupjYyEnFvcT5VS7nFQqFQIDA6FWq5nJicX9RDnVPqfaztZTEaNqzNdUfn4+JBIJrK2tAQDXr1+HRqOBj4+P/j2Ojo7o1asXQkNDK13P5s2bYWVlpX84OTkBgP4kQUxMDGJiYgAAUVFRiIuLAwBEREQgKSkJABAWFobU1FQAQGhoKDIyMgA8vnUgOzsbABAcHKwfrTkwMBBKpRIAEBAQAJVKBY77v3uEVCoVAgICAABKpRKBgYEAAIVCob+nNDs7GyEhIQCAjIwMfY6pqakICwsDACQlJSEiIgIAEBcXh6ioKMqpied09uxZjB49GiYmJszkxNp+4nke8fHx2Lp1K0xNTZnIqbr7KTQ0FC1btsTdu3f1OQ0aNAgcx0GhUFQrp7S0NDg6OsLExMQocmJxP1FOhs3JxMQEcrkcKpWKmZxY3E+UU+1yioiIwLRp05CWlsZMTizuJ8qp9jnp1icGiaA7vWFkJBIJ/Pz8MH369ApfV6lUGDZsGLp164Y//vgDALB//34sWrQIarW6zHt9fHzg4uKCn376qcJ1qdXqMp8pKCiAk5MTTpw4gQkTJujPtMhksjLLHMdBIpHol6VSKaRSaaXLGo0GMplMvyyXyyGRSPTLwOOzNk8um5iYQBAE/bJWqwXP8/plrVYLuVxe6TLP8xAEQb9cUR6UU9PKqbS0FCUlJbC0tCyXR2PNibX9tH79eqhUKnz55ZfM5FST/fTgwQP8888/eOutt/R5xMTE4OTJk3jnnXeemhPHcVAqlbC2ttafBTd0TizuJ8rJcDlJJBLk5eXB2tq6TH6NOScW9xPlVLuceJ5HSUkJmjdvDgBM5MTifqKcap/TyZMn4evri/z8fFhaWqIuGmVjXqPR4Nlnn0VKSgrOnTun/yFU1pgfN24cOnfujJ07d1Zr2wUFBbCyskJgYCDGjRtX51wIMSYajQaBgYHw8fGBiYmJocMh/xEWFob33nsPJ0+e1P8h0xR99NFHWLFihb7nFQBs2bIFU6dORdeuXav8LNU4YR3VOGEZ1TdhXVBQEHx8fERpzDe6bvYajQazZ89GUlISgoKCyvwA2rZti9LSUuTl5ZX5TFZWFtq0aVPjbQ0aNKjO8RJibExMTDBp0iT6gjRCeXl5+OSTT7By5com3ZAHgGnTpsHf37/Mcy+//DJ2796Np52DphonrKMaJyyj+iasE7ON2aga87qGfFxcHE6fPo1WrVqVeb1fv34wMTFBUFCQ/rmMjAxER0fDy8urxtvTdc8khCVarRa5ublU30ZGEARs2rQJDg4OmDx5sqHDMbh+/fohKiqqzCAxNjY2GDx4MPz8/Kr8LNU4YR3VOGEZ1TdhnZi1bVSN+cLCQkRGRuqnJUpKSkJkZCRSUlLAcRyeeeYZhIeHY9++feB5HpmZmcjMzNT/sWdlZYXFixdjxYoVOHPmDCIiIjBv3jz07t0bY8eOrXE8unshCGEJz/O4du0a1beROXLkCPLz8/H2229DIpEYOhyDk0gkGDt2bLmZSKZPn47IyEgkJCRU+lmqccI6qnHCMqpvwjoxa9uo7pk/d+5chfMpL1iwABs2bICLi0uFnzt79ixGjRoF4PHAeCtXrsT+/ftRUlKCMWPG4Mcff9SPUF8dunvmxbiPgRBCnqa4uBhvvvkmXF1d8eGHHxo6HKOh0WiwevVqbN26tcwJDoVCgY8//hifffYZzM3NDRghIYQQQkjNiNnWNKrGvLHQ/YBv3LgBDw8PQ4dDiKi0Wi2ys7NhZ2cHqdSoOuc0Wd9//z1CQ0OxY8eOMgO+EWDv3r3o2rUrBg4cWOb5Gzdu4OzZs1ixYkW5z1CNE9ZRjROWUX0T1kVERMDT07NpDoDXkB4+fGjoEAgRnVarRXR0NN2LZiTS09Oxb98+bN68mRryFZg6dWq5gfAAwNPTE5aWljh37ly516jGCeuoxgnLqL4J68RsY1Jjvgq6+QoJYYlcLoe3tzfVtxHQarV46aWX8P7778PZ2dnQ4RglKysr2NnZITExsdxrixYtQkBAADIzM8s8TzVOWEc1TlhG9U1YJ2ZtU2O+CnRGkLBIq9XiwYMHVN9GYPPmzWjTpg1mzJhh6FCM2jPPPINDhw6Ve14ul+Odd97Btm3bwHGc/nmqccI6qnHCMqpvwjpmR7M3NnQQISzSarVISEig+jYwPz8/XL9+HZs2bTJ0KEavffv2KCoqQk5OTrnXHBwc4OPjgz179uifoxonrKMaJyyj+iaso8Z8A6HuPYRFcrkcI0aMoPo2oMuXL+PSpUsYNmxYjWbaaMpmzJiBI0eOVPjamDFjkJubi4iICABU44R9VOOEZVTfhHXUzb6B0BlBwiKtVovk5GSqbwOJi4uDv78/TExMsGDBAkOH02i4u7sjNjYWarW6wtfffPNN7N27F/n5+VTjhHlU44RlVN+EdXRlvoHQQYSwiO5FM5ysrCzs2LEDEyZMgKurK1q1amXokBoNiUSC8ePH4+TJkxW+3qxZMyxduhTbtm0Dz/NU44RpdBwnLKP6JqyjxnwDoe49hEVyuRxeXl5U3w2suLgYX3zxBd577z0cO3YMc+bMMXRIjc6oUaNw/vx5CIJQ4etubm7o1asXjh07RjVOmEbHccIyqm/COupm30B4njd0CISIjud5xMfHU303II7j8Nlnn+Hll19GTEwMBg0aBAsLC0OH1ejI5XIMHTq00qvzADBr1ixERkbi6tWrVOOEWXQcJyyj+iasE7O2qTFfBRsbG0OHQIjoBEFAXl5epVc3ifh27dqFcePGwc3NDcePH8e0adMMHVKjNX36dAQHByMvL6/C1yUSCZYtW4bo6Gj6Q5Awi47jhGVU34R1YrYxqTFfhW7duhk6BEJEJ5fLMWDAAOq+1kAUCgXS0tIwcuRI+Pv7Y8KECTAxMTF0WI2WTCbDK6+8gh07dlT6Hjs7Ozg4OMDf378BIyOk4dBxnLCM6puwTsw2JjXmq0BXdQiLeJ5HbGws1XcDOXToEJ555hkUFxcjNDQUY8eONXRIjZ6bmxtatWqFK1euVPg6z/NwcXHBjRs3kJWV1cDREVL/6DhOWEb1TVhH3ewJIXVSUlJi6BCahKKiIsTHx6NPnz7466+/8Oyzz0IqpcOuGBYuXIiDBw+iuLi4wtdVKhVeffVV/PDDD9RVkzCJjuOEZVTfhFQP/VVZBZlMZugQCBGdTCaDh4cH1XcDOHLkCKZNmwaFQoG4uDgMHDjQ0CExw8zMDPPmzcOuXbvKvaar8Y4dO6Jz5844f/68ASIkpP7QcZywjOqbsE7M2qbGfBXCw8MNHQIhouN5ngYHawBqtRoREREYPHgwfv31V8yfPx8SicTQYTHF09MTKpUKd+7cKfP8kzX+/PPPw9/fH4WFhQaKkhDx0XGcsIzqm7BOzDYmNearoNFoDB0CIaSROn78OCZOnIioqChIpVJ0797d0CEx6dVXX8WuXbvAcVyFr5uYmGDRokX45ZdfGjgyQgghhJDyxGxjUmO+CtS9h7BIJpOhV69eVN/1iOM4XLx4EUOGDMGePXvwyiuvGDokZllaWmLSpEn4888/9c/9t8Z79+4NrVaLW7duGSpMQkRFx3HCMqpvwjrqZt9AqHsPYRHP84iIiKD6rkdBQUEYM2YMfv/9d8yePRvNmzc3dEhMGz16NOLj45Gamgqg4hpfsmQJfv3110qv4BPSmNBxnLCM6puwjkazJ4TUSbNmzQwdArO0Wi2CgoLQsWNH5ObmYvDgwYYOiXkSiQRvvPEGfvzxR/3I9f+t8ZYtW2Lq1KllruAT0pjRcZywjOqbkOqhxnwVqHsPYZFMJkO3bt2ovutJSEgIBg0ahN9++w1Lly41dDhNRuvWrTFgwAAcP3680hofOXIk4uLikJaWZqAoCREHHccJy6i+Ceuom30Doe6YhEUcx+HatWtU3/VAEAQcP34cRUVF8PX1hbW1taFDalKmT5+OkJAQZGVlVVjjFV3BJ6QxouM4YRnVN2GdmLVNjfkq0DRShEUSiQQ2NjZU3/UgLCwMjo6OSE5Ohre3t6HDaXKkUimWLl2K//3vf5XWeJs2beDh4YGTJ08aIEJCxEHHccIyqm/COjFrmxrzVaDuPYRFMpkMrq6uVN/14PDhw0hOTsabb75Jf4QYiIuLCxwdHZGZmVlpjc+cORPBwcFQKBQNGxwhIqHjOGEZ1TdhHXWzbyDUvYewiOM4hIaGUn2LLCoqCvn5+RgzZgzs7e0NHU6TNmfOHMTFxUGpVFb4ukwmwyuvvIJdu3Y1cGSEiIOO44RlVN+EddTNvoF06tTJ0CEQIjqpVIp27dpBKqVffzH9+uuvMDc3x+TJkw0dSpNnbm4OV1dX7Nmzp9L3uLm5ITc3F8XFxQ0YGSHioOM4YRnVN2GdmG1Mo/otCQkJwZQpU+Do6AiJRIIjR46Uef3w4cMYP3487OzsIJFIEBkZWW4dmZmZmD9/Ptq2bQsLCwt4enri0KFDtYqnbdu2tfocIcZMKpWiY8eO9CUponv37uH27dt47733qHu9EZBKpRg+fDiKi4tx9+7dSt83evRonDt3ruECI0QkdBwnLKP6JqwTs41pVL8lRUVFcHd3x/bt2yt9fejQofj8888rXcf8+fNx9+5d+Pv749atW5g5cyaee+45RERE1Dge6t5DWMRxHEJCQqi+RbRp0ybMnDkT7du3N3QoBP9X44sXL8Yvv/wCnucrfN/w4cMREhLSwNERUnd0HCcso/omrBOztuWirUkEvr6+8PX1rfT1+fPnAwDu379f6XsuX76MHTt2YODAgQCAtWvXYtu2bbhx4wY8PDxqFA+dESQskkql6Ny5M9W3SG7duoX79+/j119/NXQo5P/T1biNjQ28vb3h5+eHZ555ptz7zMzM0Lp1a6SkpKBDhw4GiJSQ2qHjOGEZ1TdhnZi1zdxvybBhw/DXX38hNzcXWq0WBw4cgFqtxqhRoyr9jFqtRkFBQZkH8LgnAADwPK+/svPkMsdxZZa1Wm2VyxqNpsyybp5j3bIgCOWWAZRZ1mq1ZZZ1Z3YqW+Z5vsxyRXlQTk0rJ57n4ejoCKlUykxOhtxPq1atwieffAIdFnJq7PtJEAS0adMGUqkU48aNw40bN5CVlVVhThMnTsTJkyeNPicW9xPlVPucpFIpWrdurb+th4WcWNxPlFPtctJqtWjXrh0EQWAmJxb3E+VU+5wqG6C3NphrzP/111/gOA6tWrWCmZkZXn31Vfj5+aFz586Vfmbz5s2wsrLSP5ycnAAA/v7+AICYmBjExMQAeDxidVxcHAAgIiICSUlJAB7PL52amgoACA0NRUZGBoDH4wBkZ2cDQJmpkAIDA/U7MiAgACqVChzHISAgABzHQaVSISAgAMDjHR4YGAgAUCgUCA4OBgBkZ2fru4hmZGQgNDQUAJCamoqwsDAAQFJSkv4Wg7i4OERFRVFOlBNOnz6tz4OVnAyxn8LCwlBUVIRWrVoxkxML+ykhIQGnTp0Cx3GIj4/HxIkTsWPHjgpz6tq1K6RSKRITE406Jxb3E+VU+5x0sevyYyEnFvcT5VS7nK5evYrg4GAkJCQwkxOL+4lyqn1Of/zxB8QiEXSnN4yMRCKBn58fpk+fXu61+/fvw8XFBREREejbt2+Z15YtW4awsDB89tlnsLOzw5EjR7Bt2zZcuHABvXv3rnBbarUaarVa//+CggI4OTnhxIkTmDBhgv5Mi0wmK7PMcRwkEol+WSqVQiqVVrqs0Wggk8n0y3K5HBKJRL8MPD5r8+SyiYmJ/sykiYkJtFoteJ7XL2u1Wsjl8kqXeZ6HIAj65YryoJyaVk5qtRoKhQL29vbgeZ6JnAy1n7Zs2QKe57F69WpmcmJhP2k0GmRnZ6NNmzb6M/i///47XF1d4eXlVS6ngwcPon379vDy8jLanFjcT5RT7XMCHg/426ZNG8hkMiZyYnE/UU61y4njOCgUCtjY2EAikTCRE4v7iXKqfU4nTpzAxIkTkZ+fD0tLS9QFU435hIQEuLq6Ijo6Gj179tQ/P3bsWLi6umLnzp3V2nZBQQGsrKwQFBSEsWPH1jUVQgijRo4cib/++otmvmgESktLsXr1amzatAnNmzcv81peXh6+//57rFu3zkDREUIIIaSpOH36NMaNGydKY56pbva6+YKl0rJpyWQy/T0KNcFxNIomYY9Go8GpU6f09w6R2snIyIBEIqGGvBGqqMZNTU0xf/587N69u9z7bWxsIAiCvhsfIcaOjuOEZVTfhHVitjGNqjFfWFiIyMhI/fzxSUlJiIyMREpKCgAgNzcXkZGRuHPnDgDg7t27iIyMRGZmJgCgW7ducHV1xauvvoqwsDAkJCTgq6++QlBQUIVX+J/mvycFCGGBTCbDgAED9F01Se3s3r0bEyZMMHQYpAKV1biHh0elc8+PGzcOQUFBDRUiIXVCx3HCMqpvwjox25hG1VoNDw+Hh4eHfgq5d999Fx4eHvquj/7+/vDw8MCkSZMAAHPmzIGHh4e++7yJiQkCAgJgb2+PKVOmoE+fPvj999+xZ88eTJw4scbxUGOesEgqlcLW1pbqu44CAwP102US41JVjb/yyiv4+eef9fe66QwcOBBhYWEw0jvPCCmDjuOEZVTfhHXMNuZHjRqln1bgycdvv/0GAFi4cGGFr2/YsEG/Djc3N/zzzz94+PAhioqKcPPmzVr/wU3d7AmLNBoNjh8/Tt3X6iApKQlyuRzt2rUzdCikAlXVuLW1NcaMGYPDhw+XeV4ul6Nz5864d+9eQ4VJSK3RcZywjOqbsI7ZbvbGhrr3EBbJ5XIMHz5cP+onqbk9e/ZgzJgxhg6DVOJpNT5hwgT93PNP8vX1xYkTJxoiRELqhI7jhGVU34R1YrYxqTFfBYlEYugQCBGdRCKBpaUl1XctCYKAixcv4rnnnjN0KKQST6txiUSCpUuX4n//+1+Z5zt27IjMzMwyU5USYozoOE5YRvVNWCdmbVNjvgrUzZ6wSKPR4OjRo9R9rZaio6Mhl8vRqVMnQ4dCKlGdGu/QoQPMzc2RlJRU5vlhw4bh4sWL9R0iIXVCx3HCMqpvwjrqZt9A+vXrZ+gQCBGdXC6Hj48PdV+rpQMHDmD48OF0xcCIVbfG586di/3795d5bvTo0QgODq7P8AipMzqOE5ZRfRPWidnGpMZ8FUxNTQ0dAiH1gr4ga4fjOERERGDatGmGDoU8RXVq3NHRESYmJkhOTtY/Z2FhgZYtW+Lhw4f1GR4hdUbHccIyqm/CMjHbmNSYrwJ1sycs4jgOAQEBVN+1EBYWBhMTE/To0cPQoZAq1KTGK7o6P378eJw6daq+wiOkzug4TlhG9U1YR93sGwidFSQsksvlmDhxItV3LRw9ehT9+/enuW+NXE1qvH379pBIJEhNTdU/17dvX9y8eZPmnCdGi47jhGVU34R1YtY2/UVahYyMDEOHQEi9oLPdNVdcXIz4+Hj4+PgYOhRSDTWp8RdeeAH79u3T/18ikcDd3R2RkZH1EBkh4qDjOGEZ1TdhmZhtTGrMV+H+/fuGDoEQ0XEch8DAQPqirKGQkBCYmJjA09PT0KGQp6hpjTs5OUEQBKSlpemfmzBhAk6ePFlfIRJSJ3QcJyyj+iasE7ONSY35KlD3HsIiExMTTJs2DSYmJoYOpVEJCgpCly5d6OfWCNSmxv9773zr1q1RWFiIoqKi+giRkDqh4zhhGdU3YR11s28gdL8kYZEgCCgoKKD6roGcnBw8evQII0eONHQopBpqU+MdO3aERqNBenq6/rkxY8bg7Nmz9REiIXVCx3HCMqpvwjoxa5sa81Xged7QIRAiOo7jcOHCBeq+VgPBwcGQy+UYPHiwoUMh1VDbGv/v1flhw4bh4sWLYodHSJ3RcZywjOqbsE7MNiY15qtA3ewJi0xMTDBp0iTqvlYDoaGhaNu2LSwsLAwdCqmG2ta4i4sLSkpK9APTmJqawsbGBllZWfURJiG1RsdxwjKqb8I66mbfQLRaraFDIER0Wq0Wubm5VN/VlJycDI7jMGTIEEOHQqqpLjVe0dX50NBQMcMjpM7oOE5YRvVNWCdmbVNjvgp0ECEs4nke165do9tIqikoKAgymQxeXl6GDoVUU11qvHPnzigsLMTDhw8BAP3798e1a9fEDpGQOqHjOGEZ1TdhHTXmGwh1sycsMjExwfjx46n7WjUIgoA7d+7AzMwMrVq1MnQ4pJrqWuNz587Fn3/+CQAwMzODiYkJlEqlmCESUid0HCcso/omrKNu9g2ErswTFmm1WmRlZVF9V0N0dDRatGgBDw8PQ4dCaqCuNe7m5gaFQqG/V37IkCG4cuWKmCESUid0HCcso/omrDOaK/MajQapqam4e/cucnNzxYrJaJiamho6BEJEp9VqER0dTV+S1XD69GlIpVIMGzbM0KGQGhCjxp9//nkcOHAAwOPG/OXLl8UKj5A6o+M4YRnVN2GdmG3MGjfmCwsL8dNPP2HUqFGwsrKCs7MzevToAXt7e3Ts2BFLlixh5v5CT09PQ4dAiOjkcjm8vb3pNpKn0M05XlRUhPbt2xs6HFIDYtR4165dkZ2djezsbFhaWkKlUkGtVosYJSG1R8dxwjKqb8I6MduYNWrMb9u2Dc7Ozvj555/h7e2Nw4cPIzIyEnfv3sXly5exfv16cByHcePGYcKECYiLixMtUEOgM4KERVqtFg8ePKD6forTp0/D1dUV3bp1M3QopIbEqvHnn39ef+98//79cf36dTHCI6TO6DhOWEb1TVhnsG72oaGhOHv2LMLDw7Fu3TpMmDABvXv3hqurKwYOHIiXXnoJv/76Kx4+fIipU6fi/PnzogVqCHQQISzSarVISEig+q6CIAg4c+YMeJ6nLvaNkFg13r17d6Snp6OkpATDhg3DxYsXRYqQkLqh4zhhGdU3YZ2YtS0RBEEQbW2MKCgogJWVFfLz82FpaWnocAghDSw8PBx37txBVFQUtmzZAolEYuiQiIGcOHECMpkMPj4+WLVqFT7//HNIpTR2LCGEEEJqR8y2Jv1FUoWYmBhDh0CI6LRaLZKTk+mMdxWOHj2KNm3aoGfPntSQb4TErPGRI0ciJCQEANCrVy9ER0fXeZ2E1BUdxwnLqL4J68RsY9a6MZ+Tk6NfTk1Nxbp167By5UpcuHBBlMCMQV5enqFDIER0dC9a1ZKSkmBhYYFTp05h7ty5hg6H1IKYNd68eXOYm5sjJyeHutoTo0HHccIyqm/COjHbmDVuzN+6dQvOzs5o3bo1unXrhsjISAwYMADbtm3D//73P4wePRpHjhwRLUBDolE0CYvkcjm8vLyovivxzz//gOM4zJ07F2ZmZoYOh9SC2DU+duxYnDlzBi4uLkhKSgLdnUYMjY7jhGVU34R1YtZ2jRvzq1atQu/evXH+/HmMGjUKkydPxsSJE5Gfn4+8vDy8+uqr+Pzzz0UL0JB4njd0CISIjud5xMfHU31XIC8vD/fv34dMJkP//v0NHQ6pJbFrvH///rh27RokEgk6deqEpKQkUdZLSG3RcZywjOqbsE7M2q5xY/7atWv49NNPMWzYMGzduhXp6el4/fXXIZVKIZVKsWzZMsTGxtYqmJCQEEyZMgWOjo6QSCTlrvAfPnwY48ePh52dHSQSCSIjIytcz+XLl+Ht7Q0LCwtYW1tj1KhRKCkpqXE8dPWFsEgQBOTl5VF9V+Dw4cMoKCjAa6+9ZuhQSB2IXeNyuRwdOnRAUlIShg0bxtTtZKRxouM4YRnVN2GdmLVd48Z8bm4u2rZtCwBo0aIFLCwsYGtrq3/dxsYGSqWyVsEUFRXB3d0d27dvr/T1oUOHVnnl//Lly5gwYQJ8fHwQFhaGa9eu4c0336zV6MPUvYewSC6XY8CAAVTf/6FWq+Hv749nn30WNjY2hg6H1EF91Pi4ceMQFBSEnj174vbt26Ktl5DaoOM4YRnVN2GdmLVdqzX9d3RnsUZ79vX1ha+vb6Wvz58/HwBw//79St/zzjvv4K233sLq1av1z7m5udUqHureQ1jE8zzi4uLg5uYGmUxm6HCMxqFDh2Bubo7JkycbOhRSR/VR4127dsUvv/wCiUSC1q1bIzMzU39im5CGRsdxwjKqb8I6g3azB4CFCxdi5syZmDlzJlQqFV577TX9/1966SXRgquprKwsXL16Fa1bt4aXlxfatGmDkSNHPnX0YbVajYKCgjIPAPpRNHme1//Qn1zmOK7Msu79lS1rNJoyy7ouFrplQRDKLQMos6zVassscxxX5TLP82WWK8qDcmp6ORUXFzOXU132kyAI2LFjBz7++GNIJBImcmJxP9Ukp6KiIlFz4nke7u7uiIyMhJeXFy5dukT7iXIyaE5FRUXM5cTifqKcapdTSUkJczmxuJ8op9rlpHuvGGrcmF+wYAFat24NKysrWFlZYd68eXB0dNT/v3Xr1njxxRdFC7AmEhMTAQAbNmzAkiVLcPLkSXh6emLMmDGIi4ur9HObN2/Wx29lZQUnJycA/9cDICYmRj8fYFRUlH5dERER+oGQwsLCkJqaCgAIDQ1FRkYGgMfjAGRnZwMAgoODoVAoAACBgYH62xECAgKgUqnAcRwCAgLAcRxUKhUCAgIAAEqlEoGBgQAAhUKB4OBgAEB2drZ+/uOMjAyEhoYCeDxVYFhYGIDH02xFREQAAOLi4hAVFUU5NfGczpw5A1dXV8hkMmZyqut++vnnnzFixAg8ePCAmZxY3E/VzSklJQVarRYymUzUnPr374+goCAUFBTg7t27tJ8oJ4PlJJPJkJmZqT8xy0JOLO4nyql2OV2/fh0eHh5ISUlhJicW9xPlVPucbt68CdEIRgqA4OfnV+FrSUlJAgAhIiKizPOXLl0SAAgffPBBmed79+4trF69utJtqVQqIT8/X/9ITU0VAAgBAQGCIAgCx3ECx3HlljUaTZllnuerXC4tLS2zrNVqyyxrtdpyy4IglFnmeb7MskajqXKZ47gyyxXlQTk1rZxKSkqEqKgogeM4ZnKqy34qKCgQBg4cKGRlZTGTE4v7qSY5qdVq4ebNm/q4xcxp9erVQmFhofDxxx8LCoWC9hPlZJCcOI4Tbt68qd8WCzmxuJ8op9rlpFKphFu3bglqtZqZnFjcT5RT7XM6fvy4AEDIz88X6koiCMY5VKREIoGfnx+mT59e7rX79+/DxcUFERER6Nu3r/75pKQkdOrUCXv37sW8efP0zz/33HOQy+XYt29ftbZdUFAAKyurcusnhAU8zyMmJgbdu3ene9EAbNy4EUqlElu3bjV0KEQk9Vnjx44dg4WFhb7L3oQJE0RdPyHVQcdxwjKqb8K6yMhIeHh4ID8/H5aWlnVaV40GwHv33Xer/d6vv/66xsHUlbOzMxwdHfXdH3Xu3btX5cB6lenUqZNYoRFiNGQyGXr16mXoMIxCfHw8Ll++jF27dhk6FCKi+qzxkSNHYsuWLVi1ahW2bt1KjXliEHQcJyyj+iasE7ONWaPGvO7eAZ3r16+D53l07doVwONGs0wmQ79+/WoVTGFhIeLj4/X/T0pKQmRkJGxtbdGhQwfk5uYiJSUF6enpAKBvtLdt2xZt27aFRCLBypUrsX79eri7u6Nv377Ys2cPYmNjcejQoRrHw4s4OAEhxoLneURFRaFPnz5N+oy3VqvFN998g759+6Jdu3aGDoeIqD5rvGXLlpDJZOA4Tn9/nrm5uajbIORp6DhOWEb1TVgnZhuzRo35s2fP6pe//vprtGzZEnv27NHPyZyXl4dFixZh+PDhtQomPDwco0eP1v9f1xNgwYIF+O233+Dv749FixbpX58zZw4AYP369diwYQMAYPny5VCpVHjnnXeQm5sLd3d3BAUFoXPnzrWKiRAWNWvWzNAhGJy/vz8kEgleeOEFQ4dC6kF91ri3tzeCg4MxYMAAXLt2rdbfeYTUBR3HCcuovgmpnlrfM9+uXTsEBgaiZ8+eZZ6Pjo6Gj4+P/up5Y6S7Z16M+xgIIcYnJycHn332GaRSKb788ktIJBJDh0QaEY1Gg7Vr12LlypX45ZdfsHr1akOHRAghhJBGQsy2Zq3mmdcF8fDhw3LPZ2Vl6acMaOx0UxUQwhKO43Dt2jX9/JlN0fnz52Fvb4+JEydSQ55B9V3jJiYmaNu2LUpKSqBQKOiWLNLg6DhOWEb1TVgnZhuz1o35GTNmYNGiRTh06BDS0tKQlpaGQ4cOYfHixZg5c6ZoARoS/ZFPWCSRSGBjY9Ok6zsiIgLp6ekYMWKEoUMh9aAhanzcuHEICgpCnz599PPPEtJQ6DhOWEb1TVgnZm3XujG/c+dOTJo0CfPmzUPHjh3RsWNHvPDCC/D19cWPP/4oWoCGRINuEBbJZDK4uro22frWaDRITk7G0KFDm+zPgHUNUeM9e/bEnTt3MHToUFy8eLHetkNIRZr6cZywjeqbsE7M2q51Y7558+b48ccfkZOTg4iICNy4cQO5ubn48ccfYWFhIVqAhkTdewiLOI5DaGhok63v27dvo7CwkKYUY1hD1LhEIkHPnj1RUFCAlJQU1HL4GUJqpakfxwnbqL4J68Ss7Ro15lNSUso9Z2FhgT59+sDd3b1cI/7Bgwd1i87ApNJan+sgxGhJpVK0a9euydZ3cHAwnJ2dYWVlZehQSD1pqBrXdbV3cXGp8PuRkPrS1I/jhG1U34R1YtZ2jdY0YMAALFmyBGFhYZW+Jz8/Hz///DN69eqFw4cP1zlAQ6KDCGGRVCpFx44dm2x9nz59mqajY1xD1Xj79u3x8OFDdOnSBXfu3KnXbRHypKZ+HCdso/omrDNYYz4mJgZWVlaYMGEC2rRpg0mTJmHJkiVYtmwZ5s2bB09PT7Ru3Rq//fYbtmzZgmXLlokWqCFQ9x7CIo7jEBIS0iTrOy8vDwUFBfD09DR0KKQeNWSNDx48GEVFRYiNja33bRGi05SP44R9VN+EdQbrZm9ra4utW7ciPT0dO3bsQJcuXZCdnY24uDgAwAsvvIDr16/j0qVL8PX1FS1IQ6EzgoRFUqkUnTt3bpL1vX//fgwdOpRGyGVcQ9a4t7c3IiIiKpyqlZD60pSP44R9VN+EdWLWtrw2HzI3N8fMmTOZmYKuMnQQISzS3YvWFB09ehQ//PCDocMg9awha9zKyko/+J1arYaZmVmDbJc0bU35OE7YR/VNWGewbvZNDXXvISziOA7BwcFNrr5zc3NRVFQENzc3Q4dC6llD17inpyfMzMxw7969BtkeIU31OE6aBqpvwjqDdbNvarp162boEAgRnVQqRa9evZpcz5N9+/bBy8vL0GGQBtDQNd63b1+oVCq6b540mKZ6HCdNA9U3YZ2YbUz6LamCra2toUMgRHRSqRStW7duUl+SgiAgMDAQzzzzjKFDIQ2goWu8Q4cOKC0tpcY8aTBN8ThOmg6qb8I6MduY9FtSBY1GY+gQCBGdRqPBqVOnmlR937x5E8DjK6iEfQ1d4xKJBK1atUJeXl6DbI+QpngcJ00H1TdhnZi1TY35KshkMkOHQIjoZDIZBgwY0KTq+9ixY+jUqRMNTtZEGKLG3d3dUVRURA160iCa4nGcNB1U34R1YtZ2rRvz3t7e2LhxY7nn8/Ly4O3tXaegjEVBQYGhQyBEdFKpFLa2tk2m+5pSqURycjKGDBli6FBIAzFEjXt4eECr1SImJqbBtkmarqZ2HCdNC9U3YZ2Ybcxa/5acO3cO27dvx/Tp01FUVKR/vrS0FOfPnxclOEO7c+eOoUMgRHQajQbHjx9vMt3XgoKCYGVlhYEDBxo6FNJADFHjummUqDFPGkJTO46TpoXqm7BOzDZmnU55nT59GpmZmRg8eDDu378vUkjGg7r3EBbJ5XIMHz4ccrnc0KHUO0EQEBoaCgBwcXExcDSkoRiixiUSCTp06EAngUmDaErHcdL0UH0T1hlFN3sAcHBwwPnz59GnTx8MGDAA586dEyks4yCRSAwdAiGik0gksLS0bBL1HRMTg3bt2qFVq1ZNIl/ymKFq3NPTE5mZmRAEoUG3S5qepnQcJ00P1TdhnZi1XevGvC4IMzMz7Nu3D2+//TYmTJiAH3/8UbTgDI3jOEOHQIjoNBoNjh492iS6rx0/fhytW7dG//79DR0KaUCGqvG+ffuitLQUKSkpDbpd0vQ0peM4aXqovgnrxGxj1rr/yn+vPKxduxbdu3fHggUL6hyUsaBu9oRFcrkcPj4+zHdfKy4uRl5eHjiOg6+vr6HDIQ3IUDXu6OgImUyG2NhYdOzYsUG3TZqWpnIcJ00T1TdhnVF0s09KSoKdnV2Z52bNmoUrV65g9+7ddQ6MEFJ/msIXZHBwMLy9vZGXlwdbW1tDh0MamCFqXCKRwMXFBTdu3GjwbZOmpykcx0nTRfVNSPXUuDFfUFCAgoIC2NjYoLCwUP9/3aNDhw6YMWNGfcTa4HieN3QIhIiO4zgEBAQwfxtJSEgI2rVrh86dOxs6FNLADFnjw4cPx61btxp8u6RpaSrHcdI0UX0T1onZxqzxaS9ra+sqb9oXBAESiYSJhjB1sycsksvlmDhxItNnvePj49GhQwfcvHkTAwYMMHQ4pIEZssY9PDzwzTffQK1Ww8zMrMG3T5qGpnAcJ00X1TdhnZhtzBr/lpw9e1a/LAgCJk6ciF9++UU/xy4hxPhxHMf0l+SxY8cwffp0/Pzzz5g5c6ahwyEGYKgab9u2LUxMTBAXF4devXo1+PZJ08H6cZw0bVTfhFRPjX9LRo4cWeb/MpkMgwcPRqdOnUQLyliw0LuAkP/iOA6BgYGYOHEiTExMDB2O6NRqNTIzM+Hg4ACpVApTU1NDh0QamCFrXCKRoFOnTggLC6PGPKk3rB/HSdNG9U1YJ2Ybs07zzIstJCQEU6ZMgaOjIyQSCY4cOVLm9cOHD2P8+PGws7ODRCJBZGRkpesSBAG+vr4Vrqe6vLy8avU5QoyZiYkJpk2bxuwX5Llz5zBy5EjcvHkTffr0MXQ4xAAMXeNjxozBhQsXDLJt0jQYusYJqU9U34R1YrYxjaoxX1RUBHd3d2zfvr3S14cOHYrPP//8qev65ptvqry3vzr+O/0eISwQBAEFBQVM1rcgCAgKCoK3tzfCw8PpfvkmytA1PmTIECQnJxtk26RpMHSNE1KfqL4J68SsbVFuRqlro1nH19e3yvmg58+fDwC4f/9+leu5efMmvv76a1y7dg0ODg61jodG0SQs4jgOFy5cgI+PD3NnvUNDQ9GvXz+YmZkhKSmJ5vpuogxd423atAHHcVAoFLC2tm7w7RP2GbrGCalPVN+EdWK2MWt8ZX7mzJllHiqVCq+99lq55w2luLgYzz//PLZv3462bdtW6zNqtbrcFHsAIJU+/vHwPK+/t+HJZY7jyixrtdoqlzUaTZll3VkZ3bIgCOWWAZRZ1mq1ZZZ1xVDZMs/zZZYryoNyalo5AdDfh8ZKTrrlo0ePYtKkSXj06BHs7OzAcVyjzom12muonKRSKcaPHw8TExOD5eTm5obQ0FDaT5RTveRkYmICHx8f/QBhLOTE4n6inGqXk0QiwaRJkyCVSpnJicX9RDnVPicx1bgxb2VlVeYxb948ODo6lnveUN555x14eXlh2rRp1f7M5s2by8Tu5OQEADh9+jQAICYmBjExMQCAqKgoxMXFAQAiIiKQlJQEAAgLC0NqaiqAx1cHMzIyADweByA7OxsAEBwcDIVCAQAIDAyEUqkEAAQEBEClUoHj/m9eTZVKhYCAAACAUqlEYGAgAEChUCA4OBgAkJ2djZCQEABARkYGQkNDAQCpqakICwsDACQlJSEiIgIAEBcXh6ioKMqJckJaWhq0Wi1TOV29ehXdu3fH2bNnER4ejn79+jX6nFisvYbIKTExEVevXoVWqzVYTlOnTsX58+dpP1FO9ZKTVqvFqVOn9BcfWMiJxf1EOdU+p9zcXCQmJjKVE4v7iXKqXU5///03RCMYKQCCn59fha8lJSUJAISIiIgyzx89elRwdXUVlEpltdajo1KphPz8fP0jNTVVACAcO3ZMEARB4DhO4Diu3LJGoymzzPN8lculpaVllrVabZllrVZbblkQBEGr1QolJSVCcXGxoNFo9M/zPC9oNJoqlzmOK7NcUR6GyunJPCinhsupqKhIOHHihFBaWspMTqWlpcLq1auF/Px8obS0VNi0aZOQl5fX6HNirfYaKqeSkhJ9jRsqp5SUFMHb25v2E+VULzmVlpYKJ06cENRqNTM5sbifKKfa5VRcXCycPHlSKCkpYSYnFvcT5VT7nI4dOyYAEPLz84W6kghCze7A//DDDzF9+nQMHDhQvDMKFZBIJPDz88P06dPLvXb//n24uLggIiICffv21T+/fPlyfPfdd/ru8cDjLhBSqRTDhw/HuXPnqrXtgoICWFlZISgoCGPHjq1jJuLYuHEjJBIJSktL9d1LdP/qxiwQBAHOzs6YN28emjdvbrBYCWlo0dHRuHDhApYuXQpBELB69Wp88cUXhg6LNHGDBg3ClStXRBtXhhBCCCGN3+nTpzFu3Djk5+fD0tKyTuuq8QB4GRkZmDx5MmQyGaZMmYJp06Zh7NixMDMzq1MgYli9ejVefvnlMs/17t0b27Ztw5QpU2q8vvq4r6E2oqKiYG9vj9dff/2p771+/To++ugjjBs3DuPHj6c/Ikk5Wq0W2dnZsLOzK3PiqzE7ePAg3n77bQCPu0O5ubkZOCJiSMZS446OjoiIiICnp6fBYiBsMpYaJ6Q+UH0T1onZxqzxb8ivv/6Khw8f4uDBg7C2tsaKFStgZ2eHmTNn4rffftPfC1AbhYWFiIyM1M8fn5SUhMjISKSkpAAAcnNzERkZiTt37gAA7t69i8jISGRmZgIA2rZti169epV5AECHDh3g4uJS43iMoTEvCAL27duHF154oVrv79evH7744gsUFRVh1apV+vs6CNHRarWIjo42ivoWQ1xcHGxtbdGqVSsAwKVLl2hKuibOWGrcy8sLp06dMmgMhE3GUuOE1Aeqb8I6gzbmgcfduocPH44vv/wSsbGxCAsLw+DBg/Hzzz+jXbt2GDFiBLZu3YoHDx7UaL3h4eHw8PCAh4cHAODdd9+Fh4cH1q1bBwDw9/eHh4cHJk2aBACYM2cOPDw8sHPnztqk8VS6UWIN6dKlS+jdu3eNBhWUy+WYNWsW1qxZg5MnT+KLL75ATk5OPUZJGhO5XA5vb2+jqG8xHDhwAHPmzAHweLCTmJgY9OnTx8BREUMylhqfOHGifmAdQsRkLDVOSH2g+iasE7O2RVlT9+7d0b17d6xatQqPHj2Cv78//P39AQDvvfdetdczatQoVHUL/8KFC7Fw4cIaxVbDIQHKMPQZQZ7ncfjwYXz22We1+ry1tTXeeecdxMfHY8uWLejTpw+effbZas3ZKQgCddFnlFarRUZGBhwcHBp997Xk5GSYmZnpp6Hcv38/nn/+eardJs5YarxHjx7Iysoy2PYJu4ylxgmpD1TfhHVitjFr3Ji/ceMG/v77b6xcuRK2trZYu3YtNm3apH/d3t4eixcvxuLFi0UL0lAM3Zg/deoURo8eDXNz8zqtx9XVFZs3b8a5c+fw/vvvw8vLCxqNBgUFBVAqlSgoKNDPjahTWFiIYcOGYdasWXQgZYxWq0VCQgLatGnT6PftgQMHMHfuXACPpxpJTU3Fq6++auCoiKEZS41LJBJYWloiOTkZHTt2NFgchD3GUuOE1Aeqb8I6gzbmlyxZgpkzZ2LmzJk4cuSIfs4+Fhmye49arcbp06fx5ZdfirI+iUSC0aNHY8iQIYiMjISFhQUsLS3RsmVLtGzZstzVekEQcOLECaxevRqvv/46nJ2da7XdpKQkNG/eHG3atBEhCyIGuVyOESNGGDqMOsvIyIBGo4GTkxMAYM+ePViwYIGBoyLGwJhqvF+/fjh58iSdZCKiMqYaJ0RsVN+kMavOCXyDdrM3NTXFmjVrMH78eLz88st16sZu7Ax5Zf7w4cOYPn266CcUzM3NMXjw4Ke+TyKRYOLEiRg0aBC2b98OFxcXzJ07t1rxCIKAW7du4dChQ7C2tkZOTg569uyJ2bNn0/1PRkCr1SI1NRVOTk6N+oz3k/fKp6amQqVS0Sj2BIBx1fjYsWPx66+/UmOe1BjHccjLy0Nubm65R2FhIfr16wdvb2+YmpoaOlRCRGVMx3BCqkuhUGD79u24desW9u3bV2Wbx6BX5lu2bAkA6N+/P3x9fbF06VLRgjE2ulwbmlKpREREhL6hYkitWrXCunXr9F30lyxZgm7dulX4XkEQcPnyZfj7+6NTp054++230apVKwiCgODgYKxatQoLFy6kwckMTKvV4sGDB2jXrl2j/ZLMzs5GXl4eXF1dATy+Kl/T8TQIu4ypxgcNGoSNGzfSOCSkSgUFBYiIiMD169eRlZUFqVQKuVwOGxsb2Nra6h+dOnWCra0tTE1NERISgrVr16Jr166YMmUKWrdubeg0CBGFMR3DCamO8+fPw9/fHy+99BJycnKQlpZWZa9mMduYEqGGl9bDwsLg4eGh75Z95MgRTJ8+XbSAjEFBQQGsrKyQn58PS0vLBt/+zp07MWTIELi7uzf4tquSn5+PHTt2wNraGosWLYKZmRmAx1cPzp49i5MnT6Jfv36YOnUqWrRoUe7zSqUSu3btAsdxWLJkSY1G6CfkST/++CNGjhyJnj174u7duzhx4gSWL19u6LAIqdCYMWPwxx9/wMHBwdChECOhVqsRFRWFGzduICEhAZaWlvDw8ICnp2eN6kTXE+7ff/8Fx3GYPHkyPD096cQRIYQ0gPz8fGzfvh0ODg7o0aMHDh48iNjYWCxbtgy+vr6Vfk7MtmaNG/ORkZHo27dvnTZq7HQ/4NzcXNjY2DTotrOysrB9+3Z8/PHHDbrdmrhy5QoOHjyIuXPnIjU1FRcuXMDIkSMxYcIEfQO/KjExMdi1axfGjh2L8ePH0x8dDYzneSQlJcHFxQUymczQ4dRYfn4+vvzyS3z66acAgDVr1uDdd9/VzzNPiLHV+MsvvwwfHx/Mnj3b0KEQA3rw4AGCg4Nx584dyOVy9OnTB56enujUqVONvwcrqvHc3FwcP34ckZGR8PLywvjx4ys8sU6IsTO2YzghFQkJCcGRI0cwd+5cBAUFwcrKCosWLcIPP/yAoqIirF+/vtLP5uXlwdbW1jCNealUCg8PD7z88suYO3cuk1dXdY35nJwc2NraNui2t2zZglmzZqFTp04Nut2aKi4uxr59+9CpUyeMGjWqxgdb3bR7169fx9KlS2mk5wbEcRwiIiLg4eHRKMcw2L17N/r27QtPT09cv34dERERePnllw0dFjEixlbjf/zxBy5duoQdO3YYOhTSwARBQFhYGI4fPw4LCwtMmDABPXv2rHNdVlXjPM8jNDQUp06dgpWVFSZOnIgePXrQiXPSaBjbMZyQJxUUFGD79u2wt7dH27ZtERISgldeeUU/btOFCxewfft2/PXXX5WuIzc3F61atTJMY/7y5cvYvXs3Dh48CI1Gg5kzZ2Lx4sUYPXp0nQIxJobqZp+UlIS///4bq1atarBtGlpWVhZ27twJGxsbzJgxA+3btzd0SERkSUlJCA8Px7hx42BtbV2ndRUVFeHjjz/G559/DgBYtWoV1q9fT1efiFGLjY3FO++8g4CAAGpQNRFKpRInT57E5cuX0b9/f0ycOLHOx7/aePDgAU6cOIG7d+9i0KBB8PHxMcjtg4QQ0thptVpcunQJR44cwZQpUxAYGAhPT0/MmDGjzEXNR48eYc6cOThz5kyl6zJoN3udkpISHDx4EL/++isuXLgAZ2dnvPTSS1iwYEGjb5DpfsDnzp3DyJEjK3xPZmYmjhw5ghEjRqB79+6i/IG2YcMGLF26tElO43bv3j34+/sjLy8P48ePh5eXV6M/G6tQKBAYGIhr166hX79+mDRpksEGVXwSz/OIi4uDm5tbvXdfu3btGg4dOoTJkyfjzJkzMDExwZQpU9C7d+9a/c78/vvvcHV1hZeXF86fP4+MjAyjGCiSGJeGrPHq4DgOEydOxO7duxv99yOpWmJiIvz8/JCbmwtfX18MGTKkXmqwpjXOcRyuXr2KwMBAmJqawtfXFx4eHnRyiRglYzuGk6ZJqVTizp07uH37NhISEsDzPLp37w6O45CamorXX3+9woFHBUHAyJEjcfLkSTRv3rzCdZ8/fx6jRo0SpTFf69ZSs2bNsGDBAixYsAAJCQn49ddf8dNPP2HDhg0YN24cAgIC6hSYMVCpVJW+FhUVBQsLC1y5cgW7d+9G586dMXr0aHTt2rVWX45RUVFo3bp1k2zIA0CXLl3w3nvvobCwEIGBgXj//ffRq1cvTJ48Gfb29g0WR1FREWJiYhATE4OcnBxYWlqiZcuWaNmyZZll3f91A0HqaLVaXLt2DadOnYIgCBg3bhxmzpyJsLAwfP7557C3t8fMmTPRoUOHBsupIiUlJfW+DX9/f0RHR+OTTz6Bqakphg8fjqysLBw/fhx79uzB0KFD4ePjU62r6snJyfjtt99gb2+PIUOGgOd5HD16FJs3b673PEjj1BA1Xl1yuRzt2rXD0aNH8cYbbxg6HFIPIiMj8ddff6Ft27Z45plnGuTWsZrUuFwux9ChQzF06FA8evQIJ0+exL59++Dp6YkhQ4bAwcEBzZo1q8doCakZYzqGE/ZpNBqkpqbqG+95eXlo0aIFevTogaFDh2L+/Pm4fv06Dh48iOnTp+Oll16qtL0nkUhgY2ODhIQE9O7du8L3VNXGrKlaX5n/r8LCQuzbtw8ffvghFAoFeJ4XY7UGobsyHxQUhLFjx1b4np07d8LX1xcdO3aEIAhITEzE2bNnERsbCzc3N4waNQpdunSpVsNeEASsXr0aH374IZNjENSGIAi4efMmjh07Bp7nMWnSJPTr1w8SiQSlpaXIz89HQUEB8vPz9Y+CggIUFBSgZcuWsLOzg52dHezt7WFnZ4eWLVuW2xe6AVbu3LmDmJgY5OXlwcLCAt26dUOPHj1gb28PpVKJgoICFBYWoqCgQP9/pVIJpVIJtVoN4PGXzqNHj1BQUKAf0b9bt27lzsglJibi8OHDyM/Px6RJk9CjRw+o1Wqo1Wo4ODgwcQZaq9Vi586dMDc3x6JFiyr8HeA4Tn9Pp7W1NaZOnYquXbuWe192djb27NmD0tJSLFy4UD/K87FjxwAAkydPrt9kCBHJ999/j4yMDMybNw89evQwdDhEJBzH4bfffoNCocDSpUthYWFh6JCqTRAEXL9+HdevX8fDhw/1f1yampqibdu2aNu2LRwcHPTL1RnglhBCjE1WVpa+ga5QKJCXl4f8/HxwHAfgcePbxMREPyJ9z549y4yZdu/ePezZsweurq547rnnKr3a/qT33nsP3bt3x+LFiyt8/fTp0xg3bpxhu9nrnD9/Hrt378Y///wDmUyG2bNnY/HixRg8eHCdAjMkXWP+5MmTGD9+fIXvWbt2LTZs2FCuK7ggCIiPj8e5c+dw9+5dODk5wdbWFpaWlmjRokW5q7stWrRAaGgokpOT8cILLzREeo1Obm4uAgICcOvWLchkMpiamsLKygpWVlawtLTUL1tZWSEyMhJSqRT29vZQKBTIzs5GdnY2lEoldKVubm6OkpISSCQSdOrUCd27d0f37t1rPNihWq3GpUuXcObMGbRo0QL9+/eHjY0NMjMzkZmZiYyMDP2Z5f82aFUqFRITE5GdnY2uXbuiZ8+eSEtLg6enJyZNmlSv91byPI+YmBh0795d9JMHKpUKX3zxBQYPHlzp785/paWl4dixY0hISMCQIUMwZswYyGQyHDhwAPfv38eCBQv0g4oAj3/uH3zwAbZs2cLEyQ8ivvqs8doKDw/H2bNnkZGRgS+++KJcrx7S+GRkZGDbtm3w9fVt8HGD6rPG1Wq1/ntM94iJicGyZcvQuXNnUbdFSEWM8RhOGqfY2Fj8/PPPGD9+PGxtbWFtbQ0bGxtYWVk99XbezMxM/PrrrzA1NcXChQtrNGvSgQMHcOHCBfzwww8Vvn7q1ClMmDDBcN3sU1NT8dtvv+G3335DUlISvLy88P3332P27NmN6qx0XXAcV2ERSCQSuLm5wc3NDYIgICMjQ3/VuLCwEA8ePChzpbewsBBSqRQffvihAbJoHGxtbTFv3rynvi8oKAgZGRlwcHDAyZMnYWFhgSFDhuC5554rc696SUkJzMzMIJVKaxyLRqNBWFgYzp8/j6KiIgwZMgQffvhhreue4zicPXsWJ06cwIIFC1BaWopvvvkG5ubmmD59Orp161ar9YohLi4Ojo6O1c4tJycHn3/+OebNmwd3d/dqb6d9+/Z47bXXwHEcQkJCsGTJEjx48ACLFi2q8ISZn58fpk2bRl/wpFHp378/bt26BY1Gg7179+Kll14ydEikDs6cOYPAwEC88847NZoXvjEwMzNDx44dy9wqUFBQgE8//RRz5syBh4eHAaMjjR3HcZBIJPQdTupdeHg4Dh06hI0bN9ZooGSlUol9+/YhPT0dL730EpydnWu87QEDBmDPnj01/lxt1PjK/Lhx43D27FnY29vjxRdfxEsvvVRh99jG7Gnd7JVKJb777jusWbPGANGRisTGxmL//v3YsGGDvpGuUChw5coVXL16FaWlpfD09ISXl1eN//DiOA43btzA2bNnkZeXh4EDB2LkyJGizmuuUqnw0UcfYdmyZejQoQMyMjLg7++PhIQEjBkzBqNHj4apqalo23uaM2fO4Pjx41CpVJg8eTJ8fX2rvGUkISEBP/zwA9555x04OTnVeHs8z+PMmTM4efIkJk+eDA8PD5w9exZXrlyBk5MTxo8fDzc3NxQWFuLjjz/Gl19+SQM3kUZHEARs3rwZqampeO+99+gqZyNUUlKC7du3w9bWFgsWLGj0A7XWhFqtxubNmzFixAh4e3sbOhzSCGVmZup71X3wwQewsbExdEiEUWfOnMGlS5ewevXqav/9rNFocOTIEYSFhWHu3Ll1OnHJcRxGjhyJixcvVvj3qkG72U+dOhWLFy/G5MmTKzyr9ujRI5w+fRrPP/98nQIzpKd1s4+KisLt27cbdY4syc7OxmeffYaPP/640jNvarUaERERuHTpEjIzM/W3O7Ro0aLSR1ZWFs6ePYusrCx4enpi9OjRFY5aKZbc3Fxs2rQJH330kf4LTq1WIzg4GMHBwXBzc8OQIUPQpk0btGrVqtZntXmeR1RUFPr06VPhOg4fPoy9e/fC3d0dzZs3x5UrV/DgwQOMGDECgwcPhrOzM5ydnWFrawuJRIKrV6/Cz88PH3zwQY3HfFCpVAgICMDFixcxcuRITJo0qdwfx4mJiTh16hQSEhIgkUjw/PPPw9PTs1a5k6bhaTVuSGq1GitXroRKpcKOHTsaPL7i4mJER0cjKioK8fHxaN68OYYOHYohQ4ZU6z7ApiwuLg47duzA/PnzDX512lA1zvM8vv32Wzg5OeHZZ59tsO2S8v79918kJCSgQ4cOcHZ2RseOHfXfy8YoMjISe/fuxcqVK6FWq/Htt9/i7bffrnCwSGM+hhPj988//yAlJQVvv/12tXvh6i5KTZw4EWPGjBHl98jb2xt//fVXhQN5i9nNvtb3zH/88ccVPp+QkIDDhw9DqVTWKTBDelpj/siRI3BwcMCgQYMMEB15klqtxtq1a/H2229Xe8onQRCgUqlQWFgIpVKJwsLCCh9WVlYYNWoU2rVrV89Z/J/k5GRs374dn3zyCczNzcvEHBMTg8jISGRlZSEnJwdarRbA48GK7O3t0bp1a7Ru3RqdOnWq8mdR2ZQvgiBg06ZNCA4OxtatW9GvXz/9a/n5+di+fTsyMjLg4eGBnJwc5ObmQqvVwtbWFitWrKjRPcAKhQJ+fn64c+cOJk6ciBEjRjz1C5vneSQmJpa5f56Qihj7tEa5ublYsmQJJkyYgCVLltToswkJCfjpp59gb2+PNm3a6Acna9u2Lezs7Mr94aJUKnHr1i3cvHkT9+/fR7NmzdCrVy+4u7ujc+fOUCqVCA0NxeXLlwEAgwcPhpeXl0HmRDdWgiDg0KFDuHPnDpYvX24UA9UassYFQcDevXtRXFyMV1991Wgbjyy7cOECwsPD8cILL+D+/ftITk5GSkoKsrOzAQA2Njbo0KEDOnbsiF69ehl0WlxBEPTf9++9957+b5u8vDxs3rwZs2fPRv/+/ct8xtiP4cQ4CYKAX375BXK5HAsXLqz2samkpARr1qzB+vXrRT2+v/jii5g3bx58fHzKvWYUjfn/npXmeR6pqakoKCjAJ5980qjvAX9aN/uvv/4a8+bNq9ertOTpBEHA559/jnHjxpX7ImjMbt68CT8/P6xbt65aZxTVajUePXqEhw8fIisrC3fv3kVmZiZGjhyJ0aNHlzkpUJm0tDS8+eabsLW1xc6dOyvtkpSYmIhffvkFPXr0wOzZs2FiYlKjP+TS09Nx8OBBZGdnY+bMmTTPMWmyEhIS8OKLL2Lv3r3o1KnTU9+va1Du3r0brq6usLKygo+PD3ieR2ZmJh4+fIhHjx7pT/JZWFjoZ/fo06cP+vTpA2dn5yp/34qLi3HlyhVcunQJJSUlGDBgAIYNG9ag04Mak6ysLFy8eBGXL1/G0KFDMW3aNDpePeH48eOIiorCypUrm9TtBoaWkJCAXbt24eOPP67w5y4IAvLz8/WN/NDQUPTp0wfPPPNMg89IoNFo8O2338LBwQFz584t9/ujVquxdetWuLu70+w0pE54nse2bdvg5uaGadOm1eizX375JXx8fNC3b19RY/r6669RXFyMtWvXlnvNqEazfxLHcVi+fDkUCgX++OMPsVbb4HSN+RMnTmDChAnlXv/ggw/w2Wef0Ze6ge3btw/NmjXDzJkzDR2K6M6cOYM7d+7gzTffrFWdqdVqnD9/Xj++xaRJk/RTJXIch4iICHh4eIDnefzxxx/4999/MXLkSCxfvvyp2xMEARcuXMCRI0cwY8YMDB069Knx3Lt3D4cOHYJMJsOzzz4LV1fXGudESHU9WePG3MgICAjAli1bcObMmSpP3CkUCmzatAkJCQlYvnw5Ro4cieTkZOzevRvt2rXDCy+8UGagSkEQUFRUBAsLi1p/T6lUKhw5cgQHDx7Eo0eP0KJFC1hbW6Ndu3ZwcnJCp06d0K5dO7Rq1QqtWrUqsy1BEKBUKpGVlYWsrCz9icbc3FzY29ujQ4cO6NChA5ycnIxu0Nzk5GRcvHgRUVFRaNWqFYYNG4Z+/foZ3bRsxlLjoaGhOH78ONasWUO3aTQAhUKBjz/+GBs2bKh2A0AQBFy8eBF+fn4YM2YMJkyY0CBXu3NycvDll19ixowZVc5wJQgCdu3aBY7j9D09jKW+SeOgG89j9OjRGDlyZI0+GxQUhLS0NCxatEj0uC5fvoxvvvkGf/31V7nXTp48CV9fX+NrzANAfHw8+vTpg+LiYjFX26B0jfnbt2+Xmw9YEAR8+OGH2Lx5s4GiIwD03ULfffddZk+q7Nu3D6ampnW+LzElJQXHjx9HYmIihg4dCm9vb/0f2X///TdMTEzg7u5e46kR1Wo1/v77b9y9e/ep73VwcMCsWbPQpk2b2qZBSLXxPI+kpCS4uLgYfRfN1atXQ6VS4Ztvvqnw9fDwcHz33XeQy+XYsGEDOnToUO71AwcOYMSIEZg0aVKd8tXdznP+/HkkJCTAzc0NI0eORMeOHZGVlYX79+8jISEBycnJSE1NRV5eHlQqFdRqNQRBgFqthkQigYWFBezt7dGxY0e4urqia9euaN++PWxsbPDo0SOkpKToH4WFheB5HjY2NmjdujVatWqFSZMm1bnxHBsbCz8/P1haWsLa2lo/HdGT/+p6LcXGxuLChQuIi4tDhw4dMGzYMKO/V9eYavzOnTvYvXs3PvjgA1EHhiVlaTQafPTRR3jttddqNbo2z/MICAjA2bNnMWvWLHh5edXb30+xsbH43//+h+XLl5c7ZlXm5MmTCAsLw/vvvw+5XG409U2Mm1KpxKeffornnnuuxmOZpKWlYfv27fj000/rpc6Kiorg6+uLkJCQcq/duXMHPXv2NM7G/JkzZ/DWW2/h9u3bYq62Qeka8xX9gLOysrB//34sX77cMMERJCUl4aeffsKmTZuYPmMrCAK+/vpr9O/fv8ZnGivCcRwuXryI06dPQ6lUolevXigsLESHDh0wa9YsESImhNSUVquFj48PXn/99TK9jDiOw88//4xr166hQ4cOWLVqVaVXPrVaLU6cOIHg4GDMnj0bAwcOrPYf6VqtFrdu3UJISAiSk5PRo0cPjBgxAp07d672OjiO048zwvM8Hj58iIyMDGRkZCA9PR2ZmZlQq9UAUGadgiBAIpHA3NwcPM+jpKQEOTk5uHnzJr799lt4eXlVa/tPEgQBR48eRUREBN544w1wHIe8vDwoFIpy/5aUlIDnebi5uWH48OH63kuk5lJTU7Ft2zYsXbqUxjWpB4IgYOvWrRg+fHiVV7mfFBsbi71790Kr1Zbr+ZOUlITc3FyMGTMGvXr1gqura5X7TavVoqioCCYmJjAzM6vy9+TUqVO4dOkSVq1aVaPpwIDHg+Tt27cPq1evphND1VBSUoKoqCgoFAooFArk5+dDoVCgoKAAgiBA18TT/dumTRu4urrC1dUVLi4uRtfjqLoEQUB0dDQuXLiA+Pj4Wh13NBoNPvzwQ6xcubJeb5seOnQozp8/X669UlVbs6Zq3Zj/7rvvyj2XmZmJX3/9FVOmTEGvXr30z7/11lu1j9AAdD/gnJwc2NralnntypUrePjwYY3vxyDiyM/Px8aNG7Fu3bomMUATx3HYuHEj5syZg549e4q2zqtXr+Lq1avo0qUL3adGmMNxHMLCwjBw4MBGccIvOTkZc+fOxS+//ILu3bsjLS0NX331FYqKiuDt7Y3nnnuuWo3M4uJiHDhwAElJSVi4cCFatWqF/Px8FBQUVPjIz89HSUkJevfurb8CbwxiY2OxZMkSDB48GJ999lm1B9csLi7GN998AxcXF8yZM6fSn1lJSQkuXryI8+fPQ6FQYNiwYXj22Wcb1RVAY6zxgoICbNmyBUOHDq3wFsWGVlxcDH9/f7i5ucHd3d1ofk61cfDgQQiCgOeee+6p701NTcWePXvQokULzJ8/v1yjWKPRoKCgAGlpaThw4AAePnwIExMTdOvWDS4uLvoBbhUKhb4RKJFI0KJFC3AcB5VKVem2S0tL0bVrV7z00kuV3jqk0WggCEKlY/Okpqbi22+/xfjx4zF69OhGvd/qE8/zWL9+Pfr27Qt7e3tYW1vDysoK1tbWaNmyZbnjmSAIyMrKQnx8POLi4pCUlITS0lK0aNECrq6u6N69O3r37m20JzQ5jsPNmzcREhKC9PR09OrVC8OHD4eLi0utYv7xxx/h7u5erVtF62Ly5MnYunUrunXrVub53Nxc/Xe0wRrzLi4u1duARILExMTabMJgdI35vLy8cg3G/fv3o1evXujTp49hgmvCOI7DunXrsHjx4iY1P3NxcTHWrVuH5cuXV3vE/soUFhbi9u3bCAsLQ5cuXSqcrYGQxk6r1SI1NRVOTk7VnpbG0H7//Xf4+flh7ty5OH/+PHiexyuvvFKrKdCysrLw559/Qq1Ww8rKCpaWlpU+ajILRUPiOA5r1qzBpUuX8OOPPz71OzcpKQnff/89Fi5cWOF7eZ7HjRs3cPr0aRQWFmLYsGEYMWIEmjdvjsDAQJw+fRpLly6t1mCExsBYa1yr1WLPnj3Iz8/HG2+8YbD60k0zNX36dDx48AA3b95Es2bNMHDgQAwaNKjchRpjdvXqVZw/fx4rV66sstGSnZ2NPXv2QK1WY8GCBdWeiScjIwPnzp3DmTNn0LVrV8ycORO2trawsrKqcW3xPF/hSTGFQoErV67g6tWrUKlUMDExQWlpKVq2bKm/Uqwb2BN4PNL9P//8g6SkJIwfPx5Dhw41ipNtut5ExmDHjh3o0qULxowZU6f1KJVKJCQk4Pr160hKSsKSJUuM5sRuaWkpwsPDcfHiReTm5sLd3R3Dhw+v89/CV65cwbVr17Bs2TKRIq3cypUr0aNHj3L35CsUCtjY2BhnN3sW6Brzjx49gp2dXZnXPv30U7z11lsGnebDkARBQGFhIR49eoTc3Fz07t27QbrpaDQafPfddxg4cCCGDx9e79szNjk5Ofj000/x6quv6s/APu1LVhAEpKenIzo6GtHR0Xj48CFatmyJHj16wMPDo9H80UpIU6DVavHGG2/A1tYWarUaq1atohlT8Hh8lBUrVmDSpElYvXp1hVfpAgMDcf78ebz33nuwsbHRPy8IAuLj4xEUFITExET069cPY8aMqfDnmpOTg+3bt8PFxQVz586lq4F1FBoaiiNHjuC9995r0DoWBAEnTpzA5cuXsWLFijIXZJRKJcLCwhAWFoa8vDz06NEDgwcPRteuXY2mgfZfycnJ+PHHH7Fp06ZKT4wolUrs378faWlpePHFF2t9m4MgCPjpp5/QokULvPDCC3X6mQiCgPv37yM0NBS3bt1CixYtMHjwYAwaNKjM389KpRLx8fH6R35+PkxMTODs7IyOHTvC3NwcUVFRiI6OxuDBgzF16lTRe2UKgoC4uDjk5eXB3NwcZmZmaNasGczMzGBubg5zc3Pk5+dj586dyM/Ph7OzM7y9vdGzZ0+D1U1gYCCSkpLw6quvirrerKws/Pzzz7CyssKCBQsM2tbJzs7Gxo0bMXr0aAwbNky040h2djY+//xzfPbZZ5X2DhHTwYMHceHCBXz//ffl4rC3t6fGfH2pajT7Dz74gPnB77RaLSIiIvDgwQM8evQIjx49QkFBgf71li1bwt7eHi1atEBYWBjmz59fq6tH1VFQUICjR48iMjISkydPxujRo+tlO41BSkoKAgMDkZeXh/z8/DL3QllYWMDGxga2trYQBAGxsbFQq9VwdHREz5490bt3b9jb2+tHiQ0NDYWXlxf9wUqY1FhrPDk5GSdPnsRLL71ktFfMDaGoqAjvvPMOkpOT8f3336NLly4AHg/CuX37dtja2mLBggVQqVRITExEQkICEhISkJ6ejs6dO8PHxwedOnWq1kwd586dw7Fjx/Dqq6/qt2OMGkONP3jwAF9//TVeeOEFeHp61vv2VCoVvvvuOzg4OGDevHlV7m+tVouYmBhcvnwZ9+7dQ6tWrTBo0CD069fPaC7WFBQUYMOGDfjoo4/KnKTSUavVOHToECIjIzF37lxR/g4TBAF//vkn8vPz8dprr9W4sRodHY2zZ88iJSUFLi4uGDJkCHr37l2jGtVoNEhMTMS9e/eg0WiQm5uLvLw8xMbG4t69e5DJZOjSpQvatWuH9u3b4/nnn6/xTAocxyEyMhIXL17EgwcP0KVLF7Ru3RpqtRoqlUr/KC4uxs2bN5GcnAwPDw/Y2dnhmWeeQXh4OG7fvg1XV1d4e3vDzc2tWj8rQRCQl5eHrKwsuLq61up3NzY2Fvv378eGDRvqrVfOzZs3sXfvXowaNQoTJ04UdTuCIODGjRu4ffs27OzsYGdnB3t7e9jZ2aFFixb6n+PGjRuxcOFCUXsJaLVarF27Fq+99lq1B2asq7S0NCxevBinTp0q87xRj2bPAl1j/tSpU/Dx8dE/z3EcNmzYgE2bNhkwuvpTXFyMkydP4uLFi+jfvz9cXFxgb28Pe3t7WFpaVnigKi4uxq5du6BSqfDqq6/WuSB10tLScOjQIWRnZ2Pq1KkYMGCA0Z45NzRBEFBcXIy8vDzk5uZCq9Wia9euaNasWYXv12q1yMjIgIODg1F1zyRELFTjbDp69Ci+/PJLPPvss/Dy8sJXX30FNzc3SKVSlJaWwsLCAp06dULnzp3RqVMn/QnMmlIoFPjxxx/RunVrvPjiiw1y9aamGkuNq9VqbNu2DR06dMDzzz9fb9/jycnJ+Pbbb2t9ceHRo0cICwtDeHi4fhyJQYMGPXUgSEEQ8OjRIzx48AAPHjyATCbDkCFD6nz1WHdb4UsvvVThVK4PHjzAV199hRkzZmDYsGGi/1z//fdfxMbG4t13331q93ZBEBAZGYmDBw/C2dkZvr6+cHJyqlNMVdV3RkYG/P39cffuXbRr1w4RERFYtGgRBg0aVOV0nMXFxbh69SpCQ0NRWFiIvn37YujQoZV22U5MTMTOnTsxfPhwTJ48GRKJBPHx8fjxxx/1J/vi4+Nx9uxZ3Lt3D927d4e3tzdcXFygVquRmpqKtLQ0pKSkIDU1FYWFhQAAGxsbtGrVCrGxsXB2dsaECROqfeuorpfmxx9/XOPBBWtKq9Xi2LFjuHDhAl588UX07t27TusrKSlBYGAgQkJC4O7ujv79+yMvL09/0TA7OxtKpRLA43ETSkpKMGnSJPTo0QM9e/YUJd+9e/fCzs4Ovr6+dV5XdQmCAC8vL1y+fLnM84GBgRg/fjw15uuLrjEfFBSEsWPH6p9PSkpCYGCg6N1aDC0rKwt+fn5ITEyEr68vhg8fXuN7k2JiYvDLL7/A19cXY8aMqdVBXBAE3Lp1C4cPH0bz5s0xa9asJnVvPCGEkKo9fPgQy5cvR05ODl566SX069cPHTt2rJcG98WLF3H48GEsXry4TgOQarVa5OfnIzc3F4IgwMnJqdGOIl0buhkGoqKisGLFClhYWIi6/jNnzuD06dN47733RBkBneM4REdH4+rVq0hISICdnZ2+oahrtGdkZIDneUgkEtjb26Ndu3Zo164dVCoVQkNDUVRUBE9PTwwfPhxt27at9rZzcnJw584dnD59GmPHjq3wtsIbN27gzz//xKpVq2Bvb1/nfCtz9uxZhISE4IMPPqjw90t3hfXvv/+Gq6srnn32Wf097w1BrVYjLCwMSUlJOHToEJo3b15mMDQzMzP9vf+6K/qDBg3CkCFDKuzpoFNaWorff/8dGRkZeP3118vVlFKpxNatWzFw4EBMmjQJwP9N6xkcHIzU1FQ0b94c7dq1Q4cOHeDk5IT27dtX2OMjPj4ep06dwv379zF48GCMGTOm0hNBpaWlWLt2LZYtWwYnJ6da/tRqTqlU4tdff0VRURFefvnlGtdceno6jhw5guTkZPj4+GDkyJFV9kgoKirC2rVrsXbtWmRnZ+P27duIjo5GUVERHB0d0bt3b/Tp06fcbdBPc+vWLRw7dgyrV69u8IuDI0aMQEBAQJkTEqdPn8a4cePYa8yHhIRgy5YtuH79OjIyMuDn54fp06frXz98+DB++uknXL9+HTk5OYiIiEDfvn31r+fm5mL9+vUIDAxEamoq7OzsMH36dHzyySc1OsBU1s0+ODgYHMeVuVrfmN29exf//PMPeJ7HzJkz6zxaOsdxOHjwIGJiYvD666/DwcGhWp8rLi7G5cuXcfLkSXTt2hUzZsygKUnqEcdxCAkJwYgRI4y2eyYhdUE1TsSiVCrx22+/ISMjQ/8HoCAIkMlkaNGiBVq2bKn/l+M45ObmIjc3V3+FCQCkUimsrKz0g66lpqZCrVbDxMQE7du3h7OzM5ydndGhQweYm5tXKy4xajw5ORl79uxBSUlJpVf3n7ydSy6Xw8nJCS4uLvp4a3JS4vbt29i1axfefvttUbrOlpaW4scff0SLFi2waNGiehsgLSsrC2FhYSgtLdU32tu2bVvlz12j0eDGjRu4cOECHj58iN69e2PYsGFlGpsqlQr37t3DnTt3cPfuXZSUlKBVq1b6UcX/+zMSBAF+fn6IjY3FihUrGuSEUFhYGI4ePYo1a9bou7ILgoDw8HAcOnQIXbp0wbPPPitar0ydmta3IAg4fvw4Ll++jHfeeQd2dnZQq9X6UfldXFyq9bsVERGBvXv3YtasWVWOci4IAvbu3YusrCwsW7aszvuC4zhcuXIFp0+fhkQiwdixYzFo0CB97oIgYMuWLRg5ciQGDRpUp23VVmJiIn7//Xeo1WpYWlrqfxd0jydPVgiCgOvXr+PYsWMwNzfH9OnTy43mXplvvvkG3t7e5QYyFQQBGRkZuHXrFqKiopCdnQ1ra2v06NEDNjY2MDMzg6mpqX7cgyeXNRoNNm3ahE2bNtX4lgwxLFiwAPPnzy9zgZjZbvYnTpzApUuX4OnpiVmzZpVrzO/duxdJSUlwdHTEkiVLyjXmo6OjsX79eixcuBA9evRAcnIyXnvtNfTp0weHDh2qdhyVdbP/5ZdfMGrUqAq7PDUWgiDg0qVLOHbsGJycnDBz5sxqN7qr68GDB/rRh5955hn9F6wgCHj48CHi4+ORkJCAxMREqNVqNG/eHO7u7hg/fny1/5AhtafVapGdnQ07Ozuj7p5JSG1RjZP6xnEcioqKoFQqUVhYCKVSCblcDltbW9ja2pa597MyGo0GqampSE5Oxv3795GSkgKVSgVLS0u88sorVZ7UrkuNZ2VlYc+ePeA4DgsXLqz23wC6eJOSkpCUlISUlBSUlpbCzMwMHTt2RIcOHdC8eXPI5XKYmJjAxMQEcrkcpqam+ueUSiW+/fZbDBs2DF26dEFhYSGKiorK/FtYWAiO4yCVSiGVSiGTycr8q1tOSEjA7NmzDda4qS6tVltmTmye52Fubo5mzZqhS5cu6NGjB7p27VplI4PjOHz33Xdo3bp1nQenq6no6Gjs2bMHa9euRWxsLP755x90794dzzzzTL2NL1Db+k5JScF3332HGTNm1GjKsdzcXOzevRumpqZ4+eWXq93gu3r1Kv7++2+89957NeqBURWFQoHTp08jNDQUkyZNwpgxY3Do0CFwHIc5c+aIso26UiqV+l4quofuBGbz5s2Rl5cHd3d3TJkypcpeEP8VHR2NU6dOYcWKFdV6v0KhwJ07d6BUKqFWqyt9lJaW4vnnn6/14JB19fXXX6OkpARr1qzRP9ckutlLJJJyjXmd+/fvw8XFpVxjviJ///035s2bh6Kiomqfva6sm/26devw0UcfNdqBiXJzc/HNN9+gS5cumDFjhuhd3Z4kCAICAwMRFBSEjh07Ij09HYIgoG3btnB1dUXnzp2rfaaUEEIIaSp0A/1NmzZN1NlbFAoF9u/fj8zMTCxYsEC029jUajWSk5P197lyHAeNRqP/V7dcWloKjuMgl8sRGhoKe3t7TJw4ES1atECLFi1gYWGhX5bL5dBqtdBqteB5Xr+s+z/P87CysjKaweqeRqvVIigoCCdPnoSzszOSk5PRv39/TJw48an31ysUCnz++eeYMmVKvc+JXZnExERs27YN/fv3x6xZs+r9fu260Gg0+OWXX6BWq7F06dIKr5o/fPgQUVFRiIqKQlZWFqytrTF16tRa9VDNyMjAV199hdmzZ2PgwIFipADg8TR/v/zyCxISEmBpaYk1a9Y0irGjioqKYGZmVuMeQxqNBu+//z42btzYaH6vq+vy5cv49ttvceDAAf1zYnazZ77/oe6HVFVR6c7c6OhGbtc9x/M8gMeFJpVK9fNochwHiUSiX9adMa5sWaPR6M8sazQayOVySCQS/TIA/RedbtnExASCIOiXdV9kumWtVlvmS++/yzzPQxAEhIWF4fDhw3jttdfg6uqq/zKUyWT6/MTMSXc7wpAhQ6BQKNC+fXv980/mBKDWOemWdbFXttyY9lND5FRcXIyQkBD93KQs5MTifqKcap+TSqXCuXPnMGbMGP1VncaeE4v7iXKqOKeOHTvik08+we+//46wsDC88sorsLCwKJMTz/M4c+YMvL29YWpqWmVOGo0Ghw4dwt27dzFnzhz07NlTH4MYOZmZmaFz587o3LlztffTa6+9hoCAAISHh+Ptt98uk19N9pNGozHq2pNIJLh48SKOHDmCESNG4LPPPoO5uTkEQcDVq1exbds2mJubY/LkyejVq1eZ/LRaLRISErBjxw688cYb+m73hsipU6dO+Prrrxvs90mtViMkJER/f3VNcpJKpVi6dCmuXr2KDz/8EK+//jqkUql+mt6CggLY29ujT58+mDt3Ltq0aaPPSTePfE1yatOmDTZu3IgdO3YgNjYWzz//fJX51WQ/TZ48GatWrQLw+IKclZWV0R/3nrxIV5Pa+/PPPzFlyhT9hUZjyqmuv0+enp5ITU0t8/yT7c66Yrr/YU5ODj755JOnDli3efNmWFlZ6R+6gSVSUlIAPB7cLSoqCs2aNUNUVBTi4uIAPL63JikpCcDj+4pSU1MBPJ5fNSMjA8DjcQCys7MBPL7nXqFQAHjcvULXJSUgIAAqlQocxyEgIAAcx0GlUiEgIADA4+4sx48fx4EDB5CRkYHg4GAAj+coDAkJAfD4zGBoaCiAx/fjhYWFAQDi4uLw119/ITIyEosWLdJvMyYmBjExMQBQrzk1a9YMERER4Hm+XE6BgYEAHp91rklOSUlJiIiI0OcXFRXVoDlVtZ8aQ05nz55Fz549IZPJmMmJxf1EOdU+p9TUVLRs2RIymYyZnFjcT5RT5Tldu3YNkydPxpgxY/Dvv//ixo0bZXLS/cFYUlJSaU5KpRIBAQFYt24dOnXqBC8vL/Tu3dso9tOFCxcwZMgQTJkyBSdOnEB8fHyj3E9V1V54eDhWr16NnJwcrF69GtOmTUNgYCBUKhV4nkdWVhbWrl2LuXPnIjExEatWrUJAQIB+Cqtz584hIiIC69atQ8uWLY0ip4b6fYqIiMCAAQP0y7XJied5vPzyy/jnn39w7do1NG/eHEuXLsW4ceMwb948jB07FlFRUaLkdPHiRaxYsQI2Njb4559/kJ2dXef9dOfOHXz11VdYunQpZs+ejU8//VQ/wJ6x7Cexai8yMhKJiYmQyWTM5PTkfsrOzoZWqy2Tk259YmC2m31BQQF8fHxgY2MDf3//KrvGV3Rl3snJCSdPnsT48ePB8zzu3Lmjn8sTaNirBFeuXMHly5cxaNAgHD9+HG3atMHMmTPh6OhY5Zml+Ph4/Pzzz3j++efRv39/uvJBOVFOlBPlRDlRTo0sp7y8POzcuRPt2rXDc889BxMTk0pzKi4uRlhYGC5fvoyCggJ4e3tj3LhxZfIzhpyeXE5PT9fPR9+rV69Gu590y0lJSdi7dy/atGmDF154Ac2aNatWTlqtVj+CvImJCWxsbPDaa6/B3Nzc4Dmx9PtU3zndv38fO3bsgLe3N8aOHVsu1+rkdOvWLezZswcLFy5Et27dIJFIUFpaiq+//hpdu3bFrFmzwPM8E/tJo9Fg3bp1eOedd2Bra8tEThXV2+TJk7Ft2zb9dKrMDoD3pLo05pVKJcaPH4/mzZvrR1KsiYpGsz927BhsbGwa/H6lrKwsfPnll/jss8/0U4PEx8fj8OHDKCoqwrRp0+Dh4VHmPhqe57F//34kJyfj7bffZu7eE1I3Go0GgYGB8PHxabTjPxBSFapxwqLTp08jKCgIb7/9Nuzt7fU1rlAoEBoaimvXrkEmk2HgwIEYMmSIfvT8xkClUmHr1q3o0aMHZsyY0SjuDX6SQqHA3bt3ERwcDBMTE7z44oto3bp1rdf38OFDtGnTRsQIG5fGfgzneR5///037ty5g2XLllV7OresrCzs2rULzZs3x8KFC8vNxCUIAg4dOoS4uDi88847aNasWX2EXyMc93gWj5ycHP2jY8eOcHd3r9bv8bFjx1BaWoqZM2c2QLSGs3LlSvTq1QsLFiwAwPBo9k+qbWO+oKAA48ePh5mZGQICAmo1BYGuMZ+enq4f5fX777/HrFmz4OjoWO31KJVKnDhxAs8++2ytvpg4jsOaNWvw1ltvoV27duVez8vLg7+/P27dugVvb2+MGTMGubm5+Pbbb/Vn4xvbFyKpf4IgQKlUomXLllQfhElU44RVDx8+xDfffIMBAwaguLgYUVFRaNWqFby8vDBgwIBGPaisIAjYv38/MjIyMG/ePGg0GpSWlupHo9b9q1u2tLTE4MGD9VfjGkphYSHu3buHu3fv4t69eygpKYGVlRW6du2KAQMGNOgc4Kxi5RiekpKCH374ASNHjoSvr2+luajVahw4cAD37t3D4sWL0alTpyrXGxUVhT179mD58uX1Vm+6qf10DfUnG+z5+fn6aSvl8sezeLRq1Ur/uH37NmJiYjB27Fh4e3tXekImJycHX3zxBT7//HNIpUzf+Y2//voLly5dwnfffQfgcfd+R0dH9hrzhYWF+vumPDw88PXXX2P06NGwtbVFhw4dkJubi5SUFKSnp2PSpEk4cOAAunbtirZt26Jt27ZQKpUYN24ciouL4efnV2a0dnt7e8hk1ZuDVNeYf/IH/OGHH+LTTz+t9kFFrVZj/fr16NKlC3JycrBixYoaF+qOHTvQo0cPjBw5ssr3lZaWIjg4GGfOnIFcLsdbb70l+nRzhBBCCDE8rfbxyOgODg768U9YcvXqVVy+fBlmZmb6+aJ1/z65nJWVhcuXL8PV1RWTJk1C+/btRY9FEASkpaXh+vXriI6ORmFhISwsLNC1a1d07doVbm5uBpm3mjQeWq0W//zzD6KiorBs2bIyPTYEQcDZs2dx7NgxzJo1C15eXtVuZ+Tk5GDLli1o06YNJBIJBEHQP3Tb1S0/+fyTzb4nn+d5HiUlJfp1mZqa6hvnusa6ra0t7OzsYGVl9dQ41Wo1goKCcPbsWbi7u2Pq1KnlZm7YtGkT5s6d+9STFyxIT0/HggULEBQUBKDitmZtGVVj/ty5cxg9enS55xcsWIDffvsNv/32GxYtWlTu9fXr12PDhg2Vfh54PJCBs7NzteLQ/YCzs7PRqlUrCIKADz74AJ9//nm1Ps/zPD7++GNMnz4dHh4eCAoKwo0bN7By5cpqN+gvXbqEGzduYNmyZdV6PyHVpdFoEBAQgIkTJzbK7muEPA3VOGEd1fhjgiDg7t27OH78OHJycjBmzBgMHz5cf1tibeTl5eHGjRu4fv06srOz4eTkhH79+qF3795022IDYbG+09LSsH37dgwbNgyTJk1CXFwcdu/ejb59+2LWrFm1ypPneeTk5EAikegfACCVSvXL/31N99x//5VIJDA3Nxe9J4QgCAgPD4e/vz+sra0xa9YsODs7IzQ0FLdv38aSJUtE3Z4xGzx4MK5cuQLg8ckYOzs79hrzxkLXmFcoFLCyskJOTg52796NlStXPvWzgiBg69atGDhwYJkr6mfOnEFYWBhWrVr11LPoGRkZ+Prrr7F58+YG7z5G2CcIAlQqVb0ctAkxBlTjhHVU4+UVFxcjODgYISEh6NixIyZPnqyfzq0igiDor0beuXMH169fR1JSEmxtbdGvXz94enrCzs6uATMgOqzWt1arhZ+fH86cOYNOnTrhpZdealRjW9RVcnIyDh06hNzcXJSUlGDz5s0wMzMzdFgNZtiwYQgMDETz5s2Rn58Pa2traszXF11j/t69e3Bzc0N4eDiSk5Mxa9asp372f//7H9q0aYNp06aVe+3cuXO4dOkS3n///Uob6RqNBh9++CFWrFiBtm3b1jkXQv5LN7qnbjRQQlhDNU5YRzVetbi4OBw/fhzp6emVXvHUjVRtZmaGrl27on///ujYsSP9PI0A6/XN8zxzt8fURH5+PtRqdZ0GiWyMFixYgBdffBFjxoxBXFwcunTpIkpjni77ViEhIQFubm5ITEyEm5vbU9//999/w9TUtMKGPACMGjUKUqkUmzdvxgcffFBhg37nzp2YMWMGNeRJvdHNx8lS9zVCnkQ1TlhHNV41Nzc3LF++3NBhkFpivb6bckMeQLlR+psKd3d3XL16FWPGjEFCQoJo62V76MA60v2yJSUlwcXFpcr3nj59Gg8ePNBPOVCZESNGwNvbG59++ik4jivz2rlz52BiYgIvL6+6BU5IFeRyOSZOnEi3cBBmUY0T1lGNE5ZRfRMWDRkyBLdu3QIg7gkdasxXg0KhKDcC45PCwsJw+fJlvPXWW9XqDjR06FD4+Phg06ZN0Gg0AB4PjHHixAm8/PLLYoVNSKX+eyKJENZQjRPWUY0TllF9E9b069cPKSkpoq+XGvNV4HkeWq22yhHoY2Ji4Ofnh/fff79GU88NGTIEvr6++OSTT1BYWIht27ZhxYoVdBaS1DuO4xAYGEhflIRZVOOEdVTjhGVU34RFpqam+ikDeZ4Xbb3UcqyCXC5Heno6HB0dK3w9JSUFu3btwsaNG2s1DcqgQYMglUqxePFivPvuu01uIAhiGCYmJpWO60AIC6jGCeuoxgnLqL4Jq2xtbZGYmCjqxVu6Ml8FQRCQmJiITp06lXstPz8f33zzDT788ENYWFjUehsDBgzAL7/8gkGDBtUlVEKqTRAEFBQUgCayIKyiGiesoxonLKP6Jqzq0aMHLl68KGptU2O+CjzPIzExscLB76Kjo+Ht7S3K/JAtW7as8zoIqS6O43DhwgXqvkaYRTVOWEc1TlhG9U1Y1a9fP1y/fl3UbvbUmK+CXC7H/fv34ezsXO61tLQ0tG/fvuGDIqSOTExMMGnSJCaneyEEoBon7KMaJyyj+ias8vLywt27d6mbfUPRarVQqVQwNzcv91paWhratWtngKgIqRutVovc3FxotVpDh0JIvaAaJ6yjGicso/omrHJyckJ+fr6otU2N+SoIggAzM7MKX8vJyYGdnV0DR0RI3fE8j2vXronaxYcQY0I1TlhHNU5YRvVNWCWRSGBqagq1Wi3aOmk0+yrY29tX2MVepzpzyhNibExMTDB+/HhDh0FIvaEaJ6yjGicso/omLHN2dqYB8BpKUlJShYPfqdXqWk1FR4gx0Gq1yMrKou5rhFlU44R1VOOEZVTfhGXu7u4IDw8XbX3UmK9CcnJyhdPSVTX3PCHGTqvVIjo6mr4kCbOoxgnrqMYJy6i+CcsGDRqE2NhY0dZH3eyrkJ6eXuEgdzT4HWnM5HI5vL29DR0GIfWGapywjmqcsIzqm7DM3d0daWlpoq2PrsxXIScnBzKZrNzzDx48oGnpSKOl1Wrx4MEDOuNNmEU1TlhHNU5YRvVNWNayZUsUFxeLtj5qzFdBKq34x0ONedKYabVaJCQk0JckYRbVOGEd1ThhGdU3YZ29vb1o66LGfBXatGlT4fM5OTmwtbVt4GgIEYdcLseIESMgl9NdNoRNVOOEdVTjhGVU34R1I0eOFG1d1JivgoODQ6Wv0bR0pLHSarVITk6mM96EWVTjhHVU44RlVN+EdQMHDhRtXdSYr0Lbtm3LPadWq2FmZmaAaAgRB92LRlhHNU5YRzVOWEb1TVgnZm1T/5UqWFtbl3vuwYMHNC0dadTkcjm8vLwMHQYh9YZqnLCOapywjOqbsE7MW0joynwVKjprQoPfkcaO53nEx8eD53lDh0JIvaAaJ6yjGicso/omrBOztqkxXwVBEMo9R3PMk8ZOEATk5eVVWN+EsIBqnLCOapywjOqbsE7M2qbGfBUq6gKRlpZGV+ZJoyaXyzFgwAAaJZYwi2qcsI5qnLCM6puwjrrZN5CKukDk5ubCxsbGANEQIg6e5xEbG0vd1wizqMYJ66jGCcuovgnrmO1mHxISgilTpsDR0RESiQRHjhwp8/rhw4cxfvx42NnZQSKRIDIystw61Go1li1bBjs7O1hYWGDq1KlIS0urVTz29vYVPk/T0pHGrqSkxNAhEFKvqMYJ66jGCcuovgnLKmtj1oZRNeaLiorg7u6O7du3V/r60KFD8fnnn1e6juXLl8PPzw8HDhzAxYsXUVhYiMmTJ9fqDIirq2uZ/6tUKpibm9d4PYQYE5lMBg8PD8hkMkOHQki9oBonrKMaJyyj+ias+28bsy6M6mYUX19f+Pr6Vvr6/PnzAQD379+v8PX8/Hzs2rULe/fuxdixYwEAf/zxB5ycnHD69GmMHz++RvH89wTAgwcPaPA70ujxPI+YmBh0796dvigJk6jGCeuoxgnLqL4J65jtZl9X169fh0ajgY+Pj/45R0dH9OrVC6GhoZV+Tq1Wo6CgoMwD+L8fNM/z4HleP5K97nmO48os66ayq2xZo9GUWdaNZKhbFgSh3DKAMstarbbMMsdxVS7zPF9m+b85/XeZcqKcKCfKiYWcdNtkKScW9xPlVPuctFotczmxuJ8oJ8qJcqKcKloWC1ON+czMTJiampYboK5NmzbIzMys9HObN2+GlZWV/uHk5AQAiImJ0f8bExODtLQ0mJmZIS4uDgAQERGBpKQkAEBYWBhSU1MBAKGhocjIyADweByA7OxsAEBwcDAUCgUAIDAwEEqlEgAQEBAAlUoFjuMQEBAAjuOgUqkQEBAAAFAqlQgMDAQAKBQKBAcHAwCys7MREhICAMjIyNCfsEhNTUVYWBgAICkpCREREQCAuLg4REVFlckJAKKioiinJpTTmTNn0LFjR8hkMmZyYnE/UU61zyklJQVqtRoymYyZnFjcT5RT7XOSyWRITU1FcXExMzmxuJ8op9rldP36dfTq1QspKSnM5MTifqKcap/ThQsXIBaJYKSTOEokEvj5+WH69OnlXrt//z5cXFwQERGBvn376p/fv38/Fi1aBLVaXeb948aNQ+fOnbFz584Kt6VWq8t8pqCgAE5OTjh16hR8fHz0Z1q++uorLF68GNbW1pDJZOA4DhKJRL8slUohlUorXdZoNJDJZPpluVwOiUSiXwYen6l5ctnExASCIOiXtVoteJ7XL2u1Wsjl8kqXeZ6HIAj6ZeDxvUiVLVNO7OekUqlw584duLu767ff2HNicT9RTrXPqbS0FLdu3Srz/dDYc2JxP1FOtc9JEARERkbC3d0dcrmciZxY3E+UU+1y0mg0uHPnDnr27AmpVMpETizuJ8qp9jmdOnUKEyZMQH5+PiwtLVEXRnXPfF21bdsWpaWlyMvLK3N1PisrC15eXpV+zszMDGZmZuWe153n0N2vk5eXB1tbW/1o9k/OEVidZRMTk1otSyQS/bKuCKq7/OS9RtVZppyaRk7NmzdnLqfqLlNOTSMnCwsL5nJ6Wh6UU9PJied5WFhY6P8eYSGnmi5TTuzmJJfL0axZM8hkMn3MjT0nFvcT5VT7nMTEVDf7fv36wcTEBEFBQfrnMjIyEB0dXWVjvjJP7iTgcVHQtHSksZPJZOjWrVu5+iaEFVTjhHVU44RlVN+EdWLWtlE15gsLCxEZGamfPz4pKQmRkZFISUkBAOTm5iIyMhJ37twBANy9exeRkZH6++GtrKywePFirFixAmfOnEFERATmzZuH3r1760e3rwmO+7/BCYqLi9GsWbM6ZkiI4XEch2vXrpWpb0JYQjVOWEc1TlhG9U1YJ2ZtG1VjPjw8HB4eHvDw8AAAvPvuu/Dw8MC6desAAP7+/vDw8MCkSZMAAHPmzIGHh0eZe+G3bduG6dOnY/bs2Rg6dCiaN2+Of//9t1ZnQJ68Ck/T0hFWSCQS2NjYUC8TwiyqccI6qnHCMqpvwjoxa9toB8AzpIKCAvy/9u49uIryDuP4cy4YIDdJIrlIwEhQWkDkIhovECygSBGGtlJ1Agg6OqIDw6jYmzIOA4qorTLGOsrFtgq2FnQoxkSBBKFMEiBDoFQIxgqaQLkmBHI552z/sGfHY0IksOHkvH4/M5nZs7vZ/N7kIeG3++6e+Ph4FRQU2Ff0N27cqMbGxpC3vQMAAAAA4Fx9/PHHGj16tCMPwOtQV+Y7mm9PgQi+xzwQ6Xw+n7Zs2cL0NRiLjMN0ZBwmI98wnbHT7Dua4BMNJabZwxxut1uXX355SL4Bk5BxmI6Mw2TkG6ZzMttGvTWd0779jT558qTi4+PDWA3gDLfbrV69eoW7DKDdkHGYjozDZOQbpnOymeeUVyu+OwWCB3HABD6fT0VFRUxfg7HIOExHxmEy8g3TMc3+IrnqqqskffO2dF27dg1zNYAz3G63evfuzfQ1GIuMw3RkHCYj3zBdsMd0AtPsW5GUlCSJh9/BLMF70QBTkXGYjozDZOQbpgv2mE7glFcrglMgePgdTOLz+bR+/Xqmr8FYZBymI+MwGfmG6Zhmf5EEp/ccPHhQPXr0CHM1gDPcbrf69+/P9DUYi4zDdGQcJiPfMB0PwLtI6urqJNHMwyxut1vdu3fnjySMRcZhOjIOk5FvmC7YYzqBfyWtKC8vlyTV1tYqLi4uzNUAzmhqatJHH32kpqamcJcCtAsyDtORcZiMfMN0wR7TCTTzrQieEbQsK8yVAM7xeDy67rrr5PF4wl0K0C7IOExHxmEy8g3TOTnrhKfZt8LtduvUqVOKjo4OdymAY9xutxISEsJdBtBuyDhMR8ZhMvIN03HP/EXi8/n01Vdfcb88jNLU1KR//OMfTF+Dscg4TEfGYTLyDdPxNPuLxOPx8PA7GMfr9eqWW26R18vEHJiJjMN0ZBwmI98wnZO3kNDMt8LlcnFlHsZxuVyKi4uTy+UKdylAuyDjMB0Zh8nIN0znZLZp5lvh8/l08OBBXX755eEuBXBMU1OT3n//faavwVhkHKYj4zAZ+YbpmGZ/kXg8HtXW1io2NjbcpQCO8Xq9GjNmDNPXYCwyDtORcZiMfMN0TLMHcEH4AwnTkXGYjozDZOQbODc0863gqjxM5PP5tG7dOken+AAdCRmH6cg4TEa+YTq/3+/YsVyWZVmOHc0QNTU1io+P1z//+U/t3btXU6ZMCXdJgGMsy5LP55PX6+XhMjASGYfpyDhMRr5humPHjikxMVEnT55UXFzcBR2LK/OtqK6u5uF3MBJnu2E6Mg7TkXGYjHzDZNwzf5FUVVXxtnQwjs/nU35+Pn8oYSwyDtORcZiMfMN0TmabafYtCE6z/+1vf6u5c+cqJiYm3CUBAAAAACJcsNdkmn07O3ToEI08jGNZlmpqasR5PJiKjMN0ZBwmI98w3YEDBxw7Fs18K+rq6sJdAuA4n8+nTZs2MX0NxiLjMB0Zh8nIN0znZDPPmzi2onPnzuEuAXBcp06dNG7cuHCXAbQbMg7TkXGYjHzDdF6vcy14h7oyX1RUpPHjxystLU0ul0tr1qwJ2W5ZlubNm6e0tDR16dJF2dnZ2r17d8g+1dXVysnJUUpKiqKjozV48GD97W9/O696kpKSzncoQIcVCAR07NgxBQKBcJcCtAsyDtORcZiMfMN0Tma7QzXzdXV1GjhwoJYsWdLi9kWLFunFF1/UkiVLVFJSopSUFI0ePVq1tbX2Pjk5Ofrss8/0wQcfqLy8XJMmTdLkyZO1Y8eONteTmJh43mMBOiq/36+SkhL5/f5wlwK0CzIO05FxmIx8w3RONvMd9mn2LpdLq1ev1sSJEyV9c1U+LS1Ns2fP1ty5cyVJDQ0NSk5O1nPPPacHH3xQkhQTE6Pc3Fzl5OTYx0pMTNSiRYs0Y8aMc/rawScMLl26VPfdd5+zAwMAAAAA/CB9/PHHGj169A/rafaVlZWqrq7WmDFj7HVRUVEaMWKEtmzZYq+7+eabtWrVKnt6zsqVK9XQ0KDs7OyzHruhoUE1NTUhH5LUrVs3Sd+cIQyeHfz2ss/nC1kOnmU523JTU1PIcvA8SnDZsqxmy5JClgOBQMhy8OEgZ1v2+/0hyy2NgzH9sMbU0NCgQ4cO2TWaMCYTf06M6fzH1NTUpKqqKgUCAWPGZOLPiTGd/5gCgYC+/vpr+3NNGJOJPyfGdH5jamxs1OHDh9XU1GTMmEz8OTGmCxuTUyKmma+urpYkJScnh6xPTk62t0nSqlWr5PP5lJiYqKioKD344INavXq1evfufdZjL1y4UPHx8fZHenq6JKmqqkqStGfPHu3Zs0eStHPnTu3bt0+StGPHDlVWVkqSiouL7ScTbtmyxf7coqIiHTlyRJK0fv16nThxQpKUn59v3x6wbt061dfXy+fzad26dfL5fKqvr9e6deskSbW1tcrPz5cknThxQuvXr5ckHTlyREVFRXatwZMaBw4cUHFxsaRvToIEbzHYt2+fdu7cyZh+4GP6+OOPtXPnTgUCAWPGZOLPiTFd2Ji2b9+uQCBg1JhM/DkxpvMbUyAQUElJiX3xwYQxmfhzYkznN6aSkhLt2rXLqDGZ+HNiTOc/prKyMjklYqbZb9myRTfddJO+/vprpaam2vs98MADOnDggPLy8iRJjz76qIqLi7VgwQIlJSVpzZo1eumll7Rp0yYNGDCgxa/V0NCghoYG+3VNTY3S09OVl5en2267zT7T4vF4QpZ9Pp9cLpe97Ha75Xa7z7rc1NQkj8djL3u9XrlcLntZ+uaszbeXO3XqJMuy7OXglabgciAQkNfrPeuy3++XZVn2ckvjYEyMiTExJsbEmBgTY2JMjIkxMSbG1P5jysvL09ixYx2ZZh8xzfznn3+u3r17a/v27Ro0aJC934QJE3TppZdqxYoV2r9/vzIzM7Vr1y7169fP3mfUqFHKzMzUa6+9dk5fO3jP/EcffRQyrR8wQSAQUFVVlVJTU+V2R8zkHOCckXGYjozDZOQbpsvPz9dtt932w7pnPiMjQykpKSooKLDXNTY2qrCwUDfeeKMk6fTp05LU7B++x+Ox71Foiy5dulxAxUDHFAgEtH///vP6NwFEAjIO05FxmIx8w3RO9pjOvWO9A06dOqWKigr7dWVlpcrKypSQkKCePXtq9uzZWrBggfr06aM+ffpowYIF6tq1q+655x5JUt++fZWZmakHH3xQixcvVmJiotasWaOCggKtXbu2zfUMHDjQsbEBHYXX69Xw4cPDXQbQbsg4TEfGYTLyDdM52WN2qGa+tLRUI0eOtF/PmTNHkjR16lQtX75cTzzxhM6cOaOHH35Yx48f1/XXX6/8/HzFxsZKkjp16qR169bpySef1Pjx43Xq1CllZmZqxYoVuuOOO9pcD2cEYaJAIKADBw4oPT2d6WswEhmH6cg4TEa+YTone8wOe898OAXvmT969KgSEhLCXQ7gKJ/Pp+LiYg0bNsx+WAhgEjIO05FxmIx8w3THjh1TYmKi2Q/AC6dgM+/ENxgAAAAAAMnZXpO5K60IvqcgYBK/36+Kigr7LTUA05BxmI6Mw2TkG6ZzssekmW/FqVOnwl0C4DjLsnT8+HExKQemIuMwHRmHycg3TOdkj8mNKK3gPh2YyOv16rrrrgt3GUC7IeMwHRmHycg3TOdkj8mV+VYwvQcm8vv9+ve//02+YSwyDtORcZiMfMN0TmabZh74ATpz5ky4SwDaFRmH6cg4TEa+gXPDPPJWeDyecJcAOM7j8WjQoEHhLgNoN2QcpiPjMBn5humc7DG5Mt8KpvfARH6/X7t27SLfMBYZh+nIOExGvmE6ptkDAAAAAPADxjT7VjDNHibyeDzq379/uMsA2g0Zh+nIOExGvmE6J3tMmvkWBN/XsqamRjU1NWGuBnBWcPpa//79OWEFI5FxmI6Mw2TkG6YL9pfBnvNC0My34OjRo5Kkn/3sZ2GuBAAAAABgmqNHjyo+Pv6CjkEz34KEhARJ0pdffnnB32Cgo6mpqVF6eroOHDiguLi4cJcDOI6Mw3RkHCYj3zDdyZMn1bNnT7vnvBA08y1wu795LmB8fDy/RGCsuLg48g2jkXGYjozDZOQbpgv2nBd0DAfqAAAAAAAAFxHNPAAAAAAAEYZmvgVRUVF6+umnFRUVFe5SAMeRb5iOjMN0ZBwmI98wnZMZd1lOPBMfAAAAAABcNFyZBwAAAAAgwtDMAwAAAAAQYWjmAQAAAACIMDTzAAAAAABEGJr573j11VeVkZGhzp07a8iQIdq0aVO4SwLOS1FRkcaPH6+0tDS5XC6tWbMmZLtlWZo3b57S0tLUpUsXZWdna/fu3eEpFmijhQsX6rrrrlNsbKy6d++uiRMn6rPPPgvZh4wjkuXm5uqaa65RXFyc4uLilJWVpQ8//NDeTr5hkoULF8rlcmn27Nn2OjKOSDZv3jy5XK6Qj5SUFHu7U/mmmf+WVatWafbs2frNb36jHTt26JZbbtHYsWP15Zdfhrs0oM3q6uo0cOBALVmypMXtixYt0osvvqglS5aopKREKSkpGj16tGpray9ypUDbFRYWaubMmdq6dasKCgrk8/k0ZswY1dXV2fuQcUSyHj166Nlnn1VpaalKS0t16623asKECfZ/9sg3TFFSUqLXX39d11xzTch6Mo5I169fP1VVVdkf5eXl9jbH8m3BNmzYMOuhhx4KWde3b1/rySefDFNFgDMkWatXr7ZfBwIBKyUlxXr22WftdfX19VZ8fLz12muvhaFC4MIcPnzYkmQVFhZalkXGYaZu3bpZb7zxBvmGMWpra60+ffpYBQUF1ogRI6xZs2ZZlsXvcES+p59+2ho4cGCL25zMN1fm/6+xsVHbtm3TmDFjQtaPGTNGW7ZsCVNVQPuorKxUdXV1SN6joqI0YsQI8o6IdPLkSUlSQkKCJDIOs/j9fq1cuVJ1dXXKysoi3zDGzJkzNW7cOI0aNSpkPRmHCfbt26e0tDRlZGTol7/8pT7//HNJzubb62jFEezIkSPy+/1KTk4OWZ+cnKzq6uowVQW0j2CmW8r7f/7zn3CUBJw3y7I0Z84c3Xzzzerfv78kMg4zlJeXKysrS/X19YqJidHq1av14x//2P7PHvlGJFu5cqW2b9+ukpKSZtv4HY5Id/311+utt97SVVddpUOHDmn+/Pm68cYbtXv3bkfzTTP/HS6XK+S1ZVnN1gGmIO8wwSOPPKKdO3fq008/bbaNjCOSXX311SorK9OJEyf03nvvaerUqSosLLS3k29EqgMHDmjWrFnKz89X586dz7ofGUekGjt2rL08YMAAZWVlqXfv3lqxYoVuuOEGSc7km2n2/5eUlCSPx9PsKvzhw4ebnTUBIl3waZrkHZHu0Ucf1QcffKANGzaoR48e9noyDhNccsklyszM1NChQ7Vw4UINHDhQf/jDH8g3It62bdt0+PBhDRkyRF6vV16vV4WFhXr55Zfl9XrtHJNxmCI6OloDBgzQvn37HP0dTjP/f5dccomGDBmigoKCkPUFBQW68cYbw1QV0D4yMjKUkpISkvfGxkYVFhaSd0QEy7L0yCOP6O9//7vWr1+vjIyMkO1kHCayLEsNDQ3kGxHvJz/5icrLy1VWVmZ/DB06VPfee6/Kysp05ZVXknEYpaGhQXv27FFqaqqjv8OZZv8tc+bMUU5OjoYOHaqsrCy9/vrr+vLLL/XQQw+FuzSgzU6dOqWKigr7dWVlpcrKypSQkKCePXtq9uzZWrBggfr06aM+ffpowYIF6tq1q+65554wVg2cm5kzZ+rtt9/W+++/r9jYWPvsdnx8vLp06WK/XzEZR6T69a9/rbFjxyo9PV21tbVauXKlNm7cqLy8PPKNiBcbG2s/4yQoOjpaiYmJ9noyjkj22GOPafz48erZs6cOHz6s+fPnq6amRlOnTnX0dzjN/LdMnjxZR48e1TPPPKOqqir1799f69atU69evcJdGtBmpaWlGjlypP16zpw5kqSpU6dq+fLleuKJJ3TmzBk9/PDDOn78uK6//nrl5+crNjY2XCUD5yw3N1eSlJ2dHbJ+2bJlmjZtmiSRcUS0Q4cOKScnR1VVVYqPj9c111yjvLw8jR49WhL5hvnIOCLZwYMHdffdd+vIkSO67LLLdMMNN2jr1q12X+lUvl2WZVntMQAAAAAAANA+uGceAAAAAIAIQzMPAAAAAECEoZkHAAAAACDC0MwDAAAAABBhaOYBAAAAAIgwNPMAAAAAAEQYmnkAAAAAACIMzTwAAAAAABGGZh4AAESExsZGZWZmavPmzY4ed+3atRo0aJACgYCjxwUAoD3RzAMAEAbTpk2Ty+Vq9lFRURHu0jqs119/Xb169dJNN91kr3O5XFqzZk2zfadNm6aJEyee03F/+tOfyuVy6e2333aoUgAA2h/NPAAAYXL77berqqoq5CMjI6PZfo2NjWGoruN55ZVXdP/997fLse+77z698sor7XJsAADaA808AABhEhUVpZSUlJAPj8ej7OxsPfLII5ozZ46SkpI0evRoSdK//vUv3XHHHYqJiVFycrJycnJ05MgR+3h1dXWaMmWKYmJilJqaqhdeeEHZ2dmaPXu2vU9LV7IvvfRSLV++3H791VdfafLkyerWrZsSExM1YcIEffHFF/b24FXvxYsXKzU1VYmJiZo5c6aamprsfRoaGvTEE08oPT1dUVFR6tOnj958801ZlqXMzEwtXrw4pIZdu3bJ7XZr//79LX6vtm/froqKCo0bN66N32Xpiy++aHEWRHZ2tr3PnXfeqeLiYn3++edtPj4AAOFAMw8AQAe0YsUKeb1ebd68WX/84x9VVVWlESNG6Nprr1Vpaany8vJ06NAh3XXXXfbnPP7449qwYYNWr16t/Px8bdy4Udu2bWvT1z19+rRGjhypmJgYFRUV6dNPP1VMTIxuv/32kBkCGzZs0P79+7VhwwatWLFCy5cvDzkhMGXKFK1cuVIvv/yy9uzZo9dee00xMTFyuVyaPn26li1bFvJ1ly5dqltuuUW9e/dusa6ioiJdddVViouLa9N4JCk9PT1k9sOOHTuUmJio4cOH2/v06tVL3bt316ZNm9p8fAAAwsEb7gIAAPihWrt2rWJiYuzXY8eO1V//+ldJUmZmphYtWmRve+qppzR48GAtWLDAXrd06VKlp6dr7969SktL05tvvqm33nrLvpK/YsUK9ejRo001rVy5Um63W2+88YZcLpckadmyZbr00ku1ceNGjRkzRpLUrVs3LVmyRB6PR3379tW4ceP0ySef6IEHHtDevXv17rvvqqCgQKNGjZIkXXnllfbXuO+++/TUU0+puLhYw4YNU1NTk/785z/r+eefP2tdX3zxhdLS0lrcdvfdd8vj8YSsa2hosK/iezwepaSkSJLq6+s1ceJEZWVlad68eSGfc/nll4fMQAAAoCOjmQcAIExGjhyp3Nxc+3V0dLS9PHTo0JB9t23bpg0bNoQ0/0H79+/XmTNn1NjYqKysLHt9QkKCrr766jbVtG3bNlVUVCg2NjZkfX19fcgU+H79+oU00KmpqSovL5cklZWVyePxaMSIES1+jdTUVI0bN05Lly7VsGHDtHbtWtXX1+sXv/jFWes6c+aMOnfu3OK2l156yT5pEDR37lz5/f5m+86YMUO1tbUqKCiQ2x06QbFLly46ffr0WWsAAKAjoZkHACBMoqOjlZmZedZt3xYIBDR+/Hg999xzzfZNTU3Vvn37zulrulwuWZYVsu7b97oHAgENGTJEf/nLX5p97mWXXWYvd+rUqdlxg2/t1qVLl++t4/7771dOTo5eeuklLVu2TJMnT1bXrl3Pun9SUpJ9suC7UlJSmn0fY2NjdeLEiZB18+fPV15enoqLi5udrJCkY8eOhYwRAICOjGYeAIAIMHjwYL333nu64oor5PU2//OdmZmpTp06aevWrerZs6ck6fjx49q7d2/IFfLLLrtMVVVV9ut9+/aFXI0ePHiwVq1ape7du5/X/emSNGDAAAUCARUWFja7Yh50xx13KDo6Wrm5ufrwww9VVFTU6jEHDRqk3NxcWZZlT/9vi/fee0/PPPOMPvzwwxbvyw/OPBg0aFCbjw0AQDjwADwAACLAzJkzdezYMd199932U9fz8/M1ffp0+f1+xcTEaMaMGXr88cf1ySefaNeuXZo2bVqzqeS33nqrlixZou3bt6u0tFQPPfRQyFX2e++9V0lJSZowYYI2bdqkyspKFRYWatasWTp48OA51XrFFVdo6tSpmj59utasWaPKykpt3LhR7777rr2Px+PRtGnT9Ktf/UqZmZkhtwe0ZOTIkaqrq9Pu3bvb8F37xq5duzRlyhTNnTtX/fr1U3V1taqrq3Xs2DF7n61btyoqKup76wAAoKOgmQcAIAKkpaVp8+bN8vv9uu2229S/f3/NmjVL8fHxdsP+/PPPa/jw4brzzjs1atQo3XzzzRoyZEjIcV544QWlp6dr+PDhuueee/TYY4+FTG/v2rWrioqK1LNnT02aNEk/+tGPNH36dJ05c6ZNV+pzc3P185//XA8//LD69u2rBx54QHV1dSH7zJgxQ42NjZo+ffr3Hi8xMVGTJk1qcfr/9yktLdXp06c1f/58paam2h+TJk2y93nnnXd07733tjrVHwCAjsRlfffGOQAAYIzs7Gxde+21+v3vfx/uUprZvHmzsrOzdfDgQSUnJ3/v/uXl5Ro1alSLD+i7EP/973/Vt29flZaWKiMjw7HjAgDQnrgyDwAALqqGhgZVVFTod7/7ne66665zauSlb+7FX7RokeNvH1dZWalXX32VRh4AEFF4AB4AALio3nnnHc2YMUPXXnut/vSnP7Xpc6dOnep4PcOGDdOwYcMcPy4AAO2JafYAAAAAAEQYptkDAAAAABBhaOYBAAAAAIgwNPMAAAAAAEQYmnkAAAAAACIMzTwAAAAAABGGZh4AAAAAgAhDMw8AAAAAQIShmQcAAAAAIML8D+NtMgBlXdWdAAAAAElFTkSuQmCC", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "## CODE GOES HERE\n", + "epochs_15_20 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(15, 20))\n", + "epochs_15_20.compute_psd().plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Simulate 2 interacting channels in the frequency range 25-30 Hz and verify that activity is present in this frequency range." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs_25_30 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(25, 30))\n", + "epochs_25_30.compute_psd().plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, we combine all three connectivity simulations into a single `Epochs` object using the [`add_channels()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.add_channels) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Combine epochs into a single object\n", + "epochs = epochs_5_10.copy().add_channels([epochs_15_20, epochs_25_30])\n", + "epochs.compute_psd().plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - The `spectral_connectivity_epochs()` function\n", + "\n", + "We will use the function [`mne_connectivity.spectral_connectivity_epochs()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_epochs.html#mne_connectivity.spectral_connectivity_epochs) to compute connectivity between the simulated signals.\n", + "\n", + "By default, connectivity is computed between all channels using the coherence method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute coherence between all channels\n", + "connectivity = mne_connectivity.spectral_connectivity_epochs(data=epochs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Formatting of connectivity results\n", + "\n", + "As you can see from the logging output, without specifying which channels to compute connectivity between, 15 connections are computed. This corresponds to the number of unique combinations of our channels (`channels * (channels - 1) / 2`), e.g.:
\n", + "\"Connectivity\n", + "\n", + "However, such all-to-all connectivity results can also be represented as a dense array of shape `(channels, channels, frequencies)`, where the lower triangular elements are filled with the connectivity results, and all other elements are zeros, e.g.:
\n", + "\"Connectivity\n", + "\n", + "\n", + "This makes indexing the results easier, and is what MNE returns when no channels are specified. Hence, 36 connections are actually returned ($6^2$, with only the lower triangular elements filled).\n", + "\n", + "For an undirected connectivity measure like coherence, the upper triangular elements are identical to the lower triangular elements, so do not need to be computed.\n", + "\n", + "Additionally, the diagonal elements represent the connectivity of each channel with itself and are always 1, so do not need to be computed.\n", + "\n", + "These elements which are not computed are simply set to 0." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Extracting connectivity results\n", + "\n", + "The results are returned as a [`SpectralConnectivity`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.SpectralConnectivity.html#mne_connectivity.SpectralConnectivity) object, and the array of results can be accessed with the [`get_data()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.SpectralConnectivity.html#mne_connectivity.SpectralConnectivity.get_data) method." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By default, `get_data()` returns the results as a raveled array with shape `(connections, frequencies)`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Get compact results (default)\n", + "print(f\"Results shape (default): {connectivity.get_data().shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This behaviour can also be specified using the `output` parameter of `get_data()`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Get compact results (explicit)\n", + "print(f\"Results shape (`output='compact'`): {connectivity.get_data(output='compact').shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also specify `get_data()` to return the results in the dense array form with shape `(channels, channels, frequencies)` by setting `output=\"dense\"`.\n", + "\n", + "This will be easier to index for the following exercises." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Get dense results\n", + "print(f\"Results shape (`output = 'dense'`): {connectivity.get_data(output='dense').shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The frequencies corresponding to the connectivity results are stored as a list and can be accessed under the `freqs` attribute." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Inspect frequencies of the results\n", + "print(f\"Range of frequencies: {connectivity.freqs[0]} - {connectivity.freqs[-1]} Hz\")\n", + "print(f\"Number of frequencies: {len(connectivity.freqs)}\")\n", + "print(f\"Frequencies: {connectivity.freqs}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Below, we extract the connectivity information for the 5-10 Hz interaction between channel 0 and channel 1.\n", + "\n", + "*Hint:* Because the connectivity results are stored in the lower-triangular elements, the positions of the seed and target channels must be switched. See also:
\n", + "\"Highlighted\n", + "\n", + "Plotting the results shows that these are the connectivity results for the 5-10 Hz interaction.\n", + "\n", + "We use the custom helper function `plot_connectivity()` to visualise the results, passing in the connectivity results for a single connection as an array and the corresponding frequencies of the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Plot connectivity for the 5-10 Hz interaction\n", + "conn_5_10 = connectivity.get_data(\"dense\")[1, 0] # (target index, seed index)\n", + "plot_connectivity(conn_5_10, connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - extracting connectivity results**\n", + "\n", + "**Exercise:** Extract and plot the connectivity information for the 15-20 Hz interaction.\n", + "\n", + "*Hint:* This involved channels 2 and 3." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "conn_15_20 = connectivity.get_data(\"dense\")[3, 2]\n", + "plot_connectivity(conn_15_20, connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Extract and plot the connectivity information for the 25-30 Hz interaction." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "conn_25_30 = connectivity.get_data(\"dense\")[5, 4]\n", + "plot_connectivity(conn_25_30, connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 3 - The `indices` parameter\n", + "\n", + "Of course, it is inefficient to compute all combinations of connections if we are only interested in a few of them. This is where the `indices` parameter is needed.\n", + "\n", + "Connection indices in MNE take the form of a tuple of two array-likes, specifying the seed and target channels, respectively, e.g. `indices = (seeds, targets)`. Connectivity will be computed for only these particular combinations.\n", + "\n", + "Here, we specify that connectivity should only be computed between channels 0 and 1 (the 5-10 Hz interaction) with `indices = ([0], [1])`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute coherence between 5-10 Hz interacting channels\n", + "connectivity_5_10 = mne_connectivity.spectral_connectivity_epochs(data=epochs, indices=([0], [1]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From the logging output, we can see that only one connection is computed.\n", + "\n", + "Accordingly, when we call `get_data()`, there is only one connection to index, and plotting the results show us we have indeed selected the 5-10 Hz interaction." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Plot connectivity for the 5-10 Hz interaction\n", + "print(f\"Connectivity results shape: {connectivity_5_10.get_data().shape} (connections x frequencies)\")\n", + "plot_connectivity(connectivity_5_10.get_data()[0], connectivity_5_10.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Specifying the indices**\n", + "\n", + "**Exercise:** Compute and plot connectivity for only the 15-20 Hz interacting channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "connectivity_15_20 = mne_connectivity.spectral_connectivity_epochs(data=epochs, indices=([2], [3]))\n", + "print(connectivity_15_20.get_data().shape)\n", + "plot_connectivity(connectivity_15_20.get_data()[0], connectivity_15_20.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute and plot connectivity for only the 25-30 Hz interacting channels." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "connectivity_25_30 = mne_connectivity.spectral_connectivity_epochs(data=epochs, indices=([4], [5]))\n", + "print(connectivity_25_30.get_data().shape)\n", + "plot_connectivity(connectivity_25_30.get_data()[0], connectivity_25_30.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Alternatively, we can specify multiple connections at once by providing multiple seeds and targets.\n", + "\n", + "**Exercise:** Compute connectivity for the 5-10 and 15-20 Hz interacting channels (i.e. 2 connections) in a single call to `spectral_connectivity_epochs()`, and plot the connectivity values." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "connectivity = mne_connectivity.spectral_connectivity_epochs(data=epochs, indices=([0, 2], [1, 3]))\n", + "print(connectivity.get_data().shape)\n", + "for con_i in range(connectivity.get_data().shape[0]):\n", + " plot_connectivity(connectivity.get_data()[con_i], connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute connectivity for the 5-10, 15-20, and 25-30 Hz interacting channels (i.e. 3 connections) in a single call to `spectral_connectivity_epochs()`, and plot the connectivity values." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "connectivity = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, indices=([0, 2, 4], [1, 3, 5])\n", + ")\n", + "print(connectivity.get_data().shape)\n", + "for con_i in range(connectivity.get_data().shape[0]):\n", + " plot_connectivity(connectivity.get_data()[con_i], connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By specifying the `indices` parameter, we can be much more efficient by only computing the connections we are interested in." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 4 - The `method` parameter\n", + "\n", + "Until now, we have been using the default connectivity method - `\"coh\"` (coherence). However, several such methods are available depending on the sort of connectivity you are interested in, specified using the `method` parameter.\n", + "\n", + "The available connectivity methods are:\n", + "- coherence - `coh`\n", + "- coherency - `cohy`\n", + "- imaginary part of coherency - [`imcoh`](https://doi.org/10.1016/j.clinph.2004.04.029)\n", + "- phase-locking value - [`plv`](https://doi.org/10.1002/(SICI)1097-0193(1999)8:4%3C194::AID-HBM4%3E3.0.CO;2-C)\n", + "- corrected imaginary phase-locking value - [`ciplv`](https://doi.org/10.1088/1741-2552/aacfe4)\n", + "- pairwise phase consistency - [`ppc`](https://doi.org/10.1016/j.neuroimage.2010.01.073)\n", + "- phase lag index - [`pli`](https://doi.org/10.1002/hbm.20346)\n", + "- unbiased squared phase lag index - [`pli2_unbiased`](https://doi.org/10.1016/j.neuroimage.2011.01.055)\n", + "- directed phase lag index - [`dpli`](https://doi.org/10.1016/j.neuroimage.2012.05.050)\n", + "- weighted phase lag index - [`wpli`](https://doi.org/10.1016/j.neuroimage.2011.01.055)\n", + "- debiased squared weighted phase lag index [`wpli2_debiased`](https://doi.org/10.1016/j.neuroimage.2011.01.055)\n", + "\n", + "References and relevant equations are given in the [documentation](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_epochs.html#mne_connectivity.spectral_connectivity_epochs).\n", + "\n", + "The specifics of each method are not relevant here, but as you can see, there are many tools available in MNE-Connectivity suited to various signal analysis problems." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Specifying the connectivity method \n", + "\n", + "`method` can be specified as a string for a single method, or a list of strings for multiple methods.\n", + "\n", + "To start off, we can explicitly pass coherence (`\"coh\"`) as the desired method.\n", + "\n", + "Using the `method` attribute of the `SpectralConnectivity` object, we can verify that coherence has been computed.\n", + "\n", + "Furthermore, we can plot the results for the 15-20 Hz interaction and verify that they match the results above." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Compute coherence (explicit)\n", + "coh = mne_connectivity.spectral_connectivity_epochs(data=epochs, method=\"coh\")\n", + "coh_15_20 = coh.get_data(\"dense\")[3, 2]\n", + "\n", + "plot_connectivity(coh_15_20, connectivity.freqs)\n", + "print(f\"The computed method is: {coh.method}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Specifying the connectivity method**\n", + "\n", + "**Exercise:** Compute connectivity using the imaginary part of coherency (`\"imcoh\"`) and plot the results for the 15-20 Hz interaction.\n", + "\n", + "How do the results compare for coherence and the imaginary part of coherency?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "imcoh = mne_connectivity.spectral_connectivity_epochs(data=epochs, method=\"imcoh\")\n", + "imcoh_15_20 = imcoh.get_data(\"dense\")[3, 2]\n", + "plot_connectivity(imcoh_15_20, connectivity.freqs)\n", + "print(f\"The computed method is: {imcoh.method}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute connectivity using coherence and the imaginary part of coherency in the same function call.\n", + "\n", + "What do you notice about the output of the function?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "con_methods = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=[\"coh\", \"imcoh\"]\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Try to access the results for each of the connectivity methods computed above.\n", + "\n", + "What do you notice about the order in which the results for each method is returned?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "for con_method in con_methods:\n", + " print(con_method.method)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "As you can see, frequency-resolved connectivity analyses in MNE can easily be performed using the MNE-Connectivity package.\n", + "\n", + "A wide range of connectivity methods are supported, which either come in the form of the more traditional connectivity-over-epochs discussed here (`spectral_connectivity_epochs()`), or an alternative connectivity-over-time ([`spectral_connectivity_time()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_time.html#mne_connectivity.spectral_connectivity_time)).\n", + "\n", + "In addition, MNE-Connectivity has functions for computing vector autoregressive models ([`vector_auto_regression()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.vector_auto_regression.html)), as well as [visualisation tools](https://mne.tools/mne-connectivity/stable/api.html#visualization-functions).\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on computing all-to-all connectivity in sensor space: https://mne.tools/mne-connectivity/stable/auto_examples/sensor_connectivity.html\n", + "\n", + "MNE tutorial on computing connectivity in source space: https://mne.tools/mne-connectivity/stable/auto_examples/mne_inverse_coherence_epochs.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day3/3 - Connectivity 2.ipynb b/workshops/mne_course/Day3/3 - Connectivity 2.ipynb new file mode 100644 index 0000000..2169bb1 --- /dev/null +++ b/workshops/mne_course/Day3/3 - Connectivity 2.ipynb @@ -0,0 +1,970 @@ +{ + "cells": [ + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Collecting mne_connectivity\n", + " Downloading mne_connectivity-0.6.0-py3-none-any.whl.metadata (10 kB)\n", + "Requirement already satisfied: mne>=1.6 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne_connectivity) (1.6.1)\n", + "Collecting netCDF4>=1.6.5 (from mne_connectivity)\n", + " Downloading netCDF4-1.6.5-cp311-cp311-win_amd64.whl.metadata (1.8 kB)\n", + "Requirement already satisfied: numpy>=1.21 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne_connectivity) (1.26.4)\n", + "Requirement already satisfied: pandas>=1.3.2 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne_connectivity) (2.2.0)\n", + "Requirement already satisfied: scipy>=1.4.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne_connectivity) (1.12.0)\n", + "Requirement already satisfied: tqdm in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne_connectivity) (4.66.1)\n", + "Collecting xarray>=2023.11.0 (from mne_connectivity)\n", + " Downloading xarray-2024.1.1-py3-none-any.whl.metadata (11 kB)\n", + "Requirement already satisfied: matplotlib>=3.5.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (3.8.2)\n", + "Requirement already satisfied: pooch>=1.5 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (1.8.0)\n", + "Requirement already satisfied: decorator in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (5.1.1)\n", + "Requirement already satisfied: packaging in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (23.2)\n", + "Requirement already satisfied: jinja2 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (3.1.3)\n", + "Requirement already satisfied: lazy-loader>=0.3 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from mne>=1.6->mne_connectivity) (0.3)\n", + "Collecting cftime (from netCDF4>=1.6.5->mne_connectivity)\n", + " Downloading cftime-1.6.3-cp311-cp311-win_amd64.whl.metadata (8.8 kB)\n", + "Requirement already satisfied: certifi in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from netCDF4>=1.6.5->mne_connectivity) (2024.2.2)\n", + "Requirement already satisfied: python-dateutil>=2.8.2 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from pandas>=1.3.2->mne_connectivity) (2.8.2)\n", + "Requirement already satisfied: pytz>=2020.1 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from pandas>=1.3.2->mne_connectivity) (2024.1)\n", + "Requirement already satisfied: tzdata>=2022.7 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from pandas>=1.3.2->mne_connectivity) (2023.4)\n", + "Requirement already satisfied: colorama in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from tqdm->mne_connectivity) (0.4.6)\n", + "Requirement already satisfied: contourpy>=1.0.1 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (1.2.0)\n", + "Requirement already satisfied: cycler>=0.10 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (0.12.1)\n", + "Requirement already satisfied: fonttools>=4.22.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (4.48.1)\n", + "Requirement already satisfied: kiwisolver>=1.3.1 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (1.4.5)\n", + "Requirement already satisfied: pillow>=8 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (10.2.0)\n", + "Requirement already satisfied: pyparsing>=2.3.1 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from matplotlib>=3.5.0->mne>=1.6->mne_connectivity) (3.1.1)\n", + "Requirement already satisfied: platformdirs>=2.5.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from pooch>=1.5->mne>=1.6->mne_connectivity) (4.2.0)\n", + "Requirement already satisfied: requests>=2.19.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from pooch>=1.5->mne>=1.6->mne_connectivity) (2.31.0)\n", + "Requirement already satisfied: six>=1.5 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from python-dateutil>=2.8.2->pandas>=1.3.2->mne_connectivity) (1.16.0)\n", + "Requirement already satisfied: MarkupSafe>=2.0 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from jinja2->mne>=1.6->mne_connectivity) (2.1.5)\n", + "Requirement already satisfied: charset-normalizer<4,>=2 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from requests>=2.19.0->pooch>=1.5->mne>=1.6->mne_connectivity) (3.3.2)\n", + "Requirement already satisfied: idna<4,>=2.5 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from requests>=2.19.0->pooch>=1.5->mne>=1.6->mne_connectivity) (3.6)\n", + "Requirement already satisfied: urllib3<3,>=1.21.1 in c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages (from requests>=2.19.0->pooch>=1.5->mne>=1.6->mne_connectivity) (2.2.0)\n", + "Downloading mne_connectivity-0.6.0-py3-none-any.whl (107 kB)\n", + " ---------------------------------------- 0.0/107.2 kB ? eta -:--:--\n", + " -------------------------------------- - 102.4/107.2 kB 3.0 MB/s eta 0:00:01\n", + " ---------------------------------------- 107.2/107.2 kB 2.1 MB/s eta 0:00:00\n", + "Downloading netCDF4-1.6.5-cp311-cp311-win_amd64.whl (6.6 MB)\n", + " ---------------------------------------- 0.0/6.6 MB ? eta -:--:--\n", + " - -------------------------------------- 0.3/6.6 MB 9.2 MB/s eta 0:00:01\n", + " ---- ----------------------------------- 0.8/6.6 MB 8.4 MB/s eta 0:00:01\n", + " ------- -------------------------------- 1.2/6.6 MB 8.5 MB/s eta 0:00:01\n", + " --------- ------------------------------ 1.6/6.6 MB 8.5 MB/s eta 0:00:01\n", + " ------------ --------------------------- 2.0/6.6 MB 8.5 MB/s eta 0:00:01\n", + " -------------- ------------------------- 2.4/6.6 MB 8.5 MB/s eta 0:00:01\n", + " ---------------- ----------------------- 2.8/6.6 MB 8.5 MB/s eta 0:00:01\n", + " ------------------- -------------------- 3.2/6.6 MB 8.5 MB/s eta 0:00:01\n", + " ------------------- -------------------- 3.3/6.6 MB 8.7 MB/s eta 0:00:01\n", + " ---------------------- ----------------- 3.7/6.6 MB 7.9 MB/s eta 0:00:01\n", + " ------------------------ --------------- 4.1/6.6 MB 7.8 MB/s eta 0:00:01\n", + " --------------------------- ------------ 4.5/6.6 MB 8.0 MB/s eta 0:00:01\n", + " ----------------------------- ---------- 5.0/6.6 MB 8.1 MB/s eta 0:00:01\n", + " ------------------------------- -------- 5.3/6.6 MB 8.2 MB/s eta 0:00:01\n", + " ---------------------------------- ----- 5.7/6.6 MB 8.1 MB/s eta 0:00:01\n", + " ------------------------------------ --- 6.1/6.6 MB 8.3 MB/s eta 0:00:01\n", + " --------------------------------------- 6.5/6.6 MB 8.4 MB/s eta 0:00:01\n", + " --------------------------------------- 6.6/6.6 MB 8.3 MB/s eta 0:00:01\n", + " ---------------------------------------- 6.6/6.6 MB 7.9 MB/s eta 0:00:00\n", + "Downloading xarray-2024.1.1-py3-none-any.whl (1.1 MB)\n", + " ---------------------------------------- 0.0/1.1 MB ? eta -:--:--\n", + " --------------- ------------------------ 0.4/1.1 MB 8.7 MB/s eta 0:00:01\n", + " ------------------------------- -------- 0.8/1.1 MB 8.8 MB/s eta 0:00:01\n", + " ---------------------------------------- 1.1/1.1 MB 7.6 MB/s eta 0:00:00\n", + "Downloading cftime-1.6.3-cp311-cp311-win_amd64.whl (188 kB)\n", + " ---------------------------------------- 0.0/188.2 kB ? eta -:--:--\n", + " --------------------------------------- 188.2/188.2 kB 11.1 MB/s eta 0:00:00\n", + "Installing collected packages: cftime, netCDF4, xarray, mne_connectivity\n", + "Successfully installed cftime-1.6.3 mne_connectivity-0.6.0 netCDF4-1.6.5 xarray-2024.1.1\n", + "Note: you may need to restart the kernel to use updated packages.\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "WARNING: Skipping c:\\Users\\sangeetha\\anaconda3\\envs\\mne\\Lib\\site-packages\\vtk-9.2.6.egg-info due to invalid metadata entry 'name'\n", + "WARNING: Skipping c:\\Users\\sangeetha\\anaconda3\\envs\\mne\\Lib\\site-packages\\vtk-9.2.6.egg-info due to invalid metadata entry 'name'\n", + "WARNING: Skipping c:\\Users\\sangeetha\\anaconda3\\envs\\mne\\Lib\\site-packages\\vtk-9.2.6.egg-info due to invalid metadata entry 'name'\n" + ] + } + ], + "source": [ + "%pip install mne_connectivity" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "\n", + "import mne_connectivity\n", + "import numpy as np\n", + "\n", + "from _helper_functions import simulate_connectivity, plot_connectivity" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Multivariate frequency-resolved connectivity - the `mne-connectivity` package continued\n", + "\n", + "The connectivity methods covered so far have all been bivariate methods, i.e. connectivity from one signal to another signal.\n", + "\n", + "In contrast, multivariate connectivity methods can be used to compute connectivity between whole groups of signals simultaneously, bringing both practical and methodological benefits." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 1 - Simulating connectivity\n", + "\n", + "As before, we will use the custom helper function `simulate_connectivity()` to generate signals which we can explore multivariate connectivity computations on." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Simulating connectivity**\n", + "\n", + "**Exercise:** Simulate 2 interacting channels in the frequency ranges: 5-10 Hz; 15-20 Hz; and 25-30 Hz.\n", + "\n", + "Do this in 3 separate function calls." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs_5_10 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(5, 10))\n", + "epochs_15_20 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(15, 20))\n", + "epochs_25_30 = simulate_connectivity(n_seeds=1, n_targets=1, freq_band=(25, 30))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Combine the 3 sets of `Epochs` into a single `Epochs` object using the [`add_channels()`](https://mne.tools/stable/generated/mne.Epochs.html#mne.Epochs.add_channels) method." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs = epochs_5_10.copy().add_channels([epochs_15_20, epochs_25_30])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Verify that activity is present in the appropriate frequency ranges by computing the power spectra of the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "epochs.compute_psd().plot();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 2 - A recap of bivariate connectivity\n", + "\n", + "Again, we will use [`mne_connectivity.spectral_connectivity_epochs()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_epochs.html#mne_connectivity.spectral_connectivity_epochs) to compute connectivity between the simulated signals.\n", + "\n", + "We first generate the results from a bivariate connectivity method to use as a comparison for the multivariate methods." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Bivariate connectivity**\n", + "\n", + "**Exercise:** Compute connectivity using the imaginary part of coherency (`imcoh` method).\n", + "\n", + "Specify the indices such that connectivity is only computed between the 3 sets of interacting channels." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[1;31mSignature:\u001b[0m\n", + "\u001b[0mmne_connectivity\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mspectral_connectivity_epochs\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mdata\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mnames\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mmethod\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;34m'coh'\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mindices\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0msfreq\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mmode\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;34m'multitaper'\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mfmin\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mfmax\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0minf\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mfskip\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mfaverage\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mFalse\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mtmin\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mtmax\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mmt_bandwidth\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mmt_adaptive\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mFalse\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mmt_low_bias\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mTrue\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mcwt_freqs\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mcwt_n_cycles\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m7\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mgc_n_lags\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m40\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mrank\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mblock_size\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m1000\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mn_jobs\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;36m1\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m \u001b[0mverbose\u001b[0m\u001b[1;33m=\u001b[0m\u001b[1;32mNone\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\n", + "\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", + "\u001b[1;31mDocstring:\u001b[0m\n", + "Compute frequency- and time-frequency-domain connectivity measures.\n", + "\n", + "The connectivity method(s) are specified using the \"method\" parameter.\n", + "All methods are based on estimates of the cross- and power spectral\n", + "densities (CSD/PSD) Sxy and Sxx, Syy.\n", + "\n", + "Parameters\n", + "----------\n", + "data : array-like, shape=(n_epochs, n_signals, n_times) | Epochs\n", + " The data from which to compute connectivity. Note that it is also\n", + " possible to combine multiple signals by providing a list of tuples,\n", + " e.g., data = [(arr_0, stc_0), (arr_1, stc_1), (arr_2, stc_2)],\n", + " corresponds to 3 epochs, and arr_* could be an array with the same\n", + " number of time points as stc_*. The array-like object can also\n", + " be a list/generator of array, shape =(n_signals, n_times),\n", + " or a list/generator of SourceEstimate or VolSourceEstimate objects.\n", + "\n", + "names : list | np.ndarray | None\n", + " The names of the nodes of the dataset used to compute\n", + " connectivity. If 'None' (default), then names will be\n", + " a list of integers from 0 to ``n_nodes``. If a list\n", + " of names, then it must be equal in length to ``n_nodes``.\n", + "method : str | list of str\n", + " Connectivity measure(s) to compute. These can be ``['coh', 'cohy',\n", + " 'imcoh', 'mic', 'mim', 'plv', 'ciplv', 'ppc', 'pli', 'dpli', 'wpli',\n", + " 'wpli2_debiased', 'gc', 'gc_tr']``. Multivariate methods (``['mic',\n", + " 'mim', 'gc', 'gc_tr]``) cannot be called with the other methods.\n", + "indices : tuple of array | None\n", + " Two arrays with indices of connections for which to compute\n", + " connectivity. If a bivariate method is called, each array for the seeds\n", + " and targets should contain the channel indices for each bivariate\n", + " connection. If a multivariate method is called, each array for the\n", + " seeds and targets should consist of nested arrays containing\n", + " the channel indices for each multivariate connection. If ``None``,\n", + " connections between all channels are computed, unless a Granger\n", + " causality method is called, in which case an error is raised.\n", + "sfreq : float\n", + " The sampling frequency. Required if data is not\n", + " :class:`Epochs `.\n", + "mode : str\n", + " Spectrum estimation mode can be either: 'multitaper', 'fourier', or\n", + " 'cwt_morlet'.\n", + "fmin : float | tuple of float\n", + " The lower frequency of interest. Multiple bands are defined using\n", + " a tuple, e.g., (8., 20.) for two bands with 8Hz and 20Hz lower freq.\n", + "fmax : float | tuple of float\n", + " The upper frequency of interest. Multiple bands are dedined using\n", + " a tuple, e.g. (13., 30.) for two band with 13Hz and 30Hz upper freq.\n", + "fskip : int\n", + " Omit every \"(fskip + 1)-th\" frequency bin to decimate in frequency\n", + " domain.\n", + "faverage : bool\n", + " Average connectivity scores for each frequency band. If True,\n", + " the output freqs will be a list with arrays of the frequencies\n", + " that were averaged.\n", + "tmin : float | None\n", + " Time to start connectivity estimation. Note: when \"data\" is an array,\n", + " the first sample is assumed to be at time 0. For other types\n", + " (Epochs, etc.), the time information contained in the object is used\n", + " to compute the time indices.\n", + "tmax : float | None\n", + " Time to end connectivity estimation. Note: when \"data\" is an array,\n", + " the first sample is assumed to be at time 0. For other types\n", + " (Epochs, etc.), the time information contained in the object is used\n", + " to compute the time indices.\n", + "mt_bandwidth : float | None\n", + " The bandwidth of the multitaper windowing function in Hz.\n", + " Only used in 'multitaper' mode.\n", + "mt_adaptive : bool\n", + " Use adaptive weights to combine the tapered spectra into PSD.\n", + " Only used in 'multitaper' mode.\n", + "mt_low_bias : bool\n", + " Only use tapers with more than 90 percent spectral concentration\n", + " within bandwidth. Only used in 'multitaper' mode.\n", + "cwt_freqs : array\n", + " Array of frequencies of interest. Only used in 'cwt_morlet' mode.\n", + "cwt_n_cycles : float | array of float\n", + " Number of cycles. Fixed number or one per frequency. Only used in\n", + " 'cwt_morlet' mode.\n", + "gc_n_lags : int\n", + " Number of lags to use for the vector autoregressive model when\n", + " computing Granger causality. Higher values increase computational cost,\n", + " but reduce the degree of spectral smoothing in the results. Only used\n", + " if ``method`` contains any of ``['gc', 'gc_tr']``.\n", + "rank : tuple of array | None\n", + " Two arrays with the rank to project the seed and target data to,\n", + " respectively, using singular value decomposition. If None, the rank of\n", + " the data is computed and projected to. Only used if ``method`` contains\n", + " any of ``['mic', 'mim', 'gc', 'gc_tr']``.\n", + "block_size : int\n", + " How many connections to compute at once (higher numbers are faster\n", + " but require more memory).\n", + "n_jobs : int\n", + " How many samples to process in parallel.\n", + "\n", + "verbose : bool, str, int, or None\n", + " If not None, override default verbose level (see :func:`mne.verbose`\n", + " for more info). If used, it should be passed as a\n", + " keyword-argument only.\n", + "\n", + "Returns\n", + "-------\n", + "con : array | list of array\n", + " Computed connectivity measure(s). Either an instance of\n", + " ``SpectralConnectivity`` or ``SpectroTemporalConnectivity``.\n", + " The shape of the connectivity result will be:\n", + "\n", + " - ``(n_cons, n_freqs)`` for multitaper or fourier modes\n", + " - ``(n_cons, n_freqs, n_times)`` for cwt_morlet mode\n", + " - ``n_cons = n_signals ** 2`` for bivariate methods with\n", + " ``indices=None``\n", + " - ``n_cons = 1`` for multivariate methods with ``indices=None``\n", + " - ``n_cons = len(indices[0])`` for bivariate and multivariate methods\n", + " when indices is supplied.\n", + "\n", + "See Also\n", + "--------\n", + "mne_connectivity.spectral_connectivity_time\n", + "mne_connectivity.SpectralConnectivity\n", + "mne_connectivity.SpectroTemporalConnectivity\n", + "\n", + "Notes\n", + "-----\n", + "Please note that the interpretation of the measures in this function\n", + "depends on the data and underlying assumptions and does not necessarily\n", + "reflect a causal relationship between brain regions.\n", + "\n", + "These measures are not to be interpreted over time. Each Epoch passed into\n", + "the dataset is interpreted as an independent sample of the same\n", + "connectivity structure. Within each Epoch, it is assumed that the spectral\n", + "measure is stationary. The spectral measures implemented in this function\n", + "are computed across Epochs. **Thus, spectral measures computed with only\n", + "one Epoch will result in errorful values and spectral measures computed\n", + "with few Epochs will be unreliable.** Please see\n", + "``spectral_connectivity_time`` for time-resolved connectivity estimation.\n", + "\n", + "The spectral densities can be estimated using a multitaper method with\n", + "digital prolate spheroidal sequence (DPSS) windows, a discrete Fourier\n", + "transform with Hanning windows, or a continuous wavelet transform using\n", + "Morlet wavelets. The spectral estimation mode is specified using the\n", + "\"mode\" parameter.\n", + "\n", + "By default, the connectivity between all signals is computed (only\n", + "connections corresponding to the lower-triangular part of the connectivity\n", + "matrix). If one is only interested in the connectivity between some\n", + "signals, the \"indices\" parameter can be used. For example, to compute the\n", + "connectivity between the signal with index 0 and signals \"2, 3, 4\" (a total\n", + "of 3 connections) one can use the following::\n", + "\n", + " indices = (np.array([0, 0, 0]), # row indices\n", + " np.array([2, 3, 4])) # col indices\n", + "\n", + " con = spectral_connectivity_epochs(data, method='coh',\n", + " indices=indices, ...)\n", + "\n", + "In this case con.get_data().shape = (3, n_freqs). The connectivity scores\n", + "are in the same order as defined indices.\n", + "\n", + "For multivariate methods, this is handled differently. If \"indices\" is\n", + "None, connectivity between all signals will be computed and a single\n", + "connectivity spectrum will be returned (this is not possible if a Granger\n", + "causality method is called). If \"indices\" is specified, seed and target\n", + "indices for each connection should be specified as nested array-likes. For\n", + "example, to compute the connectivity between signals (0, 1) -> (2, 3) and\n", + "(0, 1) -> (4, 5), indices should be specified as::\n", + "\n", + " indices = (np.array([[0, 1], [0, 1]]), # seeds\n", + " np.array([[2, 3], [4, 5]])) # targets\n", + "\n", + "More information on working with multivariate indices and handling\n", + "connections where the number of seeds and targets are not equal can be\n", + "found in the :doc:`../auto_examples/handling_ragged_arrays` example.\n", + "\n", + "**Supported Connectivity Measures**\n", + "\n", + "The connectivity method(s) is specified using the \"method\" parameter. The\n", + "following methods are supported (note: ``E[]`` denotes average over\n", + "epochs). Multiple measures can be computed at once by using a list/tuple,\n", + "e.g., ``['coh', 'pli']`` to compute coherence and PLI.\n", + "\n", + " 'coh' : Coherence given by::\n", + "\n", + " | E[Sxy] |\n", + " C = ---------------------\n", + " sqrt(E[Sxx] * E[Syy])\n", + "\n", + " 'cohy' : Coherency given by::\n", + "\n", + " E[Sxy]\n", + " C = ---------------------\n", + " sqrt(E[Sxx] * E[Syy])\n", + "\n", + " 'imcoh' : Imaginary coherence :footcite:`NolteEtAl2004` given by::\n", + "\n", + " Im(E[Sxy])\n", + " C = ----------------------\n", + " sqrt(E[Sxx] * E[Syy])\n", + "\n", + " 'mic' : Maximised Imaginary part of Coherency (MIC)\n", + " :footcite:`EwaldEtAl2012` given by:\n", + "\n", + " :math:`MIC=\\Large{\\frac{\\boldsymbol{\\alpha}^T \\boldsymbol{E \\beta}}\n", + " {\\parallel\\boldsymbol{\\alpha}\\parallel \\parallel\\boldsymbol{\\beta}\n", + " \\parallel}}`\n", + "\n", + " where: :math:`\\boldsymbol{E}` is the imaginary part of the\n", + " transformed cross-spectral density between seeds and targets; and\n", + " :math:`\\boldsymbol{\\alpha}` and :math:`\\boldsymbol{\\beta}` are\n", + " eigenvectors for the seeds and targets, such that\n", + " :math:`\\boldsymbol{\\alpha}^T \\boldsymbol{E \\beta}` maximises\n", + " connectivity between the seeds and targets.\n", + "\n", + " 'mim' : Multivariate Interaction Measure (MIM)\n", + " :footcite:`EwaldEtAl2012` given by:\n", + "\n", + " :math:`MIM=tr(\\boldsymbol{EE}^T)`\n", + "\n", + " 'plv' : Phase-Locking Value (PLV) :footcite:`LachauxEtAl1999` given\n", + " by::\n", + "\n", + " PLV = |E[Sxy/|Sxy|]|\n", + "\n", + " 'ciplv' : corrected imaginary PLV (ciPLV)\n", + " :footcite:`BrunaEtAl2018` given by::\n", + "\n", + " |E[Im(Sxy/|Sxy|)]|\n", + " ciPLV = ------------------------------------\n", + " sqrt(1 - |E[real(Sxy/|Sxy|)]| ** 2)\n", + "\n", + " 'ppc' : Pairwise Phase Consistency (PPC), an unbiased estimator\n", + " of squared PLV :footcite:`VinckEtAl2010`.\n", + "\n", + " 'pli' : Phase Lag Index (PLI) :footcite:`StamEtAl2007` given by::\n", + "\n", + " PLI = |E[sign(Im(Sxy))]|\n", + "\n", + " 'pli2_unbiased' : Unbiased estimator of squared PLI\n", + " :footcite:`VinckEtAl2011`.\n", + "\n", + " 'dpli' : Directed Phase Lag Index (DPLI) :footcite:`StamEtAl2012`\n", + " given by (where H is the Heaviside function)::\n", + "\n", + " DPLI = E[H(Im(Sxy))]\n", + "\n", + " 'wpli' : Weighted Phase Lag Index (WPLI) :footcite:`VinckEtAl2011`\n", + " given by::\n", + "\n", + " |E[Im(Sxy)]|\n", + " WPLI = ------------------\n", + " E[|Im(Sxy)|]\n", + "\n", + " 'wpli2_debiased' : Debiased estimator of squared WPLI\n", + " :footcite:`VinckEtAl2011`.\n", + "\n", + " 'gc' : State-space Granger Causality (GC) :footcite:`BarnettSeth2015`\n", + " given by:\n", + "\n", + " :math:`GC = ln\\Large{(\\frac{\\lvert\\boldsymbol{S}_{tt}\\rvert}{\\lvert\n", + " \\boldsymbol{S}_{tt}-\\boldsymbol{H}_{ts}\\boldsymbol{\\Sigma}_{ss\n", + " \\lvert t}\\boldsymbol{H}_{ts}^*\\rvert}})`,\n", + "\n", + " where: :math:`s` and :math:`t` represent the seeds and targets,\n", + " respectively; :math:`\\boldsymbol{H}` is the spectral transfer\n", + " function; :math:`\\boldsymbol{\\Sigma}` is the residuals matrix of\n", + " the autoregressive model; and :math:`\\boldsymbol{S}` is\n", + " :math:`\\boldsymbol{\\Sigma}` transformed by :math:`\\boldsymbol{H}`.\n", + "\n", + " 'gc_tr' : State-space GC on time-reversed signals\n", + " :footcite:`BarnettSeth2015,WinklerEtAl2016` given by the same equation\n", + " as for 'gc', but where the autocovariance sequence from which the\n", + " autoregressive model is produced is transposed to mimic the reversal of\n", + " the original signal in time.\n", + "\n", + "References\n", + "----------\n", + ".. footbibliography::\n", + "\u001b[1;31mFile:\u001b[0m c:\\users\\sangeetha\\anaconda3\\envs\\mne\\lib\\site-packages\\mne_connectivity\\spectral\\epochs.py\n", + "\u001b[1;31mType:\u001b[0m function" + ] + } + ], + "source": [ + " mne_connectivity.spectral_connectivity_epochs?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "bivariate_connectivity = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"imcoh\", indices=([0, 2, 4], [1, 3, 5])\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Plot the connectivity results for each connection to verify the interaction is present.\n", + "\n", + "*Hint:* Results for the imaginary part of coherency can be positive and negative. For our purposes, you should take the absolute values of the results using NumPy's [`abs()`](https://numpy.org/doc/stable/reference/generated/numpy.absolute.html) function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "for con_idx in range(len(bivariate_connectivity.indices[0])):\n", + " plot_connectivity(\n", + " np.abs(bivariate_connectivity.get_data()[con_idx]), bivariate_connectivity.freqs\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Summarise the bivariate connectivity results by averaging across the 3 connections.\n", + "\n", + "*Hint:* Use NumPy's [`mean()`](https://numpy.org/doc/stable/reference/generated/numpy.mean.html) function, and don't forget to take the absolute values first!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "average_connectivity = np.mean(np.abs(bivariate_connectivity.get_data()), axis=0)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Plot the average bivariate connectivity results using the custom `plot_connectivity()` helper function as before.\n", + "\n", + "What do you notice about the scale of the values? What is the reason for this?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "plot_connectivity(average_connectivity, bivariate_connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 3 - Multivariate connectivity\n", + "\n", + "We will now examine connectivity for some multivariate methods, but before we do so, we need to consider the `indices` parameter again." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Indices for bivariate connectivity\n", + "\n", + "`indices` has the form `(seeds, targets)`, where the length of `seeds` and `targets` corresponds to the number of connections.\n", + "\n", + "For bivariate connectivity, `seeds` and `targets` are array-likes of integers, e.g.:\n", + "- `seeds=[0, 2, 4]`\n", + "- `targets=[1, 3, 5]`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Indices for multivariate connectivity\n", + "\n", + "For multivariate connectivity on the other hand, since we are computing connectivity between multiple channels, we need a way to distinguish between the channels belonging to each connection.\n", + "\n", + "Accordingly, we nest the entries for each connection as array-likes within `seeds` and `targets`.\n", + "\n", + "E.g. computing a single multivariate connection between channels 0, 2, and 4 to channels 1, 3, and 5 would require:\n", + "- `seeds=[[0, 2, 4]]`\n", + "- `targets=[[1, 3, 5]]`.\n", + "\n", + "Note how the length of `seeds` and `targets` still corresponds to the number of connections (in this case, 1).\n", + "\n", + "
\n", + "\n", + "E.g. we could compute two multivariate connections with `seeds=[[0], [2, 4]]` and `targets=[[1, 3], [5]]`.\n", + "\n", + "Again, the lengths of `seeds` and `targets` correspond to the number of connections (2), but see how we specify the channels for each connections as a separate array-like.\n", + "\n", + "You may also notice that the number of channels can differ for each connection, making these multivariate methods very flexible.\n", + "\n", + "
\n", + "\n", + "More information on the `indices` parameter for multivariate connectivity can be found here: https://mne.tools/mne-connectivity/dev/auto_examples/handling_ragged_arrays.html#sphx-glr-auto-examples-handling-ragged-arrays-py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The image below summarises how indices for bivariate and multivariate methods are handled in MNE-Connectivity:\n", + "\n", + "\"Cheat" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Multivariate connectivity methods\n", + "\n", + "Earlier we saw that MNE-Connectivity supports multiple bivariate connectivity methods.\n", + "\n", + "Several multivariate methods are also available:\n", + "- maximised imaginary part of coherency - [`mic`](https://doi.org/10.1016/j.neuroimage.2011.11.084)\n", + "- multivariate interaction measure - [`mim`](https://doi.org/10.1016/j.neuroimage.2011.11.084)\n", + "- state-space Granger causality -[`gc`](https://doi.org/10.1103/PhysRevE.91.040101)\n", + "- state-space Granger causality on time-reversed signals -[`gc_tr`](https://doi.org/10.1109/TSP.2016.2531628)\n", + "\n", + "Again, references and relevant equations are given in the [documentation](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_epochs.html#mne_connectivity.spectral_connectivity_epochs).\n", + "\n", + "As for the various bivariate methods, the different multivariate methods enable an appropriate analysis of signals in various contexts.\n", + "\n", + "What is relevant to understand for now is that the maximised imaginary part of coherency (`mic` method) is a multivariate form of the imaginary part of coherency." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - multivariate connectivity**\n", + "\n", + "**Exercise:** Compute connectivity using the maximised imaginary part of coherency (`mic` method).\n", + "\n", + "Do this for each interacting pair of channels separately (i.e. 3 connections in total)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "multivariate_connectivity_separate = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"mic\", indices=([[0], [2], [4]], [[1], [3], [5]])\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Plot the results for each connection.\n", + "\n", + "How do the pair-wise results for the multivariate method compare to the pair-wise results for the bivariate method above?\n", + "\n", + "*Hint:* We want to take the absolute values of the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "for con_idx in range(len(multivariate_connectivity_separate.indices[0])):\n", + " plot_connectivity(\n", + " np.abs(multivariate_connectivity_separate.get_data()[con_idx]),\n", + " multivariate_connectivity_separate.freqs,\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute connectivity between the same seed and target channels as before but in a single connection." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "multivariate_connectivity = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"mic\", indices=([[0, 2, 4]], [[1, 3, 5]])\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Plot the results for this single connection.\n", + "\n", + "How do the results for this single connection of the multivariate method compare to the single connection of the bivariate method which we obtained by averaging?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "plot_connectivity(np.abs(multivariate_connectivity.get_data()[0]), multivariate_connectivity.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Part 4 - Directed connectivity\n", + "\n", + "So far, the focus has been on coherency-based measures of connectivity.\n", + "\n", + "Coherency-based measures can be very powerful, but they tell us nothing about the direction of the interaction between signals (i.e. they are undirected measures of connectivity).\n", + "\n", + "In contrast, directed measures of connectivity tell us how information is flowing between seeds and targets. Granger causality is one such directed connectivity method." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Granger causality\n", + "\n", + "When we created the signals, we simulated the information flow from the seeds to the targets.\n", + "\n", + "As such, we expect Granger causality to be high from `seeds -> targets`, but low from `targets -> seeds`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercises - Directed connectivity**\n", + "\n", + "**Exercise:** Compute Granger causality (`gc` method) from the seeds to the targets as a single connection and plot the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "gc_seeds_targets = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"gc\", indices=([[0, 2, 4]], [[1, 3, 5]])\n", + ")\n", + "plot_connectivity(gc_seeds_targets.get_data()[0], gc_seeds_targets.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute Granger causality from the targets to the seeds as a single connection and plot the results.\n", + "\n", + "Are the values of the results lower for `target -> seeds` than `seeds -> targets`?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "gc_targets_seeds = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"gc\", indices=([[1, 3, 5]], [[0, 2, 4]])\n", + ")\n", + "plot_connectivity(gc_targets_seeds.get_data()[0], gc_targets_seeds.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute connectivity for both directions (i.e. `seeds -> targets` and `targets -> seeds`) in the same call to `spectral_connectivity_epochs()`.\n", + "\n", + "Plot the results to verify they match those when computed separately." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "gc = mne_connectivity.spectral_connectivity_epochs(\n", + " data=epochs, method=\"gc\", indices=([[0, 2, 4], [1, 3, 5]], [[1, 3, 5], [0, 2, 4]])\n", + ")\n", + "\n", + "for con_idx in range(2):\n", + " plot_connectivity(gc.get_data()[con_idx], gc.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Investigating bidirectional communication\n", + "\n", + "In neuroscience, we often study systems where information does not only flow in one direction, but reciprocally between brain regions.\n", + "\n", + "Accordingly, examining the **net** directionality of communication can be very useful in identifying the 'drivers' and 'recipients'.\n", + "\n", + "Net Granger causality can be easily computed by subtracting the results of each direction from one another:
\n", + "`seeds -> targets` - `targets -> seeds`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Compute the net Granger scores from the results computed above, and plot the results.\n", + "\n", + "What does this tell us about which set of signals are the 'drivers' and which are the 'recipients'?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "plot_connectivity(gc.get_data()[0] - gc.get_data()[1], gc_seeds_targets.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Check what happens if we flip the seeds and targets when computing the net Granger scores.\n", + "\n", + "Does this tell us the same thing?" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## CODE GOES HERE\n", + "plot_connectivity(gc.get_data()[1] - gc.get_data()[0], gc_seeds_targets.freqs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, MNE-Connectivity also supports multivariate methods for investigating connectivity in directed and undirected forms." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Conclusion\n", + "\n", + "The examples above have been kept simple to demonstrate the basic principles of multivariate connectivity in MNE.\n", + "\n", + "The extensive benefits of multivariate connectivity methods are realised fully in scenarios involving a large number of channels with complex interactions, scenarios where data-driven approaches for extracting the relevant components of connectivity are extremely powerful.\n", + "\n", + "The multivariate methods are also supported by the alternative [`spectral_connectivity_time()`](https://mne.tools/mne-connectivity/stable/generated/mne_connectivity.spectral_connectivity_time.html#mne_connectivity.spectral_connectivity_time) function." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Additional resources\n", + "\n", + "MNE tutorial on multivariate coherency: https://mne.tools/mne-connectivity/dev/auto_examples/mic_mim.html\n", + "\n", + "MNE tutorial on multivariate Granger causality: https://mne.tools/mne-connectivity/dev/auto_examples/granger_causality.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "mne_course", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.7" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/workshops/mne_course/Day3/__pycache__/_helper_functions.cpython-311.pyc b/workshops/mne_course/Day3/__pycache__/_helper_functions.cpython-311.pyc new file mode 100644 index 0000000..27142c1 Binary files /dev/null and b/workshops/mne_course/Day3/__pycache__/_helper_functions.cpython-311.pyc differ diff --git a/workshops/mne_course/Day3/_helper_functions.py b/workshops/mne_course/Day3/_helper_functions.py new file mode 100644 index 0000000..6f449ed --- /dev/null +++ b/workshops/mne_course/Day3/_helper_functions.py @@ -0,0 +1,119 @@ +"""Helper functions for using the notebooks.""" + +import mne +import numpy as np +from matplotlib import pyplot as plt + + +def simulate_connectivity( + n_seeds: int, + n_targets: int, + freq_band: tuple[int, int], + n_epochs: int = 10, + n_times: int = 200, + sfreq: int = 100, + snr: float = 0.7, + connection_delay: int = 10, + rng_seed: int | None = None, +) -> mne.Epochs: + """Simulates signals interacting in a given frequency band. + + Parameters + ---------- + n_seeds : int + Number of seed channels to simulate. + + n_targets : int + Number of target channels to simulate. + + freq_band : tuple of int, int + Frequency band where the connectivity should be simulated, where the first entry corresponds + to the lower frequency, and the second entry to the higher frequency. + + n_epochs : int (default 10) + Number of epochs in the simulated data. + + n_times : int (default 200) + Number of timepoints each epoch of the simulated data. + + sfreq : int (default 100) + Sampling frequency of the simulated data, in Hz. + + snr : float (default 0.7) + Signal-to-noise ratio of the simulated data. + + connection_delay : int (default 10) + Number of timepoints for the delay of connectivity between the seeds and targets. If > 0, + the target data is a delayed form of the seed data by this many timepoints. + + rng_seed : int | None (default None) + Seed to use for the random number generator. If `None`, no seed is specified. + + Returns + ------- + epochs : mne.Epochs + The simulated data stored in an Epochs object. The channels are arranged according to seeds, + then targets. + """ + if rng_seed is not None: + np.random.seed(rng_seed) + + n_channels = n_seeds + n_targets + trans_bandwidth = 1 # Hz + + # simulate signal source at desired frequency band + signal = np.random.randn(1, n_epochs * n_times + connection_delay) + signal = mne.filter.filter_data( + data=signal, + sfreq=sfreq, + l_freq=freq_band[0], + h_freq=freq_band[1], + l_trans_bandwidth=trans_bandwidth, + h_trans_bandwidth=trans_bandwidth, + fir_design="firwin2", + verbose=False, + ) + + # simulate noise for each channel + noise = np.random.randn(n_channels, n_epochs * n_times + connection_delay) + + # create data by projecting signal into noise + data = (signal * snr) + (noise * (1 - snr)) + + # shift target data by desired delay + if connection_delay > 0: + # shift target data + data[n_seeds:, connection_delay:] = data[n_seeds:, : n_epochs * n_times] + # remove extra time + data = data[:, : n_epochs * n_times] + + # reshape data into epochs + data = data.reshape(n_channels, n_epochs, n_times) + data = data.transpose((1, 0, 2)) # (epochs x channels x times) + + # store data in an MNE Epochs object + ch_names = [f"{ch_i}_{freq_band[0]}_{freq_band[1]}" for ch_i in range(n_channels)] + info = mne.create_info(ch_names=ch_names, sfreq=sfreq, ch_types="eeg", verbose=False) + epochs = mne.EpochsArray(data=data, info=info, verbose=False) + + return epochs + + +def plot_connectivity(results: np.ndarray, freqs: list): + """Plots connectivity results for a single connection. + + Parameters + ---------- + results : numpy.ndarray, shape of (frequencies) + Results for a single connection. + + freqs : list + Frequencies in `results`. + """ + if results.shape != (len(freqs),): + raise ValueError("`results` must be a 1D array with the same length as `freqs`.") + _, ax = plt.subplots(1, 1) + ax.plot(freqs, results, linewidth=3) + ax.set_xlabel("Frequency (Hz)") + ax.set_ylabel("Connectivity (A.U.)") + plt.show()