Skip to content

Commit

Permalink
New v_explored property
Browse files Browse the repository at this point in the history
  • Loading branch information
SmokinCaterpillar committed Apr 9, 2014
1 parent 467e715 commit d55c2a5
Show file tree
Hide file tree
Showing 9 changed files with 202 additions and 29 deletions.
10 changes: 7 additions & 3 deletions CHANGES.txt
Original file line number Diff line number Diff line change
Expand Up @@ -39,13 +39,17 @@ pypet 0.1b5
* If automatic storage is enabled, trajectories are now stored at the end of the experiment,
no longer before the starting of the single runs

* You can use the `$` character to decide where the file should branch out for the
* You can use the `$` character to decide where the HDF5 file tree should branch out for the
individual runs

* `v_creator_name` is now called `v_run_branch` (since single runs can alos create
* `v_creator_name` is now called `v_run_branch` (since single runs can also create
items that are not part of a run branch, so this is no longer misleading`.

* Results and parameters now issue a warning when they have been stored
* Results and parameters now issue a warning when they have been stored and you
change their data

* Parameters now have a property `v_explored` which is True for explored parameters
even if the range has been removed


pypet 0.1b.4
Expand Down
85 changes: 74 additions & 11 deletions doc/source/cookbook/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,11 +68,29 @@ Moreover, *pypet* has an :class:`~pypet.envrionemnt.Environment` that takes care
and allows easy parallel exploration of the parameter space.

We will see how we can use both in our numerical experiment and the different stages.
In this tutorial we will simulate the `Lorenz Attractor`_ with a simple Euler scheme
similar to :ref:`example-05`.
In this tutorial we will simulate a simple neuron model. We will numerically integrate the
equation:

.. math::
\frac{dV}{dt} = -\frac{1}{\tau_V} + I
With an additional reset rule :math:`V \leftarrow 0` if :math`V \geq 1` and we will have
an additional refractory period of :math:`\tau_{ref}`. This means if we detect a so called
action potential, i.e. math:`V \geq V_T`, we will keep the voltage :math:`V` clamped at 0
for this period of time after the threshold crossing and freeze the differential equation.
We will keep the
neuron's time constant :math:`\frac{1}{\tau_V}=10ms` fixed and explore the parameter space
by varying different input currents :math:`I` and different length of the refractory periods
:math:`\tau_{ref}`. During the single runs, we will record the development of the variable
:math:`V` over time and count the number of threshold crossing to estimate the so called
firing rate of the neuron.
In the post processing phase we will collect these firing rates and write them into a numpy
array to compute a 2D heat map of firing rates depending on the input current and the refractory
period.


.. _`Lorenz Attractor`: https://en.wikipedia.org/wiki/Lorenz_attractor

^^^^^^^^^^^^^^^^^^^
Naming convention
Expand All @@ -92,7 +110,7 @@ Whereas `myresult.mydata` might refer to a data item named `mydata` added by the
#1 Pre-processing
-------------------------

Your experiment usually starts with the creation of an :class:`~pypet.environemnt.Environment`.
Your experiment usually starts with the creation of an :class:`~pypet.environment.Environment`.
Don't worry about the huge amount of parameters you can pass to the constructor,
these are more for tweaking of your experiment and the default settings are usually
suitable.
Expand All @@ -105,6 +123,13 @@ Yet, we will shortly discuss the most important ones here.
specifying the name of a new trajectory. The environment will create a trajectory
container for you than.

* `add_time`

If `True` and the environment creates a new trajectory container, it will add the current time
to the name in the format *_XXXX_XX_XX_XXhXXmXXs*.
So for instance if you set `trajectory='Gigawatts_Experiment'` and `add_time=true`,
your trajectory's name will be `Gigawatts_Experiment_2015_10_21_04h23m00s`).

* `log_folder`

The environment makes use of logging. You can specify a folder where all
Expand All @@ -120,8 +145,46 @@ Yet, we will shortly discuss the most important ones here.
but simple `print` statements in your python script, *pypet* can write these statements
into the log files if you enable `log_stdout`.

* `multiproc`

If we want to use multiprocessing. We sure do so, so we set this to `True`.

* `ncores`

The number of cpu cores we want to utilize. More precisely the number of processes we
start at the same time to calculate the single runs. Btw, there's usually no benefit to
setting this value higher than the actual number of cores your computer has.

* `filename`

We can specify the name of the resulting HDF5 file where all data will be stored.
We don't have to give a filename per se, we can also specify a folder `'./results/'` and
the new file will have the name of the trajectory.

* `git_repository`

If your code base is under git_ version control (it's not? Stop reading and get git_ NOW!),
you can specify the path to your root git
folder here. If you do this, *pypet* will a) trigger a new commit if it detects changes
of in working copy of your code and b) write the corresponding commit code into
your trajectory so you can immediately see with which version you did your experiments.

* `sumatra_project`

If your experiments are recorded with sumatra_ you can specify the path to your sumatra_
root folder here. *pypet* will automatically trigger the recording of your experiments
if you use :func:`~pypet.environment.f_run`, :func:`~pypet.environment.f_continue` or
:func:`~pypet.environment.f_pipeline` to start your single runs or whole experiment.
If you use *pypet* + git_ + sumatra_ there's no doubt that you ensure
the repeatability of your experiments!


.. _logging: https://docs.python.org/2/library/logging.html

.. _git: http://git-scm.com/

.. _sumatra: http://neuralensemble.org/sumatra/

-------------------------
The Trajectory container
-------------------------
Expand All @@ -131,16 +194,16 @@ It's basically instantiates a tree and data can be accessed in several ways. Let
we already have a trajectory container called `traj` with some nested data in it.

You can, for instance, access data via *natural naming*:
``traj.parameters.diffeq.sigma`` or square brackets ``traj['parameters']['diffeq']['sigma']``
or ``traj['parameters.diffeq.sigma']``, or use the
``traj.parameters.neuron.tau_ref`` or square brackets ``traj['parameters']['neuron']['tau_ref']``
or ``traj['parameters.neuron.tau_ref']``, or use the
:func:`~pypet.naturalnaming.NNGroupNode.f_get` method.

As long as your tree nodes are unique, you can shortcut through the tree. If there's only
one parameter `sigma`, ``traj.sigma`` is equivalent to ``traj.parameters.diffeq.sigma``.
one parameter `tau_ref`, ``traj.tau_ref`` is equivalent to ``traj.parameters.neuron.tau_ref``.

The tree contains two types of nodes, group nodes
(here for example `parameters`, `diffeq`) and leaf nodes
(here `sigma`). Group nodes can, as you have seen, contain other group or leaf nodes, whereas
(here for example `parameters`, `neuron`) and leaf nodes
(here `tau_ref`). Group nodes can, as you have seen, contain other group or leaf nodes, whereas
leaf nodes are terminal and do not contain more groups or leaves.
The leaf nodes are abstract containers for your actual data. Basically,
there exist two sub-types of these leaves :class:`~pypet.parameter.Parameter`
Expand All @@ -154,8 +217,8 @@ different runs.
Moreover, since a :class:`~pypet.parameter.Parameter` only contains a single value (apart
from the range),
*pypet* will assume that you usually don't care about the actual container but just about
the data. Thus, ``traj.parameters.diffeq.sigma`` will immediatly return the data value
for `sigma` and not the corresponding :class:`~pypet.parameter.Parameter` container.
the data. Thus, ``traj.parameters.neuron.tau_ref`` will immediatly return the data value
for `tau_ref` and not the corresponding :class:`~pypet.parameter.Parameter` container.

A :class:`~pypet.parameter.Result` container can manage several results. You can think of it
as non-nested dictionary. Actual data can also be accessed via natural naming or squared
Expand Down
53 changes: 53 additions & 0 deletions examples/example_13_post_processing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
__author__ = 'robert'

import numpy as np

def run_neuron(traj):

steps = int(traj.par.simulation.duration / float(traj.par.simulation.dt))

traj.f_add_derived_parameter('simulation.steps', steps, comment='The steps')
# This derived parameter will be sorted into the branch
# `traj.derived_parameters.runs.run_XXXXXXXXX.simulation.steps
# (where XXXXXXXXX is the index of the current run like run_00000003)
# since we did not use the `$` wildcard character.


# Extract all parameters from `traj`
V_init = traj.par.neuron.V_init
V_array = np.zeros((1,steps))
V_array[0] = V_init
I = traj.par.neuron.I
tau_V = traj.par.neuron.tau_V
dt = traj.par.neuron.dt
tau_ref = traj.par.neuron.tau_ref

spike_times = []
# Do the Euler integration:
for step in range(1, steps):
if V_array[step-1] >= 1:
# The membrane potential crossed the threshold and we mark this as
# an action potential
V_array[step] = 0
spike_times.append((step-1)*dt)
elif spike_times and step * dt - spike_times[-1] <= tau_ref:
# We are in the refractory period, so we simply clamp the voltage
# to 0
V_array[step] = 0
else:
# Euler Integration step:
dV = -1/tau_V * V_array[step-1] + I
V_array[step] = V_array[step-1] + dV*dt

# Add the voltage trace and spike times
traj.f_add_result('neuron.$', V=V_array, spike_times = spike_times,
comment='Contains the development of the membrane potential over time '
'as well as a list of spike times.')
# In contrast to the derived parameter above this result will be named
# `traj.results.neuron.run_XXXXXXXXX` and not `traj.results.runs.run_XXXXXXXXX.neuron`.

# And finally we return the estimate of the firing rate
return len(spike_times) / float(traj.par.simulation.duration)


def neuron_postproc(traj, result_list, )
4 changes: 3 additions & 1 deletion pypet/brian/parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,6 +205,7 @@ def _load(self,load_dict):
explore_list.append(brian_quantity)

self._explored_range=tuple(explore_list)
self._explored = True
elif self._storage_mode == BrianParameter.FLOAT_MODE:

# Recreate the brain units from the vale as float and unit as string:
Expand All @@ -221,7 +222,8 @@ def _load(self,load_dict):
brian_quantity = value*unit
explore_list.append(brian_quantity)

self._explored_range=tuple(explore_list)
self._explored_range = tuple(explore_list)
self._explored = True


except KeyError:
Expand Down
29 changes: 28 additions & 1 deletion pypet/parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -150,6 +150,7 @@ def __init__(self, full_name, comment=''):

# Whether to keep the full range array when pickled or not
self._full_copy = False
self._explored = False # If explored or not

def f_supports(self, data):
"""Checks whether the data is supported by the parameter."""
Expand All @@ -168,6 +169,16 @@ def f_supports_fast_access(self):
"""
return not self.f_is_empty()

@property
def v_explored(self):
"""Whether parameter is explored.
Does not necessarily have to be similar to
:func:`~pypet.parameter.BaseParameter.f_has_range` since the range can be
deleted on pickling and the parameter remains explored.
"""
return self._explored

@property
def v_full_copy(self):
Expand Down Expand Up @@ -222,7 +233,9 @@ def f_is_array(self):
return self.f_has_range()

def f_has_range(self):
"""Returns true if the parameter is explored and contains a range array.
"""Returns true if the parameter contains a range array.
Not necessarily equal to `v_explored` if the range is removed on
pickling due to `v_full_copy=False`.
ABSTRACT: Needs to be defined in subclass
Expand Down Expand Up @@ -750,6 +763,13 @@ def __len__(self):

@copydoc(BaseParameter.f_has_range)
def f_has_range(self):
"""If the parameter has a range.
Does not have to be `True` if the parameter is explored.
The range might be removed during pickling to save memory.
Accordingly, `v_explored` remains `True` whereas `f_has_range` is `False`.
"""
return len(self._explored_range)>0

def __getstate__(self):
Expand Down Expand Up @@ -954,6 +974,7 @@ def _explore(self, explore_iterable):
data_tuple = self._data_sanity_checks(explore_iterable)

self._explored_range = data_tuple
self._explored = True
self.f_lock()

def _expand(self,explore_iterable):
Expand Down Expand Up @@ -1055,6 +1076,7 @@ def _load(self,load_dict):
if 'explored_data' in load_dict:
self._explored_range = tuple([self._convert_data(x)
for x in load_dict['explored_data']['data'].tolist()])
self._explored = True

@copydoc(BaseParameter.f_get)
def f_get(self):
Expand All @@ -1078,6 +1100,7 @@ def _shrink(self):

del self._explored_range
self._explored_range={}
self._explored=False

@copydoc(BaseParameter.f_empty)
def f_empty(self):
Expand All @@ -1092,6 +1115,7 @@ def f_empty(self):
del self._default
self._data=None
self._default=None
self._explored=False


class ArrayParameter(Parameter):
Expand Down Expand Up @@ -1214,6 +1238,7 @@ def _load(self,load_dict):
explore_list.append(load_dict[arrayname])

self._explored_range=tuple([self._convert_data(x) for x in explore_list])
self._explored = True

except KeyError:
super(ArrayParameter,self)._load(load_dict)
Expand Down Expand Up @@ -1535,6 +1560,7 @@ def _load(self,load_dict):
explore_list.append(matrix)

self._explored_range=tuple(explore_list)
self._explored = True


except KeyError:
Expand Down Expand Up @@ -1681,6 +1707,7 @@ def _load(self,load_dict):
explore_list.append(loaded)

self._explored_range=tuple(explore_list)
self._explored = True


self._default=self._data
Expand Down
1 change: 1 addition & 0 deletions pypet/tests/briantests/brian_parameter_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ def explore(self):
## Explore the parameter:
for key, vallist in self.explore_dict.items():
self.param[key]._explore(vallist)
self.assertTrue(self.param[key].v_explored and self.param[key].f_has_range())


class BrianParameterStringModeTest(BrianParameterTest):
Expand Down
9 changes: 9 additions & 0 deletions pypet/tests/parameter_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -298,6 +298,7 @@ def test_exploration(self):
self.assertTrue(np.all(str(val) == str(param_val)),'%s != %s' %(str(val),str(param_val)))

param._restore_default()
self.assertTrue(param.v_explored and param.f_has_range(), 'Error for %s' % key)
val = self.data[key]
self.assertTrue(np.all(repr(param.f_get())==repr(val))),'%s != %s'%( str(param.f_get()),str(val))

Expand Down Expand Up @@ -361,6 +362,9 @@ def test_pickling_without_multiprocessing(self):


def test_pickling_with_mocking_multiprocessing(self):

self.test_exploration()

for key, param in self.param.items():
param.f_unlock()
param.v_full_copy=False
Expand All @@ -373,12 +377,17 @@ def test_pickling_with_mocking_multiprocessing(self):

self.param[key] = newParam

if key in self.explore_dict:
self.assertTrue(not newParam.f_has_range() and newParam.v_explored)

#self.test_exploration()

self.test_the_insertion_made_implicetly_in_setUp()

self.test_meta_settings()



def test_resizing_and_deletion(self):

for key, param in self.param.items():
Expand Down
11 changes: 11 additions & 0 deletions pypet/tests/trajectory_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -285,7 +285,18 @@ def test_get_data_dictionaries_directly(self):
self.assertTrue(comp.nested_equal(self.traj.f_get(key,fast_access=True),
explore_dict_directly[self.traj.f_get(key).v_full_name]))

def test_increase_exploration(self):

self.explore_dict = {'IntParam':[2,1,1,3]}

self.traj.f_explore(self.explore_dict)

self.assertTrue(len(self.traj._explored_parameters)==2)

self.traj._stored=True

with self.assertRaises(TypeError):
self.traj.f_explore(self.explore_dict)

def test_f_get(self):
self.traj.v_fast_access=True
Expand Down

0 comments on commit d55c2a5

Please sign in to comment.