Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding example in pynest named sensitivity_to_perturbation.py #198

Merged
merged 8 commits into from Sep 12, 2016

Conversation

@einfallslos
Copy link
Contributor

einfallslos commented Dec 18, 2015

No description provided.

@abigailm
Copy link
Contributor

abigailm commented Dec 20, 2015

Hi Claudia (@einfallslos), for future reference, you can nominate reviewers for your code by naming them here with an @ before their user name, like this: @jougs, please take a look at this from the compatibility with other examples point of view, I will check the example itself.

This script simulates a network in two successive trials which
are identical except for an extra input spike in the second realisation.
(a small perturbation). The network consists of recurrent, randomly
connected excitatory and inhibitory neurons. Its activity is driven

This comment has been minimized.

Copy link
@jougs

jougs Feb 15, 2016

Contributor

Please exchange the first and the second sentence.

This comment has been minimized.

Copy link
@abigailm

abigailm Apr 18, 2016

Contributor

Really? Why? I think it makes sense that the first sentence states what is the main point of the example

This comment has been minimized.

Copy link
@jougs

jougs Apr 27, 2016

Contributor

On second thought, I agree with @abigailm. @einfallslos, please ignore my comment

---------------------------
This script simulates a network in two successive trials which
are identical except for an extra input spike in the second realisation.

This comment has been minimized.

Copy link
@jougs

jougs Feb 15, 2016

Contributor

"an" -> "one"

Sensitivity to perturbation
---------------------------
This script simulates a network in two successive trials which

This comment has been minimized.

Copy link
@jougs

jougs Feb 15, 2016

Contributor

comma before "which"

pylab.ylabel('neuron id')
pylab.xlim((0, T))
pylab.ylim((0, N))
pylab.show()

This comment has been minimized.

Copy link
@jougs

jougs Feb 15, 2016

Contributor

Please remove the call to show(). This will make the CI choke, as there's no display attached there.


NE = 1000 # number of excitatory neurons
NI = 250 # number of inhibitory neurons
N = NE + NI

This comment has been minimized.

Copy link
@jougs

jougs Feb 15, 2016

Contributor

# total number of neurons


suppr = nest.Create("dc_generator",
params={'amplitude':-1e16, 'start': T, 'stop':T+fade_out})
nest.Connect(suppr, list(allnodes))

This comment has been minimized.

Copy link
@abigailm

abigailm Apr 18, 2016

Contributor

allnodes is already a list, it shouldn't need to be cast

'''
As mentioned above, we need to reset the network, the random number
generator, and the clock of the Kernel. In addition we make sure
that there is no spike left in the spike generator.

This comment has been minimized.

Copy link
@abigailm

abigailm Apr 18, 2016

Contributor

in the spike detector or the spike generator?

leads to rather chaotic acitivity.
'''

J = .1 # excitatory synaptic weight (mV)

This comment has been minimized.

Copy link
@abigailm

abigailm Apr 18, 2016

Contributor

Given the examples end up on the home page, it would be good to select a value for J that gives 'interesting' results - does 0.1 do this? If not, I suggest changing it.

This comment has been minimized.

Copy link
@einfallslos

einfallslos May 30, 2016

Author Contributor

I have changed the network weight to 0.5. Now, the network dynamic is chaotic, although the weight is still small.

@abigailm
Copy link
Contributor

abigailm commented Apr 18, 2016

The code is really nicely documented and easy to follow. Once the comment above are addressed it gets 👍 from me.

for trial in [0,1]:
'''
As mentioned above, we need to reset the network, the random number
generator, and the clock of the Kernel. In addition we make sure

This comment has been minimized.

Copy link
@jougs

jougs Apr 27, 2016

Contributor

"Kernel" -> "simulation kernel"

…d intrinsic network weight "J" in order to get "interesting" results
@heplesser
Copy link
Contributor

heplesser commented Jun 10, 2016

@einfallslos Before any further work on this PR, please merge all changes in master to your branch so that Travis can properly check your code again; the old setup won't work any more (#380, #391).

@jougs
Copy link
Contributor

jougs commented Sep 12, 2016

Very nice. Many thanks for addressing all issues raised. 👍 and merging!

@jougs jougs merged commit df5722f into nest:master Sep 12, 2016
1 check passed
1 check passed
continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

4 participants
You can’t perform that action at this time.