Skip to content

Commit

Permalink
added missing parameters in docstring.
Browse files Browse the repository at this point in the history
Signed-off-by: bamsumit <bam_sumit@hotmail.com>
  • Loading branch information
bamsumit committed Feb 24, 2022
1 parent b2ead4c commit 72f31d4
Showing 1 changed file with 29 additions and 0 deletions.
29 changes: 29 additions & 0 deletions src/lava/proc/sdn/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,18 @@ class Delta(AbstractProcess):
Parameters
----------
shape: Tuple
shape of the sigma process. Default is (1,).
vth: int or float
threshold of the delta encoder.
cum_error: Bool
flag to enable/disable cumulative error accumulation. Default is False.
wgt_exp: int
weight scaling exponent. Note: this has effect only on fixed point
models. Default is 0.
state_exp: int
state variables scaling exponent. Note: this has effect only on fixed
point modles. Default is 0.
"""
def __init__(
self,
Expand Down Expand Up @@ -119,6 +131,23 @@ class SigmaDelta(AbstractProcess):
Parameters
----------
shape: Tuple
shape of the sigma process. Default is (1,).
vth: int or float
threshold of the delta encoder.
bias: int or float
bias to the neuron activation.
act_mode: enum
activation mode describing the non-linear activation function. Options
are described by ``ACTIVATION_MODE`` enum.
cum_error: Bool
flag to enable/disable cumulative error accumulation. Default is False.
wgt_exp: int
weight scaling exponent. Note: this has effect only on fixed point
models. Default is 0.
state_exp: int
state variables scaling exponent. Note: this has effect only on fixed
point modles. Default is 0.
"""
def __init__(
self,
Expand Down

0 comments on commit 72f31d4

Please sign in to comment.