layered.activation module

class Activation[source]

Bases: object

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]

Compute the derivative of the cost with respect to the input of this activation function. Outgoing is what this function returned in the forward pass and above is the derivative of the cost with respect to the outgoing activation.

class Identity[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Sigmoid[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Relu[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Softmax[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class SparseField(inhibition=0.05, leaking=0.0)[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class SparseRange(range_=0.3, function=<layered.activation.Sigmoid object>)[source]

Bases: layered.activation.Activation

E%-Max Winner-Take-All.

Binary activation. First, the activation function is applied. Then all neurons within the specified range below the strongest neuron are set to one. All others are set to zero. The gradient is the one of the activation function for active neurons and zero otherwise.

See: A Second Function of Gamma Frequency Oscillations: An E%-Max Winner-Take-All Mechanism Selects Which Cells Fire. (2009)

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]