layered.activation module

class Activation[source]

Bases: object

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]

Compute the derivative of the cost with respect to the input of this activation function. Outgoing is what this function returned in the forward pass and above is the derivative of the cost with respect to the outgoing activation.

class Identity[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Sigmoid[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Relu[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]
class Softmax[source]

Bases: layered.activation.Activation

__call__(incoming)[source]
delta(incoming, outgoing, above)[source]