layered.activation module¶
-
class
SparseRange
(range_=0.3, function=<layered.activation.Sigmoid object>)[source]¶ Bases:
layered.activation.Activation
E%-Max Winner-Take-All.
Binary activation. First, the activation function is applied. Then all neurons within the specified range below the strongest neuron are set to one. All others are set to zero. The gradient is the one of the activation function for active neurons and zero otherwise.
See: A Second Function of Gamma Frequency Oscillations: An E%-Max Winner-Take-All Mechanism Selects Which Cells Fire. (2009)