layered.gradient module¶
-
class
Backprop
(network, cost)[source]¶ Bases:
layered.gradient.Gradient
Use the backpropagation algorithm to efficiently determine the gradient of the cost function with respect to each individual weight.
-
class
NumericalGradient
(network, cost, distance=1e-05)[source]¶ Bases:
layered.gradient.Gradient
Approximate the gradient for each weight individually by sampling the error function slightly above and below the current value of the weight.
-
class
CheckedBackprop
(network, cost, distance=1e-05, tolerance=1e-08)[source]¶ Bases:
layered.gradient.Gradient
Computes the gradient both analytically trough backpropagation and numerically to validate the backpropagation implementation and derivatives of activation functions and cost functions. This is slow by its nature and it’s recommended to validate derivatives on small networks.