layered.gradient module

class Gradient(network, cost)[source]

Bases: object

__call__(weights, example)[source]
class Backprop(network, cost)[source]

Bases: layered.gradient.Gradient

Use the backpropagation algorithm to efficiently determine the gradient of the cost function with respect to each individual weight.

__call__(weights, example)[source]
class NumericalGradient(network, cost, distance=1e-05)[source]

Bases: layered.gradient.Gradient

Approximate the gradient for each weight individually by sampling the error function slightly above and below the current value of the weight.

__call__(weights, example)[source]

Modify each weight individually in both directions to calculate a numeric gradient of the weights.

class CheckedBackprop(network, cost, distance=1e-05, tolerance=1e-08)[source]

Bases: layered.gradient.Gradient

Computes the gradient both analytically trough backpropagation and numerically to validate the backpropagation implementation and derivatives of activation functions and cost functions. This is slow by its nature and it’s recommended to validate derivatives on small networks.

__call__(weights, example)[source]
class BatchBackprop(network, cost)[source]

Bases: object

Calculate the average gradient over a batch of examples.

__call__(weights, examples)[source]
class ParallelBackprop(network, cost, workers=4)[source]

Bases: object

Alternative to BatchBackprop that yields the same results but utilizes multiprocessing to make use of more than one processor core.

__call__(weights, examples)[source]