Adagrad

class mlpractice.gradient_descent.Adagrad(W0: numpy.ndarray, lambda_: float, eps: float = 1e-08, S0: float = 1, P: float = 0.5)

Adaptive gradient algorithm class.

Parameters
W0np.ndarray

Initialize weights.

lambda_float

Learning rate parameter (step scale).

epsfloat

Smoothing term.

S0float

Learning rate parameter.

Pfloat

Learning rate parameter.

Attributes
Wnp.ndarray

Weights.

Methods

calc_gradient(X, Y)

Calculating MSE gradient.

step(X, Y, iteration)

Descending step.

update_weights(gradient, iteration)

Changing weights with respect to gradient.

calc_gradient(X: numpy.ndarray, Y: numpy.ndarray) numpy.ndarray

Calculating MSE gradient.

Parameters
Xnp.ndarray

Features.

Ynp.ndarray

Targets.

Returns
gradientnp.ndarray

Calculating gradient.

update_weights(gradient: numpy.ndarray, iteration: int) numpy.ndarray

Changing weights with respect to gradient.

Parameters
gradientnp.ndarray

Gradient of MSE.

iterationint

Iteration number.

Returns
weigh_diffnp.ndarray

Weight difference.