AMSGrad

AMSGrad()

AMSGrad optimizer, a monotone-variance variant of Adam.

Use the same objective interface as Adam.

Attributes

Name Description
batchSize
beta1
beta2
epsilon
exactObjective
maxIterations
resetPolicy
shuffle
stepSize
tolerance

Methods

Name Description
optimize optimize(self: pyensmallen._pyensmallen.AMSGrad, objective: collections.abc.Callable[[typing.Annotated[numpy.typing.ArrayLike, numpy.float64], typing.Annotated[numpy.typing.ArrayLike, numpy.float64]], float], initial_point: typing.Annotated[numpy.typing.ArrayLike, numpy.float64]) -> numpy.typing.NDArray[numpy.float64]

optimize

AMSGrad.optimize()

optimize(self: pyensmallen._pyensmallen.AMSGrad, objective: collections.abc.Callable[[typing.Annotated[numpy.typing.ArrayLike, numpy.float64], typing.Annotated[numpy.typing.ArrayLike, numpy.float64]], float], initial_point: typing.Annotated[numpy.typing.ArrayLike, numpy.float64]) -> numpy.typing.NDArray[numpy.float64]

Optimize an objective from the provided starting point.

Parameters

Name Type Description Default
objective callable Callable with signature objective(params, gradient). The callable should write the gradient in place and return the scalar objective. required
initial_point ndarray Initial parameter vector. required

Returns

Name Type Description
ndarray Optimized parameter vector.