- In mathematics, the
subderivative (or
subgradient)
generalizes the
derivative to
convex functions which are not
necessarily differentiable. The set of...
-
Subgradient methods are
convex optimization methods which use subderivatives.
Originally developed by Naum Z. Shor and
others in the 1960s and 1970s,...
- Cutting-plane
methods Ellipsoid method Subgradient method Dual
subgradients and the drift-plus-penalty
method Subgradient methods can be
implemented simply...
-
Subgradient methods: An
iterative method for
large locally Lipschitz functions using generalized gradients.
Following Boris T. Polyak,
subgradient–projection...
- size rules,
which were
first developed for
classical subgradient methods.
classical subgradient methods using divergent-series
rules are much
slower than...
- have a "
subgradient oracle": a
routine that can
compute a
subgradient of f at any
given point (if f is differentiable, then the only
subgradient is the...
-
space dilation in the
direction of the
difference of two
successive subgradients (the so-called r-algorithm), that was
created in
collaboration with Nikolay...
- z_{t}\rangle } . To
generalise the
algorithm to any
convex loss function, the
subgradient ∂ v t ( w t ) {\displaystyle \partial v_{t}(w_{t})} of v t {\displaystyle...
- 604861. Kiwiel,
Krzysztof C. (2001). "Convergence and
efficiency of
subgradient methods for
quasiconvex minimization".
Mathematical Programming, Series...
-
essentially the same
update as in the
unconstrained case, by
choosing a
subgradient g 0 {\displaystyle g_{0}} that
satisfies g 0 T ( x ∗ − x ( k ) ) + f...