Аннотация:
In Convex Optimization, numerical schemes are always developed for some specific problem classes. One of the most important characteristics of such classes is the level of smoothness of the objective function. Methods for nonsmooth functions are different from the methods for smooth ones. Different rate of growth of second derivatives require different approximation technique.
However, very often the level of smoothness of the objective is difficult to estimate in advance. In this talk we present algorithms which adjust their behavior in accordance to the actual level of smoothness observed during the minimization process. Their only input parameter is the required accuracy of the solution.