Seminars
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
Calendar
Search
Add a seminar

RSS
Forthcoming seminars




PreMoLab Seminar
February 9, 2012 16:00, Moscow, A. A. Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences (Bol'shoi Karetnyi per., 19), room 615
 


Random gradient-free minimization of convex functions

Yu. E. Nesterov

Université Catholique de Louvain

Number of views:
This page:359

Abstract: In this talk, we prove the complexity bounds for methods of Convex Optimization based only on computation of the function value. The search directions of our schemes are normally distributed random Gaussian vectors. It appears that such methods usually need at most $n$ times more iterations than the standard gradient methods, where $n$ is the dimension of the space of variables. This conclusion is true both for nonsmooth and smooth problems. For the later class, we present also an accelerated scheme with the expected rate of convergence $O(n^2/k^2)$, where $k$ is the iteration counter. For Stochastic Optimization, we propose a zero-order scheme and justify its expected rate of convergence $O(n/k^{1/2}). We give also some bounds for the rate of convergence of the random gradient-free methods to stationary points of nonconvex functions, both for smooth and nonsmooth cases. Our theoretical results are supported by preliminary computational experiments.
 
  Contact us:
 Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024