Seminars
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
Calendar
Search
Add a seminar

RSS
Forthcoming seminars




Mathematics of Artificial Intelligence
December 1, 2023 17:00, Moscow, Moscow, Skoltech Applied AI Center, room R2-2023
 


Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance

Nikita Kornilov

Skolkovo Institute of Science and Technology
Supplementary materials:
Adobe PDF 9.0 Mb

Number of views:
This page:106
Materials:11

Abstract: We consider non-smooth stochastic convex optimization with two function evaluations per round under infinite noise variance. In the classical setting when noise has finite variance, an optimal algorithm is built upon the batched accelerated gradient method. This optimality is defined in terms of iteration and oracle complexity, as well as the maximal admissible level of adversarial noise. However, the assumption of finite variance is burdensome and it might not hold in many practical scenarios. To address this, we demonstrate how to adapt a refined clipped version of the accelerated gradient (Stochastic Similar Triangles) method for a two-point zero-order oracle. This adaptation entails extending the batching technique to accommodate infinite variance – a non-trivial task that stands as a distinct contribution.

Supplementary materials: presentation.pdf (9.0 Mb)

Website: https://vk.com/wall-220010299_70
 
  Contact us:
 Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024