Семинары
RUS  ENG    ЖУРНАЛЫ   ПЕРСОНАЛИИ   ОРГАНИЗАЦИИ   КОНФЕРЕНЦИИ   СЕМИНАРЫ   ВИДЕОТЕКА   ПАКЕТ AMSBIB  
Календарь
Поиск
Регистрация семинара

RSS
Ближайшие семинары




Коллоквиум Факультета компьютерных наук НИУ ВШЭ
1 февраля 2022 г. 16:20–17:40, г. Москва, Покровский бульвар 11
 


Approximation with neural networks of minimal size: exotic regimes and superexpressive activations

Dmitry Yarotsky

Количество просмотров:
Эта страница:166
Youtube:

Dmitry Yarotsky



Аннотация: I will discuss some "exotic" regimes arising in theoretical studies of function approximation by neural networks of minimal size. The classical theory predicts specific power laws relating the model complexity to the approximation accuracy for functions of given smoothness, under the assumption of continuous parameter selection. It turns out that these power laws can break down if we use very deep narrow networks and don't impose the said assumption. This effect is observed for networks with common activation functions, e.g. ReLU. Moreover, there exist some "superexpressive" collections of activation functions that theoretically allow to approximate any continuous function with arbitrary accuracy using a network with a fixed number of neurons, i.e. only by suitably adjusting the weights without increasing the number of neurons. This result is closely connected to the Kolmogorov(-Arnold) Superposition Theorem. An example of superexpressive collection is {sin, arcsin}. At the same time, the commonly used activations are not superexpressive.

Website: https://cs.hse.ru/announcements/556734483.html
 
  Обратная связь:
 Пользовательское соглашение  Регистрация посетителей портала  Логотипы © Математический институт им. В. А. Стеклова РАН, 2024