- This event has passed.
DSI Learning Club – Ohad Shamir, Weizmann Institute of Science (Faculty)
November 11, 2018 @ 12:00 pm - 1:00 pm IST
Nov. 11th 2018, Sun. 12:00 , Ohad Shamir (webpage).
Weizmann Institute of Science.
Location: Gonda Building (901), Room 101.
Optimization Landscape of Neural Networks: Where Do the Local Minima Hide?
Training neural networks is a highly non-convex optimization problem, which is often successfully solved in practice, but the reasons for this are poorly understood. Much recent work has focused on showing that these non-convex problems do not suffer from poor local minima. However, this has only been provably shown under strong assumptions or in highly restrictive settings. In this talk, I’ll describe some recent results on this topic, both positive and negative. On the negative side, I’ll show how local minima can be ubiquitous even when optimizing simple, one-hidden-layer networks under favorable data distributions. On the flip side, I’ll discuss how looking at other architectures (such as residual units), or modifying the question, can lead to positive results under mild assumptions.