- This event has passed.
On the Expressive Power of ConvNets and RNNs as a Function of their Architecture by Or Sharir (DSI Learning Club)
May 31, 2018 @ 10:00 am - 11:00 am IDT
May 31st, Thu 10:00 , Or Sharir (webpage).
The Hebrew University of Jerusalem (PhD Student)
Location: Gonda Building (901), Room 101.
On the Expressive Power of ConvNets and RNNs as a Function of their Architecture
The driving force behind convolutional and recurrent networks — two of the most successful deep learning architectures to date — is their expressive power. Despite its wide acceptance and vast empirical evidence, formal analyses supporting this belief are scarce. The primary notions for formally reasoning about expressiveness are efficiency and inductive bias. Efficiency refers to the ability of a network architecture to realize functions that require an alternative architecture to be much larger. Inductive bias refers to the prioritization of some functions over others given prior knowledge regarding a task at hand. Through an equivalence to hierarchical tensor decompositions, we study the expressive efficiency and inductive bias of various architectural features in convolutional networks (depth, width, pooling geometry, inter-connectivity, overlapping receptive fields etc.) as well as the long-term memory capacity of deep recurrent networks. Our results shed light on the demonstrated effectiveness of modern networks, and in addition, provide new tools for network design.