Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Optimization for ML Workshop

Neural Networks with Complex-Valued Weights Have No Spurious Local Minima

Xingtu Liu


Abstract:

We study the benefits of complex-valued weights for neural networks. We prove that shallow complex neural networks with quadratic activations have no spurious local minima. In contrast, shallow real neural networks with quadratic activations have infinitely many spurious local minima under the same conditions. In addition, we provide specific examples to demonstrate that complex- valued weights turn poor local minima into saddle points.

Chat is not available.