美文网首页PapeRman
2019-01-23 Paperman #2

2019-01-23 Paperman #2

作者: 朱小虎XiaohuZhu | 来源:发表于2019-01-23 01:03 被阅读1次

    PROBABILISTIC SYMMETRY AND INVARIANT NEURAL NETWORKS

    Authors: Benjamin Bloem-Reddy and Yee Whye Teh
    Institute: University of Oxford

    In an effort to improve the performance of deep neural networks in data-scarce, non-i.i.d., or unsupervised settings, much recent research has been devoted to encoding invariance under symmetry transformations into neural network architectures. We treat the neural network input and output as random variables, and consider group invariance from the perspective of probabilistic symmetry. Drawing on tools from probability and statistics, we establish a link between functional and probabilistic symmetry, and obtain generative functional representations of joint and conditional probability distributions that are invariant or equivariant under the action of a compact group.

    Those representations completely characterize the structure of neural networks that can be used to model such distributions and yield a general program for constructing invariant stochastic or deterministic neural networks. We develop the details of the general program for exchangeable sequences and arrays, recovering a number of recent examples as special cases.

    相关文章

      网友评论

        本文标题:2019-01-23 Paperman #2

        本文链接:https://www.haomeiwen.com/subject/dgpdjqtx.html