Which of the following statements are true? Check all that apply.
(√)A. The activation values of the hidden units in a neural network, with the sigmoid activation function applied at every layer, are always in the range (0, 1).
(√)B. Any logical function over binary-valued (0 or 1) inputs x1 and x2 can be (approximately) represented using some neural network.
(x)C. Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.
因为同层之间的节点互不影响
(x)D. A two layer (one input layer, one output layer; no hidden layer) neural network can represent the XOR function.
3层
网友评论