美文网首页
tf.losses.mean_squared_error

tf.losses.mean_squared_error

作者: 西方失败9527 | 来源:发表于2019-04-08 10:22 被阅读0次

    tf源代码中:

    def mean_squared_error(

    labels, predictions, weights=1.0, scope=None,

        loss_collection=ops.GraphKeys.LOSSES,

        reduction=Reduction.SUM_BY_NONZERO_WEIGHTS):

    """Adds a Sum-of-Squares loss to the training procedure.

    `weights` acts as a coefficient for the loss. If a scalar is provided, then

    the loss is simply scaled by the given value. If `weights` is a tensor of size

    `[batch_size]`, then the total loss for each sample of the batch is rescaled

    by the corresponding element in the `weights` vector. If the shape of

    `weights` matches the shape of `predictions`, then the loss of each

    measurable element of `predictions` is scaled by the corresponding value of

    `weights`.

    Args:

    labels: The ground truth output tensor, same dimensions as 'predictions'.

    predictions: The predicted outputs.

    weights: Optional `Tensor` whose rank is either 0, or the same rank as

    `labels`, and must be broadcastable to `labels` (i.e., all dimensions must

    be either `1`, or the same as the corresponding `losses` dimension).

    scope: The scope for the operations performed in computing the loss.

    loss_collection: collection to which the loss will be added.

    reduction: Type of reduction to apply to loss.

    Returns:

    Weighted loss float `Tensor`. If `reduction` is `NONE`, this has the same

        shape as `labels`; otherwise, it is scalar.

    这里的参数weights十分有用,可以对需要求差异的输入部分截取,即只计算部分labels和predictions的MSE,最终除数也是真实的计算元素个数。

    import tensorflowas tf

    a = tf.constant([[1,2],[3,4]])

    b = tf.constant([[2,1],[5,6]])

    mask = tf.constant([[1,0],[1,0]])

    # mse loss :

    recon_loss = tf.losses.mean_squared_error(a, b, mask)

    result = tf.Session().run(recon_loss)

    print(result)

    输出:[(1-2)^2+(3-5)^2]/2 = 2.5

    相关文章

      网友评论

          本文标题:tf.losses.mean_squared_error

          本文链接:https://www.haomeiwen.com/subject/yhtuiqtx.html