美文网首页
02-1tensorflow.nn主要功能模块

02-1tensorflow.nn主要功能模块

作者: 南墙已破 | 来源:发表于2016-10-28 14:01 被阅读840次
    Activation Functions
    • relu
    • relu6
    • elu
    • softplus
    • softsign
    • dropout
    • bias_add
    • sigmoid
    • tanh
    Convolution
    • conv2d
    • depthwise_conv2d
    • separable_conv2d
    • atrous_conv2d
    • conv2d_transpose
    • conv3d
    Pooling
    • avg_pool
    • max_pool
    • max_pool_with_argmax
    • avg_pool3d
    • max_pool3d
    Morphological filtering

    Morphological operators are non-linear filters used in image processing.

    • dilation2d
    • erosion2d
    Normalization
    • l2_normalize
    • local_response_normalization
    • sufficient_statistics
    • normalize_moments
    • moments
    Losses
    • l2_loss
    • log_poisson_loss
    Classification

    TensorFlow provides several operations that help you perform classification.

    • sigmoid_cross_entropy_with_logits
    • softmax
    • log_softmax
    • softmax_cross_entropy_with_logits
    • sparse_softmax_cross_entropy_with_logits
    • weighted_cross_entropy_with_logits
    Embeddings

    TensorFlow provides library support for looking up values in embedding tensors.

    • embedding_lookup
    • embedding_lookup_sparse
    Recurrent Neural Networks

    TensorFlow provides a number of methods for constructing Recurrent Neural Networks. Most accept an RNNCell-subclassed object (see the documentation for tf.nn.rnn_cell).

    • dynamic_rnn
    • rnn
    • state_saving_rnn
    • bidirectional_rnn
    Conectionist Temporal Classification (CTC)
    • ctc_loss
    • ctc_greedy_decoder
    • ctc_beam_search_decoder
    Evaluation

    The evaluation ops are useful for measuring the performance of a network. Since they are nondifferentiable, they are typically used at evaluation time.

    • top_k
    • in_top_k
    Candidate Sampling

    Do you want to train a multiclass or multilabel model with thousands or millions of output classes (for example, a language model with a large vocabulary)? Training with a full Softmax is slow in this case, since all of the classes are evaluated for every training example. Candidate Sampling training algorithms can speed up your step times by only considering a small randomly-chosen subset of contrastive classes (called candidates) for each batch of training examples.

    Sampled Loss Functions

    TensorFlow provides the following sampled loss functions for faster training.

    • nce_loss
    • sampled_softmax_loss
    Candidate Samplers

    TensorFlow provides the following samplers for randomly sampling candidate classes when using one of the sampled loss functions above.

    • uniform_candidate_sampler
    • log_uniform_candidate_sampler
    • learned_unigram_candidate_sampler
    • fixed_unigram_candidate_sampler
    Miscellaneous candidate sampling utilities
    • Miscellaneous candidate sampling utilities
    Other Functions and Classes
    • batch_normalization
    • depthwise_conv2d_native

    相关文章

      网友评论

          本文标题:02-1tensorflow.nn主要功能模块

          本文链接:https://www.haomeiwen.com/subject/zaetyttx.html