https://zhuanlan.zhihu.com/p/61379965
https://zhuanlan.zhihu.com/p/98785902
https://zhuanlan.zhihu.com/p/258395701
1.L1loss
2.MSELoss
3.CrossEntropyLoss
4.NLLLoss
5.PoissonNLLLoss
6.KLDivLoss
7.BCELoss
8.BCEWithLogitsLoss
9.MarginRankingLoss
10.HingeEmbeddingLoss
11.MultiLabelMarginLoss
12.SmoothL1Loss
13.SoftMarginLoss
14.MultiLabelSoftMarginLoss
15.CosineEmbeddingLoss
16.MultiMarginLoss
17.TripletMarginLoss
18.CTCLoss
Pytorch
nn.L1Loss | Creates a criterion that measures the mean absolute error (MAE) between each element in the input xx and target yy. |
---|---|
nn.MSELoss | Creates a criterion that measures the mean squared error (squared L2 norm) between each element in the input xx and target yy. |
nn.CrossEntropyLoss | This criterion combines LogSoftmax and NLLLoss in one single class. |
nn.CTCLoss | The Connectionist Temporal Classification loss. |
nn.NLLLoss | The negative log likelihood loss. |
nn.PoissonNLLLoss | Negative log likelihood loss with Poisson distribution of target. |
nn.GaussianNLLLoss | Gaussian negative log likelihood loss. |
nn.KLDivLoss | The Kullback-Leibler divergence loss measure |
nn.BCELoss | Creates a criterion that measures the Binary Cross Entropy between the target and the output: |
nn.BCEWithLogitsLoss | This loss combines a Sigmoid layer and the BCELoss in one single class. |
nn.MarginRankingLoss | Creates a criterion that measures the loss given inputs x1x_1, x2_x_2, two 1D mini-batch Tensors, and a label 1D mini-batch tensor y_y (containing 1 or -1). |
nn.HingeEmbeddingLoss | Measures the loss given an input tensor xx and a labels tensor yy (containing 1 or -1). |
nn.MultiLabelMarginLoss | Creates a criterion that optimizes a multi-class multi-classification hinge loss (margin-based loss) between input xx (a 2D mini-batch Tensor) and output yy (which is a 2D Tensor of target class indices). |
nn.HuberLoss | Creates a criterion that uses a squared term if the absolute element-wise error falls below delta and a delta-scaled L1 term otherwise. |
nn.SmoothL1Loss | Creates a criterion that uses a squared term if the absolute element-wise error falls below beta and an L1 term otherwise. |
nn.SoftMarginLoss | Creates a criterion that optimizes a two-class classification logistic loss between input tensor xx and target tensor yy (containing 1 or -1). |
nn.MultiLabelSoftMarginLoss | Creates a criterion that optimizes a multi-label one-versus-all loss based on max-entropy, between input xx and target yy of size (N, C)(N,C). |
nn.CosineEmbeddingLoss | Creates a criterion that measures the loss given input tensors x1_x_1, x_2_x_2 and a Tensor label y_y with values 1 or -1. |
nn.MultiMarginLoss | Creates a criterion that optimizes a multi-class classification hinge loss (margin-based loss) between input xx (a 2D mini-batch Tensor) and output yy (which is a 1D tensor of target class indices, 0 \leq y \leq \text{x.size}(1)-10≤y≤x.size(1)−1): |
nn.TripletMarginLoss | Creates a criterion that measures the triplet loss given an input tensors x1_x_1, x2_x_2, x3_x_3 and a margin with a value greater than 00. |
nn.TripletMarginWithDistanceLoss | Creates a criterion that measures the triplet loss given input tensors aa, pp, and nn (representing anchor, positive, and negative examples, respectively), and a nonnegative, real-valued function (“distance function”) used to compute the relationship between the anchor and positive example (“positive distance”) and the anchor and negative example (“negative distance”). |