天天看點

【pytorch函數筆記(三)】torch.nn.BCELoss()

import torch.nn as nn
nn.BCELoss((weight=None, size_average=None, reduce=None, reduction=‘mean’))
           

一、

torch.nn.BCELoss()

介紹

  

BCELoss()

是計算目标值和預測值之間的二進制交叉熵損失函數。其公式如下:

l n = − w n ⋅ [ y n ⋅ l o g x n + ( 1 − y n ) ⋅ l o g ( 1 − x n ) ] l_n=-w_n·[{y_n·logx_n}+{(1-y_n)·log(1-x_n)}] ln​=−wn​⋅[yn​⋅logxn​+(1−yn​)⋅log(1−xn​)]

  其中, w n w_n wn​表示權重矩陣, x n x_n xn​表示預測值矩陣(輸入矩陣被激活函數處理後的結果), y n y_n yn​表示目标值矩陣。(注意, l o g log log以 e e e為底,即數學中的 l n ln ln)

二、

torch.nn.BCELoss()

應用

代碼:

import torch
import torch.nn as nn
weights=torch.tensor([[1, 1,  0],
        [1,  1,  1],
        [1,  1, 1]])
m = nn.Sigmoid()
loss = nn.BCELoss(weight=weights,reduction='none')
input = torch.tensor([[-0.1514,  0.0744, -1.5716],
        [-0.3198, -1.2424, -1.4921],
        [ 0.5548,  0.8131,  1.0369]], requires_grad=True)
target = torch.tensor([[0., 1., 0.],
        [0., 1., 1.],
        [0., 0., 0.]])
output = loss(m(input), target)
print(m(input)) 	#被激活函數處理的輸入矩陣
print(target)		#目标值矩陣
print(weights)		#權重矩陣
print(output)		#損失值矩陣
           

運作結果:

tensor([[0.4622, 0.5186, 0.1720],
        [0.4207, 0.2240, 0.1836],
        [0.6352, 0.6928, 0.7383]], grad_fn=<SigmoidBackward>)
tensor([[0., 1., 0.],
        [0., 1., 1.],
        [0., 0., 0.]])
tensor([[1, 1, 0],
        [1, 1, 1],
        [1, 1, 1]])
tensor([[0.6203, 0.6566, 0.0000],
        [0.5460, 1.4960, 1.6950],
        [1.0085, 1.1802, 1.3404]], grad_fn=<BinaryCrossEntropyBackward>)
           

繼續閱讀