Deep Learning/PyTorch

PyTorch-randint, scatter_, log_softmax, nll_loss, cross_entropy

Naranjito 2022. 8. 10. 11:55
  • randint

randint(start, end, (tensor size))

torch.randint(3, 5, (3,))

>>>

tensor([4, 3, 4])

torch.randint(3, 10, (2, 2))

>>>

tensor([[4, 5],
        [6, 7]])

 

  • scatter_

scatter_(dim, index, value)

y.unsqueeze(1)

>>>

tensor([[3],
        [1],
        [2]])

y_one_hot.scatter_(1, y.unsqueeze(1),1)

>>>

tensor([[0., 0., 0., 1., 0.], #<--3
        [0., 1., 0., 0., 0.],  #<--1
        [0., 0., 1., 0., 0.]])  #<--2

 

  • cost(multi classification)

All the result is same.

 

- cost function 1 : log

(y_one_hot * -torch.log(F.softmax(model(x), dim=1))).sum(dim=1).mean()

>>>

tensor(1.7024, grad_fn=<MeanBackward0>)

 

- cost function 2 : log_softmax

(y_one_hot * - F.log_softmax(model(x), dim=1)).sum(dim=1).mean()

>>>

tensor(1.7024, grad_fn=<MeanBackward0>)

 

- cost function 3 : nll_loss

Negative Log Likelihood

F.nll_loss(F.log_softmax(model(x), dim=1), y_train)

>>>

tensor(1.7024, grad_fn=<NllLossBackward0>)

 

- cost function 4 : F.cross_entropy(model(x), y)

F.cross_entropy(model(x), y_train)

>>>

tensor(1.7024, grad_fn=<NllLossBackward0>)