DEV Community

Super Kai (Kazuya Ito)
Super Kai (Kazuya Ito)

Posted on • Updated on

normal() in PyTorch

Buy Me a Coffee

*Memos:

normal() can create the 0D or more D tensor of zero or more random floating-point numbers or complex numbers from normal distribution as shown below:

*Memos:

  • normal() can be used with torch but not with a tensor.
  • The 1st argument with torch is mean(Required-Type:float or complex or tensor of float or complex): *Memos:
    • Setting mean without std and size is tensor of float or complex.
    • Setting mean and std without size is float or tensor of float or complex.
    • Setting mean, std and size is float or tensor of float. *The 0D tensor of float also works.
  • The 2nd argument with torch is std(Optional-Type:float or tensor of float): *Memos:
    • It is standard deviation.
    • It must be greater than or equal to 0.
    • Setting std without size is float or tensor of float.
    • Setting std with size is float or tensor of float. *The 0D tensor of float also works.
  • The 3rd argument with torch is size(Optional-Type:tuple of int, list of int or size()): *Memos:
    • It must be used with std.
    • It must not be negative.
  • There is dtype argument with torch(Optional-Type:dtype): *Memos:
  • There is device argument with torch(Optional-Type:str, int or device()): *Memos:
  • There is requires_grad argument with torch(Optional-Type:bool): *Memos:
    • requires_grad= must be used.
    • My post explains requires_grad argument.
  • There is out argument with torch(Optional-Type:tensor): *Memos:
    • out= must be used.
    • My post explains out argument.
import torch

torch.normal(mean=torch.tensor([1., 2., 3.]))
# tensor([1.2713, 0.7271, 3.5027])

torch.normal(mean=torch.tensor([1.+0.j, 2.+0.j, 3.+0.j]))
# tensor([1.1918-0.9001j, 2.3555+0.2956j, 2.5479-0.4672j])

torch.normal(mean=torch.tensor([1., 2., 3.]),
             std=torch.tensor([4., 5., 6.]))
# tensor([2.0851, -4.3646, 6.0162])

torch.normal(mean=torch.tensor([1.+0.j, 2.+0.j, 3.+0.j]),
             std=torch.tensor([4., 5., 6.]))
# tensor([1.7673-3.6004j, 3.7773+1.4781j, 0.2872-2.8034j])

torch.normal(mean=torch.tensor([1., 2., 3.]), std=4.)
# tensor([2.0851, -3.0917, 5.0108])

torch.normal(mean=torch.tensor([1.+0.j, 2.+0.j, 3.+0.j]), std=4.)
# tensor([1.7673-3.6004j, 3.4218+1.1825j, 1.1914-1.8689j])

torch.normal(mean=1., std=torch.tensor([4., 5., 6.]))
# tensor([2.0851, -5.3646, 4.0162])

torch.normal(mean=1., std=4., size=())
torch.normal(mean=1., std=4., size=torch.tensor(8).size())
torch.normal(mean=torch.tensor(1.), std=torch.tensor(4.), size=())
# tensor(2.0851)

torch.normal(mean=1., std=4., size=(3,))
torch.normal(mean=1., std=4., size=torch.tensor([8, 3, 6]).size())
torch.normal(mean=torch.tensor(1.), std=torch.tensor(4.), size=(3,))
# tensor([2.0851, -4.0917, 3.0108])

torch.normal(mean=1., std=4., size=(3, 2))
torch.normal(mean=1., std=4.,
             size=torch.tensor([[8, 3], [6, 0], [2, 9]]).size())
torch.normal(mean=torch.tensor(1.), std=torch.tensor(4.), size=(3, 2))
# tensor([[2.0851, -4.0917],
#         [3.0108, 2.6723],
#         [-1.5577, -1.6431]])

torch.normal(mean=1., std=4., size=(3, 2, 4))
torch.normal(mean=torch.tensor(1.), std=torch.tensor(4.), size=(3, 2, 4))
# tensor([[[-3.7568, 6.5729, 9.4236, -0.4183],
#          [2.4840, 5.3827, 9.5657, 1.5267]],
#         [[8.0575, -0.5000, -0.3416, 5.3502],
#          [-4.3835, 1.6974, 2.6226, -1.9671]],
#         [[1.1422, 1.7790, 4.5886, -0.3273],
#          [2.8941, -3.3046, 1.1336, 2.8792]]])
Enter fullscreen mode Exit fullscreen mode

Top comments (0)