๐๐๐ ๐๐๐๐บ๐๐๐๐๐
[Code] Studying Pytorch for Deep Learning Models ๋ณธ๋ฌธ
1. Using @torch.no_grad()
- It's a decorator used in Pytorch that disables automatic differentiation. This means that gradients are not calculated within the scope of this function.
- Since the goal is not to update model weights but rather generate results, it is primarily used during the inference phase.
2. torch.randn() vs torch.randn_like()
~ to be continued ~
~ gonna keep updating ~
'Python' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
[Python] What is "self" in Python? (0) | 2025.01.07 |
---|