Machine Learning Notes
These are my notes made during watching Andrej Karpathy’s tutorials on YouTube. I have also more theoretical (overview) notes about NLP and ML.
The tutorials are these:
(Py)Torch
- Tensor
.view(<shape>)
will take the internal linear (memory) representation of tensor and layout it according to the wanted shape- very efficient (better than e.g.
torch.cat(torch.unbind(tensor, 1), 1)
- very efficient (better than e.g.
arange
similar torange
in Pythonrandn(<shape>
will fill with numbers from normal distribution-1
in shape tells torch to infer the dimension- squeeze
torch.linspace(from, to, steps)
is likerange
in Python but works for floats- sum per row:
P.sum(1, keepdim=False)
- broadcasting semantics
- from numpy
- when a binary operation is defined for two tensors:
- both dimensions are equal
- one of them is 1
- one of them doesn’t exist
toch.nn.functional.one_hot
- common way of importing functional:
import torch.nn.functional as F
@
is vector multiplication- indexing with a range:
x[torch.arange(10), y]
with torch.no_grad(): ...
tells torch to not include what follows in backpropagationtorch.zeros_like(tensor)
will create a new tensor with the shape oftensor
with all zerostorch.all_close(t1, t2)
will compare tensors with some tolerance
published: 2022-11-24
last modified: 2023-11-20
https://vit.baisa.cz/notes/learn/machine-learning/
last modified: 2023-11-20
https://vit.baisa.cz/notes/learn/machine-learning/