"This tutorial heavily draws on [https://pytorch.org/tutorials](https://pytorch.org/tutorials); refer to those pages for further detail."
]
},
{
"cell_type": "markdown",
"id": "b1d5ee6c-e12d-43e4-b489-fb00ae26f4c6",
"metadata": {},
"source": [
"# Data types: everything is tensor\n",
"\n",
"Tensors in `torch` are the equivalent of `numpy` arrays. See [https://pytorch.org/docs/stable/torch.html](https://pytorch.org/docs/stable/torch.html) for detail.\n",
"\n",
"They can be created from python or numpy objects. The data type is automatically inferred, un less stated otherwise."
"print('singular value decomposition of t_rand: \\n{}'.format(t_rand.svd()))\n",
"print('or equivalently')\n",
"print('{} \\n'.format(torch.svd(t_rand)))"
]
},
{
"cell_type": "markdown",
"id": "d0621d20-40ba-4e2f-97f6-7a5657714101",
"metadata": {},
"source": [
"One has to be careful with the difference between element-wise mulatiplication of arrays (`torch.mul`) and the matrix multiplication (`torch.matmul` or `@`)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "7dbe3f1f-d984-49cc-bef5-15a9461591b7",
"metadata": {},
"outputs": [],
"source": [
"t1 = torch.rand((2,2))\n",
"t2 = torch.rand((2,3))\n",
"t3 = torch.rand((2,3))\n",
"\n",
"print('tensors t1, t2 and t3:\\n{}\\n{}\\n{}'.format(t1, t2, t3))"
"Like in `numpy`, `torch.einsum` is useful to make multiplcation along specific axes in a flexible manner."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "2a40d9d3-1c5a-4782-bf8d-934d75a2c8f8",
"metadata": {},
"outputs": [],
"source": [
"print('multiplication along axis 0 for t1 and 0 for t2:\\n{}'.format(torch.einsum('ij, ik -> jk', t1, t2)))"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f0addf96-9ebb-4728-9d5f-21cb36c212a8",
"metadata": {},
"outputs": [],
"source": [
"print('multiplication along axis 1 for t1 and 0 for t2:\\n{}'.format(torch.einsum('ji, ik -> jk', t1, t2)))"
]
},
{
"cell_type": "markdown",
"id": "19d265a2-0a7e-4bb2-b462-af37f42e3ce3",
"metadata": {},
"source": [
"# Datasets and loaders\n",
"\n",
"Data manipulation are eased in pytorch by functions that can load big datasets and select batches of samples with randomization. Many datasets are available, like images with `torchvision`."
Tensors in `torch` are the equivalent of `numpy` arrays. See [https://pytorch.org/docs/stable/torch.html](https://pytorch.org/docs/stable/torch.html) for detail.
They can be created from python or numpy objects. The data type is automatically inferred, un less stated otherwise.
One has to be careful with the difference between element-wise mulatiplication of arrays (`torch.mul`) and the matrix multiplication (`torch.matmul` or `@`).
Data manipulation are eased in pytorch by functions that can load big datasets and select batches of samples with randomization. Many datasets are available, like images with `torchvision`.