- How are tensors related to scalars, vectors and matrices?
- How are tensors defined in Python?

You may already know something about tensors although you may not have used the term tensor. Tensors of rank (or order) zero are just scalars and tensors of rank one are just vectors. In 3-dimensional space a scalar has 3^0 = 1 components, a vector has 3^1 = 3 components and a second-rank tensor has 3^2 = 9 components. In 3D, in general a tensor of rank n has 3^n components.

Let us now extend this to ** m-dimensional space**. An

*n*th-rank tensor in

*m*-dimensional space is a mathematical object that has indices and components. One can also say that a tensor is a container which can house data in

*m*dimensions.

Mathematically speaking, tensors are more than simply a data container. However, aside from holding numeric data, tensors also include descriptions of the valid linear transformations between tensors. Examples of such transformations, or relations, include the cross product and the dot product. From a computer science/machine learning perspective, it can be helpful to think of tensors as being objects in an object-oriented sense, as opposed to simply being a data structure.

In reality, there are subtle differences between what tensors technically are and what tensors are referred to in machine learning/deep learning practices.

**Simply put, in terms of Python, a tensor is a numpy.ndarray.**

Now let us look at how we can define tensors using Python.

Let us first look at a scalar, its rank and its shape as well.

Moving on to vectors now.

Let us have a look at matrices now.

Now moving on to higher dimensional tensors and checking their dimension and shapes:

Example 1:

Example 2:

Play around with the above code and trying changing the tensor to understand how the rank and shape are defined for a tensor.