Let's learn

About

Published on:

April 10, 2023

Published by:

Professor Ishwar Sethi

This post was originally published by one of our partners on:

https://iksinc.tech/understanding-tensors-and-tensor-decompositions-part-1/

This is a three-part blog post about tensors. The first part introduces tensors; the second and third part discuss tensor decomposition. So a natural question that you might have is what is a tensor and why should I be interested in them? Well read on to get your answer.

Simply stated, a *tensor* is a multidimensional or *N*-way array. The array dimensionality, the value of *N*, specifies the tensor *order* or the number of tensor *modes*. A one-dimensional array ** a** of

There are many reasons for us to be interested in tensors. First, high-order tensors are a natural mathematical representation in many instances. For example, RGB images are tensors of order three and videos are tensors of order four. In medical imaging, the multimodality images of patients captured under different conditions produce high order tensors. Second, vectors and matrices of large-size converted to high-order tensors lead to efficient computation as in deep learning libraries. Finally, you can stack matrices in many tasks to form tensors to take advantages of data compression and analysis via tensor decomposition, similar to decomposing matrices via singular value decomposition (SVD) and PCA. This, for example, has been used for the face recognition task by stacking face images of different subjects under different lighting conditions, different facial expressions, and different view angles to form a tensor representation of the data.

We access the elements of a real-valued tensor

of order *K *using *K* indices as in

. You can access a tensor subarray by keeping a subset of indices fixed. In tensors, the term *fiber* is used to refer to a tensor subarray with all but one fixed index. Thus, a fiber is just like a row/column in a matrix. For a three-mode tensor, we can form three different sets of fibers; we reference them as *column (mode-1)*, *row (mode-2)*, and *tube (mode-3) fibers* as shown below. A tensor subarray with all but two indices fixed is termed a *slice*, which is nothing but a matrix. Again, the slices are known with different names based on the indices kept fixed.

You may recall the rank of a matrix is the number of independent columns or rows. For example, the matrix

has rank 2 as its both columns are independents, i.e. one column/row is not a scaled version of the other column. On the other hand, the following matrix

is a matrix of rank 1 as the second column/row is same as the first column multiplied by 2.

You may also recall that given two vectors, we can multiply them in three different ways. In *inner* or *dot product*, the result is just a number. The second kind of multiplication, known as the *vector product*, results in a vector. The third kind of product, the *outer product* and denoted by (∘), results in a matrix. It is the outer product that is of interest to us right now. Thus, given two column vectors

, the outer product results in the following matrix:

You will notice that the result of outer product of two vectors is always a matrix of rank 1. If we do outer product of three vectors, then the result is a rank 1 tensor of order three. To put it in other way, a tensor*, *

*, of order K *is of rank 1 if it can be expressed as the outer product of *K* vectors, i.e.

, where the superscripts denote different vectors. This implies that every tensor element can be expressed as the product of corresponding vector elements. For a tensor of order 3, we can thus write its (*i, j, k*) element as

, the product of multiplying the *i-th* element of the first vector with the *j-th* element of the second vector and the *k-th* element of the third vector as illustrated below, where the small white rectangles denote the vector elements whose product defines the shown tensor element.

Since a numerical example always helps, take a look below at an example of a rank 1 tensor of order 3 created by performing outer product (denoted by %o% in *R*) of three vectors.

[code language=”r”]

> a1 = c(1,2,3)

> a2 = c(4,5,6)

> a3 = c(7, 8, 9)

> T = a1%o%a2%o%a3

> print(T)

, , 1

[,1] [,2] [,3]

[1,] 28 35 42

[2,] 56 70 84

[3,] 84 105 126

, , 2

[,1] [,2] [,3]

[1,] 32 40 48

[2,] 64 80 96

[3,] 96 120 144

, , 3

[,1] [,2] [,3]

[1,] 36 45 54

[2,] 72 90 108

[3,] 108 135 162

[/code]

Note that the result above is showing the three frontal slices of the tensor. You can view the horizontal and lateral slices by changing the fixed index as shown below.

[code language=”r”]

> T[1,,]

[,1] [,2] [,3]

[1,] 28 32 36

[2,] 35 40 45

[3,] 42 48 54

> T[2,,]

[,1] [,2] [,3]

[1,] 56 64 72

[2,] 70 80 90

[3,] 84 96 108

> T[3,,]

[,1] [,2] [,3]

[1,] 84 96 108

[2,] 105 120 135

[3,] 126 144 162

> T[,1,]

[,1] [,2] [,3]

[1,] 28 32 36

[2,] 56 64 72

[3,] 84 96 108

> T[,2,]

[,1] [,2] [,3]

[1,] 35 40 45

[2,] 70 80 90

[3,] 105 120 135

> T[,3,]

[,1] [,2] [,3]

[1,] 42 48 54

[2,] 84 96 108

[3,] 126 144 162

[/code]

Another term that you are likely to come across while working with tensors is *unfolding* or *flattening*. What it means is to arrange a tensor as a matrix. When you unfold a tensor along its one of the modes, say mode *m*, then we say we are doing mode-*m* flattening. The mode-*m* flattened matrix is formed simply by arranging the mode-*m* fibers as its columns. For example, the mode-1 flattening of the order 3 tensor of the above example will be:

[code language=”r”]

[,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9]

[1,] 28 35 42 32 40 48 36 45 54

[2,] 56 70 84 64 80 96 72 90 108

[3,] 84 105 126 96 120 144 108 135 162

[/code]

With this brief intro to tensors, we are ready to look at tensor decomposition and its applications. But you will have to wait for Parts 2 and 3 coming in next few weeks.

Benson Muite has created a Git repository for the code used in this and the next two blog posts on tensors. Pl. visit the repository at https://gitlab.com/bkmgit/understanding-tensors-and-tensor-decomposition/. Many thanks to him.

Check Out These Brilliant Topics

We add Value. Not spam.

Thank you! Your submission has been received!

Oops! Something went wrong while submitting the form.

Company Info

Privacy PolicyCareersAbout US100 N Saginaw St.

Pontiac, Mi 48342

support@kevadiya.com

248. 378. 2843

Pontiac, Mi 48342

support@kevadiya.com

248. 378. 2843

Building Solutions

API DevelopmentArtificial IntelligenceCloud ApplicationsCustom DevelopmentEmbedded ApplicationsMobile AppsSoftware Architectural DesignWeb ApplicationsKevadiya INC. © 2023 All Rights Reserved.