top of page

Linear Algebra To Know For Machine Learning



Linear Algebra deals with linear equations like linear maps (which is a mapping of two different vector spaces which preserve the vector operation of addition and scalar multiplication) and its representations in vector spaces and through matrices. Linear algebra is key in almost all areas of mathematics since it is widely used in science and many fields of engineering as it helps model different natural phenomena and compute them efficiently. Linear Algebra is highly similar to the Algebra we talked about back in our previous article, except that instead of ordinary single numbers, it deals with vectors. Many of the same algebraic operations we’ve used to perform on ordinary numbers (i.e., scalars), like addition, subtraction and multiplication, can be generalized to be operated on vectors.

Vectors

A vector is a list of numbers. There are (at least) two ways to interpret what this list of numbers means: One way to think of the vector as being a point in space. If so, this list of different numbers would in fact be a way of identifying that very point in space such that each number represents the vector’s component of that dimension. Another approach to understanding vector is as a magnitude and a direction, e.g., a quantity like a velocity (“the car’s velocity is 120 mph north-by-northeast”). Vector Addition Vectors can be added and subtracted. Graphically, we can imagine adding two different vectors together as placing two-line segments end-to-end, maintaining distance and direction. An example of it is illustrated below, showing the addition of two vectors that create the third vector. A vector is denoted by its name with an arrow over it. Let,






Adding the vectors, we get,




This resultant vector would be the third vector. Vector Multiplication There are two different distinguishable ways of multiplying vectors, which are called dot products (i.e. scalar products) and cross products. The dot product generates a scalar value from the product of two vectors and will be discussed in greater detail below. Do not confuse the dot product with the cross product which is entirely different. Here,



And the calculations are the same in multiple dimensions. Solving a set of equations, One of the ways to solve the two equations is to draw them graphically and how they intersect each other. In Linear Algebra, we take the matrixes and solve them. Let us take, two equations, X = Y + 5 3X + 2Y = 5 Taking both X and Y is same sides, we get, X – Y = 5 3X + 2 Y = 5 Taking the above equation, I and II in matrix form, we get,




If you want to learn more about Linear Algebra, Check out this amazing video by AI 42.

Matrix

A matrix just like a vector can also be understood as a collection of numbers. The major difference between vector and matrix is that matrix is a table of numbers instead of a list. Matrix Row & Matrix Column Entries or Elements are different numbers, symbols, and expressions in a matrix. Rows and Columns are the horizontal and vertical lines in a matrix respectively. Matrix Multiplication Matrix multiplication is complicated, as multiple elements in the first matrix interact with multiple elements in the second to produce each element in the product matrix. This explains how matrix multiplication can be a mundane task to carry out by hand, and thus would be time-consuming on a computer for very large matrices. Taking matrix multiplication for two different equations, X – Y = 5 3X + 2 Y = 5 can be written as,



Here, the vertical values 1, 3 and –1,2 are columns and the horizontals are taken as rows.

Linear Transformation

A Linear Transformation can be defined as a function from one vector space to another which maintains the linear structure of each vector space. It is also called linear operator or a map. The matrix,


describes a linear transformation.

Determinant

The determinant can be defined as a scalar value that describes certain properties of the linear transformation described by the matrix.



Identity Matrix An identity matrix is a given square matrix of any order which contains its main diagonal elements with a value of one, while the rest of the matrix elements are equal to zero. The product of every square matrix and that of its identity matrix would always result the original matrix, no matter whatever order the multiplication was operated.




Matrix Transpose The matrix transpose i.e. the transpose of any matrix would result in a new matrix where the rows of the old matrix become its column. The superscript ‘T’ means ‘transpose’ for the matrix.





Least squares solutions The process of least squares is a standard approach in the analysis of regression in order to approximate the solution of different sets of equations which contains more equations than unknowns (also known as overdetermined systems) by reducing the sum of the squares of residues made in the results of each equation,

  • What if Ax = b does not have a solution. What is the best approximate solution?

  • The best approximate solution is what we call the least squares solution.

Each of the points has coordinates for it. E.g. (x1, y1), (x2, y2) … (xn, yn) in random. There would be line that can be drawn through joining the highest now of points in a straight line. This line, would signify the least square solution.

Where is least square used in machine learning?

Image Recognition Image recognition is subset of Computer Vision and Artificial Intelligence, which represents a set of process for detecting and analyzing images in order to enable the automation of a particular precise task. It is a technology that is capable of recognizing different locations, people, objects and numerous other kinds of elements within an image and find conclusions from them by analyzing.

Let us take a, a. 10 X 10-pixel image b. Matrix values of the image for 1 for each dotted pixel. If we have to draw a circle, the matrix would have 0 values in every other place which wouldn’t be used to signify the part of the circle and 1 for pixels where the drawing of the circle would lie. This transformation process is used to detect objects and feature analysis in image recognition.

How ML does linear transformation and connect it with the matrix?

When we are trying to tune the model, we are trying to find the transformation matrix and the best solution. This is done in multiple levels. Firstly, the transformation is done so that the edges are made visible. After that, models are defined to differentiate between multiple objects, for e.g.cat and dog. And after, many layers of transformations are done in a row to do the job.

Conclusion

In this article, we learned about Linear Algebra which is one of the most important branches of mathematics without which solving complex problems through Artificial Intelligence would be impossible. We learned about vectors – addition and multiplication, solved linear equations, understood about matrix and solved it too with linear equations, got deeper into linear transformation, identity matrix and the process to calculate determinant values. We also learned to transpose matrix and grasped the concept of least squares solution. All of these different aspects of Linear Algebra will help you become a better Machine Learning Engineer.



Source: paper.li


The Tech Platform

0 comments
bottom of page