Mastering Linear Algebra - Key concepts like matrices, determinants and linear transformations

In this course, we'll provide a comprehensive overview of linear algebra, including key concepts like vector spaces, linear transformations, and matrix operations.

Table of contents

    What is included in a linear algebra course?

    The following 28 topics are typically included in a linear algebra course

    1. Vectors
    A vector in physics represents a force with a given direction, magnitude and origin. But in linear algebra, a vector has the first two properties but lacks an origin. Therefore, it can be moved if you want.

    2. Lines and planes in space
    The equation for a straight line is only defined for two dimensions. It does not however stop them from existing in higher dimensions, where we define them in parametric or vector form.

    3. Linear systems of equations
    Bunching together several equations creates a system of equations, and a solution for the system must solve each of the system's equations. The system is said to be linear if each equation is of the form: $$ a_1x_1 + a_2x_2 + \ldots + a_nx_n $$

    4. Gauss-Jordan
    Gauss-Jordan is a method for solving a linear system of equations. Each system falls into one of three cases; unique solution, no solutions or infinitely many solutions.

    5. Matrix arithmetics
    Matrix arithmetics are defined for addition, subtraction and multiplication. For the two former the matrices must have equal dimensions and the operations are commutative. Multiplication is however not commutative, and is only defined for when the number of rows of the left matrix is equal to the number of columns of the right matrix.

    6. Inverse matrices
    The inverse to a matrix is another matrix, and the product of the matrix and its inverse is the identity matrix.

    7. Linear dependence
    If a vector can be expressed as a linear combination of other vectors, the bunch is then considered as linearly dependent. If not, the vectors are considered to be linearly independent.

    8. Solution space
    A solution space is the vector space containing all solutions to a given system. It does follow the properties of vector spaces, so if $\vec{x}_1$ and $\vec{x}_1$ are solutions to $$A\vec{x} = \vec{b}$$ then we have that $$k_1\vec{x}_1 + k_2\vec{x}_2$$ is a solution for any values of $k_1,k_2$.

    9. Matrices of special form
    Three types of matrices of special form are diagonal matrices ($D$), triangular matrices ($T$) and symmetric ($S$) and skew-symmetric matrices ($S_k$). $$ \begin{aligned} D &= \left[\begin{array}{cc} d_1 & 0 \\ 0 & d_2 \end{array}\right] ,\quad T &&= \left[\begin{array}{cr} t_1 & \phantom{-}0 \\ t_2 & t_3 \end{array}\right] \\ S &= \left[\begin{array}{cc} s_1 & s_2 \\ s_2 & s_3 \end{array}\right] ,\quad S_k &&= \left[\begin{array}{cr} s_1 & -s_2 \\ s_2 & s_3 \end{array}\right] \end{aligned} $$

    10. Determinant
    Determinant is a scalar representation of a matrix, defined by a specific calculation. The geometric interpretation is that it is a scale factor for the linear transformation the matrix represents. It also talks about whether the system of linear equations that the matrix represents has a unique solution or not.

    11. Cross product, area and volume
    The cross product is a calculation between two vectors in three dimensions, and the result is a third vector that is unique and orthogonal to the first two. The length of the resulting vector is equal to the area of the parallelogram that the two vectors span. The cross product can also be used to calculate the volume of a parallelepiped, as part of the calculation called triple product. Given the volume spanning the three vectors, one calculates a cross product between two of them and then makes a point product between the resulting vector and the third vector.

    12. Eigenvalues and eigenvectors
    Eigenvalues and eigenvectors are related to a given square matrix A. An eigenvector is a vector which does not change its direction when multiplied with A, it can however change its length. When applicable, the length is changed by a scalar, which is the corresponding eigenvalue to the eigenvector.

    13. Linear transformations
    All matrix multiplications are linear transformations. For the general case, a transformation could be seen as a function, or a black box, that for any given input has an output. The definition for the transform to be linear is that the operation is consistent for two input elements, in our case, two vectors. Let L be a transformation, x and y be vectors, and c and k be scalars. Then L is a linear transformation if, and only if, $$L(cx + ky) = cL(x) + kL(y)$$

    14. Kernel and image
    Kernel and image are subspaces relating to a linear transformation L represtend by the standard matrix A. The kernel refers to the solution space of the homogenous system of linear equations that the matrix A represents, meaning the solutions to $$Ax = 0$$ The image refers to the subspace of all resulting vectors y from multiplying the matrix A with all the possible vectors x. $$Ax = y$$

    15. Compositions of linear transformations
    Compositions of linear transformations is about treating multiple linear transformations in sequence. For instance, let's say that $T$, $R$ and $S$ are linear transformations. A composition is then, for example, the output $y$ of the vector $x$ for $$T \circ R \circ S(x) = y$$

    16. Basis and dimension
    A basis is a set of linearly independent vectors (for instance $\vec{v_1}, \ldots \vec{v}_n$) that span a vector space or subspace. That means that any vector $\vec{x}$ belonging to that space can be expressed as a linear combination of the basis for a unique set of constants $k_1, \ldots k_n$, such as: $$ \vec{x} = k_1\vec{v}_1 + \ldots + k_n\vec{v}_n $$ The dimension of the vector space corresponds to the number of vectors required to form a basis (the basis is not unique). In this example, $n$.

    17. Null space and column space
    The null space (or commonly referred to as kernel) and column space (or commonly referred to as image) are spaces related to a certain matrix $A$. The null space is plain and simple the name of the solution space for the homogeneous equation $A\vec{x} = \vec{0}$. The column space (or commonly referred to as image) is the range of the linear transformation with the standard matrix $A$, meaning all the possible vectors $\vec{y}$ that can be mapped to via a multiplication with $A$, such that $A\vec{x} = \vec{y}$.

    18. Dimension theorem
    The dimension theorem applies to a matrix and its rank (dimension of column space) and nullity (dimension of null space). Let $A$ be a $m \times n$ matrix, then we have according to the dimension theorem that: $$ \operatorname{rank}(A) = \operatorname{nullity}(A) = n $$

    19. Projection theorem
    By projection one is typically referring to orthogonal projection of one vector onto another. The result is the representative contribution of the one vector along the other vector projected on. Imagine having the sun in zenit, casting a shadow of the first vector strictly down (orthogonally) onto the second vector. That shadow is then the ortogonal projection of the first vector to the second vector.

    20. Least squares
    Least squares is a method applied for data analysis and statistics. It is used to describe the relationship by a number of observations and their explanatory variables. Let’s say you have a number of observations (xn,yn), of which you’d like to model the relationship by a linear equation $$c_1x + c_2 = y$$ where $c_1$ and $c_2$ are constants we'd like to decide so that we get the optimal fit for our line to the data. This is done by minimizing the sum of all errors, meaning the distance to the line and each of the observed data points. This is actually easily calculated, by first transforming the system of linear equations to the matrix equation $$Ac = y$$ and then multiplying with the transpose of matrix A from the left on both sides $$A^TAc = A^Ty$$ which results in a square matrix system which has a unique solution. The solution is the vector c, consisting of our searched constants $c_1$ and $c_2$.

    21. Gram-Schmidt
    Gram-Schmidt is an algorithm for finding an ON basis to a given subspace. The input to the algorithm is a known, non-ON basis, and when applying the projection theorem in a sequence will find the basis vectors one by one.

    22. Change of basis
    A base is a set of vectors that are linearly independent and span a subspace. A vector is an element of a subspace, where its coordinates is the scalar representatives of the linear combination that can be expressed by the base vectors. Since a base is not unique for a subspace, each vector to that subspace can be expressed with coordinates for each and one of its bases.

    23. Linear transformations and bases
    A linear transformation is always with respect to a given basis. A common challenge is to determine the standard matrix A for a linear transformation given another basis.

    24. Diagonalization
    Diagonlization is a process for decomposing a square $n$ x $n$ matrix $A$ into the product of three matrices; $D$, $P$ and $P^{-1}$ such as $$A = PDP^{-1}$$ where $D$ is a diagonal matrix consisting of the eigenvalues to $A$ and $P$ is a square matrix which columns are the eigenvectors to $A$. Note that not all square matrices can be diagonalized, only those of which eigenvectors span the space Rn

    25. Orthogonal diagonalization
    Orthogonal diagonalization is the same as regular diagonlization, with the extended requirement of the eigenvectors needed to form an ON basis for $R^n$. Only symmetric matrices are orthogonal diagonalizable. The process of deciding the vectors for the matrix $P$ is by applying Gram-Schmidt. Then, by the property of symmetric matrices, you have that $$A = PDP^{-1} = PDP^T$$

    26. Quadratic form
    Equations of the form $$a_1x_1^2 + a_2x_2^2 + ... + a_nx_n^2$$ + all possible cross terms $a_kx_ix_j$ with distinct $x_ix_j$ are called quadratic forms, and can be expressed with a unique matrix $A$ $$x^TAx$$ and its geometrical shape can be determined by stuying the eigenvalues of the matrix $A$.

    27. General vector spaces
    Vector spaces does not need to be built up by numbers, they could be applied from a far more general approach. A popular application in a course of linear algebra is to cover polynomial spaces, where each element in the space is a polynomial. Confused? it can be at start, but it’s not as hard as it seems at first glance. The crucial thing is to know the axioms defining a vector space V, and sticking to them. Those axioms are

    • Space V is closed under addition and scalar multiplication
    • Addition is both commutative and associative
    • Scalar multiplication is both commutative and associative
    • Existence of an identity and inverse for addition
    • Existence of an identity and inverse for scalar multiplication

    28. General linear transformations

    What is linear algebra?

    Linear algebra is a branch of mathematics that, among other things, studies equations of the form: $$a_1x_1 + a_2x_2 + \ldots + a_nx_n = b$$ You usually study linear algebra first on university, but there are those who get an easier introduction to the subject already during High School. Many people associate a course in linear algebra with vectors, matrices, linear systems of equations, lines and planes. All associations are correct and can also be deduced from the equation form above.

    FAQ

    What is a line in linear algebra?

    If the equation is two-dimensional, it is a line: $$a_1x_1 + a_2x_2 = b$$

    What is a plane in linear algebra?

    If the equation is three-dimensional, it is a plane: $$a_1x_1 + a_2x_2 + a_3x_3 = b$$

    What is a system of equations?

    If we have several equations in the same form, these can be considered a system of equations. The definition of a solution to the system must solve all equations in the system. One system of equations can be rewritten to: $$A\vec{x} = \vec{b}$$ using the matrix ($A$) and the two vectors $\vec{x}, \vec{b}$.

    What is a vector?

    A vector in linear algebra is usually treated as a group of numbers and is one coordinate representation. Several examples from different dimensions, i.e. number of coordinates, is: $$(1,2), (1,-4,2), (99, 104, 3, -7)$$ While a vector in physics is usually drawn as a arrow and represents a force with a direction and magnitude. The notion that a vector is a force usually follows in mathematics, but here vectors are studied in an abstract meaning. In practice, a vector can be very much more than a force in physics. A vector can be a list of information in programming, an image in image analysis, a stock price change over time in finance, and much more.

    What is a cross product?

    The cross product is a calculation between two vectors in three dimensions, and the result is a third vector that is unique and orthogonal to the first two. The length of the resulting vector is equal to the area of the parallelogram that the two vectors span.

    What is linear algebra used for? - 7 practical use cases

    Linear algebra is the cornerstone of everything we see in our everyday life. It's thanks to linear algebra we make Boeing fly, Tesla drive and Spotify play. Linear algebra is also the basis of machine learning which has a number of applications, such as that Siri recognizes your face, Alexa recognizes your voice and that H&M maximizes its sales online. But how can linear algebra be critical for all vastly different applications?

    Students usually say that the course feels abstract, which to a certain extent correct. The advantage of an abstract tool is that only the imagination limits area of use.

    1. Encryption

    A smart way to protect the private information we send to each other is through encryption and decryption processes involving inverse matrices. In order to ensure that an eavesdropper cannot read a message sent electronically; we encrypt it, distorting and rearranging the symbols to make them look like nonsense. In order for the right person to read it, the message must then be decrypted at the reception.

    Using a technique called the Hill Cipher, the sender and receiver agree on a matrix to use to encrypt messages. Since the recipient knows that, they can use the inverse of the matrix to decrypt the message

    2. Computed tomography (CT scan)

    A CT scan is a medical imaging system that shoots X-rays through a body from many different angles. Based on these X-ray images, we can construct a picture of what the body looks like on the inside. But have you ever wondered how the image is actually constructed?

    The answer is of course with mathematics, especially with the so-called Radon transform. Radon transform is a type of integral transformation, which is a general linear transformation.

    3. Digital photo filters

    A pixel refers to a small region on your screen represented by three numbers between 0 and 255 that indicate the intensity of the red, green, and blue components, respectively. We use pixels to create digital images, and to change the appearance of an image, we adjust the values of its pixels.

    Setting the colors to similar values creates a grayscale image, and increasing or decreasing them will result in a lighter or darker appearance, respectively. By changing the intensity of the three components of the constituent pixels, there are endless other ways in which images can be manipulated to enhance certain features.Matrix arithmetics enables us to do this efficiently. Consequently, it's part of the math to thank for the filters that make your Instagram posts look amazing.

    4. Pricing within e.g. insurance

    Red cars are overrepresented in traffic accident statistics, but why aren't the insurance costs for red cars higher than for other colored cars of the same model? When you dig a little deeper, we find that it is not the color red itself that is a risk factor on the roads, but the color is linked to other characteristics that are. Red is a common color for sports cars, which tend to have powerful engines and male drivers. They also tend to have a high price tag.

    The cost of the insurance is linearly dependent on the value of the car and the risk of collision, i.e. if these go up, the insurance premium increases by a proportional amount. The probability that a car is red in color and its insurance cost depend on the same parameters, but do not directly affect each other.

    5. Search engines

    Computers often perform calculations on matrices, and it turns out that special types of matrices where many of their elements are zero make calculations both faster and more accurate. Datorer utför ofta

    Larry Page and Sergey Brin, the founders of Google, knew everything about how computer arithmetic works and how to optimize it. This enabled the revolution in the search engine market that Google brought about, increasing both the frequency of relevant hits on the web per diversity compared to their competitors.

    6. Face recognition

    Face detection systems are used to highlight the differences between people's faces so that only the right person is given access. Such programs rely heavily on the concept of eigenvectors.

    It turns out that human faces are linearly dependent on combinations of certain distinguishing features, such as hair color, nose size, distance between eyes, and so on. To properly construct these features, the system needs many images of people to learn from, but after it has described the important aspects of faces, a much smaller amount of special images can be used to reconstruct and compare any of the people.

    These special images are called eigenfaces. The name comes from the fact that they are essentially eigenvectors of a matrix containing information about the facial features found among the set of given images. Corresponding eigenvalues give a measure of how important each eigenface is to distinguish between different people. Dessa speciella bilder kallas för egenansikten.

    7. Self-driving cars

    Just like human drivers, self-driving cars must constantly scan the roads for obstacles and road signs to navigate our streets safely. To be able to do that, the car is equipped with cameras that take snapshots of the surroundings at very short intervals. But how does the car know if the Volvo in front is driving nonchalantly on the road, or has stopped suddenly as a result of an accident?

    The answer is linear transformations. An image of a car far away has a completely different pixel representation compared to a close-up of the same car. However, there is a linear relationship between the images, as the car itself does not change its appearance. Through linear transformations that zoom and rotate the image sequence, the self-driving algorithm can determine the behavior of the car in front and act accordingly.

    Is linear algebra hard?

    Linear algebra is usually considered a difficult threshold for students to cross. In addition to the fact that it is a new world in mathematics that is presented, there is also a new language with a lot of inconsistent usage. In addition, English is usually an additional threshold for all students whose mother tongue is not English. The most difficult parts of linear algebra are usually linear transformations, change of basis and linear transformation and bases.

    Why is it called linear algebra?

    Linear algebra is called "linear" because it deals with linear equations and linear transformations . A linear equation is an equation in which the highest power of the variable is 1. For example, $2x + 3 = 0$ is a linear equation, whereas $x^2 + 4x + 3 = 0$ is not. Linear equations can be represented graphically as straight lines.

    A linear transformation is a transformation of a vector space that preserves the operations of vector addition and scalar multiplication. In other words, it is a function that maps one vector to another in such a way that the properties of vectors and scalars are preserved.

    Linear algebra also deals with concepts such as matrices, determinants, eigenvalues, and eigenvectors, which are used to represent and manipulate linear equations and transformations. All of these concepts have special properties when it comes to linearity that is why the name is Linear Algebra.

    Good outline for linear algebra and short to-do list

    We work hard to provide you with short, concise and educational knowledge. Contrary to what many books do.

    Get exam problems for old linear algebra exams divided into chapters

    The trick is to both learn the theory and practice on exam problems. We have categorized them to make it extra easy.

    Apple logo
    Google logo
    © 2024 Elevri. All rights reserved.