Data Science – Key Algebra Topics to Master

This article represents some of the key topics in Algebra that one may need to brush up or master in order to get good at understanding different aspects of machine learning algorithms. If you are gearing up to become the data scientist, the topics below may be worth your attention as I had to brush them up eventually when I was learning different machine learning algorithms. The concepts listed below, especially related with linear algebra, touches almost all machine learning algorithms. Please feel free to comment/suggest if I missed to mention one or more important points. Also, sorry for the typos.

Following are the key high level topics which are elaborated later in this article:

  • Basic algebra concepts
  • Linear Algebra
Topics in Basic Algebra

Following are some of the key topics in basic algebra that one may need to brush up on. The understanding on below concepts forms the backbone of understanding any machine learning algorithm. If you are not excited by topics listed below, linear algebra would surely scare you much.

  • Linear equations
  • Quadratic equations
  • Polynomials
  • Functions
Topics in Linear Algebra

As per Wikipedia page on Linear Algebra, here goes the definition: Linear algebra is the branch of mathematics concerning vector spaces and linear mappings between such spaces. It includes the study of lines, planes, and subspaces, but is also concerned with properties common to all vector spaces.. Linear algebra, in general, is very useful for modeling, simulations etc which is what is done while working with machine learning algorithms. Following are some of the key topics in linear algebra that one may need to brush up:

  • Vector & Spaces: This topic shall primarily cover concepts related with what a vector is, and, related operations supported with vectors (such as addition, subtraction, multiplication etc.). I found the crunching need to have some good understanding around vectors when I was working on Support Vector Machines (SVM). One would need to understand basic concepts related with vectors, vectors operation such as dot product etc to get a good hang on SVM concepts. Some of the following concepts should be studied:
    • Vectors
    • Operations with Vectors (Addition, subtraction, multiplication)
    • Inner/Dot & Cross product (Related with vector multiplication)
    • Null space (Set of vectors satisfying Ax=0 for a given matrix A) & column space (Linear combination of columns of matrix)
  • Matrix Transformations: This topic helps us understand mapping between one set of vectors to another set of vectors. The understanding of Matrix is a must for understanding almost any machine learning algorithm starting with linear regression through artificial neural networks. The topic will cover some of the following related concepts:
    • Functions & linear transformations
    • Matrix operations such as addition, subtraction, multiplication
    • Inverse functions & transformations
    • Transpose of a matrix
  • Orthogonal projections: Understanding of orthogonal projections concepts help in understanding SVM in a nice and easy manner.
  • Eigen vectors, Eigen values, Eigen spaces: Understanding of Eigenvectors, Eigen values and Eigen spaces is key to understanding concepts related with Dimensionality reduction algorithm.


Ajitesh Kumar

Leave A Reply

Time limit is exhausted. Please reload the CAPTCHA.