# Axler revisited

I was looking for something in my old stack of books when I stumbled on to Sheldon Axler’s fantastic book, ‘'**Linear Algebra Done Right**". I have fond memories about the book. I think the last time I referred to this book was more than 3.5 years ago. Took a few hours to go over the book again. Like wine that tastes better when aged, I think some books also give the same kind of effect, at least to me. A legitimate understanding of ANY discipline needs one to have a good grip on Linear Algebra. The more randomness in the field you choose to work in, the more you will use linear algebra, as you are forever looking for an approximate solution.

A very basic example that any stats person learns is least squares. When you build a model for “Y = X *beta + e”, all you are doing is to find the best 2-norm approximation of Y in the column space of X. Take any machine learning algorithm from the most basic to the most sophisticated, linear algebra pervades everywhere. Any electrical engineer breathes FFT algo which is nothing but a change of basis. You take a discretized function in one basis and represent it in another basis. For a structural engineer, his/her job would certainly involve solving differential equation relating to structures and if they want to solve it numerically, they will need to study the stability of first order and second order difference matrices.An econometrician needs to understand invariant subspaces, i.e. the concept of eigen values and eigen vectors to study even the most elementary time series AR(1). Optimization pervades many fields and all the techniques involved in optimization are formulated, solved and reported in terms of linear algebra terminology. Minimizing a multivariate function entails computing gradients(vectors) and hessian(matrices).With out the language of Linear Algebra, optimization as a field would not have developed to what it is today. I can go on and on……But the point I want to make is this : For some reason, I had never learnt the importance of linear algebra as a **“thinking tool”** for many years in to my working life. When I did start learning the subject, I realized that it gives one a set of fresh eyes to look at a ton of stuff. Many things start to make sense when viewed from a matrix algebra perspective.

For example, if you are asked to approximate the **sin(x)** over the interval **[-pi, pi]** using a fifth order polynomial, a0 + a1 x + a2 x2 + a3 x3 + a4 x4 + a5 x5 ,how would you go about it ? What are your first thoughts ? Pause for a while….

Well, you can use Taylor series approximation but it only works near origin and not across the entire interval

**[-pi, pi]**. A person who breathes stats would generate a few data points of **sin(x)** and do a polynomial regression. An electrical engineer would probably consider the 6 functions (1, x, x2, x3, x4, x5) as input signals and then come up with a linear filter to approximate it to **sin(x)** function . Basically there are many ways to do it. However an elegant way to do is to think about the function **sin(x)** as infinite order polynomial and express the function in terms of the basis of a subspace, i.e. fifth order polynomial space. You can do this via pen and paper. No fancy software is needed to solve the problem. I like this book because this was the first book that really taught me what a matrix actually means. I have listed down a few questions in a document. These questions drill home the importance of actually “**understanding matrices**” as opposed to merely “**using matrices**”. The questions in the document are in no particular order of difficulty. One thing that is common across all of them is that Axler’s book provides a beautiful explanation to all the questions and in the process will make anyone appreciate the beauty of Linear Algebra.