Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there any book which actually explains where matrix and its rules come from? Instead of throwing on you matrix multiplication rules in dogmatic way so you blindly follow them like mindless robot?

I know there are lectures in YouTube from 3Blue1Brown:

https://www.youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQ...

So I want a book which covers linear algebra in the similar manner.



Matrices are not fundamental or interesting by themselves as just Excel-like grids of numbers. The reason we care about them is because they are a convenient notation for a certain class of functions.

I have to catch a flight so I don't have time to explain this fully, but the key points are:

1/ A "linear function" is a function where each variable in the output is "linear" in all the variables of the input (i.e., a sum of constant multiples of the input variables). e.g. f(x,y) = (x + 3y, y - 2x) is a linear function, but g(x, y) = (x^2, sin(y)) is not.

2/ All linear functions can be represented by a matrix. The `f` I mentioned above corresponds to the matrix:

  [ 1 3
   -2 1 ]
3/ The rules of matrix multiplication are defined so that multiplying by the matrix of a linear function corresponds to applying that function.

For example, again using the definitions above:

  f(7, 8) = (31, -6)
And notice that we get the same thing when we do matrix multiplication:

  [ 1 3   * [ 7      = [ 31
   -2 1]      8 ]        -6 ]
4/ Matrix multiplication also corresponds to function composition. If `f` is as defined above, and `h` is defined by h(x, y) = (-3y, 4x + y), then the matrix for h is

  [ 0 -3
    4  1 ]
and the function `f ο h` you get by applying `h` and then `f` is (you can check this...)

  f ο h(x, y) = (12x, 4x + 7y)
The matrix for this functions happens to be

  [ 12 0
     4 7 ]
But, lo and behold, this matches matrix multiplication:

  [ 1 3    * [ 0 -3    =  [ 12 0
   -2 1 ]      4  1 ]        4 7 ]
4/ Why do we care about linear functions? Well, linear functions are interesting for a lot of reasons, but one in particular is that (differential) calculus is all about approximating arbitrary differentiable functions by linear ones. So you might have some weird function but, if it's differentiable, you know that "locally" it is approximated by some (constant plus a) linear function



"Linear Algebra Done Right" is a fine book but its enduring popularity leads people to recommend it as a universal default answer.

The parent asked if there was a LA book that covered the material in the same style as 3Blue1Brown's videos. If that's the criteria, Sheldon Axler's book isn't the best book. One can compare a sample chapter to the youtube videos and realize they use different pedagogy:

http://linear.axler.net/Eigenvalues.pdf


I second this. Linear Algebra Done Right is an awesome book. It also comes with a helpful selection of exercises after each chapter with detailed answers available on the website, which is great for self-learners. If you are a student and your Uni has a Springer subscription, you might be able to get the PDF for free.


For intuition about linearity, check out this intro jupyter notebook: https://github.com/minireference/noBSLAnotebooks/blob/master... and the associated video https://www.youtube.com/watch?v=WfrwVMTgrfc


The No Bullshit Guide to Linear Algebra https://gumroad.com/l/noBSLA#


I like the look of the presentation and the active teaching method adopted. I also like the way Savov gives out the definitions and the facts as a pdf but keeps the exercises, investigations and examples for the paid version - the exercises are the value added in Maths in my limited experience.

I just bought the paper version off Lulu (I like being able to read and scribble and then go on the computer for the computational exercises). And now to set up SymPy on Debian...


There was a web book I found a while ago that built up some sort of motivation for linear algebra. Unfortunately I don't remember what it was or the title.

Edit: found it, https://graphicallinearalgebra.net

Ymmv. Matrix multiplication is defined the way it is imo because it has interesting properties that way. Not very satisfying though.


You might find my Clojure Linear Algebra Refresher helpful.

http://dragan.rocks/articles/17/Clojure-Linear-Algebra-Refre...

This is the link for the first part. You'll find further articles there.

It walks you through the code, explain things briefly, and points you to the exact places in a good Linear Algebra with Application textbook where this is explained in detail.


my linear algebra teacher taught it in a visual and proof focused way and it was amazing. He also tied it upwards into abstract algebra WRT vector spaces. He also taught my abstract algebra where he tied things back into linear algebra. That was an amazing set of classes...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: