گيس گلابتون | |

پنجشنبه ٧ اسفند ،۱۳۸٢
In the praise of matrices!Wednesday, February 25, 2004
Hallooo! Bita dear, sorry if you are having problem with the English! Writing in English just comes more naturally! I'll switch back some time. Soroush, I am sad to hear that you aren’t gonna write anymore! Come on man! What’s the problem! As you see, I got over my “no more laundary” problem (and got stuck with les autres!) and so should you! About the matrix analysis, from what I know, being a science major, matrices have MANY applications. Also, I never used GauseJordan reduction (whatever that is) I know of a Gaussian reduction and a Jordan Canonical form and neither have anything to do with the inverse of matrices! Also, you broke my heart saying such mumbo jumbo (don't take it too serious I am making it dramatic) about Matrices! Matrices for linear algebra are just like functions for calculus! God, I love this analogy! No, but really! You can’t divorce one from the other! The reason we are even interested in knowing what a matrix inverse, minor, etc. is, is for isomorphism, isomorphic spaces, transformation of a vector space from one geometric space with a certain basis to another (this is very easy using a matrix as opposed to numerous equation, I assume this has use in differential geometry and topology), and from there to equivalence relation and equivalence class, etc. The augmented matrices are just a simple practice to solve systems of linear equation using row echelon form, and I find them much easier and faster to do (it’s very easy to see the relationship between the vectors, specially if the vector space is something like the Pn(R) the space of all the polynomials in the field R with the degree n) than writing out all the variables, etc. Also, it facilitates doing any problem involving linear combinations, span, bases, linear dependence/independence (because you can just write out the coefficient as an augmented matrix as opposed to writing the whole equations) and you can easily see the relationship. And as for their uses in the sciences, I can hardly think of any science that matrices don’t apply to at all! Have you ever heard of the Leslie Growth Model? It’s used in bio-statistics to analyze the change in a population using a simple eigenvalue problem. There you go for application in biology, env. science, and statistics. If you know computer programming, you have heard of arrays, which are essentially matrices! I don’t even need to talk about matrix application in physics! Aside from the simple vector manipulation they teach in high school, take string theory for example (My favorite topic ), how do you think they analyze an 11 dimensional universe? Of course using matrices with 11 dimensional bases is one way. This way they can reconstruct the geometric properties of let’s say R^11 or more generally F^n without actually being able to see it and manipulate that space using matrices! ‘Cause we can’t see 11 dimensions and can’t perceive it otherwise! There are also many problems in quantum mechanics and circuits that use matrices, some even have famous names, I can’t think of any off the top of my head! And in numerical analysis, which is useful for engineering specifically, they also use the same problems. They figure out the space where the answer to the linear equations lie (cause as you know if the rank of the matrix (the dimension of the image) is less than the number of variables we have, we will end up with numerous answers, or, depending on the number of variables, the solution is a line, a plane, a piece of 3 dimensional space, or more. And they take the equation of the solution and find the numerical answers to the problem (I don’t know exactly how). And in Economics, we have Markov’s theorem, linear differential equations and many more which are also related to matrices. Needless to say, linear diff. eq's have a class of their own! All the other sciences are enjoying the cool properties of matrices and the part that’s not very interesting is left for us mathematicians: The proofs (hehe! Just kidding)! Like this stupid problem I should have known how to solve two months ago and I have gotten stuck on: If S={u1,u2,…, un} (a finite set of vectors) I have to prove S is linearly dependent iff(if and only if) u1=0 or u(k+1)is and element of the span of {(u1, u2, …, uk)} for (1<=k
[ خانه | آرشيو | پست الكترونيك ] |