People apparently weren't looking at this much, so I let it slide for a while. Since the last entry, we have talked about changing bases and rewriting matrices of linear transformations in terms of new bases. Then we spent a week or so on eigenstuff - eigenvectors/values/spaces, algebraic and geometry multiplicity, similarity, diagonalizability, etc. We also talked a bit about discrete dynamic systems and, in particular, Google's PageRank algorithm (involving the power method for finding eigenvectors of large matrices). Then, Wednesday (11/12), we had Exam II.
Today, we covered inner products and a bit about norms, including why the unit circle isn't necessarily a circle. Since it seems that people aren't looking at this much (if at all), I may not write much more.... For the rest of the semester (7 more lectures), we'll dig into orthogonality, orthonormality, the Gram-Schmidt procedure, and possibly some matrix factorizations and/or applications....
We covered a couple of examples of linear transformations today, in full detail. We also proved the rank-nullity theorem (rank+nullity=dim(domain)), which is a fairly tricky proof. Next time, we'll launch into a discussion of changing bases. Soon, we will get into eigenvectors and eigenvalues, which will lead us into Exam 2.
We defined image, nullspace, injectivity, and surjectivity today. These are all very important concepts (outside of linear algebra, too). Your comfort with these concepts will aid you greatly as you move on through your math training!
We talked about linear maps today and went over a bunch of examples.
We talked about the fact that any linearly independent set can be extended to a basis today. Next time, we'll jump into linear maps (a.k.a. linear transformations) - the other key to our understanding of linear algebra (along with the definition of vector spaces and such).
Today, we revisited some old friends (such as linear (in)dependence) and met some new ones - bases, spans, dimensions, etc. Well, we at least started down this road today. We'll finish it Friday. We ended today with the fact that you can remove certain vectors from a span (those that are involved in a linear dependence) without changing the span.
Arsen Elkin talked about subspaces, among other things, today.
We started talking about abstract vector spaces. In particular, we defined rings, fields, and vector spaces and discussed a bunch of examples of fields and rings. Next time, Arsen Elkin (my substitute) will talk about subspaces and some other things.
Vector spaces and subspaces will be our fundamental building blocks from now on, so try to get comfortable with them. For example, try to understand why the examples we discussed today are indeed vector spaces.
Today was review day, and we spent the entire time going over the practice test solutions. The test is Wednesday, so come ready!
As promised, we finished the simplex method today and even had five minutes to talk about two methods (the branch and bound technique and cutting planes) for integer programming. This material is not on the exam, but it is good to see.
Today, we started talking about the simplex method - a neat way to solve some linear optimization problems. We'll finish that up next time.
We proved a few last things about determinants today, as well as one handy fact about linearly independent sets. After that, we talked about interpolation - how to find a polynomial (with the degree bounded by a prespecified bound) whose graph goes through ("interpolates") a given set of points. Next time, we will also talk about splines, after which we will dig into the simplex method....
Later this week, I'll post a practice exam, which will count as your homework for next Wednesday. Next Monday (one week from today) will be review day, and next Wednesday is exam day.
As mentioned last time, today was filled with quick and (mostly) easy consequences of the big theorem from last time. We proved about a dozen things about determinants, many of which are actually useful in practice. Time ran out before I could finish, so we'll do a few more next time....
Today was the day of the horrendous proof. It took all class. We went through a proof (a thorough sketch, no less!) of the fact that the determinant of a matrix changes sign if you swap row 1 with any other row. Fortunately, this opens the door for lots of quick and easy results next time, so the next class will be much nicer, with lots of cute little arguments. Also, HW #3 was due today. HW #4 should be posted by the end of the day today.
We finished talking about linear (in)dependence today (at least for the moment) and jumped into determinants. We talked about finding determinants of square matrices of dimension(s) 1 and those of dimension(s) 2 as well as a general definition (for any dimension(s)). We stopped just before a horrendous proof.... More fun for next time.
Also, I handed back HW #2. Please let me know if you have any concerns.
Today we talked more about spans. We discussed how to decide whether a given vector is in the span of a bunch of other vectors. The column space of a matrix was also defined (just the span of the columns of the matrix, as you might expect!). At the end, we started talking about what it means for a set of vectors to be linearly (in)dependent.
Today was pretty cut and dry. We covered the last bit of matrix inversion, including showing why our algorithm for computing inverses actually produces what it is supposed to produce (i.e., the inverse). After that, I started talking about linear combinations and spans. I spent so much time on examples of spans that we didn't get very deeply into that topic. We'll pick that up next time (assuming that we aren't all sucked up by a black hole in the meantime...).
FYI, I collected homework #2 today and will post the new homework tonight or tomorrow. Also, I will try to get solutions posted soon (at least before the first exam!). Please bug me for them if you want them sooner.
I handed back the first homework set today. In general, things looked pretty good. Here are a few quick comments:
We went over #3 in the homework in some detail since only a few people nailed the proof.
Once homework issues were taken care of, we went through most of MISLE. We talked about what a matrix inverse is, how to find one, and what good they can do. They can be handy when solving systems of linear equations, but not every matrix has an inverse.
Next time, we'll talk for about two minutes more about inverses, and then we will launch into linear combinations and spanning sets....
We went over the proof that A(x+y)=Ax+Ay (matrix A, vectors x,y) today, just to continue building your intuition about proofs. After that, we went over matrix multiplication and a few properties (proving that A(B+C)=AB+AC - kind of similar!). After that, we talked about solving SLEs again. This time, though, we used Ax=b format, which is a bit different and may take a little time to get used to. In any case, we discussed how you can add solutions of Ax=0 (the homogeneous system) to solutions of Ax=b and end up with new solutions of Ax=b. This (we'll see later) is a big deal. We will spend a fair bit of time thinking about systems of the form Ax=0....
The first homework was due today. Please get your homework to me
Today was much more relaxed than last time. We covered basic matrix and vector operations and talked a bit about the matrix/vector form of linear systems. Next time, we will talk about a different way of interpreting the solutions of linear systems (one which will serve us well down the road). The pace will pick up before long....
We booked today. We finished the example from last time (equation operations to get to a nice format, showing infinitely many solutions). After that, we talked about how to turn linear systems into matrices and vectors (including the important augmented matrix).
From there, we talked about row operations (identical to equation operations!) and row-equivalence. Row operations can be used to make matrices simpler. In fact, we discussed the simplest of forms, reduced row-echelon form (RREF). We defined that form and looked at the three formats that it could take (one for a unique solution, one for no solutions, and one for infinitely many solutions). Finally, I hurriedly jotted down the Gaussian elimination algorithm for putting matrices into RREF, and that was it. Check the homework tips page sometime this weekend - I will post a nice example of how to use Gaussian elimination.
Have a nice weekend!
Dick Painter stopped by - he will be holding office hours 2:30-4 M and 10-12 T in Weber 130. He is very knowledgable and has no control over your grades.
Today, we covered SLE.SSLE.PSS and most of SLE.SSLE.ESEO. We'll finish off ESEO next time and will aim to get through RREF, too.
In PSS, we saw a few examples of linear systems with different numbers of solutions, and I told you (without proof) that every SLE has 0, 1, or infinitely many solutions. That is obviously not true for nonlinear equations, so this is another way in which linear systems are special.
In ESEO, we talked about the three basic "equation operations." After defining them, we saw Theorem EOPSS, saying that the three equation operations preserve solution sets, i.e., applying an equation operation (from our list of three) will not change the solutions. We proved part of that theorem. Then we solved our first system by "eliminating" variables so that we ended up with a nice, staggered set of equations off of which you could read the solution of the system (there was a unique solution). We started on a second system (with infinitely many solutions) but will need to pick that up next time. The question from class about how to write down infinitely many solutions will be answered next time....
We spent about 20 minutes talking about the syllabus today. Here are the highlights:
For next time, please check out sections WILA, SSLE, and RREF of chapter SLE in Part C of the book. Also, please do your surveys (if you want to - they are optional).