Math 231
(Last updated Monday, December 11 @ 9:50 pm)Course Information
Instructor: Professor Nicholas Vlamis
Instructor Office: 507 Kiely
Instructor Email: nicholas.vlamis@qc.cuny.edu
Class Meeting: Monday/Wednesday 3:45–5:35pm in Kiely 320
Office Hour: Monday 12–1pm and Wednesday 11am–12pm (or by appointment)
Textbook: Lay, David. Linear Algebra and Its Applications, Sixth Edition. Pearson, 2021.
Optional textbook: Interactive Linear Algebra by Dan Margalit and Joseph Rabinoff.
- Week 16 (Week of December 11)
Course Evaluations. Please complete before final exam.
Office Hours: Wednesday 4–5pm and Friday 12–1pm
Final Exam will be 4–6pm on Monday, December 18 in Kiely 320.
The final exam is cumulative. So, it will cover the content that appears in every homework assignment this semester. In the book, this includes the sections from Exam 1 and Exam 2 (see prior descriptions for details) together with Sections 5.9, 10.1, 10.2, 6.1–5, 4.1–5. You may bring two sheets of notes to the exam, but they may only contain definitions and theorem statements
Monday's class: Defined symmetric matrices and stated the spectral theorem. Used the spectral theorem to explain how facial recognitions works. This involves training a face embedding model, embedding faces as vectors in the model, and using cosine similarity to compare faces.
- Week 15 (Week of December 4)
Suggested Reading: Sections 4.1–4.5. I will also touch on some ideas from Sections 6.7 and 6.8 on Wednesday.
Course Evaluations. Please complete before final exam.
Homework Assignment #13. This assignment will not be collected, but the material is on the final exam.
Note: Office hours for next week will be announced at a later time.
Monday's class: Defined an abstract (real) vector space and gave many examples. Overviewed how the various notions we discussed previously remain the same in the setting of abstract vector spaces, including subspaces, linear combinations, spanning, linear independence, bases, dimension, and coordinates.
Wednesday's class: Discussed inner product spaces. Discussed the inner prouduct on polynomial space and performed Gram–Schmidt process to compute an orthogonal basis for P_2[x]. Also, informally discussed Fourier approximations in the context of inner products. We ended with a quick discussion on the field of 2 elements and quantum bits.
- Week 14 (Week of November 27)
Suggested Reading: Sections 6.4 and 6.5.
Homework Assignment #12. Due Wednesday, December 6.
Monday's class: Established the Gram–Schmidt process for finding an orthonormal basis for a subspace of R^n. Proved the QR-Decomposition theorem, and as an application, explained how to use QR-decomposition to approximate eigenvalues.
Wednesday's class: Defined least-squares solutions to linear systems. Proved least-squares solutions always exist and characterized when they are unique. Gave a formulat for the least-squares solution using the QR-decomposition theorem. Explained how linear regression is a least-squares problem.
- Week 13 (Week of November 20)
Suggested Reading: Sections 6.1, 6.2, and 6.3.
Homework Assignment #11. Due Wednesday, November 30.
Monday's class: Introduced the notion of orthogonal/orthonormal bases. Gave a formula for computing coordinates with respect to an orthogonal basis. Went over the orthogonal projection theorem.
- Week 12 (Week of November 13)
Suggested Reading: Sections 5.9, 10.1, and 10.2 (Chapter 10 can be found at bit.ly/2nj1HhO).
Relevant Links to Monday's discussion: OpenAI's embeddings, Cosine similarity (Wikipedia), Markov Chain (Wikipedia)
Homework Assignment #11. Due Wednesday, November 30.
Wednesday Office Hours are cancelled.
Monday's class: Introduced Markov chains and stochastic matrices. Went over some examples and proved some essential facts about stochastic matrices.
Wednesday's class: Finished discussion of Markov chains and defined Google PageRank as the stable the vector of a Markov chain. Introduced the notion of orthogonality and the orthogonal complement of a subspace.
- Week 11 (Week of November 6)
Suggested Reading: Sections 6.1
Exam week
No homework. No quiz on Wednesday
Monday's class: Recalled the definition of the dot product and introduced the notions of norm, distance, and angle in R^n. Gave an application to search using the idea of word space. Proved the Cauchy-Schwarz inequality and the triangle inequality.
- Week 10 (Week of October 30)
Suggested Reading: Sections 5.1, 5.2, and 5.3
Exam 2 will be held in class on Wendesday, November 8. The exam will cover the matrial from Week 6 through Week 10. This cover the content in Homework Assignments #6 through #10. In the text, this correpsonds to Sections 2.1–2.3, 2.8, 2.9, 3.1, 3.2, 5.1–5.3. You make bring one sheet of notes containing statements of theorems and definitions.
Homework Assignment #10. This assignment will not be collect, and there will be no quiz. But it will show up on Exam 2. So you should complete it by November 8.
Monday's class: Defined eigenvalues, eigenvectors, and eigenspaces. Went over procedures to find these values/objects for a given matrix.
Wednesday's class: Introduced the notion of similar matrices. Proved that similar matrices have the same eigenvalues. Proved that a set of eigenvectors with distinct eigenvalues is linearly independent. Proved that an n x n matrix is diagonalizable if and only if it has n linearly independent eigenvectors.
Examples of eigenfaces.
- Week 9 (Week of October 23)
Change in office hours: Monday office hours will be held 11am–12pm.
Suggested Reading: Sections 3.1 and 3.2.
Homework Assignment #9. Due Wednesday, November 1.
Monday's class: Defined the determinant. Computed determinant of triangular matrix and showed that a triangular matrix is invertible if and only its determinant is not 0. Stated a theorem showing how determinant changes under elementary row operations.
Wednesday's class: Proved that a matrix is invertible if and only if its determinant is nonzero. Discussed various properties of the determinant; in particular, we say that the determinant of a product of matrices is equal to the product of their determinants. Discussed a consequence of Cramer's Rule.
Week 8 (Week of October 16)
There will be no class on Wednesday. However, you are responsible for all the content in Section 2.9 of the textbook. I will talk about much of it in Monday's class, but not all of it.
Reading: Section 2.9
Homework Assignment #8. Due Wednesday, October 25.
There will be a take-home quiz handed out on Wednesday. You will turn it in the following Monday.
Take-home quiz. Due Monday, October 23
Monday's Class: Proved a theorem yielding a basis for the column space of a matrix. Defined dimension and proved a theorem showing the definition makes sense. Defined the rank and nullity of a matrix, and proved the Rank-Nullity theorem.
Week 7 (Week of October 9)
No class on Monday (CUNY closed). But we will have class on Tuesday at the regular time (CUNY follows a Monay schedule).
Office hours on Tuesday cancelled. (You can find me at the major/minor fair if you want to say hello.)
Homework Assignment #6 (Due Wednesday, October 11)
Suggested reading: Sections 2.2, 2.3, and 2.8
Homework Assignment #7. Due Wednesday, October 18. (This assignment will not be collected. There will be a take-home quiz on this assignment, as we will not meet in person next Wednesday.)
Tuesday's class: Introduced the notion of an invertible matrix. Introduced the notion of determinant for 2x2 matrices and a formula for the inverse of a 2x2 matrix. Established various properties of invertible matrices. Introduced the notion of an elementary matrix, and used elementary matrices to show that the Gauss-Jordan elimination algorithm can be used to detect if a matrix is invertible and to find the inverse of an invertible matrix.
Wednesday's class: Finished our initial discussion of invertible matrices. Introduced the notion of a subspace, discussed examples, and basic properties. Defined a basis of a subspace and computed bases of null spaces and column spaces.
A proof-by-picture of the fact that the determinant of a 2x2 matrix is the absolute value of the area of the parallelogram determined by its row vectors.
Week 6 (Week of October 2)
Suggested reading: Section 2.1
Exam 1 will be held in class on Wednesday, October 4. The exam will cover the material from Week1 through Week 5. This corresponds to the material represented in Homework assignment #1 through Homework assignment #5. In the text, this is Sections 1.1–1.5, 1.7–1.9. You make bring one sheet of notes containing statements of theorems and definitions.
Monday's class: Defined matrix multiplication and the transpose.
Week 5 (Week of September 25)
Suggested reading: Sections 1.9, 2.1
CUNY was closed on Monday, Septembe 25.
Homework Assignment #5. This assignment will not be collected; however, the content will appear on Exam 1.
Exam 1 will be held in class on Wednesday, October 4. The exam will cover the material from Week1 through Week 5. This corresponds to the material represented in Homework assignment #1 through Homework assignment #5. In the text, this is Sections 1.1–1.5, 1.7–1.9. You make bring one shee of notes containing statements of theorems and definitions.
Wednesday's class: Proved theorems discussing when a linear transformation is one-to-one or onto. Proved that there is no one-to-one linear transformation from R^n to R^m whenever n > m.
Week 4 (Week of September 18)
Suggested reading: Sections 1.7, 1.8, 1.9, 2.1
Homework Assignment #4. Due Wednesday, September 20.
Monday's class: Further discussed linear independence, and established several theorems; in particular, we showed that any collection of more than n vectors in R^n is linearly dependent. We defined matrix transformations and saw several examples.
Wednesday's class: Introduced the notion of a linear transformation. Proved that every linear transformation is a matrix transformation. Discussed how to find the standard matrix of a linear transformation. Proved the angle sum identities for cosine and sine.
Week 3 (Week of September 11)
Suggested reading: Sections 1.3, 1.4, 1.5, and 1.7.
Homework Assignment #3. Due Wednesday, September 20.
The solutions to Quiz 1 are posted above. The quiz itself is graded out of 10 points, but the total score is out of 12 points: if you turned in your homework you receive an additional two points.
Monday's class: Introduced the span of a collection of vectors. Defined the dot product and went over basic properties. Defined the product of an (m x n )-matrix and a vector in R^n in terms of linear combinations. Explained how to compute this product using dot products. Introduced the coefficient matrix and stated a theorem establishing the equivalence of solution sets between the representation of a linear system as an augmented matrix, a vector equation, and a matrix equation. Defined the column space of a matrix and went over a theorem establishing the equivalence of several conditions, namely the consistency of a matrix equation, the column space of a matrix being as large as possible, and the existence of pivot positions in every row of a matrix.
Wednesday's class: Introduced homogeneous linear systems. Showed, via a worked example, how to express the general solution of a homogeneous linear system as a span of vectors. Proved a theorem (Theorem 6) relating the general solution of Ax=b to the general solution of Ax=0. Defined linear independence for a set of vectors. Made basic observations of when a set with one or two vectors is linearly independent.
Week 2 (Week of September 4)
CUNY was closed on Monday.
Suggested reading: Section 1.3.
Homework Assignment #2. Due Wednesday, September 13.
Quiz 2 will be given during the beginning of class on Wednesday, September 13. It will be based on the exercises in Homework Assignment #1.
Wednesday's class: Went over another example of finding the general solution to a linear system. Discussed a theorem detecting when a linear system is consistent based on the augmented matrix; the same theorem included a statement saying that a consistent linear system either has a unique solution (no free variables) or infinitely many solutions (there exists a free variable). Defined the size of a matrix and defined a (column) vector. Introduced notation for R^m, the set of (m x 1)-matrices. Discussed basic properties of vector addition and scalar multiplication. Defined linear combination and discussed how to decide if a vector is a linear combination of a set of vectors. Discussed correspondence between vector equations and linear systems.
See videos to the right for how to use TI calculator to find the reduced row echelon form of a matrix and for an additional example of working through the Gauss-Jordan elimination algorithm.
Week 1 (Week of August 28)
Welcome to Math 231! Information for the week and a summary of classes will be added here.
Suggested reading: Section 1.1 and 1.2.
Homework Assignment #1. Due Wednesday, September 6.
Quiz 1 will be given during the beginning of class on Wednesday, September 6. It will be based on the exercises in Homework Assignment #1.
Monday's class: Discussed Google PageRank as a motivating example. Introduced linear equations, linear systems, solutions of linear systems, and augmented matrices.
Wednesday's class: Defined elementary row operations, reduced row echelon form, pivot positions, and pivot columns. Established the Gauss–Jordan elimination algorithm for finding the reduced row echelon form of a matrix. Discussed how to use this algorithm to find the general solution to a linear system. Defined basic variables, free variables, and parameterized solutions.