# Jason Grout

### Sidebar

courses:linear_algebra:spring_2014:home

# Linear Algebra, Spring 2014

• Image compression example - this page explores the discrete cosine transform, which is how a cell phone stores images. The basic underlying concept is constructing a more intelligent basis that can differentiate between what your brain thinks is important in an image and what is not important, and then throwing away the stuff that is not important. We went through this example in class.

# Some notes from class

## Wed, 16 Apr

We canceled class today because I was sick. I've pushed back the homework HW07 deadline until Monday to give a bit of time on Monday to answer homework questions. A few people emailed about the homework problems dealing with showing that a set is a vector space (i.e., that it is closed under linear combinations). This is what we spent nearly all our class time the other week on (this is what the handout below was all about (from 02 Apr)). Section 4.1 also talks about this a lot.

As another hint, here is the work to check a case from my online homework problem 2. In this problem, vectors are polynomials from $P_2$ (i.e., the polynomials having degree 2 or less). Let's check to see if the set $V=\{p(t) \mid p'(t) \text{ is constant}\}$ is a vector space. In words, this set $V$ is the set of all polynomials in $P_2$ such that the derivative of $p(t)$ is constant. We need to check two things: is it closed under vector addition, and is it closed under scalar multiplication. Before we start, though, we list out a few polynomials that are in our set so we get an idea of what they look like: $1+2t$, $5+3t$, $0$, $10t$. All of these polynomials have the property that $p'(t)$ is a constant.

• closed under vector addition: we take two polynomials in $V$ and add them together and check to see if the result is in $V$ (i.e., that the derivative is constant): Let $p_1(t)$ and $p_2(t)$ have constant derivatives. Then the derivative of the sum, $(p_1(t)+p_2(t))' = p_1'(t)+p_2'(t)$ is the sum of two constants, and thus is constant. So we've just shown that the sum of two polynomials from $V$ is also in $V$.

* closed under scalar multiplication: we take an arbitrary constant $k\in\mathbb{R}$ and a polynomial $p(t)$ from $V$ (so $p'(t)$ is constant), and check to make sure that $kp(t)$ is also in $V$ (i.e., that it has a constant derivative). The derivative of $kp(t)$ is $(kp(t))'=kp'(t)$, which is just $k$ times a constant, so it is constant as well. So $kp(t)$ is in $V$.

We just showed that adding any two polynomials from $V$ give us a result in $V$, and multiplying any polynomial in $V$ with any constant also gives us a result in $V$. Thus, the set $V$ is closed under vector addition and scalar multiplication, so it is closed under linear combinations, so it is a vector space.

## Wed, 02 Apr

We discussed what happens when we let the word “vector” mean things other than lists of numbers. For example, what does it mean to have a vector space of polynomials, or of matrices, or of functions?

For Monday, please complete the handout we started in class. Here are some answers for 3 of the problems in the handout: Answers

## Wed, 26 Mar

We finished talking about determinants today, including reviewing a number of properties of determinants. We also talked about efficiently calculating the determinant using elementary matrices and row operations. We've now talked about the material in sections 3.1 and 3.2 of the text.

I posted some online homework about determinants due next Thursday. I'd also recommend doing problems 10, 27, 28, 30, 33, 34, 36, and 40 in section 3.2 (check your answers in the back of the book, and come talk with me if you have questions).

## Mon, 24 Mar

We started looking at determinants today. We talked about how the determinant tells us how a transformation stretches area, and showed how to calculate determinants, and investigated some of the properties of determinants.

## Mon, 03 Mar

We talked more about the 4 categories of linear transformations (combinations of one-to-one/not one-to-one and onto/not onto). We then talked about finding the matrix for a linear transformation given its description (find where the vectors $(1,0,\ldots,0)$, $(0,1,0,\ldots,0)$, …, $(0,0,\ldots,0,1)$ go to). We also investigated how to compose functions using matrix multiplication and went over how to think about matrix multiplication as linear combinations of the first matrix's columns.

## Wed, 26 Feb

For next time:

• Find 4 linear transformations, one for each of the following cases:
• T is one-to-one and onto
• T is one-to-one, but not onto
• T is not one-to-one, but is onto
• T is not one-to-one and not onto
• Find the matrix for the transformation from $\mathbb{R}^2$ to $\mathbb{R}^2$ that does these operations, in order, to any input vector:
• Flips the vector over the y-axis
• then rotates the result 90 degrees clockwise
• then stretches the result vertically by a factor of 2.

## Wed, 12 Feb

Here are the linear transformation visualizations: Vector Transformations, Transformation F, Lady Liberty, and movie poster.

## Mon, 3 Feb

The Magic Carpet Ride problem sequence comes from this PRIMUS article.

## RREF

Here is an RREF tool. Edit the list of rows in the matrix, then press Evaluate to compute the Reduced Row Echelon Form (RREF).

### Page Tools 