-  First, what happens when we multiply the two vectors
,\left(\begin{array}{c}
{-1}\cr
{1}
\end{array}\right)
\rangle}) by H? by H?
	 
-  
	What's a better basis for both the domain and codomain in which to
	represent this transformation?
	
	 
-  Okay, so we've found some vectors that have the property that
			their images under the transformation h are
			actually just scalar multiples of themselves.
	
	These vectors have a special name: eigenvectors. And the scalars
			by which they're scalled are called
			eigenvalues. 
	 
	 
-  Now, an important question is this: under what conditions can we
			accomplish this this "diagonalization"? Under what
			conditions can we find eigenvectors and eigenvalues
			that allow us to construct this very simple matrix
			representation of the homomorphism?
		
		Are all matrices "diagonalizable"? (Can you think of a
			homomorphism from  we've studied geometrically where no vector will have
			as an image a multiple of itself? we've studied geometrically where no vector will have
			as an image a multiple of itself?
 
	 
-  The vectors I proposed for multiplication by the matrix H
			came "out of the blue". How would we go about finding
			them? Let's think about what we're looking for in terms
			of an equation: 
  
	We can re-write that as 
	 
	   
	obviously (since I is the identity matrix); now we take
			everything over to the lefthand side, 
	 
	 \overline{v}=\overline{0}})  
	i.e.
	 
 \left(\begin{array}{c}
{x}\cr
{y}
\end{array}\right)=\left(\begin{array}{c}
{0}\cr
{0}
\end{array}\right)
})