Last update: 28 August 2013
Recall our setting from last time. Suppose and are algebras and is a completely decomposable with decomposition Suppose that is also a module for and that
We showed that has certain irreducible modules (called last time), each of dimension for The module is a completely decomposable module, with decomposition into irreducible modules
Corollary 3.22 Let be the character of and let be the character of Then for and
Proof. | |
Let be a basis for each and similarly choose bases for the Then by the above decomposition, the set forms a basis for Thus since and |
Consider the regular representation of an algebra defined by the transformation defined by left multiplication by Recall that we often denoted this action by using the vector notation to distinguish elements of the vector space from transformations in
We showed that that is, those transformations which commute with all left multiplications consist precisely of the right multiplications by elements of Let the opposite algebra of The algebra is defined to have as the underlying vector space, but with multiplication defined by where the multiplication of the right hand side is the usual multiplication in
The algebra acts on by right multiplication: for and
Problem 3.23 Verify that this defines a module action of on
Then acts as as in the statement of the theorem. If is completely decomposable as an (i.e. if is a semisimple algebra), then as a module, where and are the irreducible and modules respectively. Note that in this case, each irreducible appears in the decomposition of
We knew this already, however. If is semisimple, then and the irreducible (left) are the column spaces of dimension for each simple component. Then, by taking transposes, we see that the irreducible modules are the row vectors for each simple component (Note that they have the same dimension as the corresponding Then and we have the above decomposition.
We do get one useful piece of information from this situation, however. Continuing with semisimple, suppose has character and denote the representation by Then the irreducible modules for the opposite algebra are defined by Thus,
Applying the corollary,
We next evaluate this trace in another way. Choose a basis and a nondegenerate trace Recall that is then a non-degenerate symmetric bilinear form on so we may choose a dual basis with respect to this form.
We defined Evaluating the trace above, We have shown previously that the mapping is an adjoint map with respect to this form, hence Combining these two calculations,
Proposition 3.24 Let be a semisimple algebra with irreducible representations indexed by and denote the character of by Then
Example 3.25 Let be a finite group, and let be the group algebra over By Maschke’s theorem, is semisimple. By definition, the elements of the group form a basis for We have shown that the linear function defined by is a nondegenerate trace on and that the dual basis with respect to the associated bilinear form is Recall also that if then where is the order of the stabilizer of under conjugation, and is the conjugacy class of (so Applying the proposition in this context yields:
Proposition 3.26 (Second Orthogonality Condition) Let be a finite group. If then
Let be non-commuting variables and let be the vector space with basis Let be the vector space with basis Then acts on by permuting the subscripts of the basis vectors
What is We know that is semisimple by Maschke’s theorem, so is completely decomposable. To use the theorems of the last few classes, we’d like to find an algebra acting on such that Then, where are the irreducible modules and are the irreducible modules for
Of course, there is a natural action of the general linear group the set of all invertible matrices, on given by where We may extend this action to as the linear extension of This is a group representation, so we may consider the representation of the group algebra A word of caution is necessary: since this is a group algebra, the invertible matrices form a basis and the addition operation is formal addition which differs from the usual addition of matrices.
It is easy to see that that is, the action of commutes with the action of on (we shall prove it). That the reverse inclusion holds is as surprising as it is beautiful.
Theorem 3.27 (Schur-Weyl Duality) With the above notation This is equivalent to the classical Fundamental Theorem of Invariant Theory.
Lemma 3.28
Proof. | |
Let and suppose is a basis word in Then Then Acting by we have after replacing with Acting by sigma, we obtain Hence |
The remaining direction of the theorem will take some work. Although we have not yet proven that generates the full centralizer of on we do know that the actions of and commute, hence, we may view as a bimodule. Here the action is given by
Let us compute the trace of on For and we have Thus we may choose to calculate using convenient conjugates of and For there exists such that where is the cycle type of Moreover, there exists such that is in Jordan Canonical Form. In particular, the eigenvalues lie along the diagonal. These are nonzero, since is invertible.
We have
Homework Problem 3.29 Let Then
So we need only compute on with in Jordan Canonical Form. Write so and for Then Note that the terms in the last summation are zero unless Hence is a symmetric function in the eigenvalues (called the power symmetric function To summarize,
Proposition 3.30 If has cycle type and has eigenvalues then
Aside. Note that not all characters of are rational functions of the eigenvalues. Let be the representation defined by whose character has constant value 2.
Representations of in which the matrix is a rational function of the entries of are called rational representations. If the entries of are, in fact, polynomial functions in the entries of the representation is said to be a polynomial representation. Since is a polynomial in the entries of one can equally refer to rational or polynomial characters. Of course, the polynomial representations are a subset of the rational representations.
The proof of the fundamental theorem via symmetric functions, as suggested last time, can be found in paper 59 of Schur’s collected works. The proof we offer here is due to Curtis and Reiner [CR62].
Theorem 3.31 Fundamental Theorem of Invariant Theory.
Proof. | |||||||
We know that so we show the reverse inclusion. Let Then the action of on a word is where denotes the sequence denotes the sequence and If then for any we have Therefore, Thus, we conclude that for all sequences and and all Let be the set of pairs of sequences which we view as a two line array satisfying
We define a basis of as follows: for each pair let be defined by if for some and by if for any Then and every element can be written as for Homework Problem 3.32 Show that In particular, if and acts on as the transformation then and so Therefore, Now define an inner product on by for all This makes the basis of orthonormal with respect to Now consider where Then we write and in terms of the basis of as and we have Thus if for all In other words, we must have for all choices of for which Let be commuting variables, and define polynomials and Then the product is zero when evaluated at all so, by the fundamental theorem of algebra, But in and is an integral domain, so Moreover, for the monomials are distinct, since we ordered them. Thus for all implying that and therefore, |
Now let and Then is a module for both and with the property that Moreover, is completely decomposable as an (by Maschke’s theorem), so acts on and we have the decomposition where is an irreducible and is an irreducible We showed that if with cycle type and with eigenvalues then where is the irreducible character of evaluated on the conjugacy class labeled by and is the Schur function corresponding to It follows that:
Corollary 3.33
(1) | has some irreducible representations that can be indexed by the partitions having |
(2) | The character of the indexed by can be given by |
This is a copy of lectures in Representation Theory given by Arun Ram, compiled by Tom Halverson, Rob Leduc and Mark McKinzie.