Notes on Schubert Polynomials
Chapter 5
Arun Ram
Department of Mathematics and Statistics
University of Melbourne
Parkville, VIC 3010 Australia
aram@unimelb.edu.au
Last update: 2 July 2013
Orthogonality
Recall that
where
are independent indeterminates.
(5.1)
is a free
of rank with basis
 |
 |
Proof. |
|
by induction on The result is trivially true when
so assume that and that is a free
with basis
Since
it follows that is a free
with basis Now
because the identities
show that
and on the other hand it is clear that
Hence is a free
with basis
To complete the proof it remains to show that
is a free with basis
Since
we have
from which it follows that the
generate
as a On the other hand, if we have a relation
of linear dependence
with coefficients
then we have also
for
and since
it follows that
|
As before, let
By reversing the order of
in (5.1) it follows that
(5.1')
The monomials
(i.e., for
form a
of
We define a scalar product on with values in
by the rule
where is the longest element of Since
is
so is the scalar product.
(5.3)
Let and
Then
| (i) |
|
| (ii) |
|
where
is the sign of
 |
 |
Proof. |
|
(i) It is enough to show that
for We have
because is symmetrical in and
The last expression is symmetrical in
and hence
as required.
(ii) Again it is enough to show that
We have
and since
this is equal to
|
(5.4)
Let be such that
Then
 |
 |
Proof. |
|
We have
by (5.3). Also
hence
It follows that
|
(5.5)
Let Then
 |
 |
Proof. |
|
We have
by (5.3) and (2.12). By (4.2) the scalar product is therefore zero unless
and then it is equal to
Now is a linear combination of monomials
such that and
Hence
is a sum of monomials where
Now
unless all the components of are distinct; since
for each
it follows that
unless for some
and in that case
must have all its components So the only possibility that gives a nonzero scalar product is
and in that case
|
(5.6)
The Schubert polynomials
form a of
 |
 |
Proof. |
|
Let and let
with coefficients
Let
Then from (5.5) we have
or in matrix terms
where
and
are square matrices of size with coefficients in
From (3) it follows that each of
has determinant
hence the equations (2) can be solved for
as combinations of the Schubert polynomials
Since by (5.1') the from a
of so also do the
|
We have
for all
 |
 |
Proof. |
|
Let denote the right-hand side of
(5.7). We claim first that
For this it is enough to show that for
Let
then is the disjoint union of and
and
Hence
Since for all we have
it follows that
for all as required.
Next, since each operator is
it follows that is
in each argument. By (5.6) it is therefore enough
to verify (5.7) when and
where
We have then
which by (4.2) is equal to
summed over such that
and
Hence the polynomial (2) is (i) symmetric in
(by (1) above), (ii) independent of (iii) homogeneous of degree
Hence it vanishes unless
and in which case it is equal to
Hence
by (5.5). This completes the proof of (5.7).
|
Now let
and
be two sequences of independent variables, and let
(the "semiresultant"). We have
For
is non-zero if and only if whenever
that is to say if and only if
and
The polynomial
is a linear combination of the monomials
with coefficients in
hence by (4.11) can be written uniquely in the form
with
By (5.5) we have
where the suffix means that the scalar product is taken in the variables. Hence
by (2.10), where acts by permuting the
Now this expression (1) must be independent of
Hence we may set
But then (5.9) shows that the
only non-zero term in the sum (1) is that corresponding to and we obtain
Hence we have proved
(5.10) ("Cauchy formula")
Remark.
Let where
and regard as a subgroup of
with permuting
and
permuting
Let
be the longest elements of respectively, and let
If we have
if
that is to say if is Grassmannian (with its only descent at and
otherwise. Hence by applying
to the in (5.10) we obtain
where is the set of Grassmannian
permutations with descent at (i.e.
if On the other hand, it is easily verified that
and that is the permutation
hence is also Grassmannian, with descent at
The shape of is
and the shape of is say
The relation between these two partitions is
that is to say is the complement, say of
in the rectangle with rows and
columns. Hence, replacing each by
we obtain from (5.10) by operating with
on both sides and using (4.8)
summed over all where
is the complement of in
This is one version of the usual Cauchy identity
[Mac1979, Chapter I, (4.3)'].
Let
be the of
dual to the basis relative to the
scalar product (5.2). By (5.3) and (5.5) we have
or equivalently
which shows that
for all From (5.10) it follows that
or equivalently
Let
be the basis dual to If
then by taking scalar products we have
and therefore also
so that
From (5.13) it follows that is the coefficient of in
and hence we find
where
Let
If (4.11), let
denote the polynomial in
obtained by replacing each by Then we have
where as before the suffix means that the scalar product is taken in the variables. In other words,
is a "reproducing kernel" for the scalar product.
 |
 |
Proof. |
|
From (5.13) we have
Hence by (5.5)
Hence (5.15) is true for all Schubert polynomials
Since the scalar product is it follows from (5.6)
that (5.15) is true for all
|
Let be the homomorphism that replaces each
by Then (5.15) can be restated in the form
for all
Now let
be a third set of variables and consider
for where
and act on the
variables. By (5.3) this is equal to
and by (5.15') we have
Since and
commute, it follows from (1)-(4) that
Hence we have
for all where
and
Let denote the algebra of operators of the form
with coefficients
For such a we have
for all where and
act on the variables in
For
and by (5.8)
is zero if and is equal to
if
Let and let
be a reduced word for so that
Since
for each it follows that we may write
where means that is of the form
where
is a subword of
The coefficients in (5.18) are polynomials, for it follows from (5.16) and (5.17) that
(5.20)
For all we have
 |
 |
Proof. |
|
From (5.18) we have
By (5.9) this is zero if and if
then by (2.10)
by (5.9) again.
|
The matrix of coefficients in (5.18) is triangular with
respect to the ordering and one sees easily that the diagonal entries
are non-zero (they are products in which each factor is of the form
Hence we may invert the equations (5.18), say
and thus we can express any as a linear combination of the operators
Explicitly, we have
 |
 |
Proof. |
|
By linearity we may assume that with
Then
Now by (4.2)
is either zero or equal to
and by (5.20)
is zero if and is equal to if
Hence the right-hand side of (5.22) is equal to
as required.
|
In particular, it follows from (5.22) and (5.21) that
hence is a polynomial.
The coefficients
in (5.18) and (5.23) satisfy the following relations:
| (5.24) |
| (i) |
|
| (ii) |
|
| (iii) |
|
|
for all
where
 |
 |
Proof. |
|
(i) By (5.23) and (2.12) we have
(ii) From (5.18) we have
and likewise
again by (5.9). Hence (ii) follows from (5.16).
(iii) Since
(2.12) we have
and hence
|
(5.25)
Let be the subalgebra of operators
such that
Then a free
with basis
 |
 |
Proof. |
|
If
then by (5.22)
On the other hand, the are a of
and hence are linearly independent over
|
Notes and References
This is a typed excerpt of the book Notes on Schubert Polynomials by I. G. Macdonald.
page history