#18453 closed defect (fixed)
Infinite affine crystals should use extended weight lattice
Reported by:  bump  Owned by:  

Priority:  major  Milestone:  sage6.8 
Component:  combinatorics  Keywords:  crystals, days65 
Cc:  sagecombinat, bsalisbury1, tscrim, aschilling, nthiery, ptingley  Merged in:  
Authors:  Ben Salisbury, Anne Schilling, Travis Scrimshaw  Reviewers:  Dan Bump 
Report Upstream:  N/A  Work issues:  
Branch:  ee398ef (Commits, GitHub, GitLab)  Commit:  
Dependencies:  Stopgaps: 
Description
Infinite affine crystals should use the extended weight lattice. This would include both B(infinity) crystals and crystals of integrable representations.
Here is an example.
sage: C = crystals.infinity.NakajimaMonomials(['A',1,1]) sage: v=C.highest_weight_vector() sage: v.f(1).weight()+v.f(0).weight() 0
The answer should be delta
. The problem is that the extended weight lattice is not being used.
Change History (62)
comment:1 Changed 6 years ago by
 Type changed from PLEASE CHANGE to defect
comment:2 Changed 6 years ago by
 Cc bsalisbury1 tscrim aschilling added
comment:3 Changed 6 years ago by
 Cc nthiery added
 Component changed from PLEASE CHANGE to combinatorics
 Keywords crystals added
comment:4 Changed 6 years ago by
There may be cases where one does not want the extended weight lattice. For example KirillovReshetikhin crystals, I think it would complicate things if you used the extended weight lattice.
This is addressed by Travis' proposal since he would create a new category for finite crystals that use the nonextended weight lattice.
The alternative way of fixing things would be to find the places where the wrong weight lattice is created (for example in the __classcall_private__ method o InfinityCrystalOfNakajimaMonomials
and add extended=True
to the parameters of weight_lattice
. Both ways would fix the problem.
One question is whether we more often than not want affine weight lattices to be extended. I myself do. So I would support Travis' proposal but I am pointing out that there is an alternative.
comment:5 Changed 6 years ago by
 Branch set to public/combinat/fix_crystal_weight_lattice18453
 Commit set to b4a764743046481af4a6bb88ca353d607535d339
Okay, here's a rough draft of my proposal that mostly works. I would like some feedback, in particular from Nicolas on the change of the behavior and the reorganization of the code. There also seems to be a subtle issue with subclassing and @cached_method
(with my branch):
sage: P = RootSystem(['A',3,1]).weight_lattice() sage: P.fundamental_weight('delta') delta sage: P.fundamental_weights() Finite family {0: Lambda[0], 1: Lambda[1], 2: Lambda[2], 3: Lambda[3]}
Restart Sage and do
sage: P = RootSystem(['A',3,1]).weight_lattice() sage: P.fundamental_weights() Finite family {0: Lambda[0], 1: Lambda[1], 2: Lambda[2], 3: Lambda[3]} sage: P.fundamental_weight('delta') # This is now the method of the base class!!! ... ValueError: delta is not in the index set
I have no idea about why this occurs, much less how to fix this.
However it does not actually fix the problem noted on this ticket as we compute the weight by Phi  Epsilon
. I've fixed this directly for LS paths (to which we need to be much more careful with the category) and rigged configurations. For Nakajima monomials, I'm not quite sure how to fix it. I really don't want to special case the affine types. In Kang et al., they construct B(\infty) by the path realization, and so I think they consider it as a U'_{q}(g)crystal (and so is in the nonextended weight lattice).
(Additionally, the weight function isn't correct for the Kyoto path model either, but that is a separate issue.)
New commits:
7663e78  Implement category of KR crystals and reorganizing (extended) weight spaces.

6e0ba04  Fixing my errors from the conversion.

b4a7647  Fixing weight for LS paths and rigged configurations.

comment:6 Changed 6 years ago by
 Commit changed from b4a764743046481af4a6bb88ca353d607535d339 to cbd6e19da260c9c8fea9ae4762886fd1477d330c
Branch pushed to git repo; I updated commit sha1. New commits:
cbd6e19  Merge branch 'develop' into public/combinat/fix_crystal_weight_lattice18453

comment:7 followup: ↓ 8 Changed 6 years ago by
 Branch changed from public/combinat/fix_crystal_weight_lattice18453 to public/crystal/18453
 Commit changed from cbd6e19da260c9c8fea9ae4762886fd1477d330c to 901a865ac6ca5405018dc8ca5634448aebc203c4
comment:8 in reply to: ↑ 7 ; followup: ↓ 17 Changed 6 years ago by
Ben and I are going to fix this in a different way. I had to fix the Weyl dimension formula (which I guess was written by Nicolas and only worked in the ambient space). So please check. Ben will add fixes to the other crystals, but the Littlemann paths should be fixed now.
comment:9 Changed 6 years ago by
 Commit changed from 901a865ac6ca5405018dc8ca5634448aebc203c4 to 45ca9eefc39552f1e7e7bc9ae5dd61a66c97bec1
comment:10 Changed 6 years ago by
 Commit changed from 45ca9eefc39552f1e7e7bc9ae5dd61a66c97bec1 to 9225af71b4dbcfa6c748816c452e9824dadc3651
Branch pushed to git repo; I updated commit sha1. New commits:
9225af7  18543: small typo fixed in generalized_young_walls

comment:11 followup: ↓ 12 Changed 6 years ago by
comment:12 in reply to: ↑ 11 Changed 6 years ago by
Hi Travis,
littelmann_paths and generalized_young_walls are fixed. Ben is working on monomials to get the hyperbolic cases working. Could you fix the rigged configuration crystals by implementing the correct weight and weight_lattice_realization? In this ticket we do not want to revamp all the root_system stuff that you added in your previous patch and the category changes for KR crystals.
Thanks!
Anne
comment:13 followup: ↓ 16 Changed 6 years ago by
I'm pretty sure this will not end up fixing most of the problems and it feels like we're hacking around the problem. At the very least as a better patch, I think we could solve a host of issues by just checking if the type is affine and if the crystal is not in finite crystals to pass extended=True
. However I will fix the RC code and test a bunch of the different models in affine type as soon as I'm able to today.
comment:14 followup: ↓ 20 Changed 6 years ago by
I've fixed the rigged configurations
CrystalOfAlcovePaths
fails outright:sage: C = crystals.AlcovePaths(La[0]) sage: C Highest weight crystal of alcove paths of type ['A', 1, 1] and weight Lambda[0] sage: C.module_generators[0].f_string([0,1]) ((alpha[0], 0), (2*alpha[0] + alpha[1], 1)) sage: _.weight() TypeError
 We should also explicitly implement a
weight_lattice_realization
forDirectSumOfCrystals
similar to what we did for tensor products.  We have to decide what we want to do with
KyotoPathModel
and if we want to consider it as a U_{q}'crystal or a U_{q}crystal. I think the former is what we should do considering it is a tensor product of U_{q}'crystals. In any case, we will probably have to do something special for this.  We have a problem with not distinguishing between elements of the extended and nonextended affine weight lattices. In particular, this causes an issue with creating the highest weight crystals with the same weight but you accidentally first create it in the nonextended affine weight lattice.
sage: LaE = RootSystem(['A',2,1]).weight_space(extended=True).fundamental_weights()sage: La = RootSystem(['A',2,1]).weight_space().fundamental_weights() sage: B = crystals.LSPaths(La[0]) sage: B2 = crystals.LSPaths(LaE[0]) sage: B is B2 True
I had changed crystals.RiggedConfigurations
with inadvertantly triggered this. I don't think we want to enforce that the user must consider weights in the extended affine weight lattice as it is a valid restriction from U_{q} to U_{q}' (in particular, the Kyoto path model). IMO the solution would be for __classcall__
/__classcall_private__
to pass the weight lattice realization with the default to use the parent of the weight given. I feel this gives the user added flexibility for where they want the weights to be printed. Although part of me also thinks there should be some kind of option which could be changed in the methodinput/classes/globally which specifies the weight lattice realization.
I will be on skype all day today if you want to talk.
comment:15 Changed 6 years ago by
 Commit changed from 9225af71b4dbcfa6c748816c452e9824dadc3651 to 7963ea6a64dc4bd1d623288fbb6702e11beae0dc
Branch pushed to git repo; I updated commit sha1. New commits:
7963ea6  Fixing WLR for rigged configurations.

comment:16 in reply to: ↑ 13 ; followup: ↓ 18 Changed 6 years ago by
Replying to tscrim:
However I will fix the RC code and test a bunch of the different models in affine type
Just random thoughts: will failure be caught by TestSuite?? If not could there be new _test methods catching the issue?
comment:17 in reply to: ↑ 8 ; followup: ↓ 19 Changed 6 years ago by
Replying to aschilling:
I had to fix the Weyl dimension formula (which I guess was written by Nicolas and only worked in the ambient space).
By Dan, if I recall correctly.
comment:18 in reply to: ↑ 16 Changed 6 years ago by
Replying to nthiery:
Replying to tscrim:
However I will fix the RC code and test a bunch of the different models in affine type
Just random thoughts: will failure be caught by TestSuite?? If not could there be new _test methods catching the issue?
I don't think we could write a TestSuite
check for this (at least for the highest weight crystals) as the branching rule I mentioned above is valid (and so having the weights in the nonextended weight lattice still makes sense, unless I'm misunderstanding something about U_{q}'(g) inside of U_{q}(g)).
comment:19 in reply to: ↑ 17 ; followups: ↓ 26 ↓ 61 Changed 6 years ago by
Replying to nthiery:
Replying to aschilling:
I had to fix the Weyl dimension formula (which I guess was written by Nicolas and only worked in the ambient space).
By Dan, if I recall correctly.
7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 859) def weyl_dimension(self, highest_weight): 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 860) """ 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 861) EXAMPLES:: 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 862) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 863) sage: RootSystem(['A',3]).ambient_lattice().weyl_dimension([2,1,0,0]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 864) 20 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 865) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 866) sage: type(RootSystem(['A',3]).ambient_lattice().weyl_dimension([2,1,0,0])) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 867) <type 'sage.rings.integer.Integer'> 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 868) """ 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 869) highest_weight = self(highest_weight) f562ca2b (Travis Scrimshaw 20141001 17:16:31 0700 870) if not highest_weight.is_dominant(): f562ca2b (Travis Scrimshaw 20141001 17:16:31 0700 871) raise ValueError("the highest weight must be dominant") 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 872) rho = self.rho() 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 873) n = prod([(rho+highest_weight).dot_product(x) for x in self.positive_roots()]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 874) d = prod([ rho.dot_product(x) for x in self.positive_roots()]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 875) from sage.rings.integer import Integer 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 876) return Integer(n/d)
But in any case, someone should check!
comment:20 in reply to: ↑ 14 ; followup: ↓ 23 Changed 6 years ago by
CrystalOfAlcovePaths
fails outright:sage: C = crystals.AlcovePaths(La[0]) sage: C Highest weight crystal of alcove paths of type ['A', 1, 1] and weight Lambda[0] sage: C.module_generators[0].f_string([0,1]) ((alpha[0], 0), (2*alpha[0] + alpha[1], 1)) sage: _.weight() TypeError
No, it does not. You did not send how you made the weight La. I assume you used the weight_lattice and not weight_space (which you have to for this model!!)
sage: R = RootSystem(['A',1,1]) sage: P = R.weight_space() sage: La = P.basis() sage: C = crystals.AlcovePaths(La[0]) sage: b = C.module_generators[0].f(0) sage: b ((alpha[0], 0),) sage: b.weight() Lambda[0] + 2*Lambda[1]
 We should also explicitly implement a
weight_lattice_realization
forDirectSumOfCrystals
similar to what we did for tensor products. We have to decide what we want to do with
KyotoPathModel
and if we want to consider it as a U_{q}'crystal or a U_{q}crystal. I think the former is what we should do considering it is a tensor product of U_{q}'crystals. In any case, we will probably have to do something special for this.
Mathematically speaking, the Kyoto path model is a model in the category of highest weight affine crystals. It is a U_q(g)crystal not a U_q'(g) crystal. Really what one does it take tensor products B^{r,s} \otimes u_\Lambda
with \Lambda
of level s
.
comment:21 Changed 6 years ago by
 Commit changed from 7963ea6a64dc4bd1d623288fbb6702e11beae0dc to 3b3c0b2d15289a8c4a5c5c25ad2984114a856916
Branch pushed to git repo; I updated commit sha1. New commits:
3b3c0b2  added weight_lattice_realization to monomial crystals

comment:22 Changed 6 years ago by
 Cc ptingley added
 Keywords days65 added
It seems the original error in the monomial crystals code is due to an error in the literature about how the weight function is defined for monomial crystals. I've checked
 Kang, Kim, and Shin, Modified Nakajima monomials and the crystal B(infinity), J. Alg. 308 (2007), 524535.
 Kashiwara, Realizations of crystals, Contemporary Math. 325 (2003), 133139.
 Kim and Shin, Nakajima monomials, Young walls, and the Kashiwara embedding, J. Alg. 330 (2011), 234250.
 Kim, Monomial realization of crystal graphs for Uq(An1), Math. Ann. 332 (2005), 1735.
 ...
In none of these references is there a way to get delta
to appear as a weight. Perhaps I've missed a reference that does include this information, but I'm thinking about the problem now and planning to discuss it with Peter TIngley, so hopefully we will have a resolution soon.
comment:23 in reply to: ↑ 20 ; followup: ↓ 24 Changed 6 years ago by
Replying to aschilling:
CrystalOfAlcovePaths
fails outright:No, it does not. You did not send how you made the weight La. I assume you used the weight_lattice and not weight_space (which you have to for this model!!)
Ah, we do need the weight_space
, so we should probably make it error out similar for the LS paths if we aren't in the weight space. Also it does work using the extended weight lattice:
sage: La = RootSystem(['A',2,1]).weight_space(extended=True).fundamental_weights() sage: C = crystals.AlcovePaths(La[0]) sage: C.module_generators[0].f_string([0,1]) ((alpha[0], 0), (alpha[0] + alpha[1], 0)) sage: _.weight() Lambda[1] + 2*Lambda[2]  delta
 We should also explicitly implement a
weight_lattice_realization
forDirectSumOfCrystals
similar to what we did for tensor products. We have to decide what we want to do with
KyotoPathModel
and if we want to consider it as a U_{q}'crystal or a U_{q}crystal. I think the former is what we should do considering it is a tensor product of U_{q}'crystals. In any case, we will probably have to do something special for this.Mathematically speaking, the Kyoto path model is a model in the category of highest weight affine crystals. It is a U_q(g)crystal not a U_q'(g) crystal. Really what one does it take tensor products
B^{r,s} \otimes u_\Lambda
with\Lambda
of levels
.
I agree that it is in highest weight affine crystals, but we don't have a way to get \delta
from the elements. The embedding B(\lambda) \to B^{r,s} \otimes B(\mu)
has to be as U_q'(g)crystals as B^{r,s}
must be a U_q'(g)crystal as otherwise it breaks the condition for the weight function. Also if \Lambda
is of level s
, then \Lambda + k\delta
is of level s
for all k
since <c, \delta> = 0
. I still don't quite see how we could figure out what the coefficient of \delta
is for an arbitrary element in the Kyoto path model.
Replying to bsalisbury1:
It seems the original error in the monomial crystals code is due to an error in the literature about how the weight function is defined for monomial crystals. ... In none of these references is there a way to get delta to appear as a weight. Perhaps I've missed a reference that does include this information, but I'm thinking about the problem now and planning to discuss it with Peter Tingley, so hopefully we will have a resolution soon.
I had poked around on this too when I first looked into this ticket and couldn't find anything (I also looked in Level 0 monomial crystals by Hernandez and Kashiwara). Expressing the monomial crystals in the A
's isn't a problem since we can just pull the simple roots and \alpha_0
is what contributes to \delta
. We can always bruteforce this computation for highest weight crystals by taking a path to the highest weight and computing the weight from that, but that is going to be really slow (but it will be correct). I think the best solution is to follow Hernandez and Kashiwara and add a weight attribute to each element of the crystal, which is easy enough to compute on each application of e
and f
.
Also in a similar vein, I don't think we can compute \delta
from doing Epsilon  Phi
as Epsilon = \sum_i \epsilon_i \Lambda_i
and similar for \Phi
(which comes down to the fact that <h_i, \delta> = 0
, where h_i
is a simple coroot). I think we need to change this, or at least put a stopgap in affine type and the WLR is the extended weight lattice.
comment:24 in reply to: ↑ 23 ; followup: ↓ 25 Changed 6 years ago by
Replying to tscrim:
Replying to bsalisbury1:
It seems the original error in the monomial crystals code is due to an error in the literature about how the weight function is defined for monomial crystals. ... In none of these references is there a way to get delta to appear as a weight. Perhaps I've missed a reference that does include this information, but I'm thinking about the problem now and planning to discuss it with Peter Tingley, so hopefully we will have a resolution soon.
I had poked around on this too when I first looked into this ticket and couldn't find anything (I also looked in Level 0 monomial crystals by Hernandez and Kashiwara). Expressing the monomial crystals in the
A
's isn't a problem since we can just pull the simple roots and\alpha_0
is what contributes to\delta
. We can always bruteforce this computation for highest weight crystals by taking a path to the highest weight and computing the weight from that, but that is going to be really slow (but it will be correct). I think the best solution is to follow Hernandez and Kashiwara and add a weight attribute to each element of the crystal, which is easy enough to compute on each application ofe
andf
.
I tried doing this using the path to the highest weight vector, but it causes a loop in Sage because of the way the crystal operators are defined in the monomial crystals model. In particular, we need phi to compute the action of the Kashiwara operators, and phi depends on the weight in the B(infinity) model.
comment:25 in reply to: ↑ 24 ; followup: ↓ 27 Changed 6 years ago by
Replying to bsalisbury1:
Replying to tscrim:
I had poked around on this too when I first looked into this ticket and couldn't find anything (I also looked in Level 0 monomial crystals by Hernandez and Kashiwara). Expressing the monomial crystals in the
A
's isn't a problem since we can just pull the simple roots and\alpha_0
is what contributes to\delta
. We can always bruteforce this computation for highest weight crystals by taking a path to the highest weight and computing the weight from that, but that is going to be really slow (but it will be correct). I think the best solution is to follow Hernandez and Kashiwara and add a weight attribute to each element of the crystal, which is easy enough to compute on each application ofe
andf
.I tried doing this using the path to the highest weight vector, but it causes a loop in Sage because of the way the crystal operators are defined in the monomial crystals model. In particular, we need phi to compute the action of the Kashiwara operators, and phi depends on the weight in the B(infinity) model.
Right...I'm now fairly convinced it will be best to simply store the weight as an attribute of the elements. This might also result in a speedup as we wouldn't need to (re)compute the weight everytime we call phi()
.
comment:26 in reply to: ↑ 19 ; followup: ↓ 30 Changed 6 years ago by
comment:27 in reply to: ↑ 25 Changed 6 years ago by
Right...I'm now fairly convinced it will be best to simply store the weight as an attribute of the elements. This might also result in a speedup as we wouldn't need to (re)compute the weight everytime we call
phi()
.
Ben and I found a different way which fixes the problem. A new commit will follow soon!
comment:28 followup: ↓ 29 Changed 6 years ago by
 Commit changed from 3b3c0b2d15289a8c4a5c5c25ad2984114a856916 to e0fd5597cb665cc8cf7d11e6d24e9e7dee691c57
Branch pushed to git repo; I updated commit sha1. New commits:
e0fd559  monomial crystals weight is fixed!!!

comment:29 in reply to: ↑ 28 Changed 6 years ago by
We have the following behavior in root systems (as Travis mentioned):
sage: R = RootSystem(['B',2,1]) sage: La = R.weight_space().basis() sage: LaE = R.weight_space(extended=True).basis() sage: La[0] == LaE[0] True sage: La[0].parent() == LaE[0].parent() False
This yields to the behavior that LS paths labeled by extended fundamental weights and nonextended ones are considered to be the same crystal. Should we fix the weight issue or pass another (fake) argument in the init method to distinguish the crystals?
Nicolas, do you have an opinion on the above?
comment:30 in reply to: ↑ 26 ; followup: ↓ 31 Changed 6 years ago by
Replying to bump:
Replying to aschilling:
But in any case, someone should check!
I will check it.
I would if I could … I pulled the branch, built and can't run it it because of some problem with ascii_art. I'll make distclean and try again.
I'm a little worried that if the changes required replacing the inner product in weyl_dimension method there could also be failures in weyl_characters.py too which also depend on the same inner product.
comment:31 in reply to: ↑ 30 Changed 6 years ago by
Replying to bump:
Replying to bump:
Replying to aschilling:
But in any case, someone should check!
I will check it.
I would if I could … I pulled the branch, built and can't run it it because of some problem with ascii_art. I'll make distclean and try again.
Strange. It works for both myself and Ben.
I'm a little worried that if the changes required replacing the inner product in weyl_dimension method there could also be failures in weyl_characters.py too which also depend on the same inner product.
The only problem with it was that it only worked for the ambient space and we want it for all weight lattices. If that is not an issue for weyl_character, then it hopefully does not change anything there!
comment:32 Changed 6 years ago by
 Commit changed from e0fd5597cb665cc8cf7d11e6d24e9e7dee691c57 to c8fd06023efa5185e302778858a99f588b4da919
Branch pushed to git repo; I updated commit sha1. New commits:
c8fd060  Alcove paths weights are up to snuff.

comment:33 Changed 6 years ago by
 Commit changed from c8fd06023efa5185e302778858a99f588b4da919 to 495fa98ec8ecd009128b2a86c3e8a6ab0ed8d5ca
Branch pushed to git repo; I updated commit sha1. New commits:
495fa98  documentation updates

comment:34 Changed 6 years ago by
In the last commit, weight_lattice_realization()
for DirectSumOfCrystals
was added, too.
comment:35 Changed 6 years ago by
Also, original example is now fixed!
sage: C = crystals.infinity.NakajimaMonomials(['A',1,1]) sage: v=C.highest_weight_vector() sage: v.f(1).weight()+v.f(0).weight() delta
comment:36 Changed 6 years ago by
 Commit changed from 495fa98ec8ecd009128b2a86c3e8a6ab0ed8d5ca to 42ed2f2cdb58a8f5383159716e9be570c779a7f3
comment:37 followup: ↓ 39 Changed 6 years ago by
There are doctest failures in non_symmetric_macdonald_polynomials.py. There are quite a few but for example:
Failed example: for d in range(1,2): # long time (41s) for a,b,c,d in !IntegerVectors(d,4): weight = a*La![1]+b*La![2]+c*La![3]+d*La![4] weight0 = a*omega![1]b*omega![2]c*omega![3]d*omega![4] LS = crystals.ProjectedLevelZeroLSPaths(weight) assert E[weight0].map_coefficients(lambda !x:x.subs(t=0)) == LS.one_dimensional_configuration_sum(q) Exception raised: Traceback (most recent call last): File "/home/bump/sagemath/sage/local/lib/python2.7/sitepackages/sage/doctest/forker.py", line 496, in _run self.compile_and_execute(example, compiler, test.globs) File "/home/bump/sagemath/sage/local/lib/python2.7/sitepackages/sage/doctest/forker.py", line 858, in compile_and_execute exec(compiled, globs) File "<doctest sage.combinat.root_system.non_symmetric_macdonald_polynomials.!NonSymmetricMacdonaldPolynomials![201]>", line 6, in <module> assert E[weight0].map_coefficients(lambda !x:x.subs(t=Integer(0))) == LS.one_dimensional_configuration_sum(q) !AssertionError
comment:38 Changed 6 years ago by
 Commit changed from 42ed2f2cdb58a8f5383159716e9be570c779a7f3 to f4955c5f50d4f34510176acff8956f49d3b8702a
Branch pushed to git repo; I updated commit sha1. New commits:
f4955c5  Fixing RC crystals for extended and nonextended weight input.

comment:39 in reply to: ↑ 37 Changed 6 years ago by
Dan,
I think you are not using the latest branch! Please pull again and then try again!
Anne
New commits:
f4955c5  Fixing RC crystals for extended and nonextended weight input.

comment:40 Changed 6 years ago by
 Status changed from new to needs_review
All tests passed on my and Anne's machine. Travis' changes look good, too. We think it's ready for review.
Dan, would you mind reviewing the ticket?
Best, Ben
comment:41 Changed 6 years ago by
 Reviewers set to bump
comment:42 Changed 6 years ago by
 Reviewers changed from bump to Dan Bump
comment:43 Changed 6 years ago by
All tests pass with `sage testall`.
I will look at it some more.
comment:44 followup: ↓ 48 Changed 6 years ago by
From the thematic tutorial at http://doc.sagemath.org/html/en/thematic_tutorials/lie/affine_hw_crystals.html we have:
sage: sage: La = RootSystem(['C',3,1]).weight_space().fundamental_weights() sage: sage: LS = crystals.LSPaths(2*La[1]+La[2]) sage: sage: SL = LS.subcrystal(max_depth=3) sage: sage: GL = LS.digraph(subset=SL) sage: sage: La = RootSystem(['C',3,1]).weight_lattice().fundamental_weights() sage: sage: M = crystals.NakajimaMonomials(['C',3,1], 2*La[1]+La[2]) sage: sage: SM = M.subcrystal(max_depth=3) sage: sage: GM = M.digraph(subset=SM) sage: sage: GL.is_isomorphic(GM, edge_labels=True) True
Why do we use weight_space for one and weight_lattice for the other? (Maybe the thematic tutorial should explain.)
After this:
sage: LS.weight_lattice_realization().is_extended() False sage: M.weight_lattice_realization().is_extended() True
I think the point of the example in the tutorial is that the two crystals are isomorphic, but one uses the extended lattice and the other does not.
I think that the user should be encouraged to use extended lattice for crystals of integrable representations of affine Lie algebras and therefore the example in the thematic tutorial should be changed. The tutorial goes on and shows another affine crystal in the Littelmann path model that does use the extended lattice. I think the patch is good but it would be good if some revisions to the above page in the thematic tutorial are made. Here are some suggestions for change to the tutorial that could be part of this patch.
(1) Change the first example to use the extended lattice unless there is a reason not to. (2) Tell the user just a bit more about the issue. The following is my understanding: roughly, the extended lattice is the weight lattice of the full KacMoody Lie algebra and the unextended lattice is the weight lattice of the derived algebra. If the highest weight is a nonzero dominant weight then the crystal is infinite and it is preferrable to use the extended lattice. These are the crystals of integrable representations. A contrasting situation is with KirillovReshetikhin crystals where there is an advantage to working with the unextended lattice, namely that the derived algebra has a finite dimensional crystal where the full KacMoody Lie algebra has an infinite one. So there is an advantage in this case to using the derived Lie algebra and its weight lattice is the unextended one. (3) Explain why we use weight_space in the LS path model but weight_lattice in the monomial model.
comment:45 Changed 6 years ago by
 Commit changed from f4955c5f50d4f34510176acff8956f49d3b8702a to 9b5b9d5e83a40a6fe55ea23d8cdf33d0f0a131c5
comment:46 Changed 6 years ago by
 Commit changed from 9b5b9d5e83a40a6fe55ea23d8cdf33d0f0a131c5 to d120397df0e3861d7f8a876da1a62836d9cdd00b
Branch pushed to git repo; I updated commit sha1. New commits:
d120397  18453: updated thematic tutorial on affine crystals

comment:47 Changed 6 years ago by
 Commit changed from d120397df0e3861d7f8a876da1a62836d9cdd00b to 8e5c4cd07daadb6479b3a6bc18b12d42254a3d74
Branch pushed to git repo; I updated commit sha1. New commits:
8e5c4cd  18453: some more additions to the thematic tutorial

comment:48 in reply to: ↑ 44 Changed 6 years ago by
Hi Dan,
Good points! I fixed the examples in the thematic tutorial (and also the picture since it was missing the deltas as well) and put some extra explanations.
Anne
PS: I also rebased on top of sage6.5.beta4.
comment:49 Changed 6 years ago by
 Status changed from needs_review to positive_review
comment:50 Changed 6 years ago by
I gave this a positive review. Here is a (temporary) link to the revised tutorial.
http://match.stanford.edu/thematic_tutorials/lie.html
I am currently unable to run tests for some reason, but I ran sage testall before the rebase.
comment:51 Changed 6 years ago by
 Status changed from positive_review to needs_work
sage t long src/sage/combinat/crystals/littelmann_path.py # 3 doctests failed sage t long src/sage/combinat/crystals/tensor_product.py # 1 doctest failed sage t long src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py # 10 doctests failed
comment:52 Changed 6 years ago by
 Dependencies set to #18700
Here's the problem:
sage: La = RootSystem(['A',2,1]).weight_space().basis() sage: LS = crystals.ProjectedLevelZeroLSPaths(2*La[1]) sage: LS.weight_lattice_realization() Weight space over the Rational Field of the Root system of type ['A', 2, 1] sage: CS = LS.one_dimensional_configuration_sum() sage: CS B[2*Lambda[1] + 2*Lambda[2]] + (q+1)*B[Lambda[1]] + (q+1)*B[Lambda[1]  Lambda[2]] + B[2*Lambda[1]] + B[2*Lambda[2]] + (q+1)*B[Lambda[2]] sage: K = crystals.KirillovReshetikhin(['A',2,1], 1,1) sage: T = K.tensor(K) sage: CSK = T.one_dimensional_configuration_sum() sage: CSK B[2*Lambda[1] + 2*Lambda[2]] + (q+1)*B[Lambda[1]] + (q+1)*B[Lambda[1]  Lambda[2]] + B[2*Lambda[1]] + B[2*Lambda[2]] + (q+1)*B[Lambda[2]] sage: CS == CSK False sage: CS.parent() Group algebra of the Weight space over the Rational Field of the Root system of type ['A', 2] over Univariate Polynomial Ring in q over Rational Field sage: CSK.parent() Group algebra of the Weight lattice of the Root system of type ['A', 2] over Univariate Polynomial Ring in q over Rational Field sage: CS.parent().has_coerce_map_from(CSK.parent()) False
The group algebra, by its functoriality, should have a coercion when the underlying groups have a coercion:
sage: CS.parent()._indices.has_coerce_map_from(CSK.parent()._indices) True
This is now #18700.
comment:53 followup: ↓ 54 Changed 6 years ago by
 Commit changed from 8e5c4cd07daadb6479b3a6bc18b12d42254a3d74 to ee398efc22755947ea98f0a8250333d21352f2db
Branch pushed to git repo; I updated commit sha1. New commits:
ee398ef  18453: fixed one_dimensional_configuration sum issue

comment:54 in reply to: ↑ 53 Changed 6 years ago by
The problem is that the weight are now in the weight space rather than the weight lattice. I fixed this in one dimensional configuration sums directly, so we do not need 18700.
comment:55 Changed 6 years ago by
 Dependencies #18700 deleted
 Status changed from needs_work to needs_review
comment:56 Changed 6 years ago by
I ran doctests again in /combinat and /categories and did not have any failures after Anne's changes. In particular:
sage t warnlong 69.0 src/sage/combinat/crystals/littelmann_path.py [242 tests, 5.61 s]  All tests passed!  Total time for all tests: 5.7 seconds cpu time: 5.6 seconds cumulative wall time: 5.6 seconds BenBook:sagegit Ben$ sage tp src/sage/combinat/crystals/littelmann_path.py Running doctests with ID 201506141203413453f2af. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t warnlong 68.9 src/sage/combinat/crystals/littelmann_path.py [242 tests, 5.26 s]  All tests passed!  Total time for all tests: 5.3 seconds cpu time: 5.3 seconds cumulative wall time: 5.3 seconds BenBook:sagegit Ben$ sage tp src/sage/combinat/crystals/tensor_product.py Running doctests with ID 20150614120845ab879d56. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t warnlong 69.0 src/sage/combinat/crystals/tensor_product.py [439 tests, 2.39 s]  All tests passed!  Total time for all tests: 2.5 seconds cpu time: 2.4 seconds cumulative wall time: 2.4 seconds BenBook:sagegit Ben$ sage tp src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py Running doctests with ID 20150614120416dbb764b5. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t warnlong 68.9 src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py [533 tests, 8.01 s]  All tests passed!  Total time for all tests: 8.1 seconds cpu time: 7.9 seconds cumulative wall time: 8.0 seconds
comment:57 Changed 6 years ago by
These tests are marked # long time
, so you need to run them with the long
option (warnlong
just notifies you if a test took over a certain time (the default being 1 second)).
comment:58 Changed 6 years ago by
Thanks, Travis. Here is the updated proof of passed tests:
BenBook:sagegit Ben$ sage tp long src/sage/combinat/crystals/littelmann_path.py Running doctests with ID 20150614131812ce4dd0f9. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t long warnlong 69.0 src/sage/combinat/crystals/littelmann_path.py [254 tests, 31.08 s]  All tests passed!  Total time for all tests: 31.2 seconds cpu time: 31.0 seconds cumulative wall time: 31.1 seconds BenBook:sagegit Ben$ sage tp long src/sage/combinat/crystals/tensor_product.py Running doctests with ID 201506141319014e6d8c8e. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t long warnlong 69.1 src/sage/combinat/crystals/tensor_product.py [441 tests, 3.40 s]  All tests passed!  Total time for all tests: 3.5 seconds cpu time: 3.4 seconds cumulative wall time: 3.4 seconds BenBook:sagegit Ben$ sage tp long src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py Running doctests with ID 201506141319217bd8f158. Git branch: t/18453/public/crystal/18453 Using optional=gcc,mpir,python2,sage,scons Doctesting 1 file using 8 threads. sage t long warnlong 69.1 src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py ********************************************************************** File "src/sage/combinat/root_system/non_symmetric_macdonald_polynomials.py", line 695, in sage.combinat.root_system.non_symmetric_macdonald_polynomials.NonSymmetricMacdonaldPolynomials Warning, slow doctest: E[2*omega[1]omega[2]].map_coefficients(lambda x:x.subs(t=0)) == LS.one_dimensional_configuration_sum(q) # long time Test ran for 84.13 s [562 tests, 149.57 s]  All tests passed!  Total time for all tests: 149.7 seconds cpu time: 149.4 seconds cumulative wall time: 149.6 seconds
comment:59 Changed 6 years ago by
 Status changed from needs_review to positive_review
This change is okay.
comment:60 Changed 6 years ago by
 Branch changed from public/crystal/18453 to ee398efc22755947ea98f0a8250333d21352f2db
 Resolution set to fixed
 Status changed from positive_review to closed
comment:61 in reply to: ↑ 19 Changed 6 years ago by
 Commit ee398efc22755947ea98f0a8250333d21352f2db deleted
Replying to aschilling:
Replying to nthiery:
Replying to aschilling:
I had to fix the Weyl dimension formula (which I guess was written by Nicolas and only worked in the ambient space).
By Dan, if I recall correctly.
7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 859) def weyl_dimension(self, highest_weight): 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 860) """ 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 861) EXAMPLES:: 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 862) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 863) sage: RootSystem(['A',3]).ambient_lattice().weyl_dimension([2,1,0,0]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 864) 20 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 865) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 866) sage: type(RootSystem(['A',3]).ambient_lattice().weyl_dimension([2,1,0,0])) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 867) <type 'sage.rings.integer.Integer'> 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 868) """ 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 869) highest_weight = self(highest_weight) f562ca2b (Travis Scrimshaw 20141001 17:16:31 0700 870) if not highest_weight.is_dominant(): f562ca2b (Travis Scrimshaw 20141001 17:16:31 0700 871) raise ValueError("the highest weight must be dominant") 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 872) rho = self.rho() 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 873) n = prod([(rho+highest_weight).dot_product(x) for x in self.positive_roots()]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 874) d = prod([ rho.dot_product(x) for x in self.positive_roots()]) 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 875) from sage.rings.integer import Integer 7d693534 (Nicolas M. Thiery 20120319 21:38:26 +0100 876) return Integer(n/d)
This got me curious about my memory failing. Digging back in history ended up in a mercurial patch from early 2008 from which one can't extract the author. As far as I remember, I refactored this code at different occasions (in the above to move it from
weight_lattice_realization to weight_lattice_realizations when
creating the categories), but the original code was by Dan in the WeylDim
function. Anyway just ranting this has no relevance.
comment:62 Changed 6 years ago by
This agrees with my recollection.
As far as I remember, I refactored this code at different occasions (in the above to move it from weight_lattice_realization to weight_lattice_realizations when creating the categories), but the original code was by Dan in the WeylDim? function.
My proposal is to change the default value for the weight lattice to be the extended version by default since it would have the proper dimension and is the more natural thing to do weight computations in. As a result, we should create a new category for U_{q}'(g)crystals which uses the nonextended weight lattice. This would give us:
At least to fix this ticket, I don't think there would be much work. However it would be a backwardsincompatible change; although I think it would be mild.