#11935 closed enhancement (fixed)
Make parent/element classes independent of base rings
Reported by: | Simon King | Owned by: | |
---|---|---|---|
Priority: | major | Milestone: | sage-5.11 |
Component: | categories | Keywords: | parent class, element class, category |
Cc: | Jeroen Demeyer, Sage Combinat CC user | Merged in: | sage-5.11.beta1 |
Authors: | Simon King | Reviewers: | Nicolas M. Thiéry, Travis Scrimshaw |
Report Upstream: | N/A | Work issues: | |
Branch: | Commit: | ||
Dependencies: | #9138, #11900, #11943, #12875, #12876, #12877 | Stopgaps: |
Description (last modified by )
At #11900 and sage-combinat-devel, as well as in some comments in sage/categories/category.py, the idea was discussed to make, for example, Algebras(GF(3)).parent_class==Algebras(GF(5)).parent_class
- hence, make the parent/element classes as independent from the base of a category as possible.
This is implemented in this patch by introducing an abstract class CategoryWithParameters? which uses pickling by "weak construction" for its element and parent classes. Now:
- For a join category, the parent/element class depend only on the parent/element class of its super categories.
- For a Category_over_base (e.g. Modules, Algebras, Schemes, ...), the parent/element class depend only on the category of the base.
- For a bimodule, the parent/element class depend only on the category of the left and right bases.
In the process, this patch also:
- Adds a method Category._make_named_class providing a unified way to create parent and element classes (and later on morphism classes)
- Extends the interface of dynamic_class to customize caching and pickling
- Rename the experimental class IdempotentSemigroups?.ElementMethods? and remove its super class, and discards unused code there.
Apply
Attachments (7)
Change History (133)
comment:1 Changed 11 years ago by
comment:2 follow-ups: 3 4 Changed 11 years ago by
A problem may be seen in pickling. Before explaining the problem, let me remark that I don't see a big concern for "pickling a parent class": What we actually want to pickle is a parent, not just a naked class. The serialisation data of a polynomial ring, for example, will comprise the base ring, the generator names and the term order, but certainly the class of the polynomial ring will not be part of the pickle.
However, if we do want to serialise a naked parent or element class, we have the following problems:
Currently, C.parent_class
is pickled by getattr, (C,"parent_class")
. The pickling data (hence, C) is part of the cache key of a dynamic class. With that, the parent class of different categories C1 and C2 can't be the same.
I see three approaches to get rid of it.
- Remove the pickling data from the cache key of dynamic classes
- Make pickling of
C.parent_class
just rely on the default way of pickling a dynamic class - Work around the cache of dynamic classes, but still use
getattr,(C,"parent_class")
for pickling.
I think 1. is not gonna happen. It would break a lot of code, I suppose.
I had tested 2. and 3. while working on #11900. Both would work, but there are different conerns concerning long-term stability.
- means:
The pickle of a parent class will only depend on the category graph as it was on the time of pickling. If the category graph changes between pickling and unpickling the parent class, you would get a different class.
- would be a bit more stable.
The idea is:
(i) In the lazy attribute
parent_class()
, the dynamic class is first created without providing the reduction data (as in approach 2.). (ii) Before returning that dynamic class, it is tested whether the reduction data is still none. If it is, thegetattr, (C,"parent_class")
thingy is inserted.
Consequence:
Algebras(QQ).parent_class
could, for example, be unpickled asAlgebras(GF(2)).parent_class
- which is not a big problem, since we want them to be the same. However, if in a distant future we want them to be different again, we'd be in trouble...
I suggest that I create patches for both 2. and 3., and then people can tell what they think about it. The method resolution will then be taken care of by another patch.
comment:3 follow-up: 5 Changed 11 years ago by
Replying to SimonKing:
A problem may be seen in pickling. Before explaining the problem, let me remark that I don't see a big concern for "pickling a parent class":
True: all parents ought to be pickled "by construction" rather than by "class + internal data", in order to encapsulate as much as possible of the data structure. This probably ought to be true as well for elements. I don't know how far we are from this.
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there. And why. I don't know how complicated it is to do this search though.
Among the three propositions, I like 3 best. I have trouble evaluating how big the risks are to get stuck in the future. It does not seem too big.
Thanks Simon for investigating this!
comment:4 Changed 11 years ago by
Description: | modified (diff) |
---|---|
Summary: | Make parent/element classes independent of base rings and the category graph consistent with method resolution → Make parent/element classes independent of base rings |
Replying to SimonKing:
I suggest that I create patches for both 2. and 3., and then people can tell what they think about it. The method resolution will then be taken care of by another patch.
I just argued myself into splitting the ticket: This here will be for the base ring independent parent/element classes, and another one will be for method resolution order.
comment:5 follow-up: 8 Changed 11 years ago by
Replying to nthiery:
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there.
Old pickles will not break, I believe. Let P3 = Algebras(GF(3)).parent_class
and PQ = Algebras(QQ).parent_class
. Here are a few scenarios:
1. P3 and PQ have been created and pickled with an old version of Sage
In the old version of Sage, P3!=PQ
. They are pickled by construction. Hence, after applying my yet-to-be-submitted patches, they are unpickled as Algebras(GF(3)).parent_class
and Algebras(QQ).parent_class
- which is the same class after applying the patch.
Conclusion: An old pickle will not break, with either approach 2. or 3. The worst what could happen is P3!=PQ
before pickling and P3==PQ
after unpickling. But the defining property P3==Algebras(GF(3)).parent_class
and PQ==Algebras(QQ).parent_class
is preserved.
2. P3 and PQ are created and pickled according to approach 2. from above
Of course, P3==PQ
at the time of pickling. The pickle will only depend on the parent classes of the super categories of Algebras(GF(3))
and Algebras(QQ)
. If there was a change in the super categories between pickling and unpickling, we would have P3!=Algebras(GF(3)).parent_class
after unpickling.
Conclusion: A new pickle of P3 and PQ can be unpickled after a change in the category graph, but a change in the category graph will destroy the defining property P3==Algebras(GF(3)).parent_class
.
3. P3 and PQ are created and pickled according to approach 3. from above
Of course, P3==PQ
at the time of pickling. PQ
and P3
will both be pickled by construction. In particular, a change in the category graph would not matter, as long as the super categories of Algebras(QQ)
and Algebras(GF(3))
still coincide (up to the base ring) after the change in the category graph.
A problem would arise if, in a distant future, the super categories of Algebras(QQ)
and Algebras(GF(3))
would become essentially different. Say, some algebra person finds that vector spaces over fraction fields should have their own category, different from the usual category of vector spaces. Then, Algebras(QQ).parent_class!=Algebras(GF(3)).parent_class
. In particular, after such change, unpickling P3
or PQ
would result in either PQ!=Algebras(QQ).parent_class
or P3!=Algebras(GF(3)).parent_class
.
Conclusion: A new pickle of P3 and PQ can be unpickled after a change in the category graph. Most changes in the category graph will preserve the defining property P3==Algebras(GF(3)).parent_class
and PQ==Algebras(QQ).parent_class
after unpickling. However, if the super categories will depend on the base ring in a different way as it is now, then either P3 or PQ will lose the defining property after unpickling, while the other will keep the defining property - and we don't know which of the two will break.
It seems to me that approach 3. is less fragile than 2. But I believe that in applications (hence, for pickling parents) both should be fine. So, I'll prepare patches for both approaches.
Changed 11 years ago by
Attachment: | trac11935_use_default_pickling.patch added |
---|
Use default pickling for parent/element classes, making them base ring independent.
comment:6 Changed 11 years ago by
Authors: | → Simon King |
---|---|
Status: | new → needs_review |
The patch that I just attached implements approach 2., hence, it uses the default pickling of dynamic classes. By consequence, the parent class of a category C will only depend on C.ParentMethods
and on the parent classes of the super categories of C, but it will only depend on the base ring of C if the base ring changes the super categories (which holds for algebras, e.g.).
Note the effect on the computation time for elliptic curves. With sage-4.7.2.alpha3 plus #11900, we have
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.97 s, sys: 0.07 s, total: 4.04 s Wall time: 4.18 s
but with the new patch on top of it, we have
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.11 s, sys: 0.06 s, total: 3.17 s Wall time: 3.31 s
Changed 11 years ago by
Attachment: | trac11935_weak_pickling_by_construction.patch added |
---|
Use a weak form of "pickling by construction" for parent and element classes of categories
comment:7 Changed 11 years ago by
The second patch is posted as well. It implements approach 3. Hence, the parent_class lazy attribute works around the cache of dynamic classes (by not providing unpickling information when the class is created), inserting the unpickling information only when the class has not been found in cache.
By consequence, when first creating Algebras(QQ).parent_class
, then its cache key as a dynamic class only comprises the parent classes of the super categories. Before returning it, the unpickling data (by construction) is added. When Algebras(GF(3)).parent_class
is created later, it is found in the cache of dynamic classes and immediately returned. The class returned will thus be unpickled as Algebras(QQ).parent_class
.
Similar to the first patch, it considerably speeds up the elliptic curve computations:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.05 s, sys: 0.07 s, total: 3.12 s Wall time: 3.27 s
Apply only one of the two patches, at your choice!
comment:8 follow-up: 9 Changed 11 years ago by
Replying to SimonKing:
Replying to nthiery:
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there.
Old pickles will not break, I believe.
This is my belief too! Sorry if I have been unclear, but that was not the point of my suggestion. What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data". If they already are pickled by construction, then how C.parent_class and C.element_class are pickled is mostly irrelevant, now and in the future, since it is seldom used.
Thanks in any cases for you detailed analysis!
It seems to me that approach 3. is less fragile than 2.
+1
Let's see if someone else has some preference between the two implementations.
Cheers,
Nicolas
comment:9 follow-up: 10 Changed 11 years ago by
Replying to nthiery:
What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data".
I see. If I am not mistaken, if it is pickled by "class+data", then copy_reg._reconstructor is called at unpickling. Perhaps it is possible to modify _reconstructor, so that it writes data to some log file. In that way, we could find out how often it is actually used.
comment:10 follow-up: 11 Changed 11 years ago by
Replying to SimonKing:
Replying to nthiery:
What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data".
I see. If I am not mistaken, if it is pickled by "class+data", then copy_reg._reconstructor is called at unpickling. Perhaps it is possible to modify _reconstructor, so that it writes data to some log file. In that way, we could find out how often it is actually used.
Yup. Or run explain_pickle on all pickles, and grep for element_class / parent_class.
comment:11 Changed 11 years ago by
Replying to nthiery:
Yup. Or run explain_pickle on all pickles, and grep for element_class / parent_class.
I don't know about explain_pickle. Where can I find it?
I am now running sage -testall -long with the _reconstructor log. So far, only ZODB.fsIndex.fsIndex and matplotlib.font_manager.FontEntry? use it, but we will see if there's more.
comment:12 Changed 11 years ago by
sage -testall -long succeeded (with the second patch applied, but it would probably work with thee first one as well), and copy_reg._reconstructor was only used on <class 'matplotlib.font_manager.FontEntry'>
, <class 'ZODB.fsIndex.fsIndex'>
and <class 'sage.misc.explain_pickle.EmptyNewstyleClass'>
.
Hence, it indicates that "pickling by class and data" does not occur for parents. But, to be on the safe side, one should inspect the pickle jar using explain_pickle.
comment:13 Changed 11 years ago by
Cc: | Jeroen Demeyer added |
---|---|
Dependencies: | #11900 → #9138, #11900 |
comment:14 Changed 11 years ago by
Description: | modified (diff) |
---|
sage -testall -long
passes for either patch. So, unless we will find bad surprises in the pickle jar, both approaches should be fine. I am slightly in favour of approah 3.
comment:15 Changed 11 years ago by
Dependencies: | #9138, #11900 → #9138 #11900 #11943 |
---|---|
Description: | modified (diff) |
I decided to make this ticket depend on #11943, for two reasons: Firstly, it is rather clear that #11943 is a good idea, while I am less sure here (it is good for speed, but may under very particular circumstances break new pickles). Secondly, #11943 seems less invasive than the patch here.
I think that the "potentially breaking new pickles in a distant future" aspect is less urgent with the "weak pickling by construction" approach. Hence, I have only updated the second patch, the first can now be disregarded.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:16 follow-up: 17 Changed 11 years ago by
Status: | needs_review → needs_work |
---|---|
Work issues: | → Fix doctest in covariant functorial construction |
comment:17 follow-up: 18 Changed 11 years ago by
Replying to SimonKing:
For some reason, #11935 together with #11943 result in one doctest error, namely in the "FooBars?" test of
sage/categories/covariant_functorial_construction.py
. So, it needs work.
Here is the problem:
sage: from sage.categories.covariant_functorial_construction import CovariantConstructionCategory sage: class FooBars(CovariantConstructionCategory): ... _functor_category = 'FooBars_function' ... sage: def FooBars_function(X): return FooBars(X) ... sage: C = FooBars(ModulesWithBasis(QQ)) sage: import __main__; __main__.FooBars_function = FooBars_function sage: __main__.FooBars = FooBars sage: Category.FooBars_function = FooBars_function
Now, requesting C.parent_class
results in a complaint regarding "duplicate base class FooBars.parent_class
". Indeed, with the patch from here, we have
sage: CA = FooBars(AdditiveMagmas()) sage: CM = FooBars(Magmas()) sage: CA.parent_class == CM.parent_class True
even though we have
sage: Magmas().parent_class is AdditiveMagmas().parent_class False
So, I was overdoing it: With my patch, the parent class not only becomes independent of a base ring, but also it becomes independent of the base category of a covariant functorial construction - and this is not what we want.
The point is that CA and CM above have the same super categories, namely FooBars(Sets())
. But with my patch, parent classes are the same if both the super categories and the ParentMethods
are the same. The ParentMethods
are different for Magmas()
and AdditiveMagmas()
, but they are the same for FooBars(Magmas())
and FooBars(AdditiveMagmas())
.
I wonder how this should be solved.
One possibility is that sage.categories.covariant_functorial_constructions overrides the parent_class and element_class lazy attributes that are defined in sage.categories.category.
comment:18 follow-up: 19 Changed 11 years ago by
Hi Simon!
Replying to SimonKing:
So, I was overdoing it: With my patch, the parent class not only becomes independent of a base ring, but also it becomes independent of the base category of a covariant functorial construction - and this is not what we want.
The point is that CA and CM above have the same super categories, namely
FooBars(Sets())
. But with my patch, parent classes are the same if both the super categories and theParentMethods
are the same. TheParentMethods
are different forMagmas()
andAdditiveMagmas()
, but they are the same forFooBars(Magmas())
andFooBars(AdditiveMagmas())
.I wonder how this should be solved.
One possibility is that sage.categories.covariant_functorial_constructions overrides the parent_class and element_class lazy attributes that are defined in sage.categories.category.
Ah ah, interesting challenge!
So, the question is whether we want to put in the specifications:
Constraint (C1): above any category in the category hierarchy, there should be a one-to-one correspondence between categories and their parent class. In particular, C.all_super_categories() and C.parent_class.mro() should exactly match.
For C(...) a parametrized categories, let me call (O) the optimization of having C(...).parent_class depend only on C.ParentMethods? and the parent class of the super categories of C(...).
If we want (C1), then we indeed have to be careful with parametrized categories. In particular, it seems to me (I haven't written a proof though :-)) that optimization (O) can only be used for a parametrized category C(...) if it is further guaranteed that:
Constraint (C2): no parent is simultaneously in two instances C(A) and C(B) of C.
(C2) seems reasonable for a Category_over_base_ring (a parent should have a unique distinguished base ring). Maybe it would make sense as well for a Category_over_base. Your example above shows that we don't want it in general, and in particular not for covariant constructions.
This calls for Category to not implement (O) by default, and Category_over_base_ring to override parent_class and element_class to implement (O).
Another option would be to drop (C1) altogether: it seems like a nice optimization to reduce the number of classes in the mro whenever possible (e.g. when categories do not provide ParentMethods?). But then we face an interesting poset/lattice problem, namely whether the C3 algorithm is (can be made) compatible with taking certain subposets.
This is quite related to the kind of optimizations I do in #10963, so I'd love to have a brainstorm about that; but we'd better have it face to face when you come over. In any cases, I vote for imposing (C1) in this ticket, and think about removing this constraint, if at all possible, in a later ticket. Both for not making this ticket even more complicated and reducing conflicts with #10963.
Cheers,
Nicolas
comment:19 Changed 11 years ago by
Hi Nicolas,
Replying to nthiery:
If we want (C1), then we indeed have to be careful with parametrized categories. In particular, it seems to me (I haven't written a proof though :-)) that optimization (O) can only be used for a parametrized category C(...) if it is further guaranteed that:
Constraint (C2): no parent is simultaneously in two instances C(A) and C(B) of C.
(C2) seems reasonable for a Category_over_base_ring (a parent should have a unique distinguished base ring). Maybe it would make sense as well for a Category_over_base.
That would indeed be a possibility. My first reaction was "The problem arose in covariant functorial construction, thus, use (O) by default, but let covariant functorial constructions override it with the non-optimised approach".
But after all, the subject of this ticket is "Make parent/element classes independent of base rings". So, your suggestion fits 100% into the scope of this ticket.
However, it is not all that easy. For example, by #9138, the category of a univariate polynomial ring is a JOIN of a category with base (namely commutative algebras) and a base free category (Euclidean domains or so). The join category has a base()
method, due to one of the patches at #11900, but it does not inherit from sage.categories.category_types.Category_over_base
.
In other words:
If we use optimisation (O) only on sage.categories.category_types.Category_over_base
then we would not see any speedup in elliptic curve computations.
Here are a few scenarios:
- Use (O) by default, but do not use it on covariant functorial constructions. The question is whether this is consistent.
- Do not use (O) by default, but use it on
Category_over_base
. Problem: We would not see a speed-up. - Do not use (O) by default, but use it on every category that has a
base
method. That includesCategory_over_base
, but also (by #11900) any category that is sub-category of a category with base.
The problem with the third approach is: Testing whether the category is sub-category of a category with base is expensive. Thus, it is possible that the regression (due to this test) is bigger than the speed-up obtained from optimisation (O).
comment:21 follow-ups: 22 23 Changed 11 years ago by
Milestone: | → sage-4.8 |
---|
I think the following could be a solution:
- Do not use optimization for
Category.parent_class
. Hence, the default is the good old "pickle by construction" approach. - Add a specialised
JoinCategory.parent_class
that uses default pickling of a dynamic class (which means: The class is uniquely determined by the base classes). Rationale: A join category is uniquely determined by its super categories, and thus it is consequent if the parent class of a join category is uniquely determined by the parent classes of its super categories. - Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" abbroach. Rationale: It's the purpose of this ticket to make the parent class independent of the base ring, and "weak pickling by construction" seems the most stable option.
Apparently, the problem with functorial constructions would vanish - they use the non-optimized old parent_class. But we would get a speed-up where we need it: Polynomial rings belong to a join category, and one super category of that join category is a Category_over_base
.
Of course, the same should be done with the element_class.
comment:22 Changed 11 years ago by
Replying to SimonKing:
- Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" approach.
... and to Bimodules
as well. They have two bases, but they do not inherit from 'Category_over_base'.
comment:23 follow-up: 24 Changed 11 years ago by
Replying to SimonKing:
I think the following could be a solution:
- Do not use optimization for
Category.parent_class
. Hence, the default is the good old "pickle by construction" approach.- Add a specialised
JoinCategory.parent_class
that uses default pickling of a dynamic class (which means: The class is uniquely determined by the base classes). Rationale: A join category is uniquely determined by its super categories, and thus it is consequent if the parent class of a join category is uniquely determined by the parent classes of its super categories.- Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" abbroach. Rationale: It's the purpose of this ticket to make the parent class independent of the base ring, and "weak pickling by construction" seems the most stable option.Apparently, the problem with functorial constructions would vanish - they use the non-optimized old parent_class. But we would get a speed-up where we need it: Polynomial rings belong to a join category, and one super category of that join category is a
Category_over_base
.Of course, the same should be done with the element_class.
This sounds very good.
Just a suggestion: in Category, you may want to put a lazy attribute _parent_class_from_base_classes, so that the categories that want the optimization (like Modules or Bimodules) can just do parent_class = _parent_class_from_base_classes (same for element_class of course)
parent_class
comment:24 Changed 11 years ago by
Replying to nthiery:
Just a suggestion: in Category, you may want to put a lazy attribute _parent_class_from_base_classes, so that the categories that want the optimization (like Modules or Bimodules) can just do parent_class = _parent_class_from_base_classes (same for element_class of course)
Excellent idea!
comment:25 Changed 11 years ago by
The patch is not yet ready for publication, but in L = EllipticCurve('960d1').prove_BSD()
I see a speedup of nearly 25% compared with #11900+#11943!
So, according to your suggestion, I add a _parent_class_from_bases and _element_class_from_bases to Category, and use it for categories over base and for bimodules.
However, there is a slight problem: You can not simply define parent_class = _parent_class_from_bases
if you want to have a real lazy attribute. Namely, parent_class
would believe that its name is _parent_class_from_bases
:
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar ....: sage: f = Foo sage: f.bar.__name__ '_bar'
In particular, it is not as fast as it should be. With the not-submitted patch:
sage: C = Modules(GF(3),dispatch=False) sage: %timeit p = C.parent_class 625 loops, best of 3: 69.4 µs per loop sage: %timeit p = C._parent_class_from_bases 625 loops, best of 3: 440 ns per loop
But it might be possible to work around.
comment:26 follow-up: 29 Changed 11 years ago by
Concerning lazy attributes: I wonder whether one could add a method rename(name)
to a lazy attribute. That method would return a copy of the original lazy attribute, but with a new name.
Then, the example above would become
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
comment:27 Changed 11 years ago by
Dependencies: | #9138 #11900 #11943 → #9138 #11900 #11943 #11999 |
---|
I created #11999 for the possibility to rename lazy attributes, and make it a dependency.
comment:28 follow-up: 30 Changed 11 years ago by
Status: | needs_work → needs_review |
---|---|
Work issues: | Fix doctest in covariant functorial construction |
Done!
The current patch preserves the default parent_class and element_class for categories. In particular, there is no problem with the covariant functorial constructions.
According to Nicolas' idea, I added a _parent_class_from_bases and _element_class_from_bases. They use the "weak pickling-by-construction" approach discussed above, because that seems to minimize the probability of breaking new pickles in a distant future. Just to emphasize: Old pickles will still work.
These two new lazy attributes override parent_class and element_class for Category_over_base
(which was made possible by #11999). By consequence, the parent classes of all vector spaces (over different base fields) coincide. The parent classes of algebras over fields coincide, but differ from the parent class of algebras over a non-field.
Moreover, the parent class of a join category only depends on the parent classes of its super categories. Here, I use the default pickling of dynamic classes.
Rationale for using the default pickling in the case of join categories:
- The creation of a dynamic class becomes slightly faster when you don't need to worry about pickling.
- A join category is uniquely determined by its super categories. So, it is safe to make the parent class uniquely determined by the bases (which is precisely the default pickling of dynamic classes).
With the new patch, all long tests pass. Moreover, I have the following timing:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.95 s, sys: 0.08 s, total: 3.03 s Wall time: 3.20 s
That is a speed-up of about 20% compared with unpatched sage-4.7.2.alpha2!
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:29 follow-up: 31 Changed 11 years ago by
Replying to SimonKing:
Concerning lazy attributes: I wonder whether one could add a method
rename(name)
to a lazy attribute. That method would return a copy of the original lazy attribute, but with a new name.Then, the example above would become
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
For what's its worth, a potential variant:
sage: class Foo(object): ....: @lazy_attribute(name="bar") ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
In our use case, we want the lazy attribute parent_class_from_bases to be called parent_class whenever it's used by subclasses of Category.
Cheers,
Nicolas
comment:30 Changed 11 years ago by
Replying to SimonKing:
With the new patch, all long tests pass. Moreover, I have the following timing:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.95 s, sys: 0.08 s, total: 3.03 s Wall time: 3.20 sThat is a speed-up of about 20% compared with unpatched sage-4.7.2.alpha2!
Congrats!
comment:31 follow-up: 32 Changed 11 years ago by
Replying to nthiery:
For what's its worth, a potential variant:
sage: class Foo(object): ....: @lazy_attribute(name="bar")
You mean: Add an optional argument "name" to the lazy attribute? I was thinking about that, too.
Using it means that (in the example above) f._bar
would result in assigning f.__dict__["bar"] = 5
(note: "bar", not "_bar", even though the lazy attribute is called requested as "_bar").
So, when we do
@lazy_attribute(name="parent_class") def _parent_class_from_bases(self): return bla
then the following would happen, where C is a category:
C._parent_class_from_bases
would write its result intoC.parent_class
, thus, potentially overriding an already existing parent class. Do we want that such a dangerous thing can happen?- On the bright side, we could easily do
class Foo(Category): parent_class = _parent_class_from_bases
without renaming, because the name is already right.
If you think this should be added, I can easily do so on #11999.
comment:32 follow-up: 33 Changed 11 years ago by
Replying to SimonKing:
You mean: Add an optional argument "name" to the lazy attribute? I was thinking about that, too.
Yup.
Using it means that (in the example above)
f._bar
would result in assigningf.__dict__["bar"] = 5
(note: "bar", not "_bar", even though the lazy attribute is called requested as "_bar").
Hmm, this smells indeed. I am not sure. At this point, I am wondering if we don't want instead to introduce a new subclass:
class CategoryWithClassesFromBases(Category): # TODO: find a better name
with the two optimized parent_class / element_class (and possibly in the future morphism_class / category_class), and have:
class Category_over_base_ring(CategoryWithClassesFromBases): ... class JoinCategory(CategoryWithClassesFromBases): ...
Sorry, I should have though about this option earlier ...
Cheers,
Nicolas
comment:33 follow-up: 34 Changed 11 years ago by
Hi Nicolas,
Replying to nthiery:
Replying to SimonKing:
class CategoryWithClassesFromBases?(Category): # TODO: find a better name
Perhaps CategoryEnsemble
? The name is short and refers to the fact that the members of an ensemble (such as all categories of algebras over fields) are sufficiently similar that they share important features (such as their parent class).
Or (slightly longer) CategoryWithParameters
? Again, if you have parameters (such as one or two base rings) then still certain important features may not depend on the parameters. Or ParametrizedCategory
, but I think you prefer if the name starts with the word "Category", isn't it?
class JoinCategory?(CategoryWithClassesFromBases?): ...
I think join categories should have their own parent class. Reason:
The parent/element classes of a category ensemble are pickled by a weak form of "pickling by construction": If C1,C2,... belong to an ensemble of categories that share their parent class, then that class will be pickled as getattr(C,'parent_class'), where C is any member of the ensemble.
But I think that JoinCategory
should not use that "pickling by weak construction". Reason: For JoinCategory
, the "pickling by weak construction" is equivalent to the default pickling of dynamic classes (which is: the class is determined by name, bases, and potentially a new class such as ParentMethods
, which is empty in the case of a JoinCategory
). Hence, it would be a waste of time to construct the pickle data for the JoinCategory
while they are created.
I will certainly test both approaches. But if I remember correctly what I did yesterday, the difference between "pickling by weak construction" or "default pickling" for join categories was 6% in the infamous elliptic curve benchmark.
comment:34 Changed 11 years ago by
Dependencies: | #9138 #11900 #11943 #11999 → #9138 #11900 #11943 |
---|
I have updated my patch according to what we have discussed.
- I added a subclass
CategoryWithParameters
of theCategory
class. Of course, if you have a shorter yet more descriptive name, I can change that. I believeCategoryWithClassesFromBases
is too long and not clearer thanCategoryWithParameters
. And I thinkCategoryEnsemble
is not clear either. - Both
Category_over_base
,JoinCategory
andBimodules
inherit from the new class. - Pickling is by weak construction: A parent class P is pickled by
getattr, (C,'parent_class')
, where C is any category such that C.parent_class is P at the time of pickling. We had discussed advantages and disadvantages of this and other approaches.
Using the new class, the patch becomes independent of #11999. Of course, I still think that #11999 is a nice addition, but your suggestion to use a sub-class is better.
Replying to SimonKing:
I will certainly test both approaches. But if I remember correctly what I did yesterday, the difference between "pickling by weak construction" or "default pickling" for join categories was 6% in the infamous elliptic curve benchmark.
Here I was mistaken: With the new patch, the benchmark becomes
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.87 s, sys: 0.05 s, total: 2.92 s Wall time: 3.10 s
and this is as fast as by using default pickling for parent classes of join categories.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:35 Changed 11 years ago by
Sorry, it seems that I had forgotten to hg qrefresh
. Now, the patch is really updated...
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:36 Changed 11 years ago by
Gosh, where is my brain? The previous patch did still contain references to the _parent_class_from_bases
. It is now removed. But I think I need a break.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:37 Changed 11 years ago by
Status: | needs_review → needs_work |
---|---|
Work issues: | → introduce _make_named_class |
Some suggestion of Nicolas: The creation of the parent_class and of the element_class follow the same logic: We have the corresponding classes of the super categories, take them as bases, and add some methods that are available in the attribute ParentMethods
or ElementMethods
.
So, it seems reasonable to write a new method that implements that logic. Then, parent_class and element_class would both simply call that method. Originally, we suggested the name _make_member_class
for that method, because it creates a class for some member (object, element of object, or in future morphism) of the category.
But meanwhile I prefer the name _make_named_class
, because the parameter is indeed a name. So, roughly like this:
class Category: ... def _make_named_class(self, name, methods_holder): the default logic @lazy_attribute def parent_class(self): return self._make_named_class("parent_class", "ParentMethods") @lazy_attribute def element_class(self): return self._make_named_class("element_class", "ElementMethods")
Then, CategoryWithParameters
only needs to override one thing, namely _make_named_class
.
Changed 11 years ago by
Attachment: | trac11935_named_class.patch added |
---|
Refactor parent/element class creation: Use _make_named_class
comment:38 Changed 11 years ago by
Description: | modified (diff) |
---|---|
Status: | needs_work → needs_review |
Work issues: | introduce _make_named_class |
I have implemented the new method _make_named_class
(with "strong pickling by construction" for Category and "weak pickling by construction" for CategoryWithParameters
). For me, all tests pass.
And, for the record:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.88 s, sys: 0.04 s, total: 2.92 s Wall time: 3.06 s
Apply trac11935_weak_pickling_by_construction_rel11943.patch trac11935_named_class.patch
comment:39 Changed 11 years ago by
Status: | needs_review → needs_work |
---|---|
Work issues: | → Rebase wrt the new version of #11900; ideas behind Category_singleton need to be modified if parent classes are shared |
comment:40 Changed 11 years ago by
Status: | needs_work → needs_review |
---|---|
Work issues: | Rebase wrt the new version of #11900; ideas behind Category_singleton need to be modified if parent classes are shared |
I had to rebase the first patch because of changes in #11943. It is now updated. The second patch did not need to change.
The timing is still good. I am now running the tests, but I think it can be needs review now.
Apply trac11935_weak_pickling_by_construction_rel11943.patch trac11935_named_class.patch
comment:42 Changed 11 years ago by
For the record: all tests pass on 5.0 beta10 with the version of the patch on the Sage-Combinat queue (basically the two patches folded together).
comment:43 Changed 11 years ago by
I finished my review. From my point of view it's good to go!
Now I just realized that I had apparently already folded in some of my reviewers changes; sorry. Since the patch is not so long, I guess it's not so bad; I also folded in my latest little changes.
comment:44 Changed 11 years ago by
Description: | modified (diff) |
---|
comment:45 Changed 11 years ago by
I guess we can delete the other intermediate patches once the review is finished.
Happy easter!
comment:46 follow-up: 47 Changed 11 years ago by
All tests seem to pass, except maybe for some fairly trivially failing tests in:
sage -t -force_lib "devel/sage/sage/categories/algebras.py" sage -t -force_lib "devel/sage/sage/categories/modules_with_basis.py" sage -t -force_lib "devel/sage/sage/categories/category.py" sage -t -force_lib "devel/sage/sage/misc/preparser.py"
But that might be due to a couple other patches above in my queue (including #11943) when I ran the tests. I will investigate on Tuesday unless you beat me to it.
comment:47 Changed 11 years ago by
Replying to nthiery:
All tests seem to pass, except maybe for some fairly trivially failing tests in ... But that might be due to a couple other patches above in my queue (including #11943) when I ran the tests. I will investigate on Tuesday unless you beat me to it.
Ok, only the failure in algebras.py was due to this patch (I had forgotten to update one of the subcategory hooks for the change NotImplemented? -> Troolean). This is fixed with the updated patch I just posted.
Cheers,
Nicolas
comment:48 Changed 11 years ago by
Status: | needs_review → needs_work |
---|---|
Work issues: | → Rebase rel 5.0.beta13 |
The patch won't apply.
Wende trac11935_weak_pickling_by_construction_rel11943.patch an patching file sage/categories/bimodules.py Hunk #1 FAILED at 10 Hunk #3 succeeded at 118 with fuzz 1 (offset 0 lines). 1 out of 3 hunks FAILED -- saving rejects to file sage/categories/bimodules.py.rej patching file sage/categories/category.py Hunk #4 FAILED at 1083 Hunk #5 FAILED at 1114 Hunk #7 succeeded at 1632 with fuzz 2 (offset -157 lines). Hunk #8 succeeded at 1686 with fuzz 2 (offset -179 lines). 2 out of 8 hunks FAILED -- saving rejects to file sage/categories/category.py.rej patching file sage/categories/category_types.py Hunk #1 FAILED at 14 1 out of 3 hunks FAILED -- saving rejects to file sage/categories/category_types.py.rej Patch schlug fehl und Fortsetzung unmöglich (versuche -v) Patch schlug fehl, Fehlerabschnitte noch im Arbeitsverzeichnis Fehler beim Anwenden. Bitte beheben und auf trac11935_weak_pickling_by_construction_rel11943.patch aktualisieren king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage$ hg qapplied trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12313-mono_dict-combined.patch trac11935_weak_pickling_by_construction_rel11943.patch
comment:49 Changed 11 years ago by
Status: | needs_work → needs_review |
---|
Arrgh, me stupid! I had #11943 in my queue, but after #11935. With
king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage$ hg qapplied trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12313-mono_dict-combined.patch trac11943_mro_for_all_super_categories_lazy_hook.patch trac11943_mro_for_all_super_categories_lazy_hook-review-nt.patch trac11935_weak_pickling_by_construction_rel11943.patch
everything is fine.
comment:50 Changed 11 years ago by
Work issues: | Rebase rel 5.0.beta13 |
---|
comment:51 follow-up: 52 Changed 11 years ago by
Cc: | Sage Combinat CC user added |
---|---|
Dependencies: | #9138 #11900 #11943 → #9138, #11900, #11943, #12875, #12876, #12877 |
Description: | modified (diff) |
Hi Simon,
I reworked the patch by adding features to dynamic_class in order to avoid logic duplication and encapsulation breaking in make_named_class.
The downside is that this makes this ticket depend on #12876 (ensuring that parent/element classes are purely abstract).
All test should pass on 5.0.beta13, except for the two issues I mentionned in #12876. Oh, and one trivial failure I had forgotten in semigroup_cython.pyx. I'll update the patch later (tonight?) but you can start the review.
I folded the two patches to get a better overview. You can access the differential patch by looking up http://combinat.sagemath.org/patches/file/3121811e2ebe/trac11935_weak_pickling_by_construction_rel11943-review-nt.patch.
Cheers,
Nicolas
Changed 11 years ago by
Attachment: | trac11935_weak_pickling_by_construction_rel11943-nt.patch added |
---|
comment:52 Changed 11 years ago by
Replying to nthiery:
All test should pass on 5.0.beta13, except for the two issues I mentionned in #12876. Oh, and one trivial failure I had forgotten in semigroup_cython.pyx. I'll update the patch later (tonight?) but you can start the review.
The updated patch fixes the failure in semigroup_cython.pyx. If I am not mistaken, all tests pass on 5.0.beta14 (that is all failures I have seen should be due to the fact that I did not activate #715 in the Sage-Combinat queue, because it forces recompiling too much stuff).
Cheers,
Nicolas
Changed 11 years ago by
Attachment: | trac11935_share_on_base_category.patch added |
---|
comment:53 follow-up: 55 Changed 11 years ago by
Hi Simon,
While working on #12895, I got a non trivial time regression, due to the large number of constructed categories for which creating yet another *_class was non negligible. Investigating this with runsnake made me turn back to this ticket: too many categories are created for nothing.
It indeed sounds a bit like a waste, to construct the parent class of Algebras(GF(5)) to have to reconstruct all the hierarchy of super categories above Algebras(GF(5)) (e.g. Modules(GF(5)), ...). With the updated patch, Algebras(K).parent_class directly reuses Algebras(L).parent_class if it already exists and if K and L have the same category. The super_categories method of Algebras(K) is not even called.
To achieve this, each subcategory of CategoryWithParameters? should provide a method _make_named_class_key specifying on what the parent_class (and friends) depend on. For example, Category_over_base specifies that parent_class depends only on the category of the base. Then, _make_named_class uses that to do a lookup in a cache.
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()
the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
One fine point is that e.g. Algebras(ZZ) and Algebras(ZZx?) don't share the same parent class anymore, since ZZ and ZZx? don't have the same category.
What do you think? Could you have a brief look at the experimental trac11935_share_on_base_category.patch I just attached? If it sounds reasonable to you, I'll finalize it (doctests, ...), and fold it in my reviewer's patch.
comment:54 follow-up: 57 Changed 11 years ago by
Hi Nicolas!
What patches are supposed to be applied, currently? Only trac11935_weak_pickling_by_construction_rel11943-nt.patch (which I am now testing), or trac11935_share_on_base_category.patch as well?
comment:55 follow-up: 59 Changed 11 years ago by
Replying to nthiery:
It indeed sounds a bit like a waste, to construct the parent class of Algebras(GF(5)) to have to reconstruct all the hierarchy of super categories above Algebras(GF(5)) (e.g. Modules(GF(5)), ...). With the updated patch, Algebras(K).parent_class directly reuses Algebras(L).parent_class if it already exists and if K and L have the same category. The super_categories method of Algebras(K) is not even called.
I was thinking of that, too. But it would only work if we rely on the assumption that the list of super_categories of a category with base only depends on the category of the base. Can we? Then, to the very least, that assumption must be clearly stated somewhere.
To achieve this, each subcategory of CategoryWithParameters? should provide a method _make_named_class_key specifying on what the parent_class (and friends) depend on. For example, Category_over_base specifies that parent_class depends only on the category of the base. Then, _make_named_class uses that to do a lookup in a cache.
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
I am a bit confused. What is the "subcategory patch"? Is it "share_on_base_category"? And what patches are applied for the four different timings?
One fine point is that e.g. Algebras(ZZ) and Algebras(ZZx?) don't share the same parent class anymore, since ZZ and ZZx? don't have the same category.
Sure, but I don't think that is necessarily bad.
What do you think? Could you have a brief look at the experimental trac11935_share_on_base_category.patch I just attached? If it sounds reasonable to you, I'll finalize it (doctests, ...), and fold it in my reviewer's patch.
I am currently running tests without it. But I am now reading it.
comment:56 follow-up: 60 Changed 11 years ago by
Owner: | Nicolas M. Thiéry deleted |
---|
Looking at the docs of sage.structure.dynamic_class, I see that the docs of DynamicClasscallMetaclass
is broken. Shall I fix it here (in yet another reviewer patch) or leave it to a different ticket?
comment:57 Changed 11 years ago by
Replying to SimonKing:
Hi Nicolas!
What patches are supposed to be applied, currently? Only trac11935_weak_pickling_by_construction_rel11943-nt.patch (which I am now testing), or trac11935_share_on_base_category.patch as well?
Both patches for the experimental feature of having the parent class depend only on the base ring.
comment:58 Changed 11 years ago by
Apparently, the broken documentation of DynamicClasscallMetaclass
comes from the init method of NestedClassMetaclass
. So, it would be better to provide proper documentation. Just a short note.
While the tests are running, I noticed
sage -t --long -force_lib devel/sage/sage/combinat/crystals/tensor_product.py [16.3 s] sage -t --long -force_lib devel/sage/sage/plot/complex_plot.pyx [21.5 s] *** glibc detected *** python: double free or corruption (fasttop): 0x0000000003e99500 *** sage -t --long -force_lib devel/sage/sage/combinat/backtrack.py [15.1 s]
Where does that come from? Here, I work without the share_on_base_category.
comment:59 Changed 11 years ago by
Replying to SimonKing:
I was thinking of that, too. But it would only work if we rely on the assumption that the list of super_categories of a category with base only depends on the category of the base. Can we?
I am indeed not sure about making that assumption for any Category_over_base (it is not clearly defined what a base is!). On the other hand, this seems quite reasonable to me for Category_over_base_ring. This makes e.g. Algebras(...) consistent with the other functorial constructions categories which depend only on the base category.
This also goes in the direction of what we had discussed that we could actually make Algebras(...) be a functorial construction, so that we could define C=Algebras(Fields()), and have Algebras(R) be basically an alias for C for every field. And similarly for PolynomialRings?(Fields()), ...
Note that we could possibly change this ticket to leave Category_over_base alone, and have only Category_over_base_ring derive from CategoryWithParameters?. Do you foresee examples of a plain Category_over_base where sharing parent classes would be important performance wise?
Then, to the very least, that assumption must be clearly stated somewhere.
YES
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
I am a bit confused. What is the "subcategory patch"?
I meant #12895.
And what patches are applied for the four different timings?
With and without share_on_base_category and with and without #12895.
Cheers,
Nicolas
comment:60 Changed 11 years ago by
Replying to SimonKing:
Looking at the docs of sage.structure.dynamic_class, I see that the docs of
DynamicClasscallMetaclass
is broken. Shall I fix it here (in yet another reviewer patch) or leave it to a different ticket?
If it is a simple fix, go ahead.
comment:61 Changed 11 years ago by
Apart from the strange glibc problem (that is not reported at the end of the test suite), I get one timeout, namely sage -t --long -force_lib devel/sage/sage/crypto/mq/mpolynomialsystem.py
.
Aha! And it turns out that the glibc comes from that test!! Here is what I get in detail:
king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13$ ./sage -t --verbose -force_lib devel/sage/sage/crypto/mq/mpolynomialsystem.py sage -t --verbose -force_lib "devel/sage/sage/crypto/mq/mpolynomialsystem.py" Trying: set_random_seed(0L) Expecting nothing ok Trying: change_warning_output(sys.stdout) Expecting nothing ok Trying: sr = mq.SR(Integer(2),Integer(1),Integer(2),Integer(4),gf2=True,polybori=True)###line 26:_sage_ >>> sr = mq.SR(2,1,2,4,gf2=True,polybori=True) Expecting nothing ok Trying: sr###line 27:_sage_ >>> sr Expecting: SR(2,1,2,4) ok Trying: set_random_seed(Integer(1))###line 33:_sage_ >>> set_random_seed(1) Expecting nothing ok Trying: F,s = sr.polynomial_system()###line 34:_sage_ >>> F,s = sr.polynomial_system() Expecting nothing *** glibc detected *** python: double free or corruption (fasttop): 0x0000000003b79b40 *** ^CAborting further tests. KeyboardInterrupt -- interrupted after 2.3 seconds!
So, these are very few commands. That should be reproducible (trying it a bit later).
For reference: The problem does not occur with
trac_12808-classcall_speedup-fh.patch trac_12808_nested_class_cython.patch trac_12808-classcall_cdef.patch trac12215_weak_cached_function.patch trac12215_segfault_fixes.patch trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12875-category-fix_abvar_homspace-nt.patch trac_12877-category-for_more_rings_and_schemes-nt.patch trac_12876_category-fix_abstract_class-nt-rel11521.patch trac_12876-reviewer.patch trac_12876_category-fix_abstract_class-nt-rel11521-review-nt.patch trac9107_nesting_nested_classes.patch trac11768_source_of_dynamic_class.patch trac11768_docfix.patch trac11817_question_mark_using_sage_getdoc.patch trac11791_dynamic_metaclass_introspection.patch trac11943_mro_for_all_super_categories_combined.patch
but does occur if one adds trac11935_weak_pickling_by_construction_rel11943-nt.patch.
comment:62 Changed 11 years ago by
I can not reproduce the problem on the command line. Too bad.
sage: set_random_seed(0L) sage: change_warning_output(sys.stdout) --------------------------------------------------------------------------- NameError Traceback (most recent call last) /mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage-main/<ipython console> in <module>() NameError: name 'change_warning_output' is not defined sage: sr = mq.SR(2,1,2,4,gf2=True,polybori=True) sage: sr SR(2,1,2,4) sage: set_random_seed(1) sage: F,s = sr.polynomial_system()
comment:63 follow-up: 72 Changed 11 years ago by
I still can't reproduce the problem. Too bad. But I have another comment. With the patch, one has in sage/categories/category.py:
def _make_named_class(...): ... else: # Otherwise, check XXXMethods import inspect assert inspect.isclass(method_provider_cls),\ "%s.%s should be a class"%(type(self).__name__, method_provider)
That seems suboptimal to me:
sage: import inspect sage: def test1(self, cls): ....: import inspect ....: assert inspect.isclass(cls),\ ....: "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: def test2(self, cls): ....: assert inspect.isclass(cls),\ ....: "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: def test3(self, cls): ....: if not inspect.isclass(cls): ....: raise AssertionError, "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: test2(ZZ,ZZ.__class__) sage: %timeit test1(ZZ,ZZ.__class__) 625 loops, best of 3: 4.45 µs per loop sage: %timeit test2(ZZ,ZZ.__class__) 625 loops, best of 3: 2.67 µs per loop sage: %timeit test3(ZZ,ZZ.__class__) 625 loops, best of 3: 2.63 µs per loop
test1 is as in your code: inspect is imported, and (at least it seems to me) the error message is created even if the error does not occur. test2 does not import inspect again, while test3 is an attempt to not create the error message.
To my surprise, creating the error message seems to be essentially for free. But one should really import inspect top-level, I think.
comment:64 Changed 11 years ago by
I wonder: Why is it tested in _make_named_class
whether method_provider_cls is a class? Shouldn't that be the job of dynamic_class, which is called in the following lines?
comment:65 follow-up: 73 Changed 11 years ago by
Since method_provider_cls is not being checked in dynamic_class, I guess it is fine to test it in _make_named_class.
I notice
self._super_categories return dynamic_class(class_name, tuple(getattr(cat,name) for cat in self._super_categories), method_provider_cls, prepend_cls_bases = False, doccls = doccls, reduction = (getattr, (self, name)), cache = cache)
i.e., first an empty call to self._super_categories. I guess that can be erased?
comment:66 Changed 11 years ago by
Some other little speed-up:
sage: def test1(cls,name): ....: s = "%s.%s"%(cls.__name__, name) ....: sage: def test2(cls,name): ....: s = cls.__name__+'.'+name ....: sage: timeit("test1(ZZ.__class__,'element_class')", number=10000) 10000 loops, best of 3: 2.15 µs per loop sage: timeit("test2(ZZ.__class__,'element_class')", number=10000) 10000 loops, best of 3: 1.85 µs per loop
So, adding strings seems to be faster than inserting into a format string.
comment:67 follow-up: 76 Changed 11 years ago by
I just tested whether the changes I mentioned have a noticeable effect on the computation time for an example that does nothing but creating the parent classes for 2000 different categories (I made it so that the parent classes are distinct). I am afraid, it was not noticeable. So, I guess I can drop my suggestion.
comment:68 Changed 11 years ago by
Concerning the glibc problem: I made no progress in tracking it down. I just know: When I trace all Python function calls then the problem vanishes, and when I temporarily disable garbage collection then it vanishes as well.
Hence, probably there is some object that is deallocated too early, which makes me wonder how it is related with the memleak fixes from #715, #11521 and #12215.
Nicolas, can you confirm the problem? You didn't mention it yet.
Also, I think I should try to test only the patches that are really needed: In addition to the dependencies (and the dependencies of the dependencies) of this ticket, I have #9107, #11768, #11817 and #11791 applied. I should test whether removing any of it would make the problem disappear.
comment:70 Changed 11 years ago by
Hooray, #12215 is to blame (which doesn't have a positive review, yet). In #12215, dynamic classes become weakly cached.
The aim of #12215 is to avoid a memory leak that was partially caused by creating many different parent classes. Here, the same problem is solved in a different way, namely by avoiding that many different parent classes are created in the first place.
Hence, I am now trying whether it helps to have apply #12215, but reverting the cache of dynamic classes into a strong cache.
comment:71 follow-up: 74 Changed 11 years ago by
Yippie! Returning to a strong cache for dynamic classes does indeed help! I am now running the test suite, because it is conceivable that the change makes part of the memleak re-appear.
comment:72 Changed 11 years ago by
Replying to SimonKing:
That seems suboptimal to me: ... test1 is as in your code: inspect is imported, and (at least it seems to me) the error message is created even if the error does not occur. test2 does not import inspect again, while test3 is an attempt to not create the error message.
To my surprise, creating the error message seems to be essentially for free. But one should really import inspect top-level, I think.
According to: http://docs.python.org/reference/simple_stmts.html
assert expression1, expression2
is equivalent to:
if __debug__: if not expression1: raise AssertionError(expression2)
This means in particular that expression2 is *not* evaluated if expression1 does not hold. So we don't need to worry about efficiency in those.
That being said, yes, the import inspect should be at the
toplevel! Thanks for catching this.
Are you in the process of writing a reviewer's patch, or do you want me to do the change? Note: I am flying to Montreal tomorrow, and my wife arrives in one hour from a two weeks mission; so you probably won't hear much from me until Sunday.
Cheers,
Nicolas
comment:73 Changed 11 years ago by
Replying to SimonKing:
Since method_provider_cls is not being checked in dynamic_class, I guess it is fine to test it in _make_named_class.
I notice
self._super_categories return dynamic_class(class_name, tuple(getattr(cat,name) for cat in self._super_categories), method_provider_cls, prepend_cls_bases = False, doccls = doccls, reduction = (getattr, (self, name)), cache = cache)i.e., first an empty call to self._super_categories. I guess that can be erased?
Oh, yes. That is just a scory from a debugging session where I wanted to force the evaluation of the supercategories. Thanks.
comment:74 Changed 11 years ago by
Replying to SimonKing:
Yippie! Returning to a strong cache for dynamic classes does indeed help! I am now running the test suite, because it is conceivable that the change makes part of the memleak re-appear.
Thanks for tracking this down! I don't have a preference for weak or strong caching dynamic classes.
comment:75 Changed 11 years ago by
Speaking of weak cache: currently trac11935_share_on_base_category.patch uses strong caching. Weak caching might be preferable.
Another thing is that this patch could be quite more concise if we had another feature in cached functions: namely that we could specify that certain arguments should be ignored in the cache lookup; or more generally that we could specify a function that would produce the key to be used in the cache lookup. Something like:
def key(x, l, option=1): return x, tuple(l) @cached_function(key=key) def foo(x, l, option=1): return (x, l, option) sage: foo(1, [1,2], 3) is foo(1, (1,2), 2) True
Better syntax welcome!
Cheers,
Nicolas
comment:76 Changed 11 years ago by
Replying to SimonKing:
I just tested whether the changes I mentioned have a noticeable effect on the computation time for an example that does nothing but creating the parent classes for 2000 different categories (I made it so that the parent classes are distinct). I am afraid, it was not noticeable. So, I guess I can drop my suggestion.
Feel free to implement it anyway in a reviewer's patch. Addition is as readable, if not more, than using a format string.
comment:77 Changed 11 years ago by
Hi Nicolas,
I was about to create a reviewer patch. But I am afraid the same problem creped back in sage/rings/polynomail/multi_polynomial_sequence.py. In other words: When I have a weak cache on dynamic classes, then the double free occurs in crypto/mq/mpolynomialsystem.py, but when I have a strong cache, then the double free occurs in rings/polynomial/multi_polynomial_sequence.py (but again in a test involving polynomial systems!).
I have a preference for a weak cache, and since apparently the weak cache as such is not to blame for the error, I need to look somewhere else. Perhaps there is some UniqueRepresentation
in mpolynomialsystem.py that needs a strong cache. If that turns out to be correct, then I will do the necessary change in #12215.
Apart from that: I will be away until Monday as well.
comment:78 Changed 10 years ago by
Too bad. Apparently the patch doesn't apply anymore (according to the patch bot). Apart from a possible rebasing: What needs to be done to get this behind us?
comment:79 Changed 10 years ago by
Apparently a lot of things need to be done. Nicolas, could you remind me (and the patchbot) of the patches to be applied? I have just updated the patches at #12876, so that they apply on top of sage-5.3.beta2 plus dependencies.
comment:80 Changed 10 years ago by
Dependencies: | #9138, #11900, #11943, #12875, #12876, #12877 → #9138, #11900, #11943, #12875, #12877 |
---|
Just a try to have the patchbot try this ticket without #12876!
comment:81 Changed 10 years ago by
Description: | modified (diff) |
---|
Changed 10 years ago by
Attachment: | trac11935_weak_pickling_by_construction-nt.patch added |
---|
comment:82 Changed 10 years ago by
Hi Simon,
I just went through the history of this ticket. At this point, I feel it's a reasonable choice to have the parent/element classe of a category over base depend only on the base.
The patch I just uploaded includes:
- The latest version of trac11935_weak_pickling_by_construction_rel11943-nt.patch in the Sage-Combinat queue (which was that from trac + some rebase + little improvements)
- trac11935_share_on_base_category.patch
- Doctests for the _make_named_class_key methods that were missing it
- Some doc minor improvements
With this patch applied on top of #12876, all long test passed on my machine, and I still get the desired speedup for our standard elliptic curve benchmark:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.98 s, sys: 0.08 s, total: 3.06 s Wall time: 3.17 s
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.06 s, sys: 0.04 s, total: 2.10 s Wall time: 2.20 s
From my point of view, it's good to go. To completely finish the review I just need someone to proofread the doctests for the _make_named_class_key methods. And you may want to glance through the patch.
Note: I have decided to fold the patches together because anyway trac11935_weak_pickling_by_construction_rel11943-nt.patch was changed in the mean time. Also, since this has been a long time, I guess its easier to glance through and see the big picture. And it's not so long anyway.
Cheers,
Nicolas
comment:83 Changed 10 years ago by
Description: | modified (diff) |
---|---|
Reviewers: | → Nicolas Thiery, Travis Scrimshaw |
Hey Nicolas and Simon,
I've looked over the documentation. I've uploaded a small review patch (also to to the combinat queue) which makes some minor tweaks to the doc and adds a few more tests to CategoryWithParameters._make_named_class()
. If you're happy with my changes, then you can set this to positive review.
Best,
Travis
comment:84 follow-up: 98 Changed 10 years ago by
Hey Travis,
your changes look good to me. Probably I want to have a closer look at the big patch today. But if you think the big patch is fine, then it is OK.
Apply trac11935_weak_pickling_by_construction-nt.patch trac_11935-weak_pickling_by_construction-review-ts.patch
comment:85 Changed 10 years ago by
When applying the patches, sage -br gives me
---------------------------------------------------------------------- | Sage Version 5.9.rc0, Release Date: 2013-04-25 | | Type "notebook()" for the browser-based notebook interface. | | Type "help()" for help. | ---------------------------------------------------------------------- ********************************************************************** * * * Warning: this is a prerelease version, and it may be unstable. * * * ********************************************************************** /home/simon/SAGE/prerelease/sage-5.9.rc0/local/lib/python2.7/site-packages/sage/categories/category.py:1140: UserWarning: Objects.HomCategory.ParentMethods should not have a super class warn("%s.%s should not have a super class"%(cls.__name__, method_provider)) ********************************************************************** Oops, Sage crashed. We do our best to make it stable, but... A crash report was automatically generated with the following information: - A verbatim copy of the crash traceback. - A copy of your input history during this session. - Data on your current Sage configuration. It was left in the file named: '/home/simon/.sage/ipython-0.12/Sage_crash_report.txt' If you can email this file to the developers, the information in it will help them in understanding and correcting the problem. You can mail it to: sage-support at sage-support@googlegroups.com with the subject 'Sage Crash Report'. If you want to do it now, the following command will work (under Unix): mail -s 'Sage Crash Report' sage-support@googlegroups.com < /home/simon/.sage/ipython-0.12/Sage_crash_report.txt To ensure accurate tracking of this issue, please file a report about it at: http://trac.sagemath.org/sage_trac
comment:87 Changed 10 years ago by
Status: | needs_review → needs_work |
---|
I think it is "needs work". Without the two patches, sage starts fine. But with the first patch, it crashes.
comment:89 follow-up: 90 Changed 10 years ago by
Hey Simon,
Hmmm... it works for me on 5.10.beta3
with both patches applied. Although I'm using the version of Nicolas' patch in the combinat queue, but there are no real differences between the two patches:
diff trac11935_weak_pickling_by_construction-nt.patch ~/Downloads/trac11935_weak_pickling_by_construction-nt.patch 4,5c4,5 < # Node ID 3045fe70aaf8fb0c0d9d3551944317ee0764c7da < # Parent f5ad6657cff1a065b4192e84a96fca0bc597c2c7 --- > # Node ID 0e23d0ca3d0bb47caa234d9d1890f7c603f3ddbd > # Parent f5ad6657cff1a065b4192e84a96fca0bc597c2c7
I don't quite understand the new patchbot logs yet, so but have you tried with both patches applied Simon (although I'd be somewhat surprised if my review patch unintentionally fixed something).
Also I didn't look at the patch functionality, just the docstrings/tests since this is what Nicolas stated as needing to be reviewed.
comment:90 follow-up: 97 Changed 10 years ago by
Replying to tscrim:
Hmmm... it works for me on
5.10.beta3
with both patches applied. Although I'm using the version of Nicolas' patch in the combinat queue, but there are no real differences between the two patches:
I suspect that there is a dependency that I did not apply. But in any case: The message Objects.HomCategory.ParentMethods should not have a super class
just seems plain wrong to me. I believe that of course one should be able to allow super classes for parent methods.
comment:91 Changed 10 years ago by
Nicolas, can you elaborate why parent classes should have no super class? And do you have an idea what dependencies make this patch work? Certainly these are new dependencies, because the patchbot reports the failure with sage-5.10.beta3.
comment:92 Changed 10 years ago by
PS: Yes, Travis, I get the failure with both patches and with the first patch only.
comment:93 Changed 10 years ago by
Aha, the user warning about super classes has been introduced in the first patch. So, Nicolas, what is the rationale? I would actually expect that some parent classes will have a super class, namely Ring
or Homset
and so on.
comment:94 follow-up: 95 Changed 10 years ago by
PS: I think I have even posted a patch on some trac ticket in which I make Homsets use Sage's coercion model, so that their custom __call__
can be removed. But this will only work when named classes can be constructed with super classes.
comment:95 Changed 10 years ago by
comment:96 Changed 10 years ago by
Dependencies: | #9138, #11900, #11943, #12875, #12877 → #9138, #11900, #11943, #12875, #12876, #12877 |
---|
comment:97 Changed 10 years ago by
Replying to SimonKing:
Replying to tscrim:
Hmmm... it works for me on
5.10.beta3
with both patches applied. Although I'm using the version of Nicolas' patch in the combinat queue, but there are no real differences between the two patches:I suspect that there is a dependency that I did not apply.
Oops, sorry; I had tried last week to remove #12876 as dependency, and forgot to put it back into place. Fixed!
comment:98 follow-up: 99 Changed 10 years ago by
Hi Travis, Simon,
Replying to SimonKing:
your changes look good to me.
+1. Thanks Travis for tidying the documentation!
Probably I want to have a closer look at the big patch today. But if you think the big patch is fine, then it is OK.
Please! I think it's a good time for both of us to look over our shoulders and see what we think of the change with a good step back.
comment:99 follow-up: 101 Changed 10 years ago by
Replying to nthiery:
Please! I think it's a good time for both of us to look over our shoulders and see what we think of the change with a good step back.
Please don't forget to tell why you insist on not allowing parent or element classes to inherit from anything useful ;-P
comment:100 follow-up: 104 Changed 10 years ago by
Hi Simon!
Thanks for putting the rational of this patch to trial!
Let me call "category class" the element/parent/... classes of categories. Here is my rationale for having them *not* inherit from anything (the fact that they did in a couple exceptional cases has always been in my mind a temporary hack to ease the transition to categories).
- From a OO point of view, we want to keep category classes as close possible from the model of a hierarchy of abstract classes providing generic code.
Homset is a concrete class (in the sense that it provides data structure, ...); so having a category class inherit from it deviates from this model.
- This makes the model simpler to explain: C.element_class contains code provided by C and its upper categories. Nothing else. The hierarchy of classes for parents / elements is perfectly parallel to that of categories.
- This allows to give a clean description of the mro of a parent (or an element). First a sequence of concrete classes, related by single inheritance, that roughly speaking describe the data structure and basic methods of the object:
E.g. CombinatorialFreeModule? -> Parent -> CategoryObject? -> SageObject?
Then, all the category classes that give the generic algorithm. All mutiple inheritance is concentrated in this part of the mro.
This leaves full freedom for organizing it to avoid conflicts (like in the upcoming C3 patch) or to do further optimizations. If a category class inherit from, say, Homset, then Homset will appear in the middle of category classes in the mro, and this is more likely to create mro conflicts.
My theory is that we should always be able to avoid inheritance of category classes upon anything else; and in fact that this should actually point us to cleaner design. Now I'd be more than happy to test this against concrete situations like #14279.
The logic in Hom is such that every homset is anyway constructed as an instance of the concrete class Homset or some of its subclasses (RingHomset?, ...). Could you point me to the exact problem that arised when you worked on it?
Cheers,
Nicolas
comment:101 follow-up: 102 Changed 10 years ago by
Replying to SimonKing:
Please don't forget to tell why you insist on not allowing parent or element classes to inherit from anything useful ;-P
I was working on it :-)
By the way: this patch is likely to conflict with the cythonization of homsets, right? How bad do you expect it would be to rebase #14214 on top of this one?
Cheers,
Nicolas
comment:102 follow-up: 103 Changed 10 years ago by
comment:103 Changed 10 years ago by
comment:104 follow-ups: 105 106 Changed 10 years ago by
Earlier today, I wrote a long reply, but my laptop's battery seems to be damaged, and without a warning the computer shut down and the whole text was lost. Bad.
Replying to nthiery:
- This makes the model simpler to explain: C.element_class contains code provided by C and its upper categories. Nothing else. The hierarchy of classes for parents / elements is perfectly parallel to that of categories.
OK. But at some point, we want to mix the abstract class with a concrete class. For example, if P is a parent, we have P.element_class
, which mixes an abstract class P.category().element_class
with a concrete class P.Element
.
The logic in Hom is such that every homset is anyway constructed as an instance of the concrete class Homset or some of its subclasses (RingHomset?, ...).
Exactly.
So, in your approach, where would this mix happen? Of course, if a homset H is initialised as an object of, say, Rings().hom_category()
, and if the base class of H is a python class, then of course the class of H is changed into a sub-class of Rings().hom_category().parent_class
.
But the first problem arises if the base class of H is an extension class. Then, H.__class__
can't be overridden, and thus inheritance from the abstract class is by the custom __getattr__
of Parent, which is slow compared with proper inheritance.
The second problem is in Hom(X,Y,category)
. First X._Hom_(Y,category)
is attempted, but if this doesn't return anything useful, the abstract class C.hom_category().parent_class
is instantiated.
H = category.hom_category().parent_class(X, Y, category = category)
But an abstract class should not be instantiated! This is exactly why I thought it would be better to mix the abstract and concrete classes by providing C.hom_category().ParentMethods
with bases.
And even worse things happen in sage.categories.rings.Rings.HomCategory.ParentMethods.__new__
, which is a very ugly hack that should be avoided IMHO.
If you don't like bases, what do you think of the following alternative model? I try to be as close to P.element_class
(P being a parent) as possible.
Let C be a category. On the one hand, it shall be provided with a concrete class C.Homset
defaulting to sage.categories.homset.Homset
, which could be overridden with a more concrete sub-class such as sage.rings.homset.RingHomset_generic
. This corresponds to P.Element
.
As we have discussed already, if S is a subcategory of C, then we would not want that S.Homset
is obtained from C.Homset
, unless S is a full subcategory. But fortunately, when we set Rings().Homset = RingHomset_generic
, then Algebras().Homset
would still default to Homset
, not RingHomset_generic
.
On the other hand, we get an abstract class C.hom_category().parent_class
. This corresponds to P.category().element_class
.
I suggest that the concrete and abstract classes be mixed in a lazy attribute C.homset_class
, in the same way as the concrete and abstract classes are mixed in P.element_class
.
If I am not mistaken, we discussed this in one of my visits to Orsay. If I recall correctly, the only concern has been improper inheritance: When we have a non-full subcategory S of C, then S.homset_class
should not inherit from C.homset_class
.
But this, I think, is something that can be solved by making S.hom_category().super_categories()
return the right thing. Namely, the concrete class S.Homset
is not inherited from C.Homset
anyway, and the abstract class S.hom_category().parent_class
will be fine if and only if
S.hom_category().super_categories()` returns the right thing.
Final remark: In Hom(X,Y,category)
, it would still be possible for X._Hom_(Y,category)
to do something special. But I believe that with the model I described it will be possible in most cases to rely on the generic (purely categorical) construction of the homset.
So, question: Do you think that this model is the way to go? It keeps abstract and concrete classes neatly apart, it will reduce the need to have custom X._Hom_(...)
, and we could get rid of the hack with __new__
.
If you agree, then I'd say we try to finalise the patch here in the way you suggested, namely without allowing base classes. And in a second step (I think I have opened an according ticket 2 or 3 years ago) we could implement the model I sketched.
Could you point me to the exact problem that arised when you worked on it?
No. It has been a while ago, and I lost track, I am afraid.
comment:105 Changed 10 years ago by
Replying to SimonKing:
So, question: Do you think that this model is the way to go? It keeps abstract and concrete classes neatly apart, it will reduce the need to have custom
X._Hom_(...)
, and we could get rid of the hack with__new__
.
I think I should first try to review #12876 (which is a dependency anyway). There, you state:
- Unified the logic for selecting the class when building a Homset (e.g. Homset,
RingHomset
,HeckeModuleHomspace
, ...). This is now systematically done through the _Hom_ hook. The logic still has a fundamental flaw, but that's for the later #10668.
So, you want that it is through the _Hom_
hook---let me guess (haven't read it yet): _Hom_
will play a similar role to Hom as _element_constructor_
plays to __call__
...
comment:106 follow-up: 107 Changed 10 years ago by
Hi Simon!
Thanks for your work on this patch!
Replying to SimonKing:
Earlier today, I wrote a long reply, but my laptop's battery seems to be damaged, and without a warning the computer shut down and the whole text was lost. Bad.
Ouch. I got struck by this as well, and this is very annoying. I am now protecting myself from such loss by using the cool "It's all text" extension to firefox to edit the trac fields in my favorite editor, and making sure that the temporary file used for that actually lives in my home directory and not /tmp.
Replying to nthiery: OK. But at some point, we want to mix the abstract class with a concrete class. For example, if P is a parent, we have
P.element_class
, which mixes an abstract classP.category().element_class
with a concrete classP.Element
.
Indeed, P.element_class and C.element_class have different status. The former is a concrete class, while the later is an abstract class. I guess I should have been more specific: the hierarchy of *category* classes for parents (i.e. all the C.parent_class's) is perfectly parallel to that of categories (all the C's).
But the first problem arises if the base class of H is an extension class. Then,
H.__class__
can't be overridden, and thus inheritance from the abstract class is by the custom__getattr__
of Parent, which is slow compared with proper inheritance.
Right, but that's no different than for other parents and element. We have lived with it so far, and as long as Cython won't allow multiple inheritance from purely abstract classes we will have to. But there is hope: Robert mentioned several times that there is no theoretical barrier and that it just needs to be implemented :-)
The second problem is in
Hom(X,Y,category)
. FirstX._Hom_(Y,category)
is attempted, but if this doesn't return anything useful, the abstract classC.hom_category().parent_class
is instantiated.H = category.hom_category().parent_class(X, Y, category = category)
Not anymore! Line 306 of homset.py, with the patch applied, if no _Hom_ method is provided by the parent or the parent class, then Hom resorts to:
H = Homset(X, Y, category = category)
which creates H as an instance of the concrete base class Homset with the appropriate category.
And even worse things happen in
sage.categories.rings.Rings.HomCategory.ParentMethods.__new__
, which is a very ugly hack that should be avoided IMHO.
You mean in sage.categories.schemes, right?
Yikes. I had forgotten about this old hacky workaround I had introduced at the beginning of categories. I am pretty sure we can now get rid of it using an appropriate _Hom_ method in Schemes.ParentMethods?. And indeed, this ticket would be a good time to get rid of it. I'll have a look!
If you don't like bases, what do you think of the following alternative model? ...
I agree that all _Hom_ methods are basically doing the same thing: specifying what concrete homset class to do. So your approach could remove a bit of duplication. On the other hand, the _Hom_ approach is flexible and has already been used for a while in Sage. And it's already implemented :-)
As for the homset class: we had indeed discussed this in Orsay, and the plan is described in #10668. Basically, as you point out, super_categories for Hom categories should be fixed so that C.HomCategory?() is a super category of S.HomCategory?() only if C is a full subcategory of S. Then nothing special needs to be done for abstract homset classes in categories; they are just regular abstract parent classes. On the other hand something needs to be done for morphisms, so that a morphism in C gets code for morphisms of S. To implement this, we want to have C.morphism_class and associated MorphismMethods?, with inheritance done properly.
That's actually partially implemented in the (disabled):
http://combinat.sagemath.org/patches/file/tip/category-hom_methods-nt.patch
and looked like it was going to work smoothly.
So, question: Do you think that this model is the way to go? It keeps abstract and concrete classes neatly apart, it will reduce the need to have custom
X._Hom_(...)
, and we could get rid of the hack with__new__
.If you agree, then I'd say we try to finalise the patch here in the way you suggested, namely without allowing base classes. And in a second step (I think I have opened an according ticket 2 or 3 years ago) we could implement the model I sketched.
I'd say: as you suggest, let's finish the patch as is (hopefully with the new removed). Then let's move on to the other functorial construction patch. Then cythonization of homsets / #10668 / ... Then, with the experience, if we find ourselves implementing _Hom_ methods too often, we can introduce a nicer protocol like the one you propose to only provide a class.
I guess one point of it is that I don't have a good view on whether we will have so many concrete classes for homsets (for morphism that's a different story!).
Could you point me to the exact problem that arised when you worked on it?
No. It has been a while ago, and I lost track, I am afraid.
Ok. Let me know if you stumble again on it!
Cheers,
Nicolas
comment:107 follow-up: 108 Changed 10 years ago by
Hi Nicolas,
Replying to nthiery:
But the first problem arises if the base class of H is an extension class. Then,
H.__class__
can't be overridden, and thus inheritance from the abstract class is by the custom__getattr__
of Parent, which is slow compared with proper inheritance.Right, but that's no different than for other parents and element. We have lived with it so far, and as long as Cython won't allow multiple inheritance from purely abstract classes we will have to. But there is hope: Robert mentioned several times that there is no theoretical barrier and that it just needs to be implemented :-)
I don't see how multiple inheritance comes into play. What we want is to override the attribute __class__
of an instance. That's possible in Python, but impossible in Cython, and I don't think this can ever change.
So, it is not an option to use _initalise_category_
to enrich the class of a homset H, if H.__class__
is a cython class.
But what we can do, if I am not mistaken: Create a dynamic class that is constructed from one concrete base class (which can be a cythoned Homset, for example) and an abstract class (which can be C.hom_category().parent_class
). And then we could create H
as an instance of this dynamic class.
Not anymore! Line 306 of homset.py, with the patch applied, if no _Hom_ method is provided by the parent or the parent class, then Hom resorts to:
H = Homset(X, Y, category = category)which creates H as an instance of the concrete base class Homset with the appropriate category.
Really? So, how does the abstract class comes into play, then? I guess I should look at the patch...
And even worse things happen in
sage.categories.rings.Rings.HomCategory.ParentMethods.__new__
, which is a very ugly hack that should be avoided IMHO.You mean in sage.categories.schemes, right?
No, in the location I stated. It is:
sage: sage.categories.rings.Rings.HomCategory.ParentMethods.__new__?? Type: instancemethod String Form:<unbound method HomCategory.ParentMethods.__new__> File: /home/simon/SAGE/prerelease/sage-5.9.rc0/local/lib/python2.7/site-packages/sage/categories/rings.py Definition: sage.categories.rings.Rings.HomCategory.ParentMethods.__new__(cls, X, Y, category) Source: def __new__(cls, X, Y, category): """ sage: Hom(QQ, QQ, category = Rings()).__class__ # indirect doctest <class 'sage.rings.homset.RingHomset_generic_with_category'> sage: Hom(CyclotomicField(3), QQ, category = Rings()).__class__ # indirect doctest <class 'sage.rings.number_field.morphism.CyclotomicFieldHomset_with_category'> """ from sage.rings.homset import RingHomset return RingHomset(X, Y, category = category)
If you agree, then I'd say we try to finalise the patch here in the way you suggested, namely without allowing base classes. And in a second step (I think I have opened an according ticket 2 or 3 years ago) we could implement the model I sketched.
I'd say: as you suggest, let's finish the patch as is (hopefully with the new removed). Then let's move on to the other functorial construction patch. Then cythonization of homsets / #10668 / ... Then, with the experience, if we find ourselves implementing _Hom_ methods too often, we can introduce a nicer protocol like the one you propose to only provide a class.
OK.
Best regards,
Simon
comment:108 follow-up: 109 Changed 10 years ago by
Replying to SimonKing:
I don't see how multiple inheritance comes into play. What we want is to override the attribute
__class__
of an instance. That's possible in Python, but impossible in Cython, and I don't think this can ever change. So, it is not an option to use_initalise_category_
to enrich the class of a homset H, ifH.__class__
is a cython class.
That's to be discussed with Robert, but I am pretty sure there will be a way out if we don't think too much in the box. Anyway, let's not worry about that for now.
But what we can do, if I am not mistaken: Create a dynamic class that is constructed from one concrete base class (which can be a cythoned Homset, for example) and an abstract class (which can be
C.hom_category().parent_class
). And then we could createH
as an instance of this dynamic class.
A quick way being to define Homset as a Python subclass of the cythonized Homset. And the _initialize_category_ will do that for you.
Really? So, how does the abstract class comes into play, then?
Homset is a parent like all the others, and inherit code from its categories (in that case a Homcategory).
I guess I should look at the patch...
Please!
No, in the location I stated. It is:
sage: sage.categories.rings.Rings.HomCategory.ParentMethods.__new__??
Oh, right, I had forgotten about this one. This is gone with the patch; which is why I did not find it :-)
And if I was able to get rid of it, I don't see why this would not be the case for schemes as well.
Cheers,
Nicolas
comment:109 Changed 10 years ago by
Replying to nthiery:
And if I was able to get rid of it, I don't see why this would not be the case for schemes as well.
I just looked, and it seems like the new in Schemes.HomCategory?.ParentMethods? is not actually used at all anymore. Since Schemes.HomCategory? contains nothing else interesting, I am currently running all long tests with Schemes.HomCategory? completely stripped away. If this works, I will post a little supplementary patch for #12876 where it belongs naturally since the same thing is done there for Rings.HomCategory?.
Cheers,
Nicolas
comment:111 Changed 10 years ago by
Status: | needs_work → needs_review |
---|
If I am not mistaken, the work issue that made me change the status to "needs_work" was the fact that sage did not start at some point of the work. But I just tested: This is no problem any more.
Apply trac11935_weak_pickling_by_construction-nt.patch trac_11935-weak_pickling_by_construction-review-ts.patch
comment:112 Changed 10 years ago by
But who can review it? I guess as the author, I am only entitled to tell whether I am fine with the changes made by Nicolas and Travis.
So, what else can I do? #12895, or is there something more urgent?
comment:114 Changed 10 years ago by
I already reviewed your things. So if you are happy with my changes, then it's good to go!
Yes, #12895 is next on the list!
Thanks!
comment:115 Changed 10 years ago by
I think Nicolas' changes look fine, too. But let's wait for the tests to finish.
Changed 10 years ago by
Attachment: | trac_11935-weak_pickling_by_construction-review-ts.patch added |
---|
comment:116 Changed 10 years ago by
All tests had passed. However I've uploaded a tweaked review patch which changes the doctest line continuations to the new format.
Apply: trac11935_weak_pickling_by_construction-nt.patch trac_11935-weak_pickling_by_construction-review-ts.patch
comment:118 Changed 10 years ago by
Travis' changes look good. So, a final make ptestlong, and then we'll hopefully get it over with.
comment:119 Changed 10 years ago by
Status: | needs_review → positive_review |
---|
The patchbot finds that all tests pass. On my laptop, tests aren't finished yet, but nothing has failed so far.
The startup.modules script complains that now the inspect module is loaded. This is a bit surprising to me, because I thought that the inspect module is loaded during startup anyway (it is frequently used). But I think there is not much need to worry---and the startup-time plugin does not complain.
The doctest continuation plugin complains about the first patch, but I just verified that the second patch fixes it.
And given all the comments above, I hope I am no entitled to change the status to "positive_review"---complain if you disagree.
comment:121 follow-up: 122 Changed 10 years ago by
Milestone: | sage-5.10 → sage-5.11 |
---|
comment:122 follow-up: 123 Changed 10 years ago by
Hi Jeroen,
Replying to jdemeyer:
Milestone changed from sage-5.10 to sage-5.11
Ah shoot, it would have been helpful to have a clean base including those patches for the work on the subsequent category patches.
I assume you are in the process of cutting a rc0? Do you have some approximate schedule for 5.11?
Thanks!
Nicolas
comment:123 follow-up: 124 Changed 10 years ago by
Replying to nthiery:
Ah shoot, it would have been helpful to have a clean base including those patches for the work on the subsequent category patches.
Well, given that these kind of patches are notorious for breaking things, I don't want to merge this in an rc. This patch is scheduled for sage-5.11.beta1 (with #12876 in beta0).
Do you have some approximate schedule for 5.11?
What do you mean?
comment:124 follow-up: 125 Changed 10 years ago by
Replying to jdemeyer:
Replying to nthiery:
Ah shoot, it would have been helpful to have a clean base including those patches for the work on the subsequent category patches.
Well, given that these kind of patches are notorious for breaking things, I don't want to merge this in an rc.
Fair enough :-)
This patch is scheduled for sage-5.11.beta1 (with #12876 in beta0).
Good point; we only need to have them in a beta, not necessarily for a final release.
Do you have some approximate schedule for 5.11?
What do you mean?
Roughly speaking when do you foresee 5.11.beta1 to be out?
Thanks!
comment:125 Changed 9 years ago by
Merged in: | → sage-5.11.beta1 |
---|---|
Resolution: | → fixed |
Status: | positive_review → closed |
comment:126 Changed 9 years ago by
Reviewers: | Nicolas Thiery, Travis Scrimshaw → Nicolas M. Thiéry, Travis Scrimshaw |
---|
Concerning uniqueness of the parent class: In at least one case (namely
Algebras(R)
), the super categories depend on whether the base ring is a field or not. We would like to haveThe idea is that the parent and element classes should only depend on the super categories, but otherwise should be independent from the base ring. Working at #11900, I found that this would drastically improve the performance of some elliptic curve computation.