Opened 9 years ago
Last modified 7 years ago
#11935 closed enhancement
Make parent/element classes independent of base rings — at Version 81
Reported by: | SimonKing | Owned by: | |
---|---|---|---|
Priority: | major | Milestone: | sage-5.11 |
Component: | categories | Keywords: | parent class, element class, category |
Cc: | jdemeyer, sage-combinat | Merged in: | |
Authors: | Simon King | Reviewers: | |
Report Upstream: | N/A | Work issues: | |
Branch: | Commit: | ||
Dependencies: | #9138, #11900, #11943, #12875, #12877 | Stopgaps: |
Description (last modified by )
At #11900 and sage-combinat-devel, as well as in some comments in sage/categories/category.py, the idea was discussed to make, for example, Algebras(GF(3)).parent_class==Algebras(GF(5)).parent_class
- hence, make the parent/element classes as independent from the base of a category as possible.
This is implemented in this patch by introducing an abstract class CategoryWithParameters? which uses pickling by "weak construction" for its element and parent classes. Now:
- For a join category, the parent/element class depend only on the parent/element class of its super categories.
- For a Category_over_base (e.g. Modules, Algebras, Schemes, ...), the parent/element class depend only on the category of the base.
- For a bimodule, the parent/element class depend only on the category of the left and right bases.
In the process, this patch also:
- Adds a method Category._make_named_class providing a unified way to create parent and element classes (and later on morphism classes)
- Extends the interface of dynamic_class to customize caching and pickling
- Rename the experimental class IdempotentSemigroups?.ElementMethods? and remove its super class, and discards unused code there.
Apply
Change History (87)
comment:1 Changed 9 years ago by
comment:2 follow-ups: ↓ 3 ↓ 4 Changed 9 years ago by
A problem may be seen in pickling. Before explaining the problem, let me remark that I don't see a big concern for "pickling a parent class": What we actually want to pickle is a parent, not just a naked class. The serialisation data of a polynomial ring, for example, will comprise the base ring, the generator names and the term order, but certainly the class of the polynomial ring will not be part of the pickle.
However, if we do want to serialise a naked parent or element class, we have the following problems:
Currently, C.parent_class
is pickled by getattr, (C,"parent_class")
. The pickling data (hence, C) is part of the cache key of a dynamic class. With that, the parent class of different categories C1 and C2 can't be the same.
I see three approaches to get rid of it.
- Remove the pickling data from the cache key of dynamic classes
- Make pickling of
C.parent_class
just rely on the default way of pickling a dynamic class - Work around the cache of dynamic classes, but still use
getattr,(C,"parent_class")
for pickling.
I think 1. is not gonna happen. It would break a lot of code, I suppose.
I had tested 2. and 3. while working on #11900. Both would work, but there are different conerns concerning long-term stability.
- means:
The pickle of a parent class will only depend on the category graph as it was on the time of pickling. If the category graph changes between pickling and unpickling the parent class, you would get a different class.
- would be a bit more stable.
The idea is:
(i) In the lazy attribute
parent_class()
, the dynamic class is first created without providing the reduction data (as in approach 2.). (ii) Before returning that dynamic class, it is tested whether the reduction data is still none. If it is, thegetattr, (C,"parent_class")
thingy is inserted.
Consequence:
Algebras(QQ).parent_class
could, for example, be unpickled asAlgebras(GF(2)).parent_class
- which is not a big problem, since we want them to be the same. However, if in a distant future we want them to be different again, we'd be in trouble...
I suggest that I create patches for both 2. and 3., and then people can tell what they think about it. The method resolution will then be taken care of by another patch.
comment:3 in reply to: ↑ 2 ; follow-up: ↓ 5 Changed 9 years ago by
Replying to SimonKing:
A problem may be seen in pickling. Before explaining the problem, let me remark that I don't see a big concern for "pickling a parent class":
True: all parents ought to be pickled "by construction" rather than by "class + internal data", in order to encapsulate as much as possible of the data structure. This probably ought to be true as well for elements. I don't know how far we are from this.
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there. And why. I don't know how complicated it is to do this search though.
Among the three propositions, I like 3 best. I have trouble evaluating how big the risks are to get stuck in the future. It does not seem too big.
Thanks Simon for investigating this!
comment:4 in reply to: ↑ 2 Changed 9 years ago by
- Description modified (diff)
- Summary changed from Make parent/element classes independent of base rings and the category graph consistent with method resolution to Make parent/element classes independent of base rings
Replying to SimonKing:
I suggest that I create patches for both 2. and 3., and then people can tell what they think about it. The method resolution will then be taken care of by another patch.
I just argued myself into splitting the ticket: This here will be for the base ring independent parent/element classes, and another one will be for method resolution order.
comment:5 in reply to: ↑ 3 ; follow-up: ↓ 8 Changed 9 years ago by
Replying to nthiery:
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there.
Old pickles will not break, I believe. Let P3 = Algebras(GF(3)).parent_class
and PQ = Algebras(QQ).parent_class
. Here are a few scenarios:
1. P3 and PQ have been created and pickled with an old version of Sage
In the old version of Sage, P3!=PQ
. They are pickled by construction. Hence, after applying my yet-to-be-submitted patches, they are unpickled as Algebras(GF(3)).parent_class
and Algebras(QQ).parent_class
- which is the same class after applying the patch.
Conclusion: An old pickle will not break, with either approach 2. or 3. The worst what could happen is P3!=PQ
before pickling and P3==PQ
after unpickling. But the defining property P3==Algebras(GF(3)).parent_class
and PQ==Algebras(QQ).parent_class
is preserved.
2. P3 and PQ are created and pickled according to approach 2. from above
Of course, P3==PQ
at the time of pickling. The pickle will only depend on the parent classes of the super categories of Algebras(GF(3))
and Algebras(QQ)
. If there was a change in the super categories between pickling and unpickling, we would have P3!=Algebras(GF(3)).parent_class
after unpickling.
Conclusion: A new pickle of P3 and PQ can be unpickled after a change in the category graph, but a change in the category graph will destroy the defining property P3==Algebras(GF(3)).parent_class
.
3. P3 and PQ are created and pickled according to approach 3. from above
Of course, P3==PQ
at the time of pickling. PQ
and P3
will both be pickled by construction. In particular, a change in the category graph would not matter, as long as the super categories of Algebras(QQ)
and Algebras(GF(3))
still coincide (up to the base ring) after the change in the category graph.
A problem would arise if, in a distant future, the super categories of Algebras(QQ)
and Algebras(GF(3))
would become essentially different. Say, some algebra person finds that vector spaces over fraction fields should have their own category, different from the usual category of vector spaces. Then, Algebras(QQ).parent_class!=Algebras(GF(3)).parent_class
. In particular, after such change, unpickling P3
or PQ
would result in either PQ!=Algebras(QQ).parent_class
or P3!=Algebras(GF(3)).parent_class
.
Conclusion: A new pickle of P3 and PQ can be unpickled after a change in the category graph. Most changes in the category graph will preserve the defining property P3==Algebras(GF(3)).parent_class
and PQ==Algebras(QQ).parent_class
after unpickling. However, if the super categories will depend on the base ring in a different way as it is now, then either P3 or PQ will lose the defining property after unpickling, while the other will keep the defining property - and we don't know which of the two will break.
It seems to me that approach 3. is less fragile than 2. But I believe that in applications (hence, for pickling parents) both should be fine. So, I'll prepare patches for both approaches.
Changed 9 years ago by
Use default pickling for parent/element classes, making them base ring independent.
comment:6 Changed 9 years ago by
- Status changed from new to needs_review
The patch that I just attached implements approach 2., hence, it uses the default pickling of dynamic classes. By consequence, the parent class of a category C will only depend on C.ParentMethods
and on the parent classes of the super categories of C, but it will only depend on the base ring of C if the base ring changes the super categories (which holds for algebras, e.g.).
Note the effect on the computation time for elliptic curves. With sage-4.7.2.alpha3 plus #11900, we have
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.97 s, sys: 0.07 s, total: 4.04 s Wall time: 4.18 s
but with the new patch on top of it, we have
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.11 s, sys: 0.06 s, total: 3.17 s Wall time: 3.31 s
Changed 9 years ago by
Use a weak form of "pickling by construction" for parent and element classes of categories
comment:7 Changed 9 years ago by
The second patch is posted as well. It implements approach 3. Hence, the parent_class lazy attribute works around the cache of dynamic classes (by not providing unpickling information when the class is created), inserting the unpickling information only when the class has not been found in cache.
By consequence, when first creating Algebras(QQ).parent_class
, then its cache key as a dynamic class only comprises the parent classes of the super categories. Before returning it, the unpickling data (by construction) is added. When Algebras(GF(3)).parent_class
is created later, it is found in the cache of dynamic classes and immediately returned. The class returned will thus be unpickled as Algebras(QQ).parent_class
.
Similar to the first patch, it considerably speeds up the elliptic curve computations:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 3.05 s, sys: 0.07 s, total: 3.12 s Wall time: 3.27 s
Apply only one of the two patches, at your choice!
comment:8 in reply to: ↑ 5 ; follow-up: ↓ 9 Changed 9 years ago by
Replying to SimonKing:
Replying to nthiery:
A good thing to do at this point would be to search through the sage pickle jar for how many parent_class's and element_class's are pickled there.
Old pickles will not break, I believe.
This is my belief too! Sorry if I have been unclear, but that was not the point of my suggestion. What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data". If they already are pickled by construction, then how C.parent_class and C.element_class are pickled is mostly irrelevant, now and in the future, since it is seldom used.
Thanks in any cases for you detailed analysis!
It seems to me that approach 3. is less fragile than 2.
+1
Let's see if someone else has some preference between the two implementations.
Cheers,
Nicolas
comment:9 in reply to: ↑ 8 ; follow-up: ↓ 10 Changed 9 years ago by
Replying to nthiery:
What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data".
I see. If I am not mistaken, if it is pickled by "class+data", then copy_reg._reconstructor is called at unpickling. Perhaps it is possible to modify _reconstructor, so that it writes data to some log file. In that way, we could find out how often it is actually used.
comment:10 in reply to: ↑ 9 ; follow-up: ↓ 11 Changed 9 years ago by
Replying to SimonKing:
Replying to nthiery:
What I wanted to know whether currently most parents and elements were pickled by construction or by "class + data".
I see. If I am not mistaken, if it is pickled by "class+data", then copy_reg._reconstructor is called at unpickling. Perhaps it is possible to modify _reconstructor, so that it writes data to some log file. In that way, we could find out how often it is actually used.
Yup. Or run explain_pickle on all pickles, and grep for element_class / parent_class.
comment:11 in reply to: ↑ 10 Changed 9 years ago by
Replying to nthiery:
Yup. Or run explain_pickle on all pickles, and grep for element_class / parent_class.
I don't know about explain_pickle. Where can I find it?
I am now running sage -testall -long with the _reconstructor log. So far, only ZODB.fsIndex.fsIndex and matplotlib.font_manager.FontEntry? use it, but we will see if there's more.
comment:12 Changed 9 years ago by
sage -testall -long succeeded (with the second patch applied, but it would probably work with thee first one as well), and copy_reg._reconstructor was only used on <class 'matplotlib.font_manager.FontEntry'>
, <class 'ZODB.fsIndex.fsIndex'>
and <class 'sage.misc.explain_pickle.EmptyNewstyleClass'>
.
Hence, it indicates that "pickling by class and data" does not occur for parents. But, to be on the safe side, one should inspect the pickle jar using explain_pickle.
comment:13 Changed 9 years ago by
- Cc jdemeyer added
- Dependencies changed from #11900 to #9138, #11900
comment:14 Changed 9 years ago by
- Description modified (diff)
sage -testall -long
passes for either patch. So, unless we will find bad surprises in the pickle jar, both approaches should be fine. I am slightly in favour of approah 3.
comment:15 Changed 9 years ago by
- Dependencies changed from #9138, #11900 to #9138 #11900 #11943
- Description modified (diff)
I decided to make this ticket depend on #11943, for two reasons: Firstly, it is rather clear that #11943 is a good idea, while I am less sure here (it is good for speed, but may under very particular circumstances break new pickles). Secondly, #11943 seems less invasive than the patch here.
I think that the "potentially breaking new pickles in a distant future" aspect is less urgent with the "weak pickling by construction" approach. Hence, I have only updated the second patch, the first can now be disregarded.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:16 follow-up: ↓ 17 Changed 9 years ago by
- Status changed from needs_review to needs_work
- Work issues set to Fix doctest in covariant functorial construction
comment:17 in reply to: ↑ 16 ; follow-up: ↓ 18 Changed 9 years ago by
Replying to SimonKing:
For some reason, #11935 together with #11943 result in one doctest error, namely in the "FooBars?" test of
sage/categories/covariant_functorial_construction.py
. So, it needs work.
Here is the problem:
sage: from sage.categories.covariant_functorial_construction import CovariantConstructionCategory sage: class FooBars(CovariantConstructionCategory): ... _functor_category = 'FooBars_function' ... sage: def FooBars_function(X): return FooBars(X) ... sage: C = FooBars(ModulesWithBasis(QQ)) sage: import __main__; __main__.FooBars_function = FooBars_function sage: __main__.FooBars = FooBars sage: Category.FooBars_function = FooBars_function
Now, requesting C.parent_class
results in a complaint regarding "duplicate base class FooBars.parent_class
". Indeed, with the patch from here, we have
sage: CA = FooBars(AdditiveMagmas()) sage: CM = FooBars(Magmas()) sage: CA.parent_class == CM.parent_class True
even though we have
sage: Magmas().parent_class is AdditiveMagmas().parent_class False
So, I was overdoing it: With my patch, the parent class not only becomes independent of a base ring, but also it becomes independent of the base category of a covariant functorial construction - and this is not what we want.
The point is that CA and CM above have the same super categories, namely FooBars(Sets())
. But with my patch, parent classes are the same if both the super categories and the ParentMethods
are the same. The ParentMethods
are different for Magmas()
and AdditiveMagmas()
, but they are the same for FooBars(Magmas())
and FooBars(AdditiveMagmas())
.
I wonder how this should be solved.
One possibility is that sage.categories.covariant_functorial_constructions overrides the parent_class and element_class lazy attributes that are defined in sage.categories.category.
comment:18 in reply to: ↑ 17 ; follow-up: ↓ 19 Changed 9 years ago by
Hi Simon!
Replying to SimonKing:
So, I was overdoing it: With my patch, the parent class not only becomes independent of a base ring, but also it becomes independent of the base category of a covariant functorial construction - and this is not what we want.
The point is that CA and CM above have the same super categories, namely
FooBars(Sets())
. But with my patch, parent classes are the same if both the super categories and theParentMethods
are the same. TheParentMethods
are different forMagmas()
andAdditiveMagmas()
, but they are the same forFooBars(Magmas())
andFooBars(AdditiveMagmas())
.I wonder how this should be solved.
One possibility is that sage.categories.covariant_functorial_constructions overrides the parent_class and element_class lazy attributes that are defined in sage.categories.category.
Ah ah, interesting challenge!
So, the question is whether we want to put in the specifications:
Constraint (C1): above any category in the category hierarchy, there should be a one-to-one correspondence between categories and their parent class. In particular, C.all_super_categories() and C.parent_class.mro() should exactly match.
For C(...) a parametrized categories, let me call (O) the optimization of having C(...).parent_class depend only on C.ParentMethods? and the parent class of the super categories of C(...).
If we want (C1), then we indeed have to be careful with parametrized categories. In particular, it seems to me (I haven't written a proof though :-)) that optimization (O) can only be used for a parametrized category C(...) if it is further guaranteed that:
Constraint (C2): no parent is simultaneously in two instances C(A) and C(B) of C.
(C2) seems reasonable for a Category_over_base_ring (a parent should have a unique distinguished base ring). Maybe it would make sense as well for a Category_over_base. Your example above shows that we don't want it in general, and in particular not for covariant constructions.
This calls for Category to not implement (O) by default, and Category_over_base_ring to override parent_class and element_class to implement (O).
Another option would be to drop (C1) altogether: it seems like a nice optimization to reduce the number of classes in the mro whenever possible (e.g. when categories do not provide ParentMethods?). But then we face an interesting poset/lattice problem, namely whether the C3 algorithm is (can be made) compatible with taking certain subposets.
This is quite related to the kind of optimizations I do in #10963, so I'd love to have a brainstorm about that; but we'd better have it face to face when you come over. In any cases, I vote for imposing (C1) in this ticket, and think about removing this constraint, if at all possible, in a later ticket. Both for not making this ticket even more complicated and reducing conflicts with #10963.
Cheers,
Nicolas
comment:19 in reply to: ↑ 18 Changed 9 years ago by
Hi Nicolas,
Replying to nthiery:
If we want (C1), then we indeed have to be careful with parametrized categories. In particular, it seems to me (I haven't written a proof though :-)) that optimization (O) can only be used for a parametrized category C(...) if it is further guaranteed that:
Constraint (C2): no parent is simultaneously in two instances C(A) and C(B) of C.
(C2) seems reasonable for a Category_over_base_ring (a parent should have a unique distinguished base ring). Maybe it would make sense as well for a Category_over_base.
That would indeed be a possibility. My first reaction was "The problem arose in covariant functorial construction, thus, use (O) by default, but let covariant functorial constructions override it with the non-optimised approach".
But after all, the subject of this ticket is "Make parent/element classes independent of base rings". So, your suggestion fits 100% into the scope of this ticket.
However, it is not all that easy. For example, by #9138, the category of a univariate polynomial ring is a JOIN of a category with base (namely commutative algebras) and a base free category (Euclidean domains or so). The join category has a base()
method, due to one of the patches at #11900, but it does not inherit from sage.categories.category_types.Category_over_base
.
In other words:
If we use optimisation (O) only on sage.categories.category_types.Category_over_base
then we would not see any speedup in elliptic curve computations.
Here are a few scenarios:
- Use (O) by default, but do not use it on covariant functorial constructions. The question is whether this is consistent.
- Do not use (O) by default, but use it on
Category_over_base
. Problem: We would not see a speed-up. - Do not use (O) by default, but use it on every category that has a
base
method. That includesCategory_over_base
, but also (by #11900) any category that is sub-category of a category with base.
The problem with the third approach is: Testing whether the category is sub-category of a category with base is expensive. Thus, it is possible that the regression (due to this test) is bigger than the speed-up obtained from optimisation (O).
comment:21 follow-ups: ↓ 22 ↓ 23 Changed 9 years ago by
- Milestone set to sage-4.8
I think the following could be a solution:
- Do not use optimization for
Category.parent_class
. Hence, the default is the good old "pickle by construction" approach. - Add a specialised
JoinCategory.parent_class
that uses default pickling of a dynamic class (which means: The class is uniquely determined by the base classes). Rationale: A join category is uniquely determined by its super categories, and thus it is consequent if the parent class of a join category is uniquely determined by the parent classes of its super categories. - Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" abbroach. Rationale: It's the purpose of this ticket to make the parent class independent of the base ring, and "weak pickling by construction" seems the most stable option.
Apparently, the problem with functorial constructions would vanish - they use the non-optimized old parent_class. But we would get a speed-up where we need it: Polynomial rings belong to a join category, and one super category of that join category is a Category_over_base
.
Of course, the same should be done with the element_class.
comment:22 in reply to: ↑ 21 Changed 9 years ago by
Replying to SimonKing:
- Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" approach.
... and to Bimodules
as well. They have two bases, but they do not inherit from 'Category_over_base'.
comment:23 in reply to: ↑ 21 ; follow-up: ↓ 24 Changed 9 years ago by
Replying to SimonKing:
I think the following could be a solution:
- Do not use optimization for
Category.parent_class
. Hence, the default is the good old "pickle by construction" approach.- Add a specialised
JoinCategory.parent_class
that uses default pickling of a dynamic class (which means: The class is uniquely determined by the base classes). Rationale: A join category is uniquely determined by its super categories, and thus it is consequent if the parent class of a join category is uniquely determined by the parent classes of its super categories.- Add a specialised
Category_over_base.parent_class
using the optimization (O) discussed above, in the "weak pickling by construction" abbroach. Rationale: It's the purpose of this ticket to make the parent class independent of the base ring, and "weak pickling by construction" seems the most stable option.Apparently, the problem with functorial constructions would vanish - they use the non-optimized old parent_class. But we would get a speed-up where we need it: Polynomial rings belong to a join category, and one super category of that join category is a
Category_over_base
.Of course, the same should be done with the element_class.
This sounds very good.
Just a suggestion: in Category, you may want to put a lazy attribute _parent_class_from_base_classes, so that the categories that want the optimization (like Modules or Bimodules) can just do parent_class = _parent_class_from_base_classes (same for element_class of course)
parent_class
comment:24 in reply to: ↑ 23 Changed 9 years ago by
Replying to nthiery:
Just a suggestion: in Category, you may want to put a lazy attribute _parent_class_from_base_classes, so that the categories that want the optimization (like Modules or Bimodules) can just do parent_class = _parent_class_from_base_classes (same for element_class of course)
Excellent idea!
comment:25 Changed 9 years ago by
The patch is not yet ready for publication, but in L = EllipticCurve('960d1').prove_BSD()
I see a speedup of nearly 25% compared with #11900+#11943!
So, according to your suggestion, I add a _parent_class_from_bases and _element_class_from_bases to Category, and use it for categories over base and for bimodules.
However, there is a slight problem: You can not simply define parent_class = _parent_class_from_bases
if you want to have a real lazy attribute. Namely, parent_class
would believe that its name is _parent_class_from_bases
:
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar ....: sage: f = Foo sage: f.bar.__name__ '_bar'
In particular, it is not as fast as it should be. With the not-submitted patch:
sage: C = Modules(GF(3),dispatch=False) sage: %timeit p = C.parent_class 625 loops, best of 3: 69.4 µs per loop sage: %timeit p = C._parent_class_from_bases 625 loops, best of 3: 440 ns per loop
But it might be possible to work around.
comment:26 follow-up: ↓ 29 Changed 9 years ago by
Concerning lazy attributes: I wonder whether one could add a method rename(name)
to a lazy attribute. That method would return a copy of the original lazy attribute, but with a new name.
Then, the example above would become
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
comment:27 Changed 9 years ago by
- Dependencies changed from #9138 #11900 #11943 to #9138 #11900 #11943 #11999
I created #11999 for the possibility to rename lazy attributes, and make it a dependency.
comment:28 follow-up: ↓ 30 Changed 9 years ago by
- Status changed from needs_work to needs_review
- Work issues Fix doctest in covariant functorial construction deleted
Done!
The current patch preserves the default parent_class and element_class for categories. In particular, there is no problem with the covariant functorial constructions.
According to Nicolas' idea, I added a _parent_class_from_bases and _element_class_from_bases. They use the "weak pickling-by-construction" approach discussed above, because that seems to minimize the probability of breaking new pickles in a distant future. Just to emphasize: Old pickles will still work.
These two new lazy attributes override parent_class and element_class for Category_over_base
(which was made possible by #11999). By consequence, the parent classes of all vector spaces (over different base fields) coincide. The parent classes of algebras over fields coincide, but differ from the parent class of algebras over a non-field.
Moreover, the parent class of a join category only depends on the parent classes of its super categories. Here, I use the default pickling of dynamic classes.
Rationale for using the default pickling in the case of join categories:
- The creation of a dynamic class becomes slightly faster when you don't need to worry about pickling.
- A join category is uniquely determined by its super categories. So, it is safe to make the parent class uniquely determined by the bases (which is precisely the default pickling of dynamic classes).
With the new patch, all long tests pass. Moreover, I have the following timing:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.95 s, sys: 0.08 s, total: 3.03 s Wall time: 3.20 s
That is a speed-up of about 20% compared with unpatched sage-4.7.2.alpha2!
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:29 in reply to: ↑ 26 ; follow-up: ↓ 31 Changed 9 years ago by
Replying to SimonKing:
Concerning lazy attributes: I wonder whether one could add a method
rename(name)
to a lazy attribute. That method would return a copy of the original lazy attribute, but with a new name.Then, the example above would become
sage: class Foo(object): ....: @lazy_attribute ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
For what's its worth, a potential variant:
sage: class Foo(object): ....: @lazy_attribute(name="bar") ....: def _bar(self): ....: return 5 ....: bar = _bar.rename("bar") ....: sage: f = Foo sage: f.bar.__name__ 'bar'
In our use case, we want the lazy attribute parent_class_from_bases to be called parent_class whenever it's used by subclasses of Category.
Cheers,
Nicolas
comment:30 in reply to: ↑ 28 Changed 9 years ago by
Replying to SimonKing:
With the new patch, all long tests pass. Moreover, I have the following timing:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.95 s, sys: 0.08 s, total: 3.03 s Wall time: 3.20 sThat is a speed-up of about 20% compared with unpatched sage-4.7.2.alpha2!
Congrats!
comment:31 in reply to: ↑ 29 ; follow-up: ↓ 32 Changed 9 years ago by
Replying to nthiery:
For what's its worth, a potential variant:
sage: class Foo(object): ....: @lazy_attribute(name="bar")
You mean: Add an optional argument "name" to the lazy attribute? I was thinking about that, too.
Using it means that (in the example above) f._bar
would result in assigning f.__dict__["bar"] = 5
(note: "bar", not "_bar", even though the lazy attribute is called requested as "_bar").
So, when we do
@lazy_attribute(name="parent_class") def _parent_class_from_bases(self): return bla
then the following would happen, where C is a category:
C._parent_class_from_bases
would write its result intoC.parent_class
, thus, potentially overriding an already existing parent class. Do we want that such a dangerous thing can happen?- On the bright side, we could easily do
class Foo(Category): parent_class = _parent_class_from_bases
without renaming, because the name is already right.
If you think this should be added, I can easily do so on #11999.
comment:32 in reply to: ↑ 31 ; follow-up: ↓ 33 Changed 9 years ago by
Replying to SimonKing:
You mean: Add an optional argument "name" to the lazy attribute? I was thinking about that, too.
Yup.
Using it means that (in the example above)
f._bar
would result in assigningf.__dict__["bar"] = 5
(note: "bar", not "_bar", even though the lazy attribute is called requested as "_bar").
Hmm, this smells indeed. I am not sure. At this point, I am wondering if we don't want instead to introduce a new subclass:
class CategoryWithClassesFromBases(Category): # TODO: find a better name
with the two optimized parent_class / element_class (and possibly in the future morphism_class / category_class), and have:
class Category_over_base_ring(CategoryWithClassesFromBases): ... class JoinCategory(CategoryWithClassesFromBases): ...
Sorry, I should have though about this option earlier ...
Cheers,
Nicolas
comment:33 in reply to: ↑ 32 ; follow-up: ↓ 34 Changed 9 years ago by
Hi Nicolas,
Replying to nthiery:
Replying to SimonKing:
class CategoryWithClassesFromBases?(Category): # TODO: find a better name
Perhaps CategoryEnsemble
? The name is short and refers to the fact that the members of an ensemble (such as all categories of algebras over fields) are sufficiently similar that they share important features (such as their parent class).
Or (slightly longer) CategoryWithParameters
? Again, if you have parameters (such as one or two base rings) then still certain important features may not depend on the parameters. Or ParametrizedCategory
, but I think you prefer if the name starts with the word "Category", isn't it?
class JoinCategory?(CategoryWithClassesFromBases?): ...
I think join categories should have their own parent class. Reason:
The parent/element classes of a category ensemble are pickled by a weak form of "pickling by construction": If C1,C2,... belong to an ensemble of categories that share their parent class, then that class will be pickled as getattr(C,'parent_class'), where C is any member of the ensemble.
But I think that JoinCategory
should not use that "pickling by weak construction". Reason: For JoinCategory
, the "pickling by weak construction" is equivalent to the default pickling of dynamic classes (which is: the class is determined by name, bases, and potentially a new class such as ParentMethods
, which is empty in the case of a JoinCategory
). Hence, it would be a waste of time to construct the pickle data for the JoinCategory
while they are created.
I will certainly test both approaches. But if I remember correctly what I did yesterday, the difference between "pickling by weak construction" or "default pickling" for join categories was 6% in the infamous elliptic curve benchmark.
comment:34 in reply to: ↑ 33 Changed 9 years ago by
- Dependencies changed from #9138 #11900 #11943 #11999 to #9138 #11900 #11943
I have updated my patch according to what we have discussed.
- I added a subclass
CategoryWithParameters
of theCategory
class. Of course, if you have a shorter yet more descriptive name, I can change that. I believeCategoryWithClassesFromBases
is too long and not clearer thanCategoryWithParameters
. And I thinkCategoryEnsemble
is not clear either. - Both
Category_over_base
,JoinCategory
andBimodules
inherit from the new class. - Pickling is by weak construction: A parent class P is pickled by
getattr, (C,'parent_class')
, where C is any category such that C.parent_class is P at the time of pickling. We had discussed advantages and disadvantages of this and other approaches.
Using the new class, the patch becomes independent of #11999. Of course, I still think that #11999 is a nice addition, but your suggestion to use a sub-class is better.
Replying to SimonKing:
I will certainly test both approaches. But if I remember correctly what I did yesterday, the difference between "pickling by weak construction" or "default pickling" for join categories was 6% in the infamous elliptic curve benchmark.
Here I was mistaken: With the new patch, the benchmark becomes
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.87 s, sys: 0.05 s, total: 2.92 s Wall time: 3.10 s
and this is as fast as by using default pickling for parent classes of join categories.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:35 Changed 9 years ago by
Sorry, it seems that I had forgotten to hg qrefresh
. Now, the patch is really updated...
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:36 Changed 9 years ago by
Gosh, where is my brain? The previous patch did still contain references to the _parent_class_from_bases
. It is now removed. But I think I need a break.
Apply trac11935_weak_pickling_by_construction_rel11943.patch
comment:37 Changed 9 years ago by
- Status changed from needs_review to needs_work
- Work issues set to introduce _make_named_class
Some suggestion of Nicolas: The creation of the parent_class and of the element_class follow the same logic: We have the corresponding classes of the super categories, take them as bases, and add some methods that are available in the attribute ParentMethods
or ElementMethods
.
So, it seems reasonable to write a new method that implements that logic. Then, parent_class and element_class would both simply call that method. Originally, we suggested the name _make_member_class
for that method, because it creates a class for some member (object, element of object, or in future morphism) of the category.
But meanwhile I prefer the name _make_named_class
, because the parameter is indeed a name. So, roughly like this:
class Category: ... def _make_named_class(self, name, methods_holder): the default logic @lazy_attribute def parent_class(self): return self._make_named_class("parent_class", "ParentMethods") @lazy_attribute def element_class(self): return self._make_named_class("element_class", "ElementMethods")
Then, CategoryWithParameters
only needs to override one thing, namely _make_named_class
.
comment:38 Changed 9 years ago by
- Description modified (diff)
- Status changed from needs_work to needs_review
- Work issues introduce _make_named_class deleted
I have implemented the new method _make_named_class
(with "strong pickling by construction" for Category and "weak pickling by construction" for CategoryWithParameters
). For me, all tests pass.
And, for the record:
sage: %time L = EllipticCurve('960d1').prove_BSD() CPU times: user 2.88 s, sys: 0.04 s, total: 2.92 s Wall time: 3.06 s
Apply trac11935_weak_pickling_by_construction_rel11943.patch trac11935_named_class.patch
comment:39 Changed 9 years ago by
- Status changed from needs_review to needs_work
- Work issues set to Rebase wrt the new version of #11900; ideas behind Category_singleton need to be modified if parent classes are shared
comment:40 Changed 9 years ago by
- Status changed from needs_work to needs_review
- Work issues Rebase wrt the new version of #11900; ideas behind Category_singleton need to be modified if parent classes are shared deleted
I had to rebase the first patch because of changes in #11943. It is now updated. The second patch did not need to change.
The timing is still good. I am now running the tests, but I think it can be needs review now.
Apply trac11935_weak_pickling_by_construction_rel11943.patch trac11935_named_class.patch
comment:41 Changed 9 years ago by
FWIW: Tests pass.
comment:42 Changed 8 years ago by
For the record: all tests pass on 5.0 beta10 with the version of the patch on the Sage-Combinat queue (basically the two patches folded together).
comment:43 Changed 8 years ago by
I finished my review. From my point of view it's good to go!
Now I just realized that I had apparently already folded in some of my reviewers changes; sorry. Since the patch is not so long, I guess it's not so bad; I also folded in my latest little changes.
comment:44 Changed 8 years ago by
- Description modified (diff)
comment:45 Changed 8 years ago by
I guess we can delete the other intermediate patches once the review is finished.
Happy easter!
comment:46 follow-up: ↓ 47 Changed 8 years ago by
All tests seem to pass, except maybe for some fairly trivially failing tests in:
sage -t -force_lib "devel/sage/sage/categories/algebras.py" sage -t -force_lib "devel/sage/sage/categories/modules_with_basis.py" sage -t -force_lib "devel/sage/sage/categories/category.py" sage -t -force_lib "devel/sage/sage/misc/preparser.py"
But that might be due to a couple other patches above in my queue (including #11943) when I ran the tests. I will investigate on Tuesday unless you beat me to it.
comment:47 in reply to: ↑ 46 Changed 8 years ago by
Replying to nthiery:
All tests seem to pass, except maybe for some fairly trivially failing tests in ... But that might be due to a couple other patches above in my queue (including #11943) when I ran the tests. I will investigate on Tuesday unless you beat me to it.
Ok, only the failure in algebras.py was due to this patch (I had forgotten to update one of the subcategory hooks for the change NotImplemented? -> Troolean). This is fixed with the updated patch I just posted.
Cheers,
Nicolas
comment:48 Changed 8 years ago by
- Status changed from needs_review to needs_work
- Work issues set to Rebase rel 5.0.beta13
The patch won't apply.
Wende trac11935_weak_pickling_by_construction_rel11943.patch an patching file sage/categories/bimodules.py Hunk #1 FAILED at 10 Hunk #3 succeeded at 118 with fuzz 1 (offset 0 lines). 1 out of 3 hunks FAILED -- saving rejects to file sage/categories/bimodules.py.rej patching file sage/categories/category.py Hunk #4 FAILED at 1083 Hunk #5 FAILED at 1114 Hunk #7 succeeded at 1632 with fuzz 2 (offset -157 lines). Hunk #8 succeeded at 1686 with fuzz 2 (offset -179 lines). 2 out of 8 hunks FAILED -- saving rejects to file sage/categories/category.py.rej patching file sage/categories/category_types.py Hunk #1 FAILED at 14 1 out of 3 hunks FAILED -- saving rejects to file sage/categories/category_types.py.rej Patch schlug fehl und Fortsetzung unmöglich (versuche -v) Patch schlug fehl, Fehlerabschnitte noch im Arbeitsverzeichnis Fehler beim Anwenden. Bitte beheben und auf trac11935_weak_pickling_by_construction_rel11943.patch aktualisieren king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage$ hg qapplied trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12313-mono_dict-combined.patch trac11935_weak_pickling_by_construction_rel11943.patch
comment:49 Changed 8 years ago by
- Status changed from needs_work to needs_review
Arrgh, me stupid! I had #11943 in my queue, but after #11935. With
king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage$ hg qapplied trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12313-mono_dict-combined.patch trac11943_mro_for_all_super_categories_lazy_hook.patch trac11943_mro_for_all_super_categories_lazy_hook-review-nt.patch trac11935_weak_pickling_by_construction_rel11943.patch
everything is fine.
comment:50 Changed 8 years ago by
- Work issues Rebase rel 5.0.beta13 deleted
comment:51 follow-up: ↓ 52 Changed 8 years ago by
- Cc sage-combinat added
- Dependencies changed from #9138 #11900 #11943 to #9138, #11900, #11943, #12875, #12876, #12877
- Description modified (diff)
Hi Simon,
I reworked the patch by adding features to dynamic_class in order to avoid logic duplication and encapsulation breaking in make_named_class.
The downside is that this makes this ticket depend on #12876 (ensuring that parent/element classes are purely abstract).
All test should pass on 5.0.beta13, except for the two issues I mentionned in #12876. Oh, and one trivial failure I had forgotten in semigroup_cython.pyx. I'll update the patch later (tonight?) but you can start the review.
I folded the two patches to get a better overview. You can access the differential patch by looking up http://combinat.sagemath.org/patches/file/3121811e2ebe/trac11935_weak_pickling_by_construction_rel11943-review-nt.patch.
Cheers,
Nicolas
Changed 8 years ago by
comment:52 in reply to: ↑ 51 Changed 8 years ago by
Replying to nthiery:
All test should pass on 5.0.beta13, except for the two issues I mentionned in #12876. Oh, and one trivial failure I had forgotten in semigroup_cython.pyx. I'll update the patch later (tonight?) but you can start the review.
The updated patch fixes the failure in semigroup_cython.pyx. If I am not mistaken, all tests pass on 5.0.beta14 (that is all failures I have seen should be due to the fact that I did not activate #715 in the Sage-Combinat queue, because it forces recompiling too much stuff).
Cheers,
Nicolas
Changed 8 years ago by
comment:53 follow-up: ↓ 55 Changed 8 years ago by
Hi Simon,
While working on #12895, I got a non trivial time regression, due to the large number of constructed categories for which creating yet another *_class was non negligible. Investigating this with runsnake made me turn back to this ticket: too many categories are created for nothing.
It indeed sounds a bit like a waste, to construct the parent class of Algebras(GF(5)) to have to reconstruct all the hierarchy of super categories above Algebras(GF(5)) (e.g. Modules(GF(5)), ...). With the updated patch, Algebras(K).parent_class directly reuses Algebras(L).parent_class if it already exists and if K and L have the same category. The super_categories method of Algebras(K) is not even called.
To achieve this, each subcategory of CategoryWithParameters? should provide a method _make_named_class_key specifying on what the parent_class (and friends) depend on. For example, Category_over_base specifies that parent_class depends only on the category of the base. Then, _make_named_class uses that to do a lookup in a cache.
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()
the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
One fine point is that e.g. Algebras(ZZ) and Algebras(ZZx?) don't share the same parent class anymore, since ZZ and ZZx? don't have the same category.
What do you think? Could you have a brief look at the experimental trac11935_share_on_base_category.patch I just attached? If it sounds reasonable to you, I'll finalize it (doctests, ...), and fold it in my reviewer's patch.
comment:54 follow-up: ↓ 57 Changed 8 years ago by
Hi Nicolas!
What patches are supposed to be applied, currently? Only trac11935_weak_pickling_by_construction_rel11943-nt.patch (which I am now testing), or trac11935_share_on_base_category.patch as well?
comment:55 in reply to: ↑ 53 ; follow-up: ↓ 59 Changed 8 years ago by
Replying to nthiery:
It indeed sounds a bit like a waste, to construct the parent class of Algebras(GF(5)) to have to reconstruct all the hierarchy of super categories above Algebras(GF(5)) (e.g. Modules(GF(5)), ...). With the updated patch, Algebras(K).parent_class directly reuses Algebras(L).parent_class if it already exists and if K and L have the same category. The super_categories method of Algebras(K) is not even called.
I was thinking of that, too. But it would only work if we rely on the assumption that the list of super_categories of a category with base only depends on the category of the base. Can we? Then, to the very least, that assumption must be clearly stated somewhere.
To achieve this, each subcategory of CategoryWithParameters? should provide a method _make_named_class_key specifying on what the parent_class (and friends) depend on. For example, Category_over_base specifies that parent_class depends only on the category of the base. Then, _make_named_class uses that to do a lookup in a cache.
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
I am a bit confused. What is the "subcategory patch"? Is it "share_on_base_category"? And what patches are applied for the four different timings?
One fine point is that e.g. Algebras(ZZ) and Algebras(ZZx?) don't share the same parent class anymore, since ZZ and ZZx? don't have the same category.
Sure, but I don't think that is necessarily bad.
What do you think? Could you have a brief look at the experimental trac11935_share_on_base_category.patch I just attached? If it sounds reasonable to you, I'll finalize it (doctests, ...), and fold it in my reviewer's patch.
I am currently running tests without it. But I am now reading it.
comment:56 follow-up: ↓ 60 Changed 8 years ago by
- Owner changed from nthiery to (none)
Looking at the docs of sage.structure.dynamic_class, I see that the docs of DynamicClasscallMetaclass
is broken. Shall I fix it here (in yet another reviewer patch) or leave it to a different ticket?
comment:57 in reply to: ↑ 54 Changed 8 years ago by
Replying to SimonKing:
Hi Nicolas!
What patches are supposed to be applied, currently? Only trac11935_weak_pickling_by_construction_rel11943-nt.patch (which I am now testing), or trac11935_share_on_base_category.patch as well?
Both patches for the experimental feature of having the parent class depend only on the base ring.
comment:58 Changed 8 years ago by
Apparently, the broken documentation of DynamicClasscallMetaclass
comes from the init method of NestedClassMetaclass
. So, it would be better to provide proper documentation. Just a short note.
While the tests are running, I noticed
sage -t --long -force_lib devel/sage/sage/combinat/crystals/tensor_product.py [16.3 s] sage -t --long -force_lib devel/sage/sage/plot/complex_plot.pyx [21.5 s] *** glibc detected *** python: double free or corruption (fasttop): 0x0000000003e99500 *** sage -t --long -force_lib devel/sage/sage/combinat/backtrack.py [15.1 s]
Where does that come from? Here, I work without the share_on_base_category.
comment:59 in reply to: ↑ 55 Changed 8 years ago by
Replying to SimonKing:
I was thinking of that, too. But it would only work if we rely on the assumption that the list of super_categories of a category with base only depends on the category of the base. Can we?
I am indeed not sure about making that assumption for any Category_over_base (it is not clearly defined what a base is!). On the other hand, this seems quite reasonable to me for Category_over_base_ring. This makes e.g. Algebras(...) consistent with the other functorial constructions categories which depend only on the base category.
This also goes in the direction of what we had discussed that we could actually make Algebras(...) be a functorial construction, so that we could define C=Algebras(Fields()), and have Algebras(R) be basically an alias for C for every field. And similarly for PolynomialRings?(Fields()), ...
Note that we could possibly change this ticket to leave Category_over_base alone, and have only Category_over_base_ring derive from CategoryWithParameters?. Do you foresee examples of a plain Category_over_base where sharing parent classes would be important performance wise?
Then, to the very least, that assumption must be clearly stated somewhere.
YES
For our typical benchmark:
%time L = EllipticCurve('960d1').prove_BSD()the time on my machine goes from 4s down to 3.5s. With the subcategory patch, the times goes down from 7s to 3.75s. This makes the subcategory patch acceptable.
I am a bit confused. What is the "subcategory patch"?
I meant #12895.
And what patches are applied for the four different timings?
With and without share_on_base_category and with and without #12895.
Cheers,
Nicolas
comment:60 in reply to: ↑ 56 Changed 8 years ago by
Replying to SimonKing:
Looking at the docs of sage.structure.dynamic_class, I see that the docs of
DynamicClasscallMetaclass
is broken. Shall I fix it here (in yet another reviewer patch) or leave it to a different ticket?
If it is a simple fix, go ahead.
comment:61 Changed 8 years ago by
Apart from the strange glibc problem (that is not reported at the end of the test suite), I get one timeout, namely sage -t --long -force_lib devel/sage/sage/crypto/mq/mpolynomialsystem.py
.
Aha! And it turns out that the glibc comes from that test!! Here is what I get in detail:
king@mpc622:/mnt/local/king/SAGE/stable/sage-5.0.beta13$ ./sage -t --verbose -force_lib devel/sage/sage/crypto/mq/mpolynomialsystem.py sage -t --verbose -force_lib "devel/sage/sage/crypto/mq/mpolynomialsystem.py" Trying: set_random_seed(0L) Expecting nothing ok Trying: change_warning_output(sys.stdout) Expecting nothing ok Trying: sr = mq.SR(Integer(2),Integer(1),Integer(2),Integer(4),gf2=True,polybori=True)###line 26:_sage_ >>> sr = mq.SR(2,1,2,4,gf2=True,polybori=True) Expecting nothing ok Trying: sr###line 27:_sage_ >>> sr Expecting: SR(2,1,2,4) ok Trying: set_random_seed(Integer(1))###line 33:_sage_ >>> set_random_seed(1) Expecting nothing ok Trying: F,s = sr.polynomial_system()###line 34:_sage_ >>> F,s = sr.polynomial_system() Expecting nothing *** glibc detected *** python: double free or corruption (fasttop): 0x0000000003b79b40 *** ^CAborting further tests. KeyboardInterrupt -- interrupted after 2.3 seconds!
So, these are very few commands. That should be reproducible (trying it a bit later).
For reference: The problem does not occur with
trac_12808-classcall_speedup-fh.patch trac_12808_nested_class_cython.patch trac_12808-classcall_cdef.patch trac12215_weak_cached_function.patch trac12215_segfault_fixes.patch trac_715_combined.patch trac_11521_homset_weakcache_combined.patch trac_12875-category-fix_abvar_homspace-nt.patch trac_12877-category-for_more_rings_and_schemes-nt.patch trac_12876_category-fix_abstract_class-nt-rel11521.patch trac_12876-reviewer.patch trac_12876_category-fix_abstract_class-nt-rel11521-review-nt.patch trac9107_nesting_nested_classes.patch trac11768_source_of_dynamic_class.patch trac11768_docfix.patch trac11817_question_mark_using_sage_getdoc.patch trac11791_dynamic_metaclass_introspection.patch trac11943_mro_for_all_super_categories_combined.patch
but does occur if one adds trac11935_weak_pickling_by_construction_rel11943-nt.patch.
comment:62 Changed 8 years ago by
I can not reproduce the problem on the command line. Too bad.
sage: set_random_seed(0L) sage: change_warning_output(sys.stdout) --------------------------------------------------------------------------- NameError Traceback (most recent call last) /mnt/local/king/SAGE/stable/sage-5.0.beta13/devel/sage-main/<ipython console> in <module>() NameError: name 'change_warning_output' is not defined sage: sr = mq.SR(2,1,2,4,gf2=True,polybori=True) sage: sr SR(2,1,2,4) sage: set_random_seed(1) sage: F,s = sr.polynomial_system()
comment:63 follow-up: ↓ 72 Changed 8 years ago by
I still can't reproduce the problem. Too bad. But I have another comment. With the patch, one has in sage/categories/category.py:
def _make_named_class(...): ... else: # Otherwise, check XXXMethods import inspect assert inspect.isclass(method_provider_cls),\ "%s.%s should be a class"%(type(self).__name__, method_provider)
That seems suboptimal to me:
sage: import inspect sage: def test1(self, cls): ....: import inspect ....: assert inspect.isclass(cls),\ ....: "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: def test2(self, cls): ....: assert inspect.isclass(cls),\ ....: "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: def test3(self, cls): ....: if not inspect.isclass(cls): ....: raise AssertionError, "%s.%s should be a class"%(type(self).__name__, repr(cls)) ....: sage: test2(ZZ,ZZ.__class__) sage: %timeit test1(ZZ,ZZ.__class__) 625 loops, best of 3: 4.45 µs per loop sage: %timeit test2(ZZ,ZZ.__class__) 625 loops, best of 3: 2.67 µs per loop sage: %timeit test3(ZZ,ZZ.__class__) 625 loops, best of 3: 2.63 µs per loop
test1 is as in your code: inspect is imported, and (at least it seems to me) the error message is created even if the error does not occur. test2 does not import inspect again, while test3 is an attempt to not create the error message.
To my surprise, creating the error message seems to be essentially for free. But one should really import inspect top-level, I think.
comment:64 Changed 8 years ago by
I wonder: Why is it tested in _make_named_class
whether method_provider_cls is a class? Shouldn't that be the job of dynamic_class, which is called in the following lines?
comment:65 follow-up: ↓ 73 Changed 8 years ago by
Since method_provider_cls is not being checked in dynamic_class, I guess it is fine to test it in _make_named_class.
I notice
self._super_categories return dynamic_class(class_name, tuple(getattr(cat,name) for cat in self._super_categories), method_provider_cls, prepend_cls_bases = False, doccls = doccls, reduction = (getattr, (self, name)), cache = cache)
i.e., first an empty call to self._super_categories. I guess that can be erased?
comment:66 Changed 8 years ago by
Some other little speed-up:
sage: def test1(cls,name): ....: s = "%s.%s"%(cls.__name__, name) ....: sage: def test2(cls,name): ....: s = cls.__name__+'.'+name ....: sage: timeit("test1(ZZ.__class__,'element_class')", number=10000) 10000 loops, best of 3: 2.15 µs per loop sage: timeit("test2(ZZ.__class__,'element_class')", number=10000) 10000 loops, best of 3: 1.85 µs per loop
So, adding strings seems to be faster than inserting into a format string.
comment:67 follow-up: ↓ 76 Changed 8 years ago by
I just tested whether the changes I mentioned have a noticeable effect on the computation time for an example that does nothing but creating the parent classes for 2000 different categories (I made it so that the parent classes are distinct). I am afraid, it was not noticeable. So, I guess I can drop my suggestion.
comment:68 Changed 8 years ago by
Concerning the glibc problem: I made no progress in tracking it down. I just know: When I trace all Python function calls then the problem vanishes, and when I temporarily disable garbage collection then it vanishes as well.
Hence, probably there is some object that is deallocated too early, which makes me wonder how it is related with the memleak fixes from #715, #11521 and #12215.
Nicolas, can you confirm the problem? You didn't mention it yet.
Also, I think I should try to test only the patches that are really needed: In addition to the dependencies (and the dependencies of the dependencies) of this ticket, I have #9107, #11768, #11817 and #11791 applied. I should test whether removing any of it would make the problem disappear.
comment:69 Changed 8 years ago by
OK, removing the additional patches did not help.
comment:70 Changed 8 years ago by
Hooray, #12215 is to blame (which doesn't have a positive review, yet). In #12215, dynamic classes become weakly cached.
The aim of #12215 is to avoid a memory leak that was partially caused by creating many different parent classes. Here, the same problem is solved in a different way, namely by avoiding that many different parent classes are created in the first place.
Hence, I am now trying whether it helps to have apply #12215, but reverting the cache of dynamic classes into a strong cache.
comment:71 follow-up: ↓ 74 Changed 8 years ago by
Yippie! Returning to a strong cache for dynamic classes does indeed help! I am now running the test suite, because it is conceivable that the change makes part of the memleak re-appear.
comment:72 in reply to: ↑ 63 Changed 8 years ago by
Replying to SimonKing:
That seems suboptimal to me: ... test1 is as in your code: inspect is imported, and (at least it seems to me) the error message is created even if the error does not occur. test2 does not import inspect again, while test3 is an attempt to not create the error message.
To my surprise, creating the error message seems to be essentially for free. But one should really import inspect top-level, I think.
According to: http://docs.python.org/reference/simple_stmts.html
assert expression1, expression2
is equivalent to:
if __debug__: if not expression1: raise AssertionError(expression2)
This means in particular that expression2 is *not* evaluated if expression1 does not hold. So we don't need to worry about efficiency in those.
That being said, yes, the import inspect should be at the
toplevel! Thanks for catching this.
Are you in the process of writing a reviewer's patch, or do you want me to do the change? Note: I am flying to Montreal tomorrow, and my wife arrives in one hour from a two weeks mission; so you probably won't hear much from me until Sunday.
Cheers,
Nicolas
comment:73 in reply to: ↑ 65 Changed 8 years ago by
Replying to SimonKing:
Since method_provider_cls is not being checked in dynamic_class, I guess it is fine to test it in _make_named_class.
I notice
self._super_categories return dynamic_class(class_name, tuple(getattr(cat,name) for cat in self._super_categories), method_provider_cls, prepend_cls_bases = False, doccls = doccls, reduction = (getattr, (self, name)), cache = cache)i.e., first an empty call to self._super_categories. I guess that can be erased?
Oh, yes. That is just a scory from a debugging session where I wanted to force the evaluation of the supercategories. Thanks.
comment:74 in reply to: ↑ 71 Changed 8 years ago by
Replying to SimonKing:
Yippie! Returning to a strong cache for dynamic classes does indeed help! I am now running the test suite, because it is conceivable that the change makes part of the memleak re-appear.
Thanks for tracking this down! I don't have a preference for weak or strong caching dynamic classes.
comment:75 Changed 8 years ago by
Speaking of weak cache: currently trac11935_share_on_base_category.patch uses strong caching. Weak caching might be preferable.
Another thing is that this patch could be quite more concise if we had another feature in cached functions: namely that we could specify that certain arguments should be ignored in the cache lookup; or more generally that we could specify a function that would produce the key to be used in the cache lookup. Something like:
def key(x, l, option=1): return x, tuple(l) @cached_function(key=key) def foo(x, l, option=1): return (x, l, option) sage: foo(1, [1,2], 3) is foo(1, (1,2), 2) True
Better syntax welcome!
Cheers,
Nicolas
comment:76 in reply to: ↑ 67 Changed 8 years ago by
Replying to SimonKing:
I just tested whether the changes I mentioned have a noticeable effect on the computation time for an example that does nothing but creating the parent classes for 2000 different categories (I made it so that the parent classes are distinct). I am afraid, it was not noticeable. So, I guess I can drop my suggestion.
Feel free to implement it anyway in a reviewer's patch. Addition is as readable, if not more, than using a format string.
comment:77 Changed 8 years ago by
Hi Nicolas,
I was about to create a reviewer patch. But I am afraid the same problem creped back in sage/rings/polynomail/multi_polynomial_sequence.py. In other words: When I have a weak cache on dynamic classes, then the double free occurs in crypto/mq/mpolynomialsystem.py, but when I have a strong cache, then the double free occurs in rings/polynomial/multi_polynomial_sequence.py (but again in a test involving polynomial systems!).
I have a preference for a weak cache, and since apparently the weak cache as such is not to blame for the error, I need to look somewhere else. Perhaps there is some UniqueRepresentation
in mpolynomialsystem.py that needs a strong cache. If that turns out to be correct, then I will do the necessary change in #12215.
Apart from that: I will be away until Monday as well.
comment:78 Changed 8 years ago by
Too bad. Apparently the patch doesn't apply anymore (according to the patch bot). Apart from a possible rebasing: What needs to be done to get this behind us?
comment:79 Changed 8 years ago by
Apparently a lot of things need to be done. Nicolas, could you remind me (and the patchbot) of the patches to be applied? I have just updated the patches at #12876, so that they apply on top of sage-5.3.beta2 plus dependencies.
comment:80 Changed 7 years ago by
- Dependencies changed from #9138, #11900, #11943, #12875, #12876, #12877 to #9138, #11900, #11943, #12875, #12877
Just a try to have the patchbot try this ticket without #12876!
comment:81 Changed 7 years ago by
- Description modified (diff)
Concerning uniqueness of the parent class: In at least one case (namely
Algebras(R)
), the super categories depend on whether the base ring is a field or not. We would like to haveThe idea is that the parent and element classes should only depend on the super categories, but otherwise should be independent from the base ring. Working at #11900, I found that this would drastically improve the performance of some elliptic curve computation.