Opened 10 years ago

# Use weak references to cache homsets — at Version 168

Reported by: Owned by: jpflori robertwb major sage-5.5 coercion sd35 jpflori, nthiery Simon King Jean-Pierre Flori, Nils Bruin N/A #12969; to be merged with #715

Originally, this ticket was about the following memory leak when computing with elliptic curves:

```sage: K = GF(1<<55,'t')
sage: a = K.random_element()
sage: while 1:
....:     E = EllipticCurve(j=a); P = E.random_point(); 2*P;
```

This example is in fact solved by #715. However, while working on that ticket, another leak has been found, namely

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:
sage: import gc
sage: gc.collect()
0
```

So, I suggest to start with #715 and solve the second memory leak on top of it. It seems that a strong cache for homsets is to blame. I suggest to use the weak `TripleDict` instead, which were introduced in #715.

To be merged with #715 and #13447. Apply

### comment:1 Changed 10 years ago by jpflori

After looking at #10548, I might have a better idea of the culprit:

```sage: import gc
sage: from sage.schemes.elliptic_curves.ell_finite_field import EllipticCurve_finite_field
sage: K = GF(1<<60,'t')
sage: j = K.random_element()
sage: for i in xrange(100):
....:     E = EllipticCurve(j=j); P = E.random_point(); 2*P;
....:
sage: gc.collect()
440
sage: len([x for x in gc.get_objects() if  isinstance(x,EllipticCurve_finite_field)])
100
```

### comment:2 Changed 10 years ago by jpflori

• Component changed from memleak to coercion
• Owner changed from rlm to robertwb

So this could be just #715 .

### comment:3 Changed 10 years ago by jpflori

This is definitely #715.

Resetting the coercion cache and calling gc.collect() deletes the cached elements.

I guess weak refs should be used in the different TripleDict? objects of !CoercionModel_cache_maps.

So this should be closed as duplicate/won't fix.

### comment:4 Changed 10 years ago by jpflori

• Status changed from new to needs_review

### comment:5 Changed 10 years ago by zimmerma

Jean-Pierre, why did you change the status to "needs review"? There is no patch to review.

Also, how to you reset the coercion cache? I would be interested if you have a workaround for the memory leak in:

```for p in prime_range(10^8):
k = GF(p)
```

Paul

### comment:6 follow-up: ↓ 17 Changed 10 years ago by jpflori

As far as I'm concerned, this is nothing but a concrete example of the vague #715. So I guess I put it to "needs review" so that it could be closed as "duplicate/won't fix". Not sure it was the right way to do it.

I seem to remember that I had some workarounds to delete some parts of the cache (so that I could perform my computations), but not all of them. In fact there are several dictionnaries hidden in different places. It was quite a while ago, but I'll have another look at it at some point. Anyway, using weak references for all these dictionnaries seems to be a quite non trivial task. Moreover it should also not slow things down too much to be viable...

### comment:7 Changed 10 years ago by zimmerma

for the computations I need to perform, which need to create about 200000 prime fields, this memory leak makes it impossible to perform it with Sage, which eats all the available memory.

I would be satisfied if I had a magic command to type to explicitly free the memory used by every `k=GF(p)`.

Paul

### comment:8 follow-up: ↓ 22 Changed 10 years ago by jpflori

I'm having a look at your problem right now. Here are some hints to study the problem, mostly stolen from #10548.

I put it here for the record, and so that i can go faster next time.

First, if I only create the finite fields and do nothing with them, I do not seem to get a memleak. Some fields are not garbage collected immediately, but calling gc.collect() does the trick.

```sage: import gc
sage: for p in prime_range(10**5):
....:    k = GF(p)
....:
sage: del p, k
sage: gc.collect()
1265
sage: from sage.rings.finite_rings.finite_field_prime_modn import FiniteField_prime_modn as FF
sage: L = [x for x in gc.get_objects() if isinstance(x, FF)]
sage: len(L)
1
sage: L
[Finite Field of size 2]

```

Of course I guess you actually do something with your finite fields.

So here is a small example causing the fields to stay cached.

```sage: import gc
sage: for p in prime_range(10**5):
....:    k = GF(p)
....:
sage: del p, k
sage: gc.collect()
0

```

The zero here is bad and indeed

```sage: from sage.rings.finite_rings.finite_field_prime_modn import FiniteField_prime_modn as FF
sage: L = [x for x in gc.get_objects() if isinstance(x, FF)]
sage: len(L)
9592

```

If you want to know where it comes from you can use the objgraph python module (on my debian I just installed the python-objgraph package, updated sys.path in Sage to include '/usr/lib/python2.6/dist-packages')

```sage: sys.path.append('/usr/lib/python2.6/dist-packages')
sage: import inspect, objgraph
sage: objgraph.show_backrefs(L[-1],filename='/home/jp/br.png',extra_ignore=[id(L)])

```

And look at the png or use

```sage: brc = objgraph.find_backref_chain(L[-1],inspect.ismodule,max_depth=15,extra_ignore=[id(L)])
sage: map(type,brc)
[<type 'module'>, <type 'dict'>, <type 'dict'>,...
sage: brc[0]
<module 'sage.categories.homset'...

```

So the culprit is "_cache" in sage.categories.homset which has a direct reference to the finite fields in its keys.

The clean solution woul be to used weakref in the keys (let's say something as WeakKeyDirectory? in python).

However, resetting cache should be a (potentially partial) workaround (and could kill your Sage?).

```sage: sage.categories.homset.cache = {}
sage: gc.collect()
376595

```

It also seems to be enough if I do "a = k.random_element(); a = a+a" in the for loop, but not if I add "a = 47*a+3".

I'm currently investigating that last case.

### comment:9 Changed 10 years ago by jpflori

For info, using "k(47)*a+k(3)" is solved,  so the problem left really comes from coercion and action of integers.

```sage: cm = sage.structure. get_coercion_model()
sage: cm.reset_cache()

```

does not help.

### comment:10 Changed 10 years ago by jpflori

First, the second example above is missing the line "k(1);" in the for loop, otherwise it does nothing more than the first example.

Second, I guess the remaining references to the finite fields are in the different lists and dictionnaries of the integer ring named _coerce_from_list, _convert_from_list etc.

You can not directly access them from Python level, but there a function _introspect_coerce() (defined in parent.pyx) which returns them.

### comment:11 Changed 10 years ago by jpflori

In fact, everything is in _*_hash.

And to conclude, I'd say that right now you can not directly delete entries in this dictionaries from the Python level.

So for minimum changes, you should eitheir avoid coercion, or make the dictionaries "cdef public" in parent.pxd to be able to explicitely delete every created entries (be aware that it could be written in different places for example in ZZ but also in QQ and CC and I don't know where...).

And I could also have missed other dictionaries used by Sage.

### comment:12 Changed 10 years ago by zimmerma

Jean-Pierre, I cannot reproduce that:

```sage: sys.path.append('/usr/lib/python2.6/dist-packages')
sage: import inspect, objgraph
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)

ImportError: No module named objgraph
```

Paul

### comment:13 Changed 10 years ago by jpflori

Did you "apt-get install python-objgraph" on your system?

If yes, is it a version for python 2.6 ?

### comment:14 Changed 10 years ago by jpflori

The path I gave above might also be different on your system...

As the package manager.

### comment:15 follow-up: ↓ 16 Changed 10 years ago by jpflori

If you found any other dicitonaries leading to cahing problems, it would be great to mention them here for the record.

Hence the day someone will finally decide to tackle ticket #715, it will speed up the search of the culprit.

### comment:16 in reply to: ↑ 15 Changed 10 years ago by zimmerma

no time so far. I will look at this during the SageFlint? days in December, unless someone else has some time before.

Paul

### comment:17 in reply to: ↑ 6 Changed 10 years ago by johanbosman

As far as I'm concerned, this is nothing but a concrete example of the vague #715. So I guess I put it to "needs review" so that it could be closed as "duplicate/won't fix". Not sure it was the right way to do it.

If you think it should be closed, then I think you should set the milestone to duplicate/wontfix. Otherwise, it is probably better to change the status back to 'new'.

### comment:18 Changed 10 years ago by zimmerma

with Sage 4.7.2 I get the following:

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:
sage: import gc
sage: gc.collect()
0
sage: from sage.rings.finite_rings.finite_field_prime_modn import \
....: FiniteField_prime_modn as FF
sage: L = [x for x in gc.get_objects() if isinstance(x, FF)]
sage: len(L), L[0], L[len(L)-1]
(9592, Finite Field of size 2, Finite Field of size 99767)
```

thus whenever we use the finite field we get a memleak. (If I remove the `a=K(0)` line, I get only two elements in L, for p=2 and p=99991.)

Paul

### comment:19 Changed 10 years ago by zimmerma

I confirm (cf comment 8) that if I comment out the line

```    _cache[(X, Y, category)] = weakref.ref(H)
```

in categories/homset.py, then the memory leak from comment 18 disappears.

Paul

### comment:20 Changed 10 years ago by SimonKing

I think we have a different problem here.

The finite fields themselves should be cached. The cache (see `GF._cache`) uses weak references, which should be fine.

Also, there are special methods `zero_element` and `one_element` which should do caching, because zero and one are frequently used and should not be created over and over again.

However, it seems that all elements of a finite field are cached - and that's bad!

```sage: K = GF(5)
sage: K(2) is K(2)
True
sage: K.<a> = GF(17^2)
sage: K(5) is K(5)
True
```

I see absolutely no reason why all `17^2` elements should be cached.

Fortunately, we have no caching for larger finite fields:

```sage: K.<a> = GF(17^10)
sage: K(5) is K(5)
False
```

### comment:21 Changed 10 years ago by SimonKing

• Status changed from needs_review to needs_work

In the following example, there is no memory leak that would be caused by the `sage.categories.homset` cache:

```sage: len(sage.categories.homset._cache.keys())
100
sage: for p in prime_range(10^5):
....:     K = GF(p)
....:
sage: len(sage.categories.homset._cache.keys())
100
```

However, when you do a conversion `K(...)` then a convert map is created, and apparently is cached:

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:
sage: len(sage.categories.homset._cache.keys())
9692
```

The homset cache does use weak references. Hence, it is surprising that the unused stuff can not be garbage collected. Apparently there is some direct reference involved at some point.

I am very stronglyagainst removing the cache of `sage.categories.homset`. Namely, elliptic curve code uses finite fields and maps involving finite fields a lot, and killing the cache is likely to cause a massive regression.

However, since we actually have coercion maps (not just any odd map), I expect that the direct reference comes from the coercion model. I suggest to look into the coercion code and use weak references there.

By the way, I don't know why the status is "needs review". I think it clearly is "needs work".

### comment:22 in reply to: ↑ 8 Changed 10 years ago by SimonKing

So the culprit is "_cache" in sage.categories.homset which has a direct reference to the finite fields in its keys.

Oops, that could indeed be the problem! Namely, the homset cache uses weak references to its values, but uses direct references to its keys! Perhaps using weak references as keys would work?

### comment:23 Changed 10 years ago by zimmerma

it seems the complete caching of field elements only occurs for p < 500:

```sage: K=GF(499)
sage: K(5) is K(5)
True
sage: K=GF(503)
sage: K(5) is K(5)
False
```

A workaround to this memory leak is to free the cache from time to time (thanks Simon):

```sage: sage.categories.homset._cache.clear()
```

Paul

### comment:24 Changed 10 years ago by SimonKing

On the other hand, it could be that using weak keys in the homset cache will not work: The keys are triples: domain, codomain and category.

What we want is: If either the domain or the codomain or the category have no strong reference, then the homset can be garbage collected.

Hence: Why don't we use a dictionary of dictionaries of dictionaries?

What I mean is:

• The keys of sage.categories.homset._cache should be weak references to the domain
• The values of sage.categories.homset._cache should be dictionaries whose keys are weak references to the codomain.
• The keys of these dictionaries should be weak references to the category, and the value a weak reference to the homset.

Hence, if there is no strong reference to either domain or codomain or category, then the homset can be deallocated.

### comment:25 Changed 10 years ago by SimonKing

The idea sketched in the previous comment seems to work!!!

With it, after running

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:     print get_memory_usage()
```

ends with printing 825.45703125

Without it, it ends with printing 902.8125

I don't know if we should ban caching of field elements?

How could fixing that memory leak be demonstrated by a doc test?

Anyway, I will post a preliminary patch in a few minutes (so that you can see if it fixes at least part of the problem for you), while I am running `sage -tp 2 devel/sage`.

Patch's up!

### comment:27 Changed 10 years ago by SimonKing

Hm. There seems to be a problem.

```sage -t  devel/sage/doc/en/constructions/linear_codes.rst
The doctested process was killed by signal 11
```

What is signal 11?

### comment:28 Changed 10 years ago by zimmerma

signal 11 is Segmentation Fault

Paul

### comment:29 Changed 10 years ago by SimonKing

Indeed it is a segfault. And it is triggered by

```sage: w = vector([1,1,-4])
```

### comment:30 Changed 10 years ago by SimonKing

Monique van Beek just pointed me to the fact that the error occurs even earlier:

```sage: is_Ring(None)
<BOOOOOM>
```

### comment:31 Changed 10 years ago by SimonKing

Moreover,

```sage: None in Rings()
<BOOOOOOM>
```

### comment:32 Changed 10 years ago by SimonKing

That said: I am not working on top of vanilla sage, but I use some patches. In particular, I use #11900, which introduces so called `Category_singleton`, which has a specialised and very fast containment test.

I would not like to change #11900, but prefer to change my patch from here so that it works on top of #11900.

### comment:33 Changed 10 years ago by SimonKing

It turns out that indeed the bug is in #11900. So, I have to fix it there.

### comment:34 follow-up: ↓ 41 Changed 10 years ago by SimonKing

Cc to Nicolas, since it concerns categories:

Do we want that `Hom(1,1)` is still supported?

I think it does not make sense at all to talk about the homomorphisms of the number 1 to the number 1. The problem (for my patch as it is posted here) is the fact that one can't create a weak reference to the number 1.

### comment:35 Changed 10 years ago by SimonKing

Sorry, I forgot to update the Cc field.

### comment:36 Changed 10 years ago by SimonKing

With my patch, applied on top of #11900, I get

```        sage -t  devel/sage-main/sage/structure/parent.pyx # 2 doctests failed
sage -t  devel/sage-main/sage/structure/category_object.pyx # 2 doctests failed
sage -t  devel/sage-main/sage/rings/polynomial/polynomial_singular_interface.py # 1 doctests failed
sage -t  devel/sage-main/sage/rings/polynomial/multi_polynomial_ring.py # 36 doctests failed
sage -t  devel/sage-main/sage/structure/parent_base.pyx # 2 doctests failed
```

At least some of the errors are like

```    sage: n = 5; Hom(n,7)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_3[4]>", line 1, in <module>
n = Integer(5); Hom(n,Integer(7))###line 108:
sage: n = 5; Hom(n,7)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.integer.Integer' object
```

and I really don't see why this should be considered a bug.

### comment:37 Changed 10 years ago by SimonKing

• Status changed from needs_work to needs_info

At least one of the errors in polynomial rings is due to a wrong order of initialising things: There is some information missing, and by consequence a weak reference can't be created.

I fixed this problem in the new patch version.

I put it as "needs info", because of my question on whether or not we want to consider an integer as object in a category.

### comment:38 Changed 10 years ago by SimonKing

I was told by Mike Hansen why weak references to integers and rationals do not work.

I see three options:

#. Drop the support for `Hom(1,1)` (which I'd prefer) #. Add a cdef'd attribute `__weakref__` to `sage.structure.element.Element`, which would create an overhead for garbage collection for elements, and also a memory overhead. #. Use two category.homset caches in parallel: One (the default) that uses weak references, and another one that uses "normal" references if weak references fail.

### comment:39 Changed 10 years ago by SimonKing

FWIW:

With the latest patch, the tests in polynomial_singular_interface and in multi_polynomial_ring pass.

There remain the following problems:

```sage -t  "devel/sage-main/sage/structure/parent.pyx"
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/parent.pyx", line 1410:
sage: n = 5; Hom(n,7)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_33[4]>", line 1, in <module>
n = Integer(5); Hom(n,Integer(7))###line 1410:
sage: n = 5; Hom(n,7)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.integer.Integer' object
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/parent.pyx", line 1412:
sage: z=(2/3); Hom(z,8/1)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_33[5]>", line 1, in <module>
z=(Integer(2)/Integer(3)); Hom(z,Integer(8)/Integer(1))###line 1412:
sage: z=(2/3); Hom(z,8/1)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.rational.Rational' object
**********************************************************************
2 of   8 in __main__.example_33
***Test Failed*** 2 failures.
For whitespace errors, see the file /home/simon/.sage//tmp/parent_2986.py
[11.6 s]
```

and

```sage -t  "devel/sage-main/sage/structure/category_object.pyx"
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/category_object.pyx", line 590:
sage: n = 5; Hom(n,7)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_17[4]>", line 1, in <module>
n = Integer(5); Hom(n,Integer(7))###line 590:
sage: n = 5; Hom(n,7)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.integer.Integer' object
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/category_object.pyx", line 592:
sage: z=(2/3); Hom(z,8/1)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_17[5]>", line 1, in <module>
z=(Integer(2)/Integer(3)); Hom(z,Integer(8)/Integer(1))###line 592:
sage: z=(2/3); Hom(z,8/1)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.rational.Rational' object
**********************************************************************
2 of   8 in __main__.example_17
***Test Failed*** 2 failures.
For whitespace errors, see the file /home/simon/.sage//tmp/category_object_3050.py
[2.7 s]
```

and

```sage -t  "devel/sage-main/sage/structure/parent_base.pyx"
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/parent_base.pyx", line 108:
sage: n = 5; Hom(n,7)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_3[4]>", line 1, in <module>
n = Integer(5); Hom(n,Integer(7))###line 108:
sage: n = 5; Hom(n,7)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.integer.Integer' object
**********************************************************************
File "/home/simon/SAGE/sage-4.8.alpha3/devel/sage-main/sage/structure/parent_base.pyx", line 110:
sage: z=(2/3); Hom(z,8/1)
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-4.8.alpha3/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_3[5]>", line 1, in <module>
z=(Integer(2)/Integer(3)); Hom(z,Integer(8)/Integer(1))###line 110:
sage: z=(2/3); Hom(z,8/1)
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python/site-packages/sage/categories/homset.py", line 159, in Hom
cache2 = _cache[X]
File "/home/simon/SAGE/sage-4.8.alpha3/local/lib/python2.6/weakref.py", line 243, in __getitem__
return self.data[ref(key)]
TypeError: cannot create weak reference to 'sage.rings.rational.Rational' object
**********************************************************************
2 of   8 in __main__.example_3
***Test Failed*** 2 failures.
For whitespace errors, see the file /home/simon/.sage//tmp/parent_base_3078.py
[2.6 s]
```

So, essentially this is just a single test that comes in two versions and is repeated three times - and I would actually say that not raising an error was a bug.

It seems that `Hom(1/2,2/3)` and similar nonsense is not used in Sage. Hence, I think these tests should be removed. I'll ask sage-devel.

### comment:40 Changed 10 years ago by SimonKing

Without the patch:

```sage: def test():
....:     for p in prime_range(10^5):
....:         K = GF(p)
....:         a = K(0)
....:
sage: m0 = get_memory_usage()
sage: %time test()
CPU times: user 7.75 s, sys: 0.08 s, total: 7.83 s
Wall time: 7.84 s
sage: get_memory_usage() - m0
80.234375
```

With the patch:

```sage: def test():
....:     for p in prime_range(10^5):
....:         K = GF(p)
....:         a = K(0)
....:
sage: m0 = get_memory_usage()
sage: %time test()
CPU times: user 7.59 s, sys: 0.01 s, total: 7.60 s
Wall time: 7.61 s
sage: get_memory_usage() - m0
8.53515625
```

So, the memory does mildly increase, but it seems that most of the leak is fixed.

I think that a test of the kind

```sage: get_memory_usage() - -m0 < 10
True
```

might be used as a doc test.

### comment:41 in reply to: ↑ 34 Changed 10 years ago by nthiery

Cc to Nicolas, since it concerns categories:

Do we want that `Hom(1,1)` is still supported?

I think it does not make sense at all to talk about the homomorphisms of the number 1 to the number 1. The problem (for my patch as it is posted here) is the fact that one can't create a weak reference to the number 1.

I don't see much point either. We had a similar discussion a while ago about whether elements should be objects in a category, and as far as I remember, the answer was no by default (Element does not inherit from CategoryObject?). So +1 on my side to kill this dubious feature. You might want to double check on sage-algebra just to make sure.

### comment:42 follow-up: ↓ 43 Changed 10 years ago by zimmerma

Simon, you can also use the test suggested by Jean-Pierre Flori (see comment 18 for an example).

Paul

### comment:43 in reply to: ↑ 42 Changed 10 years ago by SimonKing

Hi Paul,

Simon, you can also use the test suggested by Jean-Pierre Flori (see comment 18 for an example).

Yes, that looks good. With my patch, the test would be like

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:
sage: import gc
sage: gc.collect()
1881
sage: from sage.rings.finite_rings.finite_field_prime_modn import FiniteField_prime_modn as FF
sage: L = [x for x in gc.get_objects() if isinstance(x, FF)]
sage: len(L), L[0], L[len(L)-1]
(2, Finite Field of size 2, Finite Field of size 99991)
```

The people at sage-devel somehow seem to agree that objects of a category should be instances of `CategoryObject` (which elements aren't!), and that we should thus drop the `Hom(2/3,8/1)` test.

In addition to that, I suggest to provide a better error message, something like

```sage: Hom(2/3, 8/1)
Traceback (most recent call last):
...
TypeError: Objects of categories must be instances of <type 'sage.structure.category_object.CategoryObject'>, but 2/3 isn't.
```

Cheers,

Simon

### comment:44 Changed 10 years ago by SimonKing

One bad detail: I'd like to add the test to the documentation of sage.categories.homset. However, if I insert it in the appropriate place, there will be a conflict with both #9138 and #11900.

I could try to insert the test in a less logical place, in order to avoid to have a dependency.

### comment:45 Changed 10 years ago by SimonKing

• Dependencies set to #11900
• Status changed from needs_info to needs_review

I am very much afraid that I have not been able to make my patch independent of #11900. This is not just because of the documentation, but also because of some details in the choice of the homset's category.

Anyway, it needs review (and so does the most recent version of #11900, by the way)!

### comment:46 Changed 10 years ago by SimonKing

• Status changed from needs_review to needs_work

I did try the new doctests in sage/categories/homset.py. However, with other patches applied, the number returned by `gc.collect()` changes.

So, for stability, I suggest to simplify the test, so that only the number of finite fields remaining in the cache is tested.

### comment:47 Changed 10 years ago by SimonKing

• Authors set to Simon King
• Status changed from needs_work to needs_review

I updated the patch.

Difference to the previous patch: The number of objects collect by gc is marked as random (indeed, it will change with #11115 applied). What we are really interested in is the number of finite fields that remains in the cache after garbage collection. This number is two and is not random. Thus, that test is preserved.

### comment:48 Changed 10 years ago by SimonKing

• Status changed from needs_review to needs_work

I think I need help with debugging.

When I have sage-4.8.alpha3 with #9138, #11900, #715 and #11115, then all doctests pass.

When I also have #11521, then the test sage/rings/number_field/number_field_rel.py segfaults. When I run the tests in verbose mode, then all tests seem to pass, but in the very end it says

```4 items had no tests:
__main__
__main__.change_warning_output
__main__.check_with_tolerance
__main__.warning_function
69 items passed all tests:
...
660 tests in 73 items.
660 passed and 0 failed.
Test passed.
The doctested process was killed by signal 11
[15.0 s]
```

So, could it be that not one of the tests was killed, but the test process itself?

What is even more confusing: When I run the tests with the option -randorder, then most of the time the tests pass without a problem.

Can you give me any pointer on how those things could possibly be debugged?

### comment:49 Changed 10 years ago by fbissey

It sometimes happen that the sage session itself crash on exit. This is probably one of these. Last time I got one it was related to singular I think. It is quite difficult to corner these with gdb. The best you can hope is start a sage session with gdb and then try the last doctest sequence and quit sage, it may lead to the crash in which case you may have some luck with gdb. But this is one of these case where gdb itself may be interfering. I don't think I have time to look into this right now but I'll put it into my "To do" list in case it isn't solved when i have time to spare.

### Changed 10 years ago by SimonKing

Use weak references for the keys of the homset cache. If weak references are not supported, then raise an error, pointing out that category objects should be `CategoryObject`s.

### comment:50 Changed 10 years ago by SimonKing

• Work issues set to Understand why a weak key dictionary is not enough

I have attached a new patch version. It fixes the segfault I mentioned. However, it also does not fix the memory leak.

The difference between the two versions is: The new patch still uses weak references to the key of the cache, but a strong reference to the value (i.e., the homset).

The homset has a reference to domain and codomain, which constitute the cache key. Thus, I expected that it does not make any difference whether one has a strong or a weak reference to the homset. But I stand corrected. That needs to be investigated more deeply.

### comment:51 follow-up: ↓ 52 Changed 10 years ago by jpflori

Dear Simon,

Thanks a lot for taking care of all of this !

I'm just back from vacation and will have a look at all your patches in the following days.

I must point out that even if the memory leak was small, it did still mater because I used a LOT of them and after several hours of computations it ate all the available memory the piece of code in the ticket description is just a minimal example, in my actual code I used different curves and similar simple computations on them)...

And to make things clear, I must say I put that ticket as need review in order to get it closed as wont fix/duplicate because I thought it could be seen as a concrete example of ticket 715 and all the work could be done there.

Of course youre the one currently doing all the wok, so do as you want :)

Cheers,

JP

### comment:52 in reply to: ↑ 51 Changed 10 years ago by SimonKing

Hi Jean-Pierre,

I must point out that even if the memory leak was small,

It isn't small.

And to make things clear, I must say I put that ticket as need review in order to get it closed as wont fix/duplicate because I thought it could be seen as a concrete example of ticket 715 and all the work could be done there.

I am not sure whether it would be good to do everything on one ticket, as the topics are related, but clearly disting: #715 is about weak "`TripleDict`" for coercion, #12215 is about a weak version of cached_function, and the ticket here is about the cache of homsets.

On the other hand: I am about to post a new patch here, with #715 as a dependency. It will use the new version of `TripleDict` from #715. So, one could argue that there is a common tool for both tickets, and they belong together.

Anyway. The new patch will fix the leak, but it will not suffer from the segfaults.

Cheers,

Simon

### comment:53 Changed 10 years ago by SimonKing

• Dependencies changed from #11900 to #11900 #715
• Description modified (diff)
• Status changed from needs_work to needs_review
• Work issues Understand why a weak key dictionary is not enough deleted

I have attached another patch under a new name, using a new approach: The weak `TripleDict`, that I introduce at #715, is an appropriate tool for the cache of homsets. The key is the triple `(domain, codomain, category)`, and the value is a weak reference to the corresponding homset.

There is a new test (the same as in the other patch), showing that the leak is fixed. And all tests in sage/schemes, sage/rings, sage/categories and sage/structure pass.

Hence: Needs review!

Apply trac11521_triple_homset.patch

### comment:54 Changed 10 years ago by SimonKing

In fact all tests pass for me, with #9138, #11900, #11115, #715 and trac11521_triple_homset.patch applied on top of sage-4.8.alpha3.

### comment:55 Changed 10 years ago by jpflori

Applied all patches mentionned in the previous comment without problems on sage-4.8.alpha5 and "make ptestlong" passed all tests. I'll try to check that the memory leaks actually disappeared :) today and have a look at your patches to give them positive reviews as well.

### comment:56 Changed 10 years ago by SimonKing

It depends on what you call "the" leak.

At least, the patch fixes one leak, namely the one that is doctested against. I am not sure whether it is enough to fix the leak exposed in the ticket description:

```sage: K = GF(1<<55,'t')
sage: a = K.random_element()
sage: get_memory_usage()
872.26171875
sage: while 1:
....:     E = EllipticCurve(j=a)
....:     P = E.random_point()
....:     Q = 2*P;
....:     print get_memory_usage()
```

The memory usage does climb, but it climbs a lot slower than, for example, in sage-4.6.2.

### comment:57 Changed 10 years ago by SimonKing

PS: I just tested that even #12215 (which is a lot more aggressive in using weak references and fixes a gaping leak) is not enough to totally fix the example from the ticket description. Hence, I think that, for now, we should be happy with a partial fix and investigate the remaining problems on different tickets.

### comment:58 Changed 10 years ago by SimonKing

After running the example for a couple of minutes, interrupting and doing garbage collection, I find:

```sage: from sage.schemes.elliptic_curves.ell_finite_field import EllipticCurve_finite_field
sage: LE = [x for x in gc.get_objects() if  isinstance(x,EllipticCurve_finite_field)]
sage: len(LE)
632
sage: import objgraph
sage: from sage.rings.finite_rings.finite_field_prime_modn import FiniteField_prime_modn as FF
sage: LF = [x for x in gc.get_objects() if isinstance(x, FF)]
sage: len(LF)
1
```

LF shows that one leak is fixed. However, the curves in LE, which are all defined over the same finite field, can not be collected.

### comment:59 Changed 10 years ago by SimonKing

Using objgraph, I found that the remaining leak seem to be related with `sage.schemes.generic.homset.SchemeHomsetModule_abelian_variety_coordinates_field_with_category`. Since this is another homset, it would make sense to try and fix it here.

### comment:60 Changed 10 years ago by SimonKing

Aha!!

I found that we have many equal instances of these scheme homsets:

```sage: LE = [x for x in gc.get_objects() if isinstance(x,SchemeHomsetModule_abelian_variety_coordinates_field)]
sage: LE[100] == LE[150]
True
```

So, I guess we should fix the memory leak by using `UniqueRepresentation` or `UniqueFactory`!

### comment:61 Changed 10 years ago by jpflori

My 2 cents: Isn't it somehow related to the fact that elliptic curves are not unique parents? In some of my original tests, I used different curves and depending on the cache where they were stored "equality" testing was made either with "is" or with "==". In the former case, the "same" elliptic curve would be stored several times making the "leak" even bigger.

### comment:62 Changed 10 years ago by SimonKing

OK. Then why not reduce the non-uniqueness? `EllipticCurve` could easily be turned into a cached function, which would mean that two elliptic curves became identical if they are defined by equal data. That is not enough to be a unique parent (there could be equal elliptic curves defined by different data), but it could be a step forward. And with #12215, it could then actually be turned into a weak cached function.

### comment:63 Changed 10 years ago by SimonKing

• Status changed from needs_review to needs_work

Yes, using a cached function would indeed fix the leak that is cause by the useless creation of creating an elliptic curve with the same basic data over and over again. In particular, it would fix the leak from the ticket description (#12215 is not needed for that).

I am preparing a new patch (it requires do tests and stuff), so, it is "needs work" for now.

### comment:64 Changed 10 years ago by SimonKing

It is definitely not easy. It seems that the elliptic curve code relies on the non-uniqueness of elliptic curves: I get loads of errors.

### comment:65 Changed 10 years ago by SimonKing

Ah! Tuples of elements of different rings can be equal, but the corresponding elliptic curves would be different. So, the ring needs to be part of the description.

### comment:66 Changed 10 years ago by SimonKing

I am undecided what we should do:

We could argue that my patch does fix some memory leak, and leave it like that (modulo comments of the reviewer, of course). In order to fix the memory leak exposed by the example from the ticket description, we have no chance but to have some kind of uniqueness for elliptic curves. But that is a different topic and should thus be dealt with on a different ticket (perhaps such ticket already exists?).

Or we could argue that this ticket is about fixing the memory leak that is exposed in the description. Hence, we should do all necessary steps here.

And then, there is still the question whether the number theorists really want the elliptic curves be "weakly unique" (i.e., identical if the given data are equal). In addition, we might want that the elliptic curve cache is weak - which might imply that we have to wait for #12215.

What do you think?

I guess I'll also ask on sage-nt.

### comment:67 Changed 10 years ago by jpflori

I think we can go your way and stop working on this ticket now (or rather once I've taken the time to go through your patches to properly review them). Thus we can already some non negligible improvements merged.

Anyway the problem of uniqueness of elliptic curve is highly non trivial and deserves its own ticket. I guess #11474 would be a right place to treat that problem. An alternative would be to make this ticket depend on #11474.

### comment:68 Changed 10 years ago by SimonKing

Hi Jean-Pierre,

indeed, you found the correct ticket.

And there is no need to ask on sage-nt, since I already did before opening #11474: Sage-nt first seemed to agree that uniqueness is good, so I opened the ticket. But then, they became convinced that much of the elliptic curve stuff depends on choices (generators), so that we should consider elliptic curves to be mutable objects, for which we wouldn't like to have uniqueness.

Considering the discussion on sage-nt, #11474 probably is "wontfix".

### comment:69 Changed 10 years ago by SimonKing

I am not sure whether we should really stop work right there. After all, it is still not 100% clear to me why the elliptic curve E, that is created in the loop in an increasing number of copies, can not be garbage collected.

First, E is created, and some coercion into it is created. The coercion is cached. By #715, some key of the cache provides a weak reference to E. In addition, the coerce map refers to a homset, and the homset refers to its codomain, which is E. I wonder whether there is a chain of strong references from E to the homset (one could try to find out using objgraph, I guess). If there is, then we would have a reference cycle. If that was the case, then we needed to find a `__del__` method that prevents the cycle from being garbage collected.

### comment:70 Changed 10 years ago by jpflori

Objgraph only shows a reference left as _codomain in a dict in a SchemeHomsetModule_a_v_c_f (defined in sage.schemes.generic.homset).

### comment:71 Changed 10 years ago by jpflori

The homset of points of the ab. group of the curve is itself reference by an IntegerMulAction?, the point at infinity on the curve (no idea when it gets created) and a dict with 11 elements. I guess the problem might be that in addition to the _cache in sage.categories.homset the homset of points is directly link within the Action object ? That could also be nonsense.

### comment:72 Changed 10 years ago by SimonKing

The `IntegerMulAction` should be fine, as it is stored in a `TripleDict` (hence, weakly, by #715). Where does the dict occur?

### comment:73 Changed 10 years ago by SimonKing

I am a bit puzzled.

I see that there are a couple of cycles, involving an elliptic curve, a `SchemeHomsetModule...`, an `IntegerMulAction`, and an `EllipticPoint_finit_field`. However, none of them has a `__del__` method, thus, the garbage collection should be able to collect the cycles.

But the backref graph also shows something that I don't understand: Apparently there is a top level dict with three items that points to the elliptic curve. Where is it?

### comment:74 follow-up: ↓ 87 Changed 10 years ago by SimonKing

I got it!!

By #715, both the keys and the value of `sage.structure.coerce_dict.TripleDict` are by weak reference -- if possible. If the value does not allow for a weak reference, then (to simplify code) a constant function is used instead.

Actions are stored in such `TripleDict`s. However, an `IntegerMulAction` can not be weakly referenced. Hence, there is a strong reference, and no garbage collection can occur (the value points back to the keys, hence, they can't be collected either).

Solution: Make `IntegerMulAction` weakly referenceable! That should suffice.

### comment:75 Changed 10 years ago by SimonKing

It does not suffice. But I think it is a step to the right direction.

### comment:76 Changed 10 years ago by SimonKing

Apparently the weak reference to the value is not used in the `TripleDict`! So, I have to look at #715, perhaps I forgot to implement something there.

### comment:77 Changed 10 years ago by SimonKing

Aha, I was mistaken: At #715 I only took care for weak references to the keys of `TripleDict`. The big question is whether we additionally want weak references to the values. I am somehow against doing this at #715.

### comment:78 Changed 10 years ago by jpflori

My point was that it seemed to me that there was some (not weak!) reference in the dictionary ZZ._action_hash pointing to the IntegerMulAction? itself pointing to the curve and I thought it could prevent garbage collection.

So I added a cpdef function to be able to clear that dictionary and see if garbage collection occurs once it is emptied, however it is a the C level so kind of the whole Sage library had to be rebuilt and I had to go back home before rebuilding was finished...

Anyway, I'll have a look at it tomorrow :)

By the way, have you any idea if dictionaries declared at the C level such as ZZ._action_hash are detected by objgraph ?

### comment:79 Changed 10 years ago by SimonKing

Hi Jean-Pierre,

The thing with the _action_hash is a good finding. I thought it would be a `TripleDict`, but it is just a usual dict. And this could indeed be a problem. I don't know if this is visible to objgraph.

But also I think that in addition we have the problem of strong references to the values of `TripleDict`. In principle, one could use weak references for not only the keys but also to the values -- perhaps this could be done in #715.

Or one could leave `TripleDict` as it is currently in #715, but explicitly use weak references to functors for coercion (which needs to be enabled first). _action_hash then has to use weak references as well.

There might be a speed problem, though.

### comment:80 Changed 10 years ago by jpflori

For info, resetting ZZ._action_hash to {} is not sufficient to let the IntegerMulAction? be garbage collected.

Objgraph shows 3 objects pointing to the abelian group of points of the curve (itself pointing to the curve):

• the IntegerMulAction?
• the point at infinity (?)
• a dict of size 11 which is in fact an attribute (_Schemering_point_homset) of the curve (as a scheme) itself.

I'd be happy to test a weakref for values version of your patch regardless of the spedd impact.

### comment:81 follow-up: ↓ 82 Changed 10 years ago by jpflori

I guess you're solution should be the right one because resetting the coercion cache solves the problem, so dealing with it should be enough for the example in the ticket description.

Only one copy gets cached in _action_hash because the curves are equal (==) but not identical (is). However, if one uses (really) different curves, one of each gets also cached in _action_hash

Here is  small piece of code to test the effect of clearing different caches (the code to clear the ZZ cache is left as an exercise, anyway we should use weakrefs there):

```sage: import gc, inspect
sage: load /usr/share/shared/objgraph.py # or whatever you should type to use objgraph
sage: from sage.schemes.elliptic_curves.ell_finite_field import EllipticCurve_finite_field
sage: K = GF(1<<60, 't')
sage: j = K.random_element()
sage: for i in xrange(100):
....:     E = EllipticCurve(j=j); P = E.random_point(); 2*P; del P; del E;
....:
sage: gc.collect()
68
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L)
100
sage: del L
sage: get_coercion_model().reset_cache()
sage: gc.collect()
6172
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L)
1
sage: del L
sage: ZZ.del_hash()
sage: gc.collect()
56
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L)
0
sage: del L
sage: for i in xrange(100):
....:     E = EllipticCurve(j=K.random_element()); P = E.random_point(); 2*P; del P; del E;
....:
sage: gc.collect()
174
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L)
100
sage: del L
sage: get_coercion_model().reset_cache()
sage: gc.collect()
738
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L) # _action_hash
100
sage: del L
sage: ZZ.del_hash()
sage: gc.collect()
5742
sage: L = [x for x in gc.get_objects() if isintance(x, EllipticCurve_finite_field)]
sage: len(L) # mmm got one left!!! not sure where it comes from yet...
1
```

### comment:82 in reply to: ↑ 81 ; follow-up: ↓ 83 Changed 10 years ago by SimonKing

Hi Jean-Pierre,

I guess you're solution should be the right one because resetting the coercion cache solves the problem, so dealing with it should be enough for the example in the ticket description.

But it is not so easy because...

Only one copy gets cached in _action_hash because the curves are equal (==) but not identical (is).

... elliptic curves are not unique parents.

I think it would be a mistake to use a weak reference to the value of `TripleDict`. I tried and got many errors - and I think this was because some important data (homsets, actions, ...) was garbage collected even though there was still a strong reference to domain and codomain. In that situation, the value must not be garbage collected.

sage: len(L) # mmm got one left!!! not sure where it comes from yet...

Don't forget the last copy of E that was defined in the loop!

Since I think that a weak reference to the values of `TripleDict` won't work: What else could one do? Or perhaps I should try again: It could be that the errors came from the wrong choice of a callback function.

### comment:83 in reply to: ↑ 82 Changed 10 years ago by jpflori

I guess you're solution should be the right one because resetting the coercion cache solves the problem, so dealing with it should be enough for the example in the ticket description.

But it is not so easy because...

Only one copy gets cached in _action_hash because the curves are equal (==) but not identical (is).

... elliptic curves are not unique parents.

Yep :)

I think it would be a mistake to use a weak reference to the value of `TripleDict`. I tried and got many errors - and I think this was because some important data (homsets, actions, ...) was garbage collected even though there was still a strong reference to domain and codomain. In that situation, the value must not be garbage collected.

I see...

sage: len(L) # mmm got one left!!! not sure where it comes from yet...

Don't forget the last copy of E that was defined in the loop!

In the new code I posted I explicitely added "del P; del E;" to the loop so no copy should be left. What's even stranger is that this remaining copy appears after the second loop (with random j invariants so different curves) but not after the first one (with constant j invariant so only one curve)!

Since I think that a weak reference to the values of `TripleDict` won't work: What else could one do? Or perhaps I should try again: It could be that the errors came from the wrong choice of a callback function.

Mmm, have to think more about all of that...

### comment:84 Changed 10 years ago by SimonKing

I think I should describe the problem in more detail.

Currently,

• There is a strong reference from `__main__` to the action and coercion caches that are stored in the coercion model. That's fine.
• Parents store actions as well. Some of these parents (the integer ring, for example) will always stay in memory.
• There is a strong reference from any action and homset back to domain and codomain, which are used as keys in the cache. I think that's fine: If a homset is alive then its domain and codomain must remain alive as well.
• The action and coercion caches have a strong reference to the values.
• There is no direct reference from domain and codomain to the homset.

Hence, if an action is stored in the action cache, then there will always be a chain of strong references from `__main__` via the cache to the action, and further to domain and codomain, so that it can not be collected.

On the other hand, if weak references to the values of the action and coercion caches are used, then an action or a coercion could die even when domain and codomain were still alive. That's probably not good. To the very least, it would imply that actions would be needed to be re-created over and over again.

How could that be solved?

### comment:85 Changed 10 years ago by jpflori

If I understand correctly, the keys to all these dictionaries are triples and what you've done is that ift elements in that triple are weakrefs to non strongly refed objects, the key-value pair gets deleted so that garbage collection occur for these only weakrefed objects?

Therefore, if ZZ._action_hash gets deleted, only weakrefs to the abelian groups of points of the curve should be left in the coercion model (in the second triple dict _action_maps) and it should not prevent garbage collection...

### comment:86 Changed 10 years ago by jpflori

Maybe the problem is that the value in that last dict corresponding to the triple with a (abelian group of points of a) curve is a (weakref to unless not weakreferrable) the IntegerMulAction? which itself has a strong ref to the curve which would prevent the curve to get collected?

### comment:87 in reply to: ↑ 74 Changed 10 years ago by jpflori

This is basically what you concluded in Comment 74, so a solution could be to allow Functors to be weakreferreable or make them use weak refs for their [co]domains? You posted that making making IntegerMulAction? weak referrable is not enough, could you post a patch applying these ideas so that I can play with it?

### Changed 10 years ago by SimonKing

Experimental patch using weak references for actions

### comment:88 Changed 10 years ago by SimonKing

I was mistaken: Using weak references to the actions (on top of the other patch) does fix the leak - see the experimental patch that I have just posted.

I did not run any doctests on it, yet. But I tested that only one `SchemeHomsetModule...` survives the garbage collection (and I did not delete E).

### comment:89 Changed 10 years ago by jpflori

I confirm the leak is fixed with your last patch, just launched a ptestlong.

### comment:90 Changed 10 years ago by SimonKing

My first impression from some tests is that the additional patch causes a massive slow-down.

### comment:91 follow-up: ↓ 92 Changed 10 years ago by jpflori

I ran the sage test suite on the same pc, on a 4.8.alpha5 with patches and a 4.7.2 without patches, just on the schemes directory and got 1512 vs 1060 sec... some files required between 2 and 3 times more time (e.g. hyperelliptic_padic_field, heegner and ell_number_field which are already quite long, i'd say most of the file already long), others did not change at all.

### comment:92 in reply to: ↑ 91 Changed 10 years ago by SimonKing

I ran the sage test suite on the same pc, on a 4.8.alpha5 with patches and a 4.7.2 without patches, just on the schemes directory and got 1512 vs 1060 sec...

That is not half as bad as I thought! However, it is far from good.

Do you also have the time with only the first patch, resp. with #715 only? After all, #715 uses weak references and may thus be responsible for some slow-down.

### comment:93 Changed 10 years ago by SimonKing

I have slightly modifies trac11521_triple_homset.patch: In the old version, I had created a `TripleDict(50)`, but meanwhile I learnt that the parameter of the `TripleDict` should not be even and should best be a prime number. In the new version, it is prime...

Concerning the weak references: Why exactly is the experimental patch so slow? Is it because the access to the weakly referenced actions is so slow? Or is it because the actions are garbage collected even if their domain/codomain is still alive, so that the actions are uselessly created over and over again? I suspect it is the latter.

### comment:94 follow-up: ↓ 95 Changed 10 years ago by jpflori

I guess you're right.

Here is a piece of code emphasizing the problem.

```sage: K = GF(1<<60, 't')
sage: a = K.random_element()
sage: E = EllipticCurve([1, 0, 0, 0, a])
sage: P = E.random_point()
sage: 2*P
...
sage: ZZ._introspect_coerce()['_action_hash']
{(<type 'list'>, ...} # No Abelian group of points of the curve!!!
sage: get_coercion_model().get_cache()[1]
{(Integer Ring, Abelian group of points on Elliptic Curve defined by ..., <built-in function mul>): <weakref at ...; dead>, ...} # the "dead" are bad

```

Indeed, has all the dicts for coercion caches have weakrefs to their values, the actions get garbage collected. That becomes kind of tricky...

As pointed before, the problem is that if we let strong reference to the actions in the values of the dict, these action themselves have strong refs to the domain and codomain and so prevent the whole garbage collection to occur. Is it sensible to use weakrefs for [co]domains in Functors? Hence garbage collection can occur in the cache dicts, but if someone actually use functors directly, he must be sure to have some strong references to its domain and codomain somewhere to avoid garbage collection...

Another question: why not use a TripleDict? in the parent class for the _action_hash dict rather that a WeakValueDict? with three keys? That could somehow unify the approach taken here!

### comment:95 in reply to: ↑ 94 ; follow-up: ↓ 96 Changed 10 years ago by SimonKing

Indeed, has all the dicts for coercion caches have weakrefs to their values, the actions get garbage collected. That becomes kind of tricky...

Yes, and note that we have two locations for storing the actions:

• in the coercion model
• in the attribute `_action_hash` of parents.

I found that having weak references in the coercion model is enough for fixing the leak - even if one has strong references in `_action_hash`. That is something I don't fully understand. In the example of the ticket description, we have an action of `ZZ`. `ZZ` is not subject to garbage collection, hence, having a strong reference in `ZZ._action_hash` should keep the action alive, and thus the elliptic curve (namely the different copies of E that are created in the loop).

Anyway, even in that case we would see the drastic slow-down.

As pointed before, the problem is that if we let strong reference to the actions in the values of the dict, these action themselves have strong refs to the domain and codomain and so prevent the whole garbage collection to occur. Is it sensible to use weakrefs for [co]domains in Functors? Hence garbage collection can occur in the cache dicts, but if someone actually use functors directly, he must be sure to have some strong references to its domain and codomain somewhere to avoid garbage collection...

Indeed that would be the consequence. I think that would not be acceptable: If the user keeps an action, then s/he can rightfully expect that domain and codomain are automatically kept alive.

Another question: why not use a TripleDict? in the parent class for the _action_hash dict rather that a WeakValueDict? with three keys? That could somehow unify the approach taken here!

I was thinking of that, too. However, in addition to `_action_hash`, the parent also has a `_action_list`. And that might be a bigger problem.

### comment:96 in reply to: ↑ 95 ; follow-up: ↓ 97 Changed 10 years ago by jpflori

Yes, and note that we have two locations for storing the actions: * in the coercion model * in the attribute `_action_hash` of parents. I found that having weak references in the coercion model is enough for fixing the leak - even if one has strong references in `_action_hash`. That is something I don't fully understand.

As I posted above in ZZ._action_hash equality is tested with "==" (rather than identity with "is") so in the code of the ticket description where only "one" curve is used, only one curve gets stored in _action_hash. If you try the code posted some comments above where (really) different curves are used, you'll see that the leak is not fixed if you dont use a similar approach for _action_hash as for the coercion model.

In the example of the ticket description, we have an action of `ZZ`. `ZZ` is not subject to garbage collection, hence, having a strong reference in `ZZ._action_hash` should keep the action alive, and thus the elliptic curve (namely the different copies of E that are created in the loop). Anyway, even in that case we would see the drastic slow-down.

Indeed that would be the consequence. I think that would not be acceptable: If the user keeps an action, then s/he can rightfully expect that domain and codomain are automatically kept alive.

The weakref domain and codomain in functors problem could be tackled by adding an optional parameter for the use of weakref.

By default, the behavior of functors would be as before with strong refs (and the option to false).

For the coercion models we would set the option to True and use weakref whenever possible.

### comment:97 in reply to: ↑ 96 ; follow-up: ↓ 98 Changed 10 years ago by SimonKing

As I posted above in ZZ._action_hash equality is tested with "==" (rather than identity with "is") so in the code of the ticket description where only "one" curve is used, only one curve gets stored in _action_hash.

Yes, but then I don't understand why there is no error: In some places, the coercion model tests for identity, and raises a big fat "coercion BUG" otherwise.

The weakref domain and codomain in functors problem could be tackled by adding an optional parameter for the use of weakref.

By default, the behavior of functors would be as before with strong refs (and the option to false).

For the coercion models we would set the option to True and use weakref whenever possible.

Again, I believe that it must not be up to the user: "Evidently" (for a human), the different copies of E in the while loop can be garbage collected. It is our job to hammer the "evidence" into Sage. We shouldn't simply say "let the user decide" whether he wants a leak or a potential disaster: I would call it a disaster if one has an action and suddenly its codomain is gone.

### comment:98 in reply to: ↑ 97 Changed 10 years ago by jpflori

Yes, but then I don't understand why there is no error: In some places, the coercion model tests for identity, and raises a big fat "coercion BUG" otherwise.

I'll have a look at it, I'm not completely up to what the coercion model actually does :) I guess the doc could also be extended on that :)

The weakref domain and codomain in functors problem could be tackled by adding an optional parameter for the use of weakref.

By default, the behavior of functors would be as before with strong refs (and the option to false).

For the coercion models we would set the option to True and use weakref whenever possible.

Again, I believe that it must not be up to the user: "Evidently" (for a human), the different copies of E in the while loop can be garbage collected. It is our job to hammer the "evidence" into Sage. We shouldn't simply say "let the user decide" whether he wants a leak or a potential disaster: I would call it a disaster if one has an action and suddenly its codomain is gone.

That would not really be up to the user if by default the functor does not use weakref, but if we change this behavior for the coercion model by switching the option. It would be fatly documented not to set the option to True for general use, so normal use would not lead to problems. Of course it all depends on what you mean by "up to the user". One could still intentionally set the option to true, and then cry that its codomain disappeared...

### comment:99 Changed 10 years ago by jpflori

In the verify_action of the coerce model there is a PY_TYPE_CHECK(action, IntegerMulAction?) that might explain the absence of fat BUG.

### comment:100 follow-up: ↓ 101 Changed 10 years ago by jpflori

Anyway, it does not make much sense to me that the _action_hash dict in the Parent class uses "==" rather than "is", especially since the TripleDict? of the coercion model do. What do yo think? Have you any good reason to justify such a choice that I might have missed?

### comment:101 in reply to: ↑ 100 Changed 10 years ago by SimonKing

Anyway, it does not make much sense to me that the _action_hash dict in the Parent class uses "==" rather than "is", especially since the TripleDict? of the coercion model do.

The old `TripleDict` didn't - I changed that only in #715. Perhaps it is debatable whether that change is fine. But:

What do yo think? Have you any good reason to justify such a choice that I might have missed?

See sage/structure/coerce.pyx:

```            if y_mor is not None:
all.append("Coercion on right operand via")
all.append(y_mor)
if res is not None and res is not y_mor.codomain():
raise RuntimeError, ("BUG in coercion model: codomains not equal!", x_mor, y_mor)
```

That is why I think one should test for identity, not equality.

On the other hand, we also have

```        # Make sure the domains are correct
if R_map.domain() is not R:
if fix:
connecting = R_map.domain().coerce_map_from(R)
if connecting is not None:
R_map = R_map * connecting
if R_map.domain() is not R:
raise RuntimeError, ("BUG in coercion model, left domain must be original parent", R, R_map)
if S_map is not None and S_map.domain() is not S:
if fix:
connecting = S_map.domain().coerce_map_from(S)
if connecting is not None:
S_map = S_map * connecting
if S_map.domain() is not S:
raise RuntimeError, ("BUG in coercion model, right domain must be original parent", S, S_map)
```

in the same file. These lines apparently cope with the fact that there are non-unique parents, and they suggest that `==` is the right thing to do in the coerce caches.

But I think this is a discussion that belongs to #715.

### comment:102 Changed 10 years ago by SimonKing

My suggestion at #715 is to use a different way of comparison in the case of actions respectively coerce maps. This complies with existing defaults in sage/structure/coerce.pyx

### comment:103 Changed 10 years ago by jpflori

I agree to move the discussion for "==" and "is" to #715 (and also agree with your solution).

So, what about the original problem here and your thoughts about moving the weakref from the action itself to its domains and codomains?

If you really dislike the idea of having an option switched off by default (and a priori only on in the coercion model and not outside it unless one knows what he is doing), we could mimick what you wanna do with the TripleDict? and have the usual Functor and a Functor_weakref version (or you can go the other way around and add an extra arg to the TripleDict? constructor taking the equality operator as argument :) rather than having two classes)?

### comment:104 Changed 10 years ago by SimonKing

Interestingly, the experimental patch for #715 that I have not submitted yet (need more tests) seems to be almost enough to fix the memory leak from the ticket description!

Namely:

```sage: <the usual loop>
Traceback
...
KeyboardInterrupt:
sage: import gc
sage: gc.collect()
585
sage: from sage.schemes.generic.homset import SchemeHomsetModule_abelian_variety_coordinates_field
sage: LE = [x for x in gc.get_objects() if  isinstance(x,SchemeHomsetModule_abelian_variety_coordinates_field)]
sage: len(LE)
2
```

And the experimental patch is not using weak references to the values of the `TripleDict` - only to the keys! Perhaps this ticket will eventually be a duplicate of #715?

Anyway, I need more tests and will probably submit the patch to #715 later today.

### comment:105 Changed 10 years ago by jpflori

If you used "==" for equality testing for actions in the coercion model (as in the parent caches) and let "j=j" in the "usual piece of code", this is not surprising because for "==" all the curves are equal, so there will be no duplication nor memory leak.

To perform a better test, you should replace j=j by j=K.random_element() in the constructor of the elliptic curve (as I did a few tickets up) to get really different curves (i.e. for "is" and for "=="). I fear the memory leak is still there with this second example...

### Changed 10 years ago by SimonKing

Experimental: Have to versions of `TripleDict`, using "==" or "is" for comparison

### comment:106 Changed 10 years ago by SimonKing

Sorry, I have just posted a patch to the wrong ticket. The new patch belongs to #715, not to here. Just ignore it.

### comment:107 Changed 10 years ago by SimonKing

• Milestone changed from sage-4.8 to sage-duplicate/invalid/wontfix
• Status changed from needs_work to needs_info

Concerning "wrong ticket": Shouldn't we consider this ticket a duplicate of #715? After all, the examples from the two ticket descriptions are almost identical.

### comment:108 follow-up: ↓ 109 Changed 10 years ago by jpflori

That's what I originally suggested :) See the first comments on this ticket.

When I found about ticket #715 I copied the description from here to there (I must have introduced some typographical difference in the way) because I thought the problem belonged there and the original description of ticket #715 was non-existent.

### comment:109 in reply to: ↑ 108 Changed 10 years ago by SimonKing

That's what I originally suggested :) See the first comments on this ticket.

When I found about ticket #715 I copied the description from here to there (I must have introduced some typographical difference in the way) because I thought the problem belonged there and the original description of ticket #715 was non-existent.

I see. And I also found that I was originally responsible for not making it a duplicate.

The reason was that I found another leak: It is the strong cache for the homsets, and that is not addressed in #715.

So (question to the release manager), what shall we do? Mark this as a duplicate and open a different ticket for the cache of homsets? Or change the ticket description such that it is only about homsets, not about elliptic curves?

### comment:110 Changed 10 years ago by zimmerma

before tagging that ticket as a duplicate of #715, I'd like to check the leak is indeed fixed with the (upcoming) patch of #715.

Paul

### comment:111 Changed 10 years ago by jpflori

I fear the work on #715 is far from being finished...

### comment:112 Changed 10 years ago by SimonKing

• Description modified (diff)
• Milestone changed from sage-duplicate/invalid/wontfix to sage-5.0
• Status changed from needs_info to needs_review

### comment:113 Changed 10 years ago by SimonKing

The original example of this ticket is in fact fixed with #715, but I think the other problem should be dealt with here, namely by using a weak cache for homsets.

### comment:114 Changed 10 years ago by SimonKing

Note that with the patch applied on top of #715, the memleak is fixed:

```sage: for p in prime_range(10^5):
....:     K = GF(p)
....:     a = K(0)
....:
sage: import gc
sage: gc.collect()
8320
sage: LE = [x for x in gc.get_objects() if  isinstance(x,type(K))]
sage: len(LE)
2
sage: LE
[Finite Field of size 2, Finite Field of size 99991]
```

Later today, I'll run doctests.

### comment:115 Changed 10 years ago by SimonKing

Note that the patch was written before the latest version of the patch from #715 was created. In particular, it uses `TripleDict`, but should better use `TripleDictById` (which hasn't been there when I wrote the patch).

### comment:116 Changed 10 years ago by SimonKing

Not bad: `make ptest` results in only one error with the patch and its dependencies applied on top of sage-5.0.prealpha0:

```sage -t -force_lib "devel/sage/sage/rings/polynomial/multi_polynomial_libsingular.pyx"
Exception AttributeError: 'PolynomialRing_field_with_category' object has no attribute '_modulus' in  ignored
Exception AttributeError: 'PolynomialRing_field_with_category' object has no attribute '_modulus' in  ignored
**********************************************************************
File "/home/simon/SAGE/sage-5.0.prealpha0/devel/sage/sage/rings/polynomial/multi_polynomial_libsingular.pyx", line 418:
sage: len(ring_refcount_dict) == n
Expected:
True
Got:
False
**********************************************************************
1 of  18 in __main__.example_4
***Test Failed*** 1 failures.
For whitespace errors, see the file /home/simon/.sage//tmp/multi_polynomial_libsingular_6755.py
[3.6 s]

----------------------------------------------------------------------
The following tests failed:

sage -t -force_lib "devel/sage/sage/rings/polynomial/multi_polynomial_libsingular.pyx"
Total time for all tests: 3.6 seconds
```

### comment:117 Changed 10 years ago by SimonKing

It turns out that the failing test had in fact a wrong design. It was:

```            sage: import gc
sage: from sage.rings.polynomial.multi_polynomial_libsingular import MPolynomialRing_libsingular
sage: from sage.libs.singular.ring import ring_refcount_dict
sage: n = len(ring_refcount_dict)
sage: R = MPolynomialRing_libsingular(GF(547), 2, ('x', 'y'), TermOrder('degrevlex', 2))
sage: len(ring_refcount_dict) == n + 1
True

sage: Q = copy(R)   # indirect doctest
sage: p = R.gen(0) ^2+R.gen(1)^2
sage: q = copy(p)
sage: del R
sage: del Q
sage: del p
sage: del q
sage: gc.collect() # random output
sage: len(ring_refcount_dict) == n
True
```

Hence, before n is defined, no garbage collection takes place. This is, of course, not correct: The ring_refcount_dict may contain references to a ring created in another doctest, that is only garbage collected in the line before `len(ring_refcount_dict)==n`.

In other words: The test did not fail because there was a memory leak.

When I insert a garbage collection right before the definition of n, the test works, and in addition the warning about `AttributeError` being ignored vanishes.

The patch fixes the memory leak caused by a strong homset cache (which was not addressed by #715):

```sage: for p in prime_range(10^3):
....:     K = GF(p)
....:     a = K(0)
....:
sage: import gc
sage: gc.collect()
3128
sage: LE = [x for x in gc.get_objects() if  isinstance(x,type(K))]
sage: LE
[Finite Field of size 2, Finite Field of size 997]
```

The patch adds a new doctest demonstrating that it is fixed.

Needs review!

### comment:118 Changed 10 years ago by jpflori

I'll this patch as soon as 715 is finally closed.

I think the ticket title also needs to be changed to reflect the issue addressed, namely that when homset are created, a strong ref is kept forever in sage.???.homset.

### comment:119 Changed 10 years ago by SimonKing

• Status changed from needs_review to needs_work
• Summary changed from Memleak when resolving the action of Integers on an Elliptic Curve to Use weak references to cache homsets

When applying the new patch from #715, all tests pass. But when also applying the patch from here, `make ptest` results in

```sage -t  -force_lib devel/sage/sage/libs/singular/ring.pyx # 6 doctests failed
```

The failing tests concern `ring_refcount_dict` - and it is perhaps not surprising that a patch enabling garbage collection of rings has an influence on refcount, such as:

```   sage: ring_ptr = set(ring_refcount_dict.keys()).difference(prev_rings).pop()
Exception raised:
Traceback (most recent call last):
File "/home/simon/SAGE/sage-5.0.prealpha0/local/bin/ncadoctest.py", line 1231, in run_one_test
self.run_one_example(test, example, filename, compileflags)
File "/home/simon/SAGE/sage-5.0.prealpha0/local/bin/sagedoctest.py", line 38, in run_one_example
OrigDocTestRunner.run_one_example(self, test, example, filename, compileflags)
File "/home/simon/SAGE/sage-5.0.prealpha0/local/bin/ncadoctest.py", line 1172, in run_one_example
compileflags, 1) in test.globs
File "<doctest __main__.example_8[9]>", line 1, in <module>
ring_ptr = set(ring_refcount_dict.keys()).difference(prev_rings).pop()###line 452:
sage: ring_ptr = set(ring_refcount_dict.keys()).difference(prev_rings).pop()
KeyError: 'pop from an empty set'
```

Anyway, I need to analyse what happens, and provide a new patch after catching some sleep.

Jean-Pierre, I hope you like the new ticket title!

### comment:120 Changed 10 years ago by jpflori

Yes, better title !

I've not have any time to have a look at these cache tickets in the last few days, but be sure I'll have a look soon.

### Changed 10 years ago by SimonKing

Use the weak `TripleDict` from #715 for the cache of homsets

### comment:121 Changed 10 years ago by SimonKing

• Status changed from needs_work to needs_review

The doctest problem was easy enough to solve: When starting the example with a garbage collection, then things work. Actually I am surprised that the doctest framework does not do a garbage collection at the beginning of each example, in order to bring the system in a defined state - I'll ask on sage-devel.

Anyway, since the problem is solved and the patch updated, stuff is now ready for review!

Apply trac11521_triple_homset.patch

### comment:122 follow-up: ↓ 123 Changed 10 years ago by jpflori

Ive look at the patch and has nothing to say except for weakref being imported twice, I'll provided a reviewer patch for that.

Ha, maybe we should mention the caching system in the doc as well, I'm ready to do that in a reviewer patch as well.

However, I'll wait for #715 to be closed before positive reviewing this one, just in case the changes there have consequences here (which should not be the case).

### comment:123 in reply to: ↑ 122 Changed 10 years ago by SimonKing

Ha, maybe we should mention the caching system in the doc as well, I'm ready to do that in a reviewer patch as well.

Oops, you are right! I thought I had add to the docs, but I just added a doctest, no elaborate explanation.

### comment:124 Changed 10 years ago by jpflori

• Status changed from needs_review to needs_info

While finally looking at this ticket seriously and adding some doc, I remarked that if you run

```sage: V = VectorSpace(QQ,3)
sage: H = Hom(V,V)
sage: H is Hom(V,V)
False

```

With this tickets patches but also WITHOUT them.

Is this a consequence of #9138 ?

What's strange is that V does not break the unique parent assumption:

```sage: V is VectorSpace(QQ, 3)
True

```

Is that intended because there is a ? It does not look very right to me.

The same "problem" occurs playing for example with finite fields...

With more classical stuff, the cache seems to work as intended

```sage: H = Hom(ZZ, QQ)
sage: H is Hom(ZZ, QQ)
True

```

This last point let me think that we can discuss and fix the VectorSpace? situation elsewhere if this is indeed a problem.

My concern here is that it's hard to provide a "smart" example where the hom set is cached and where it can be garbage collected when its domainand codomain are.

Id on't think deleting ZZ and QQ is a good idea :) (and they would be refed elsewhere anyway)

### comment:125 Changed 10 years ago by jpflori

My bad, in fact the "issue" seems to be that X._Hom_(Y, category) succeeds but returns non identical results.

### comment:126 Changed 10 years ago by jpflori

• Description modified (diff)
• Status changed from needs_info to needs_review

Here is a reviewer patch added a little doc.

I hope you will find it satisfactory.

If so, feel free to set the ticket to positive review, I personnally feel happy with youre code.

My above rant, if it should be addressed, should be elsewhere.

### comment:127 Changed 10 years ago by davidloeffler

To reiterate what I just posted at #715, I've just successfully run doctests against 5.0.beta10 with qseries

```trac715_one_triple_dict.patch
trac_715-reviewer.patch
trac_715-rebase_11599.patch
trac11521_triple_homset.patch
trac_11521-reviewer.patch
```

The reviewer patch from jpflori looks fine to me. Is this ready to go now?

### comment:128 Changed 10 years ago by SimonKing

• Reviewers set to Jean-Pierre Flori
• Status changed from needs_review to positive_review

The reviewer patch looks fine to me. Sorry that I did not react a few days ago. Since Jean-Pierre said I should set it to positive review, I am now doing so.

### comment:129 Changed 10 years ago by davidloeffler

• Description modified (diff)

### comment:130 Changed 10 years ago by jdemeyer

• Milestone changed from sage-5.0 to sage-pending

### Changed 10 years ago by SimonKing

Use the weak TripleDict? from #715 for the cache of homsets. Includes the reviewer patch

### comment:131 Changed 10 years ago by SimonKing

• Description modified (diff)

I have created a combined patch, by folding the two patches (i.e., mine and Jean-Pierre's reviewer patch).

With #715 and this patch, all doctests pass on my machine. So, from my perspective, it is a positive review, but we should wait for Jean-Pierre's results on 32 bit.

For the patchbot:

Apply trac_11521_homset_weakcache_combined.patch

### comment:132 Changed 10 years ago by cremona

I'm just confirming that applying this patch & that at #715 to 5.0-beta13 on a 32-bit linux machine, all tests pass.

### comment:133 Changed 10 years ago by jdemeyer

• Milestone changed from sage-pending to sage-5.1

### comment:134 Changed 10 years ago by jdemeyer

• Merged in set to sage-5.1.beta0
• Resolution set to fixed
• Status changed from positive_review to closed

### comment:135 Changed 9 years ago by jdemeyer

• Merged in sage-5.1.beta0 deleted
• Resolution fixed deleted
• Status changed from closed to new

Unmerged because unmerging #12313 while keeping #11521 gives on OS X 10.6:

```\$ ./sage -t --verbose devel/sage/sage/misc/cachefunc.pyx
[...]
Trying:
P = QQ['a, b, c, d']; (a, b, c, d,) = P._first_ngens(4)###line 1038:_sage_    >>> P.<a,b,c,d> = QQ[]
Expecting nothing

------------------------------------------------------------------------
Unhandled SIGSEGV: A segmentation fault occurred in Sage.
This probably occurred because a *compiled* component of Sage has a bug
in it and is not properly wrapped with sig_on(), sig_off(). You might
want to run Sage under gdb with 'sage -gdb' to debug this.
Sage will now terminate.
------------------------------------------------------------------------
```

### comment:136 Changed 9 years ago by jdemeyer

• Milestone changed from sage-5.1 to sage-5.2

### comment:137 Changed 9 years ago by jdemeyer

• Dependencies changed from #11900 #715 to #11900, #715, to be merged with #12313

### comment:138 Changed 9 years ago by jdemeyer

• Status changed from new to needs_review

### comment:139 Changed 9 years ago by jdemeyer

• Status changed from needs_review to needs_work

### comment:140 Changed 9 years ago by jdemeyer

• Dependencies changed from #11900, #715, to be merged with #12313 to to be merged with #715

### comment:141 Changed 9 years ago by SimonKing

• Status changed from needs_work to needs_review

I found that all tests pass on bsd.math, with

```trac12969_fix_coercion_cache.patch
trac12215_weak_cached_function-sk.patch
trac12215_segfault_fixes.patch
trac_715_combined.patch
trac_11521_homset_weakcache_combined.patch
trac_12313-mono_dict-combined-random-sk.patch
```

applied on top of sage-5.2.rc0. I think this justifies returning to needs_review. Or what is the work issue?

### comment:142 follow-up: ↓ 143 Changed 9 years ago by jpflori

I guess it main l'y lot of testing. Thé code looked fine to me. But there were these random bugs potentially caused by à bad interaction with another part of the code. Hopefully you fixed this with your latest conntributions.

### comment:143 in reply to: ↑ 142 Changed 9 years ago by SimonKing

• Dependencies changed from to be merged with #715 to #12969; to be merged with #715

I guess it main l'y lot of testing. Thé code looked fine to me. But there were these random bugs potentially caused by à bad interaction with another part of the code. Hopefully you fixed this with your latest conntributions.

I think at some point #12969 must be used (which has fixed an error that depends on the order in which tests are executed), so, I made it a dependency. See the list of patches in my previous comment.

What sporadic errors did actually occur, and on what system? I did one test run on bsd.math. Since the errors were sporadic, it might be good to test different variants (serially, parallely, with make test or with sage -pt) and multiple times.

### comment:144 Changed 9 years ago by SimonKing

I have run the tests on bsd.math in different ways. There has been no error.

According to old logs of the patch bot,

```sage -t  -force_lib "devel/sage/sage/schemes/elliptic_curves/ell_number_field.py"
```

used to crash sometimes. I repeated that test about 20 times both on my `OpenSuse` laptop and on bsd.math. Everything was fine.

Hence, I am confident that #12969 is indeed enough to solve the problems that were revealed by the other bunch of patches (#12215, #715, #11521, #12313).

### comment:145 follow-up: ↓ 146 Changed 9 years ago by nbruin

sage/categories/homset.py:

```_cache = TripleDict(53)
```

I guess this 53 is a tuned parameter. Can you comment on the choice?

line 214

```    try:
H = _cache[key]()
except KeyError:
H = None
if H:
```

Could it happen that bool(H) is false if H is not None? Perhaps safer to test `H is not None`? I'm not saying that you should. Just comment please.

line 247 (same thing)

line 263

```    _cache[key] = weakref.ref(H)
```

Can you clarify the following: The way I read this is that H will not be protected against GC (good!), but that `_cache` will be be storing a strong(?) reference to `key`, meaning that the entry in `_cache` under `key` will linger with a dead weakref afterwards. Shouldn't there be a callback registered here with the weakref that contacts `_cache` to remove the `key` entry when `H` gets collected? It may be that your #715 changes already make `_cache` into essentially a `WeakKeyDict`, but it seems to me that `_cache` should still have a callback registered on the `weakref` to `H` as well, to remove the entry if `H` gets GCd.

I haven't done extensive testing myself, but you have and the bot happy, so I don't think me running tests would change much.

### comment:146 in reply to: ↑ 145 Changed 9 years ago by SimonKing

• Status changed from needs_review to needs_work

sage/categories/homset.py:

```_cache = TripleDict(53)
```

I guess this 53 is a tuned parameter. Can you comment on the choice?

My patch in #715 is, of course, based on old code. In the old code, there were comments on the choice of the parameter, that I summarise here. The parameter gives the initial number of buckets in a `TripleDict`. Thus, in order to ensure a good distribution of the items into the buckets, the parameter should be prime and should certainly not be even; there actually is a doc test that shows what happens if the parameter is even. It should not be too big, because it determines the number of empty buckets that are created during initialisation of the `TripleDict`. And it should not be too small, because otherwise the `TripleDict` would soon be resized (which is a costly operation).

I thought that 53 is a good choice, given these comments. However, it is not tuned by experiments.

line 214 ... Could it happen that bool(H) is false if H is not None? Perhaps safer to test `H is not None`? I'm not saying that you should. Just comment please.

That line is old code that has not been introduced by my patch, right? I think one should test `H is not None` - mainly for speed, but also for preventing that bool(H) is false even though H is not None.

line 247 (same thing)

line 263

```    _cache[key] = weakref.ref(H)
```

Can you clarify the following: The way I read this is that H will not be protected against GC (good!), but that `_cache` will be be storing a strong(?) reference to `key`, meaning that the entry in `_cache` under `key` will linger with a dead weakref afterwards.

Correct, at least theoretically.

The reason for the code was that at some point one needed to break a chain of strong references. In the memory leak fixed here, there was a reference from H back to the keys under which H is stored in the `TripleDict`. Hence, practically, when H gets collected, the item in the `TripleDict` becomes collectable as well. Hence, the dead weakref will soon vanish.

At least, that's what should happen. But:

Shouldn't there be a callback registered here with the weakref that contacts `_cache` to remove the `key` entry when `H` gets collected? It may be that your #715 changes already make `_cache` into essentially a `WeakKeyDict`,

Yes. A `WeakKeyDict`, but not a `WeakValueDict`. That's why I am using a weak reference to H.

but it seems to me that `_cache` should still have a callback registered on the `weakref` to `H` as well, to remove the entry if `H` gets GCd.

Recall that each `TripleDict` has an eraser. I think, at least with the new patch at #715 that I have posted today, such a callback would be easily possible.

I haven't done extensive testing myself, but you have and the bot happy, so I don't think me running tests would change much.

Well, in an ideal world, a different brain will be able to invent different stress tests, thus, find different bugs...

Anyway. I think I should turn `if H:` into `if H is not None:` and should introduce a callback to the weak reference.

### comment:147 Changed 9 years ago by SimonKing

Another comment: I use a weak reference to H (the homset) on the one hand, and a weak reference to the category C as the third argument to the `TripleDict`, on the other hand (see line 212). Since the keys of a `TripleDict` are compared by identity and not by equality, I can not use a callback function in the weak reference to C. Namely, weakref.ref(C) is identical with weakref.ref(C), whereas two weak references with callback are equal but not identical.

In conclusion, I will use a callback in the weak reference to H, so that the dictionary item gets collected as soon as H gets collected. But I think I can not use a callback in the weak reference to C. Anyway, a callback for C is not needed: If H is alive, it has a reference to C and keeps C alive. If H dies, then the callback for the weak reference to H will remove the item, and thus C can be collected as well.

### comment:148 Changed 9 years ago by SimonKing

Now I wonder: Why did I explicitly put a weak reference to the category as a key for the `TripleDict`? Would the `TripleDict` not use weak references to all three parts of the key?

I think the reason for the "weakref.ref(category)" was that in older versions of `TripleDict` I did indeed not use a weak reference to the third part of the key.

### Changed 9 years ago by SimonKing

Use a callback, to make sure that items in the homset cache are deleted if the homset is garbage collected

### comment:149 Changed 9 years ago by SimonKing

• Description modified (diff)
• Status changed from needs_work to needs_review

OK, the additional patch has been posted. It removes the explicit weak reference to the category (`TripleDict` uses a weak reference anyway), adds a callback to the weak reference to the homset, so that an item of the homset cache gets deleted if the homset is garbage collected, and it replaces "if H:" by "if H is not None:".

I doctested sage/schemes/ (heuristics: Most bugs I ever authored resulted in a segfault in sage/schemes :), sage/structure/ and sage/categories/homset.py (hence, the memory leak remains fixed).

Apply #715 trac_11521_homset_weakcache_combined.patch trac_11521_callback.patch

Last edited 9 years ago by SimonKing (previous) (diff)

### comment:150 Changed 9 years ago by nbruin

• Reviewers changed from Jean-Pierre Flori to Jean-Pierre Flori, Nils Bruin
• Status changed from needs_review to positive_review

Good to go! (Note that the patchbot being happy here shows that #715 is fine too. Tickets apparently cannot have post-dependencies)

### comment:151 Changed 9 years ago by jdemeyer

• Milestone changed from sage-5.3 to sage-5.4

### comment:152 follow-up: ↓ 159 Changed 9 years ago by nbruin

• Status changed from positive_review to needs_work

Aw, this weakref caching stuff is very headache-inducing. My sincere apologies. I'm afraid I have to withdraw my positive review due to a previous failure in my logic. As I see it now, the current patch reduces the caching of coercion maps to near uselessness. The fact that it DOES seem to work to some extent might give a lead to why other things are going wrong. Let me try:

Ignoring the weakref on H for now, the coercion cache here consists of a TripleDict?, stored on Y, indexing

```(X,Y,C) : H
```

or

```(X,Y,C): None
```

Here `H` is a map from `X` to `Y` wrt to the category `C`, so it stores strong references to `X`,`Y`,`C`. That means that as long as `H` is alive, no component of the key will die, so the fact that this is stored in a `TripleDict` is irrelevant: The value `H` would keep the key alive anyway. The only exception is the value `None`. There the `TripleDict` does exactly what it should.

This cycle problem is fixed by weakreffing `H`. But the whole point of the cache is to keep `H` alive as long as all of `X`, `Y` and `C` exist, so that we can look it up. Since the normal use for coercion would be to discover the coercion map and forget about it as soon as it has been applied, one would expect the cache to be empty all the time!

I don't immediately see a canonical way to solve this problem. Here are some ideas that may mitigate it:

We're caching coercions on the codomain, which I think is the right place for the following reason. Long-lived parents such as `ZZ` tend to have a lot of coercions to other parents, but very few from. As observed above, we are caching the absense of coercions, but those don't involve an `H` that might keep the codomain alive. So, we could cache the absense of coercions in a `TripleDict`. In fact, we could make that a `TripleSet` if we really want.

The PRESENCE of a coercion should probably be cached with a strong reference to the coercion. That's the point of caching the thing! This should be keyed on `(X,C)` (the `Y` is implied by the cache we're looking in). Storing the key weakly has no effect with the present design of coercion.

That means that discovering those impose a memory cost on Y. However, usually there tend to not be too many of those (is that true in the p-adics with all those different precisions floating around? Still, that should be managable relative to creating all finite fields up to size 109 and storing strong references on `ZZ` to them to record the absence of a coercion from them, which I think was the problem in the example that started this ticket)

For permanent objects with coercions from lots of objects (like the symbolic ring, real/complex fields, [number fields with declared embeddings], `QQbar` [number fields again]) we have a big problem.

One solution would be to turn off caching for those (which needs infrastructure ... I guess a flag `Y.no_coercion_caching_on_me_please=True`, which needs checking any time you're about to cache something. Or, if that is too drastic, an optional parent method `Y.is_this_a_coercion_cachable_domain(X)` or`Y.is_this_a_coercion_cachable_domain_type(type(X))`

Another would be to warn people that mixing parents makes the "larger" parent remember the "smaller" one, so if you let lots of "smaller" parent interact with a "larger" one, you'll be using memory for the lifetime of that "larger" one.

The "proper" solutions below would force coercion maps to be different from normal homomorphisms, fit for users. That's because if a user keeps a reference `h` to an element of `Hom(X,Y)`, he/she rightfully expects that to keep `X` and `Y` alive.

we'd have to weakly key the cache, but still strongly store the coercion map.

The "proper" (but possibly too expensive) option is to not store `X` on coercion maps. If we have `H=Y.coercion_map_from(X)` and `H` is used correctly, then `H(x)` can get access to X via `x.parent()` anyway, so `H` doesn't strictly need a reference to `X`.

So the object we store in the cache could be a map-like object `cH` that doesn't store its domain. Upon calling `Y.coercion_map_from(X)` we could wrap `cH` into a proper `Hom(X,Y)` member that does have a strong reference to X. This might be too expensive for something so fundamental as coercion.

TL;DR: Do not store a `KeyedRef` to `H` but store `H` itself in the cache. We won't solve the leak completely, but at least we're not leaking on the absence of coercions as we were before.

### comment:153 Changed 9 years ago by jpflori

I'm not sure I completely follow you. Did you consider the fact that maps are also stored in the parent themselves? not only in the coercion model?

### comment:154 Changed 9 years ago by jpflori

In particular, please have a look at #12313.

### comment:155 follow-up: ↓ 157 Changed 9 years ago by nbruin

• Status changed from needs_work to positive_review

OK, I retract my concern for this patch. No coercions are stored here! In particular, any `H` occurring here is a homset, not a map. This cache is simply ensuring that `Hom(X,Y)` is unique. So indeed, this cache should not be preventing `H` from being collected. The `KeyedRef` is entirely in place.

I `was` looking at #12313 when this struck me and I think the discussion above has relevance to that case. Once a coercion `h: X -> Y` is discovered, a strong reference to `h` is stored in `self._coerce_from_hash`. Since `h` stores a strong reference to `X` we now have ensured that the life time of `X` is bounded below by the life time of `Y`. So the weak caching there only helps for `X` that do not coerce into `Y`.

How eagerly does the system use the `self._convert_from_hash`? It is quite likely that even with #12313 in place,

```for p in prime_range(1,1000):
k=GF(p)
a=k(1)
b=ZZ(a)
```

would still leak, due to the conversion `GF(p)->ZZ` being cached on `ZZ`. That would make me believe we should probably not cache conversions. There are too many of them, so it's too easy to let sage discover enough of them that all objects in memory are in one connected component.

For coercion things are a little brighter because objects with lots of inbound arrows (near-universal codomains?) are rarer.

### comment:156 Changed 9 years ago by jpflori

Trying to summarize the action of the different coercion patches, here is what I remember:

• but then #11521 comes in because we also have a global cache for homset which prevented parents to die. So only there the tripl dicts store weakrefs to Action to enable gc back again. (just saw you just posted...)
Last edited 9 years ago by jpflori (previous) (diff)

### comment:157 in reply to: ↑ 155 Changed 9 years ago by jpflori

Replying to nbruin: relevance to that case. Once a coercion `h: X -> Y` is discovered, a strong reference to `h` is stored in `self._coerce_from_hash`. Since `h` stores a strong reference to `X` we now have ensured that the life time of `X` is bounded below by the life time of `Y`. So the weak caching there only helps for `X` that do not coerce into `Y`.

How eagerly does the system use the `self._convert_from_hash`? It is quite likely that even with #12313 in place,

```for p in prime_range(1,1000):
k=GF(p)
a=k(1)
b=ZZ(a)
```

I see the problem. Another solution may be to use TripleDicts? there as well rather than MonoDicts?. Or DuoDicts?!

### comment:158 Changed 9 years ago by jpflori

Disregard my above comment. The point of #12313 is exactly to let k be garbage collected, as the elliptic curves in #12313 description.

### comment:159 in reply to: ↑ 152 Changed 9 years ago by SimonKing

I this post, I clarify some notions and explain why the weak reference to a homset introduced here does not imply that the homset will immediately be garbage collected. Sorry, while I wrote that post, you clarified things for yourself, making my post redundant. Anyway, here it goes...

Here `H` is a map from `X` to `Y` wrt to the category `C`, so it stores strong references to `X`,`Y`,`C`.

No, it is a homset. But yes, it has strong references to X,Y,C.

The PRESENCE of a coercion should probably be cached with a strong reference to the coercion. That's the point of caching the thing!

Yes. But that should happen in a way that domain and codomain remain collectable. And that is a problem; see below.

But I think there is a lot of confusion now between coercion, homset, `TripleDict` post and prior to #715, and the way how coercion maps are cached. Let me try to straighten things a bit:

The original purpose of `TripleDict` was to store ACTIONS (not coercions). When you have an action of X on Y, then you would store the action in an attribute of X, addressed by the key triple that is formed by Y, the information what operation is considered (+ or *), and the information whether the action is on the left or on the right.

The original `TripleDict` did so by strong references to both keys and values. By #715, `TripleDict` got changed so that one has weak references to the keys.

Before #11521, that was the only application of `TripleDict`. But here, I suggest a second application, namely: Use it to store homsets. A "homset" from X to Y in the category C is the parent that contains all morphisms from X to Y in the category C (regardless whether these morphisms are coercion maps or conversion maps or anything else).

Let me emphasize again that #715, #12215 and #11521 do not change the way coercions are stored.

How are coercions stored? Before #12313, the coercions from X to Y were stored in a strong dictionary in Y, strongly keyed with X. Hence, as long as Y survives, there would be a strong reference to X. But with #12313, a weak dictionary is introduced for that purpose. Hence, even if Y persists and a coercion has been cached from X to Y, X remains collectable.

Now, back to caching `H = Hom(X,Y,category=C)` (H is a set of maps). You are right, having just a weak reference to H sounds like it would be immediately collected. If I recall correctly, my reasoning for introducing the weak reference to H has been as follows:

• Let T be the cache for the Hom function. T is a `TripleDict`. and T is in fact available under sage.categories.homset._cache. Hence, we have a strong permanent reference to T.
• T provides some buckets, and of course T has strong references to its buckets.
• With #715, the buckets contain some memory addresses representing the keys, and then strong references to the values.

By consequence, defining T[X,Y,C]=H means that we have a chain of strong references, starting with T (T is permanent), from T to its buckets, from the buckets to H, and from H to X, Y and C. Hence, X,Y and C will never be garbage collected, even though T itself only has weak references to X,Y and C.

Now, only define T[X,Y,C]=weakref.ref(H) (or any other kind of weak reference to H). Why is H not immediately garbage collected, in usual applications?

Usual application means: You will not just create H, but you will also create an element of H, say, phi. We have a strong reference from phi to H, because H is the parent of phi. If you do not store phi, then H remains collectable. But in a usual application, you would store phi, say, because it is a coerce map from X to Y.

By now, we assume that phi is a coercion, which is cached in Y. To be precise, we have Y._coerce_from_hash[X] = phi.

Prior to #12313, Y._coerce_from_hash is a usual dict. Hence, the existence of Y keeps X, phi and thus H alive.

With #12313, Y._coerce_from_hash is a `MonoDict`, which only has a weak reference to X. However, Y._coerce_from_hash has a strong reference to its buckets, the buckets have a strong reference to phi, phi has a strong reference to its parent H, and H has a strong reference to X. Hence, we have the reference cycle (with -> weak and => strong references)

```* => T -> H

Y -> X
Y => phi => H => X
H => Y
H => C
```

In conclusion, an external reference to Y will keep X alive (which is bad). But if there is no external reference to Y nor to X, then Y,X,H and C remain collectable (assuming that no `__del__` method makes the cyclic garbage collection impossible), which is good enough to fix a memleak.

Summary

• If H=Hom(X,Y,C) and the homset cache keeps a strong reference to H, then H, X, Y and C will never be collectable. That's why I introduce the weak reference, strange as it may look.
• If H=Hom(X,Y,C) is created as the parent of a coercion or conversion map phi, which is stored in Y._coerce_from_cache, then:
• An external strong reference to Y keeps phi, thus H, thus X and C alive (*).
• An external strong reference to X does not prevent Y, H or C from being collected.

In an ideal world, (*) would be improved: An external strong reference to Y would not be enough to keep X alive. Do you have any idea how this can be implemented?

### comment:160 Changed 9 years ago by SimonKing

• Status changed from positive_review to needs_work
• Work issues set to Test activity of weak references if addresses coincide

See my comment at 13370: To be on the safe side, i.e., in order to avoid that a key is deallocated but its callback function isn't called, one could/should first look at the memory address, and then (if the addresses coincide) test whether the stored weak reference is still active. Namely, if it isn't then the new key is really new, even though using an old address.

Hence, I'd like to put this and #715 to "needs work".

### comment:161 Changed 9 years ago by SimonKing

• Status changed from needs_work to needs_review
• Work issues Test activity of weak references if addresses coincide deleted

Ooops, the "needs work" should have been #715 and #12313...

### comment:162 Changed 9 years ago by nbruin

• Status changed from needs_review to positive_review

### comment:163 Changed 9 years ago by jdemeyer

• Milestone changed from sage-5.4 to sage-pending

### comment:164 Changed 9 years ago by SimonKing

• Description modified (diff)

### comment:165 Changed 9 years ago by jdemeyer

• Milestone changed from sage-pending to sage-5.4

### comment:166 Changed 9 years ago by jdemeyer

• Milestone changed from sage-5.4 to sage-5.5

SInce these tickets have caused some trouble in the past, I prefer to merge them only in a .beta0 (to maximize the testing), hence the milestone bump.

### comment:167 Changed 9 years ago by jdemeyer

Is it still true that this needs to be merged with #13447? If yes, then #715 still depends on #13447.

### comment:168 Changed 9 years ago by nbruin

• Description modified (diff)

No, we do not need that merge. We should absolutely not add any more dependencies to these tickets. They are good to go as is (modulo nasty surprises). We can improve things afterwards. The bump to 5.5 is already unfortunate, because that means we've already unnecessarily missed one sailing.

EDIT: With unfortunate I do not mean to imply unwise. I fully respect the judgement of the release manager on this issue.

Last edited 9 years ago by nbruin (previous) (diff)
Note: See TracTickets for help on using tickets.