Sage: Ticket #19197: LatticePoset: add breadth()
https://trac.sagemath.org/ticket/19197
<p>
Add a function to compute the breadth of a lattice.
</p>
en-usSagehttps://trac.sagemath.org/chrome/site/logo_sagemath_trac.png
https://trac.sagemath.org/ticket/19197
Trac 1.1.6jmantysaloSun, 13 Sep 2015 07:38:54 GMTbranch set
https://trac.sagemath.org/ticket/19197#comment:1
https://trac.sagemath.org/ticket/19197#comment:1
<ul>
<li><strong>branch</strong>
set to <em>u/jmantysalo/latticeposet__add_breadth__</em>
</li>
</ul>
TicketncohenSun, 13 Sep 2015 10:40:53 GMTcommit set
https://trac.sagemath.org/ticket/19197#comment:2
https://trac.sagemath.org/ticket/19197#comment:2
<ul>
<li><strong>commit</strong>
set to <em>cd99bd6b82aaece6bd535d6fdfd01f7bfb28a8c5</em>
</li>
</ul>
<p>
Hello Jori,
</p>
<p>
Here is a way to improve the algorithm a bit:
</p>
<p>
Let us say that a set <code>S</code> is "locally_minimal" if the join of the elements in <code>S</code> is different from the join of the elements in <any proper subset in <code>S</code>>. What you are looking for is the set <code>locally_mininal</code> set of maximum cardinality.
</p>
<p>
Why do you iterate on antichains? You iterate on antichains, because you know that in *any* "locally_minimal" set <code>S</code>, any 2-subset of <code>S</code> must be locally minimal. And the sets of locally minimal sets of cardinality 2 are precisely the antichains.
</p>
<p>
Why wouldn't you go further? Indeed, the same works for anything greater than 2: in any locally minimal set, *any* proper subset is also locally minimal.
</p>
<p>
Your implementation, however, does not use that. You test all antichains, but when testing those of size 5 you do not use the information obtained from those of size 4.
</p>
<p>
So here is a way out: enumerate all locally minimal sets directly, in increasing order of size. This can be done with <code>subsets_with_hereditary_property</code> [1], to which you can feed a function that detects if a given set of points is locally minimal (when you remove only one element).
</p>
<p>
This should do the trick, and *use* the information that <some given set of size 3 is not locally minimal> when trying to figure out one of size 4.
</p>
<p>
Note: It may be slower for small cases, but better above.
</p>
<p>
Additionally, your code may spend a lot of time hashing your elements: it may be better to work directly with the integer ID of the poset's elements (and to access the matrix directly).
</p>
<p>
Also, could you be a bit more formal in the definition of 'breadth'? I did not follow it at first (english is not a well-paenthesized language). It would also be cool if you could redirect toward a textbook/paper that defines it: I found it in "Semimodular Lattices Theory and Applications" where it is said to originate from "Lattice Theory" by Birkhoff.
</p>
<p>
Nathann
</p>
<p>
[1] <a class="ext-link" href="http://doc.sagemath.org/html/en/reference/combinat/sage/combinat/subsets_hereditary.html"><span class="icon"></span>http://doc.sagemath.org/html/en/reference/combinat/sage/combinat/subsets_hereditary.html</a>
</p>
<hr />
<p>
New commits:
</p>
<table class="wiki">
<tr><td><a class="ext-link" href="http://git.sagemath.org/sage.git/commit/?id=cd99bd6b82aaece6bd535d6fdfd01f7bfb28a8c5"><span class="icon"></span>cd99bd6</a></td><td><code>Added function breadth() to lattices.</code>
</td></tr></table>
TicketgitSun, 13 Sep 2015 11:20:20 GMTcommit changed
https://trac.sagemath.org/ticket/19197#comment:3
https://trac.sagemath.org/ticket/19197#comment:3
<ul>
<li><strong>commit</strong>
changed from <em>cd99bd6b82aaece6bd535d6fdfd01f7bfb28a8c5</em> to <em>8d81130c3ea697cbfe3ce178dd5a7fac052e78d3</em>
</li>
</ul>
<p>
Branch pushed to git repo; I updated commit sha1. New commits:
</p>
<table class="wiki">
<tr><td><a class="ext-link" href="http://git.sagemath.org/sage.git/commit/?id=8d81130c3ea697cbfe3ce178dd5a7fac052e78d3"><span class="icon"></span>8d81130</a></td><td><code>Sphinx does not like me.</code>
</td></tr></table>
TicketjmantysaloSun, 13 Sep 2015 11:23:43 GMT
https://trac.sagemath.org/ticket/19197#comment:4
https://trac.sagemath.org/ticket/19197#comment:4
<p>
Technical correction committed. Now the code should work and Sphinx should not complain.
</p>
<p>
I will try Nathann's suggestion later. I would have marked this as <em>needs_work</em>, but the system does not allow it.
</p>
TicketjmantysaloSun, 13 Sep 2015 15:57:03 GMT
https://trac.sagemath.org/ticket/19197#comment:5
https://trac.sagemath.org/ticket/19197#comment:5
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:2" title="Comment 2">ncohen</a>:
</p>
<blockquote class="citation">
<p>
So here is a way out: enumerate all locally minimal sets directly, in increasing order of size. This can be done with <code>subsets_with_hereditary_property</code> [1], to which you can feed a function that detects if a given set of points is locally minimal (when you remove only one element).
</p>
<p>
This should do the trick, and *use* the information that <some given set of size 3 is not locally minimal> when trying to figure out one of size 4.
</p>
</blockquote>
<p>
OK... So you think something like
</p>
<pre class="wiki">f = lambda ac: self.is_antichain_of_poset(ac) and self.join(ac) not in
[self.join(set(ac).difference(set([e]))) for e in ac]
l = list(subsets_with_hereditary_property(f, self))
len(l[-1])
</pre><p>
? Seems faster, but still stucks for 132-element <code>Posets.TamariLattice(6).canonical_label()</code>. Btw, what is faster way to iterate over "list <code>L</code> minus an element for every element"?
</p>
TicketncohenSun, 13 Sep 2015 16:03:10 GMT
https://trac.sagemath.org/ticket/19197#comment:6
https://trac.sagemath.org/ticket/19197#comment:6
<blockquote class="citation">
<p>
OK... So you think something like
</p>
</blockquote>
<p>
There is no need to check that it is an antichain. Also, if you do not write it as a lambda function you have more freedom and do not need to create sets at each test, test.
</p>
<blockquote class="citation">
<p>
btw, what is faster way to iterate over "list <code>L</code> minus an element for every element"?
</p>
</blockquote>
<p>
What's wrong with <code>L[:i] + L[i+1:]</code> ?
</p>
<p>
Nathann
</p>
TicketjmantysaloSun, 13 Sep 2015 17:18:58 GMT
https://trac.sagemath.org/ticket/19197#comment:7
https://trac.sagemath.org/ticket/19197#comment:7
<p>
Just a random thought: Can the thinking be reversed? I mean that for a certificate <code>A</code> there is the element <code>j=join(A)</code>. What kind of element that can be? Can we somehow go back from it/those, i.e. have algorithm <code>j -> A</code> instead of <code>A -> j</code>?
</p>
<p>
If this is not possible, is this an NP-problem?
</p>
TicketjmantysaloMon, 14 Sep 2015 07:41:37 GMT
https://trac.sagemath.org/ticket/19197#comment:8
https://trac.sagemath.org/ticket/19197#comment:8
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:2" title="Comment 2">ncohen</a>:
</p>
<blockquote class="citation">
<p>
Also, could you be a bit more formal in the definition of 'breadth'? I did not follow it at first (english is not a well-paenthesized language). It would also be cool if you could redirect toward a textbook/paper that defines it: I found it in "Semimodular Lattices Theory and Applications" where it is said to originate from "Lattice Theory" by Birkhoff.
</p>
</blockquote>
<p>
True. "- - of the <em>breadth</em> of a lattice <em>L</em>, - - as the least positive integer <em>b = b[L]</em> such that any meet <em>x_1 <code>^</code> . . . <code>^</code> x_n [n > b]</em> is always a meet of a subset of <em>b</em> of the <em>x_i</em>. This is on p. 99. Not very intuitive definition... So let's try from scratch:
</p>
<p>
Let <code>E</code> be an <code>n</code>-element subset of elements of the lattice, <code>j</code> be a join of <code>E</code>, and join of every proper subset of <code>E</code> be not equal <code>j</code>. The <em>breadth</em> of lattice is number of elements in a largest such subset <code>E</code>.
</p>
<p>
? Hmm... not best possible.
</p>
TicketncohenMon, 14 Sep 2015 08:46:57 GMT
https://trac.sagemath.org/ticket/19197#comment:9
https://trac.sagemath.org/ticket/19197#comment:9
<p>
To me the first definition you gave is good. The one from the branch, however, was harder to parse. If you find this first definition not very intuitive, it may be because it defines it as "the smallest integer" instead as a "largest integer"? If that is the problem you can probably adapt it.
</p>
<p>
Nathann
</p>
TicketjmantysaloMon, 14 Sep 2015 18:37:33 GMT
https://trac.sagemath.org/ticket/19197#comment:10
https://trac.sagemath.org/ticket/19197#comment:10
<p>
Here is, if I thinked this right, the smallest lattice with breadth 4:
</p>
<pre class="wiki">n = 4
l = [[0,i] for i in range(1,n+1)]
l += [[i, i+n] for i in range(1,n+1)]
l += [[i, i+n+1] for i in range(1,n)]
l += [[n, n+1]]
l += [[i, i+n] for i in range(n+1, 2*n+1)]
l += [[i, i+n+1] for i in range(n+1,n*2)]
l += [[n*2, n*2+1]]
l += [[i, n*3+1] for i in range(2*n+1, 3*n+1)]
L = Poset(( [], l)).completion_by_cuts().canonical_label()
</pre><p>
This is not easily generalized. <code>n=5</code> will not make a lattice of breadth 5.
</p>
<p>
However, can this be used as a part of test for a lattice to have breadht 4? If br(L)=4, there must be an antichain <code>A</code> of <code>4</code> elements with join <code>j</code>. Element <code>j</code> must cover at least four element. They can not be the set <code>A</code>. They can not directly cover <code>A</code>, because it is just impossible to make the lattice so that three element from <code>A</code> would not have meet <code>j</code>.
</p>
<p>
So to check take one element a time and try to put in as a <code>j</code> in this structure. If the element covers less than four element, it is not right one. If it does, take all four-element subsets of the lower covers of <code>j</code>-candidate. For those... somehow continue a-kind-of backtracking.
</p>
<p>
We can remove all doubly irreducible elements, as removing thme won't change the breadth, except from 2 to 1 in a diamond-like stucture.
</p>
TicketncohenMon, 14 Sep 2015 20:38:24 GMT
https://trac.sagemath.org/ticket/19197#comment:11
https://trac.sagemath.org/ticket/19197#comment:11
<p>
I do not know what you are looking for, but if I wanted to build posets of arbitrarily high breadth I'd try booleanlattices.
</p>
<p>
Nathann
</p>
TicketjmantysaloTue, 15 Sep 2015 03:44:31 GMT
https://trac.sagemath.org/ticket/19197#comment:12
https://trac.sagemath.org/ticket/19197#comment:12
<p>
I am thinking about better algorithm to compute the breadth.
</p>
<p>
As an example: to a lattice have breadth <code>k</code> there must be an element that covers <code>k</code> elements. It does not suffice that it has an antichain of <code>k</code> elements.
</p>
<p>
But I'll continue thinking and try to make some example code.
</p>
TicketgitWed, 16 Sep 2015 14:22:36 GMTcommit changed
https://trac.sagemath.org/ticket/19197#comment:13
https://trac.sagemath.org/ticket/19197#comment:13
<ul>
<li><strong>commit</strong>
changed from <em>8d81130c3ea697cbfe3ce178dd5a7fac052e78d3</em> to <em>33610ad2c4327679b8dfc1d3af445fe4d9080e4a</em>
</li>
</ul>
<p>
Branch pushed to git repo; I updated commit sha1. This was a forced push. New commits:
</p>
<table class="wiki">
<tr><td><a class="ext-link" href="http://git.sagemath.org/sage.git/commit/?id=33610ad2c4327679b8dfc1d3af445fe4d9080e4a"><span class="icon"></span>33610ad</a></td><td><code>Added function for lattice breadth.</code>
</td></tr></table>
TicketjmantysaloWed, 16 Sep 2015 14:30:58 GMTstatus changed
https://trac.sagemath.org/ticket/19197#comment:14
https://trac.sagemath.org/ticket/19197#comment:14
<ul>
<li><strong>status</strong>
changed from <em>new</em> to <em>needs_review</em>
</li>
</ul>
<p>
Ready for review. I did some timings, and for example <code>Posets.TamariLattice(8)</code>, which has 1430 elements, it takes 4,5 seconds to compute the breadth. I know that there is places for optimization, but at least this should be working code and fast enought to be usable.
</p>
TicketncohenWed, 16 Sep 2015 18:46:49 GMT
https://trac.sagemath.org/ticket/19197#comment:15
https://trac.sagemath.org/ticket/19197#comment:15
<p>
Hello Jori,
</p>
<p>
Here is a first-pass review while on the bus.
</p>
<ul><li>I was very surprised to see a 'distance' argument in the depth-first-search
function. You do not compute the 'distance' when running a
depth-first-search. At least not the usual notion of distance in graph
theory. We should probably remove it, as it is *highly* dangerous to advertise
it this way <code>O_o</code>
</li></ul><ul><li>In your code, please use breadth-first-search instead.
</li></ul><ul><li>Definition of 'elems': this line often test for containment in 'too_close'. Do
not run containment tests in a list. use a 'set' for that: containment is
faster.
</li></ul><ul><li>Anyway, you probably should do this differently, i.e. with
<code>breadth_first_search(report_distance=True)</code> while filtering 'too close'
elements according to their distance.
</li></ul><ul><li>I still believe that it would be faster to use
'subsets_with_hereditary_property' inside of your code. It would also make the
code slightly clearer, as the test you perfom on each antichain would be an
independent function.
</li></ul><ul><li>Why do you deal with breadth 2 differently from the rest? That could mean a
couple more operations, but compared to everything else that should not mean
much.
</li></ul><p>
Besides that, congratulations for this algorithm. It works, and it works well.
</p>
<p>
Nathann
</p>
TicketjmantysaloWed, 16 Sep 2015 19:05:50 GMT
https://trac.sagemath.org/ticket/19197#comment:16
https://trac.sagemath.org/ticket/19197#comment:16
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:15" title="Comment 15">ncohen</a>:
</p>
<blockquote class="citation">
<ul><li>I was very surprised to see a 'distance' argument in the depth-first-search
function. You do not compute the 'distance' when running a
depth-first-search. At least not the usual notion of distance in graph
theory. We should probably remove it, as it is *highly* dangerous to advertise
it this way <code>O_o</code>
</li></ul></blockquote>
<p>
For me the parameter name <code>distance</code> feels very clear. But could like also <code>max_depth</code>. I think that it is useful to have a function that tells for example to what cities you can reach using max three bus.
</p>
<blockquote class="citation">
<ul><li>In your code, please use breadth-first-search instead.
</li></ul></blockquote>
<p>
Why? Doesn't it use more memory?
</p>
<blockquote class="citation">
<ul><li>Definition of 'elems': this line often test for containment in 'too_close'. Do
not run containment tests in a list. use a 'set' for that: containment is
faster.
</li></ul></blockquote>
<p>
OK.
</p>
<blockquote class="citation">
<ul><li>Anyway, you probably should do this differently, i.e. with
<code>breadth_first_search(report_distance=True)</code> while filtering 'too close'
elements according to their distance.
</li></ul></blockquote>
<p>
I can test and see what happens. But I must make some test lattices, maybe some ordinal sum of ordinal products.
</p>
<blockquote class="citation">
<ul><li>I still believe that it would be faster to use
'subsets_with_hereditary_property' inside of your code.
</li></ul></blockquote>
<p>
I am not sure about that. If I am right, it will make all 2-element antichains, then all 3-element antichains and so on. There will be many of them, only one is needed. (But the one must be the right one, of course...)
</p>
<blockquote class="citation">
<ul><li>Why do you deal with breadth 2 differently from the rest?
</li></ul></blockquote>
<p>
There was some complication with breadth 2. Maybe it is not needed now in the near-to-final version.
</p>
<p>
And then, this depends on what one is doing. I guess it makes no difference for a big lattice, but for <code>10^4</code> small it might be different thing.
</p>
<blockquote class="citation">
<p>
Besides that, congratulations for this algorithm. It works, and it works well.
</p>
</blockquote>
<p>
Thanks! And thanks for the comments.
</p>
TicketncohenWed, 16 Sep 2015 19:15:05 GMT
https://trac.sagemath.org/ticket/19197#comment:17
https://trac.sagemath.org/ticket/19197#comment:17
<p>
Hello,
</p>
<blockquote class="citation">
<p>
For me the parameter name <code>distance</code> feels very clear.
</p>
</blockquote>
<p>
And it is not the graph-theoretic distance in the graph, right? Plus there may be vertices at distance 2 from your start vertex that *never* get explored because you set distance=3. Do we also agree there? This is due to the fact that a vertex is at 'distance' k from the source vertex in a DFS if "there exists a path of length k from the source vertex to it" (possibly a *very long* path) and that if this vertex is once discovered at 'distance' 10 000 it may not be 'rediscovered' again at distance 2.
</p>
<blockquote class="citation">
<p>
Why? Doesn't it use more memory?
</p>
</blockquote>
<p>
no reason to. It may be trigger additional copies of the 'queue' list (unless we implement a deque), but that's all. Additionally, if this cost worries you then you probably shouldn't use list-comprehension in your code <code>:-P</code>
</p>
<blockquote class="citation">
<p>
I am not sure about that. If I am right, it will make all 2-element antichains, then all 3-element antichains and so on. There will be many of them, only one is needed. (But the one must be the right one, of course...)
</p>
</blockquote>
<p>
With your algorithm: if the antichain {a,b,c} is not "locally minimal", then your code will test all antichains which contain the antichain {a,b,c} too. With <code>subset_with_hereditary_property</code> it would be filtered out.
</p>
<p>
Nathann
</p>
TicketncohenWed, 16 Sep 2015 19:17:20 GMT
https://trac.sagemath.org/ticket/19197#comment:18
https://trac.sagemath.org/ticket/19197#comment:18
<blockquote class="citation">
<blockquote class="citation">
<p>
For me the parameter name <code>distance</code> feels very clear.
</p>
</blockquote>
</blockquote>
<p>
Said differently: if I make no mistake, the set of vertices returned by <code>depth_first_search(v,distance=k)</code> is not defined in theory, and in practice is architecture-dependent.
</p>
<p>
Nathann
</p>
TicketjmantysaloWed, 16 Sep 2015 20:00:32 GMT
https://trac.sagemath.org/ticket/19197#comment:19
https://trac.sagemath.org/ticket/19197#comment:19
<p>
More comments later (getting late here), but for this one:
</p>
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:17" title="Comment 17">ncohen</a>:
</p>
<blockquote class="citation">
<p>
Plus there may be vertices at distance 2 from your start vertex that *never* get explored because you set distance=3. Do we also agree there? This is due to the fact that a vertex is at 'distance' k from the source vertex in a DFS if "there exists a path of length k from the source vertex to it" (possibly a *very long* path) and that if this vertex is once discovered at 'distance' 10 000 it may not be 'rediscovered' again at distance 2.
</p>
</blockquote>
<p>
No. The vertex was not discovered at distance 10 000, because search in that path was stopped at distance 3.
</p>
<p>
I can see nothing complicated here, except maybe on terms to use. Just "Go back when there is no more unseen vertices as neighbours." is changed to "Go back when there is no more unseen vertices as neighbours or if the maximum distance has already been reached."
</p>
TicketncohenWed, 16 Sep 2015 20:06:47 GMT
https://trac.sagemath.org/ticket/19197#comment:20
https://trac.sagemath.org/ticket/19197#comment:20
<blockquote class="citation">
<p>
I can see nothing complicated here, except maybe on terms to use.
</p>
</blockquote>
<p>
Let us say that you have a path of length 10 on {0,1,...,9}, and that you add a path of length 3 from 0 to 1, i.e. {0,a,1}.
</p>
<p>
If you want all vertices at distance 2 through a DFS starting from 0, you may discover 'a' first (at distance 1) then 1 (at distance 2). You will not explore 2 (because it would be a distance 3).
</p>
<p>
When next you will test edge (0,1) and re-discover 1 (this time at distance 1), you will stop there because '1' has already been discovered.
</p>
<p>
Thus you will never discover vertex '2'.
</p>
<p>
If, on the other hand, the algorithm discovered 1 first, then you would proceed to 2 without any problem.
</p>
<p>
This is what is wrong.
</p>
<p>
Nathann
</p>
TicketjmantysaloWed, 16 Sep 2015 20:16:45 GMT
https://trac.sagemath.org/ticket/19197#comment:21
https://trac.sagemath.org/ticket/19197#comment:21
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:20" title="Comment 20">ncohen</a>:
</p>
<blockquote class="citation">
<p>
When next you will test edge (0,1) and re-discover 1 (this time at distance 1), you will stop there because '1' has already been discovered.
</p>
</blockquote>
<p>
True. Now I understand.
</p>
<p>
So vertices at exactly the limit distance should be handled differently. They must be on different list, from where they will be moved to "normal" <code>already_seen</code> list if they are re-discovered by smaller distance.
</p>
TicketncohenWed, 16 Sep 2015 20:20:47 GMT
https://trac.sagemath.org/ticket/19197#comment:22
https://trac.sagemath.org/ticket/19197#comment:22
<blockquote class="citation">
<p>
So vertices at exactly the limit distance should be handled differently. They must be on different list, from where they will be moved to "normal" <code>already_seen</code> list if they are re-discovered by smaller distance.
</p>
</blockquote>
<p>
That's going too far. I don't think that we need to 'fix' that, it probably shouldn't even exist. What you want to use in this case is a BFS. Or, actually, <code>Graph.shortest_path_lengths</code> (whose name is awful).
</p>
<p>
Nathann
</p>
TicketjmantysaloThu, 17 Sep 2015 04:23:03 GMT
https://trac.sagemath.org/ticket/19197#comment:23
https://trac.sagemath.org/ticket/19197#comment:23
<p>
I opened a new ticket for the DFS <code>distance</code>-bug: <a class="closed ticket" href="https://trac.sagemath.org/ticket/19227" title="defect: Graphs: DFS and broken distance-parameter (closed: fixed)">#19227</a>.
</p>
TicketjmantysaloThu, 17 Sep 2015 06:08:40 GMT
https://trac.sagemath.org/ticket/19197#comment:24
https://trac.sagemath.org/ticket/19197#comment:24
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:15" title="Comment 15">ncohen</a>:
</p>
<blockquote class="citation">
<ul><li>Definition of 'elems': this line often test for containment in 'too_close'. Do
not run containment tests in a list. use a 'set' for that: containment is
faster.
</li></ul></blockquote>
<p>
True. This gave a sligth benefit to speed.
</p>
<blockquote class="citation">
<ul><li>Anyway, you probably should do this differently, i.e. with
<code>breadth_first_search(report_distance=True)</code> while filtering 'too close'
elements according to their distance.
</li></ul></blockquote>
<p>
False. After
</p>
<pre class="wiki">elems = [e[0] for e in H.breadth_first_search(j, neighbors=H.neighbors_in, report_distance=True) if e[1] > B-2]`
</pre><p>
it took about infinite time to compute breadth of <code>TamariLattice(6)</code>, whereas
</p>
<pre class="wiki">too_close = set(H.breadth_first_search(j, neighbors=H.neighbors_in, distance=B-2))
elems = [e for e in H.order_ideal([j]) if e not in too_close]
</pre><p>
did it in 40 milliseconds.
</p>
<blockquote class="citation">
<ul><li>I still believe that it would be faster to use
'subsets_with_hereditary_property' inside of your code.
</li></ul></blockquote>
<p>
Don't know. Please not that this code tries first with maximal breadth and then goes down one by one. subsets_with_hereditary_property would work to other direction. I guess that it will stuck with too many pairs, triples and so on.
</p>
<blockquote class="citation">
<ul><li>Why do you deal with breadth 2 differently from the rest? That could mean a
couple more operations, but compared to everything else that should not mean
much.
</li></ul></blockquote>
<p>
True. This only slowed down the code like <code>sum(L.breadth() for L in L10)</code>, where <code>L10</code> is a list of 10-element lattices. And even it was very slight loss.
</p>
<p>
Anyways, I did some tests with product lattices and so. This code seems to be "fast enought". For example it takes 4,4 seconds to create a Tamari lattice, and 4,6 seconds to compute it's breadth. So this should not be a bottleneck.
</p>
TicketgitThu, 17 Sep 2015 06:14:51 GMTcommit changed
https://trac.sagemath.org/ticket/19197#comment:25
https://trac.sagemath.org/ticket/19197#comment:25
<ul>
<li><strong>commit</strong>
changed from <em>33610ad2c4327679b8dfc1d3af445fe4d9080e4a</em> to <em>b3ae7c4d82d67adb484d53fdecec4256b196e6a1</em>
</li>
</ul>
<p>
Branch pushed to git repo; I updated commit sha1. New commits:
</p>
<table class="wiki">
<tr><td><a class="ext-link" href="http://git.sagemath.org/sage.git/commit/?id=b3ae7c4d82d67adb484d53fdecec4256b196e6a1"><span class="icon"></span>b3ae7c4</a></td><td><code>Some simplifications.</code>
</td></tr></table>
TicketncohenFri, 18 Sep 2015 13:29:22 GMTreviewer set
https://trac.sagemath.org/ticket/19197#comment:26
https://trac.sagemath.org/ticket/19197#comment:26
<ul>
<li><strong>reviewer</strong>
set to <em>Nathann Cohen</em>
</li>
</ul>
<p>
Helloooooooooooo,
</p>
<p>
I've got only one thing to add to this code: <code>self[e]</code> looks dangerous to me, just because I do not know what exactly it does. The <code>__getitem__</code> function defined on posets seems to be inherited and (very) generic. In particular, I suspect that <code>self[e]</code> may be as "efficient" as <code>list(self)[e]</code>. I think that you should prefer <code>_element_to_vertex(x)</code> in this situation.
</p>
<p>
Nathann
</p>
TicketgitFri, 18 Sep 2015 14:26:33 GMTcommit changed
https://trac.sagemath.org/ticket/19197#comment:27
https://trac.sagemath.org/ticket/19197#comment:27
<ul>
<li><strong>commit</strong>
changed from <em>b3ae7c4d82d67adb484d53fdecec4256b196e6a1</em> to <em>9c05f49e95504acc1a37ea942764886c53f57198</em>
</li>
</ul>
<p>
Branch pushed to git repo; I updated commit sha1. New commits:
</p>
<table class="wiki">
<tr><td><a class="ext-link" href="http://git.sagemath.org/sage.git/commit/?id=9c05f49e95504acc1a37ea942764886c53f57198"><span class="icon"></span>9c05f49</a></td><td><code>From self[e] to self._vertex_to_element(e).</code>
</td></tr></table>
TicketjmantysaloFri, 18 Sep 2015 14:28:10 GMT
https://trac.sagemath.org/ticket/19197#comment:28
https://trac.sagemath.org/ticket/19197#comment:28
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:26" title="Comment 26">ncohen</a>:>
</p>
<blockquote class="citation">
<p>
I think that you should prefer <code>_element_to_vertex(x)</code> in this situation.
</p>
</blockquote>
<p>
Corrected. (To <code>_vertex_to_element</code> of course, not <code>_element_to_vertex</code>.)
</p>
TicketncohenFri, 18 Sep 2015 14:31:12 GMTstatus changed
https://trac.sagemath.org/ticket/19197#comment:29
https://trac.sagemath.org/ticket/19197#comment:29
<ul>
<li><strong>status</strong>
changed from <em>needs_review</em> to <em>positive_review</em>
</li>
</ul>
<p>
OOps, right. Good to go then. Good work, really.
</p>
<p>
Nathann
</p>
TicketncohenFri, 18 Sep 2015 14:34:15 GMT
https://trac.sagemath.org/ticket/19197#comment:30
https://trac.sagemath.org/ticket/19197#comment:30
<p>
By the way: it seems to me that, theoretically speaking, having breadth <code>>=b</code> is equivalent to:
</p>
<pre class="wiki">sage: G1 = my_lattice.hasse_diagram().transitive_closure()
sage: G2 = posets.BooleanLattice(p).hasse_diagram().transitive_closure()
sage: G1.subgraph_search(G2,induced=True)
True
</pre><p>
But well. Does not seem like it would help on the computational side.
</p>
<p>
Nathann
</p>
TicketjmantysaloFri, 18 Sep 2015 15:59:44 GMT
https://trac.sagemath.org/ticket/19197#comment:31
https://trac.sagemath.org/ticket/19197#comment:31
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:30" title="Comment 30">ncohen</a>:
</p>
<blockquote class="citation">
<p>
By the way: it seems to me that, theoretically speaking, having breadth <code>>=b</code> is equivalent to:
</p>
<pre class="wiki">sage: G1 = my_lattice.hasse_diagram().transitive_closure()
sage: G2 = posets.BooleanLattice(p).hasse_diagram().transitive_closure()
sage: G1.subgraph_search(G2,induced=True)
True
</pre></blockquote>
<p>
And that is same as <code>L.has_isomorphic_subposet(Posets.BooleanLattice(b))</code>. I guess you are right. (But not sure - must think about this.)
</p>
<p>
I made an example of smallest lattice with breadth 4 with <code>completion_by_cuts()</code> and did not notice that the result was just <code>BooleanLattice(4)</code>. Oops.
</p>
<p>
And that means that my example of lattice with breadth 4 is stupid, as it hides the structure to dig6 string.
</p>
TicketncohenFri, 18 Sep 2015 16:57:34 GMT
https://trac.sagemath.org/ticket/19197#comment:32
https://trac.sagemath.org/ticket/19197#comment:32
<p>
Hello,
</p>
<blockquote class="citation">
<p>
And that is same as <code>L.has_isomorphic_subposet(Posets.BooleanLattice(b))</code>
</p>
</blockquote>
<p>
Oh, sorry. I tend to turn everything into graphs, and work with that.
</p>
<blockquote class="citation">
<p>
I made an example of smallest lattice with breadth 4 with <code>completion_by_cuts()</code> and did not notice that the result was just <code>BooleanLattice(4)</code>. Oops.
</p>
<p>
And that means that my example of lattice with breadth 4 is stupid, as it hides the structure to dig6 string.
</p>
</blockquote>
<p>
I see. Well, you can fix this in any other ticket if you wish anyway.
</p>
<p>
Nathann
</p>
TicketjmantysaloFri, 18 Sep 2015 17:22:12 GMT
https://trac.sagemath.org/ticket/19197#comment:33
https://trac.sagemath.org/ticket/19197#comment:33
<p>
Replying to <a class="ticket" href="https://trac.sagemath.org/ticket/19197#comment:32" title="Comment 32">ncohen</a>:
</p>
<blockquote class="citation">
<p>
I see. Well, you can fix this in any other ticket if you wish anyway.
</p>
</blockquote>
<p>
I'll do that after <a class="closed ticket" href="https://trac.sagemath.org/ticket/19123" title="enhancement: LatticePoset: add is_vertically_decomposable (closed: fixed)">#19123</a> on <a class="closed ticket" href="https://trac.sagemath.org/ticket/19190" title="enhancement: LatticePoset: add atoms, coatoms, doubly irreducibles etc. (closed: fixed)">#19190</a>.
</p>
TicketvbraunFri, 18 Sep 2015 19:10:41 GMTstatus, branch changed; resolution set
https://trac.sagemath.org/ticket/19197#comment:34
https://trac.sagemath.org/ticket/19197#comment:34
<ul>
<li><strong>status</strong>
changed from <em>positive_review</em> to <em>closed</em>
</li>
<li><strong>resolution</strong>
set to <em>fixed</em>
</li>
<li><strong>branch</strong>
changed from <em>u/jmantysalo/latticeposet__add_breadth__</em> to <em>9c05f49e95504acc1a37ea942764886c53f57198</em>
</li>
</ul>
Ticket