Opened 2 years ago
Last modified 15 months ago
#25091 closed enhancement
Expose some normaliz features — at Version 64
Reported by:  jipilab  Owned by:  

Priority:  major  Milestone:  sage8.8 
Component:  geometry  Keywords:  polytope, normaliz, IMAPolyGeom 
Cc:  vdelecroix, moritz, mkoeppe, ghsebasguts, Winfried, ghbraunmath, tscrim  Merged in:  
Authors:  JeanPhilippe Labbé  Reviewers:  
Report Upstream:  N/A  Work issues:  
Branch:  public/some_normaliz_features (Commits)  Commit:  dbf7db846182a3bc496458d8eea1bf10e8792b0a 
Dependencies:  #22984, #25090, #20382  Stopgaps: 
Description (last modified by )
Using Normaliz can compute several things that are currently not interfaced. This ticket implements:
 Integral points generators
 Euclidean volume and volume
 Triangulation
 Hilbert series
 Ehrhart series of compact polyhedron
 Ehrhart quasipolynomial of compact polyhedron
FOLLOWUP:
 Make ehrhart_polynomial uniform (in the spirit of
.volume()
that can have several engines.  Move the lattice related methods to
Polyhedra_normaliz_QQ
.
Change History (64)
comment:1 Changed 2 years ago by
 Branch set to u/jipilab/normaliz_features
 Commit set to e421ff89087f9d231056f163885b5ce251ed5d3b
 Dependencies set to #22984, #25090
comment:2 Changed 2 years ago by
 Commit changed from e421ff89087f9d231056f163885b5ce251ed5d3b to ac35212bdcb4b4671defd9e25ab90634d13f118d
comment:3 Changed 2 years ago by
 Branch changed from u/jipilab/normaliz_features to u/jipilab/some_normaliz_features
 Commit changed from ac35212bdcb4b4671defd9e25ab90634d13f118d to bf8b172f1a58a6d71b857575b00dd19b82f9d2e4
New commits:
bf8b172  Merge branch 'u/mkoeppe/pynormaliz2'

comment:4 Changed 2 years ago by
 Branch changed from u/jipilab/some_normaliz_features to public/some_normaliz_features
 Commit changed from bf8b172f1a58a6d71b857575b00dd19b82f9d2e4 to 9a6af25046563f288b4c93e61f0e3ab1fa593bf0
comment:5 Changed 2 years ago by
 Commit changed from 9a6af25046563f288b4c93e61f0e3ab1fa593bf0 to 3d3f64ab940e973a79557a37ddb7bb346b6b5898
Branch pushed to git repo; I updated commit sha1. New commits:
3d3f64a  Handle default behavior

comment:6 Changed 2 years ago by
 Commit changed from 3d3f64ab940e973a79557a37ddb7bb346b6b5898 to f376748a6c5330ea2f23d5e82de289ed7a575433
comment:7 followup: ↓ 21 Changed 2 years ago by
Note that is_package_installed
is being deprecated, see #20382
comment:8 Changed 2 years ago by
 Commit changed from f376748a6c5330ea2f23d5e82de289ed7a575433 to 55c9a04b9865076dc03dc3d701237ddaf5ee9614
Branch pushed to git repo; I updated commit sha1. New commits:
55c9a04  Added Hilbert series

comment:9 Changed 2 years ago by
 Commit changed from 55c9a04b9865076dc03dc3d701237ddaf5ee9614 to 720dc046b8831170f06fe341bdfa42fefeb240c4
Branch pushed to git repo; I updated commit sha1. New commits:
720dc04  Made tests pass

comment:10 Changed 2 years ago by
 Commit changed from 720dc046b8831170f06fe341bdfa42fefeb240c4 to a4ddeac32690b8be6a5edf66ff77979a1e9a2b3f
Branch pushed to git repo; I updated commit sha1. New commits:
a4ddeac  Cropped feature methods

comment:11 Changed 2 years ago by
 Commit changed from a4ddeac32690b8be6a5edf66ff77979a1e9a2b3f to 44549fa150a08fa327c7ae34f322bf4927b2eeec
comment:12 Changed 2 years ago by
 Commit changed from 44549fa150a08fa327c7ae34f322bf4927b2eeec to e32c9fddc39dd6780fea791796c5a536ecf4d42c
Branch pushed to git repo; I updated commit sha1. New commits:
e32c9fd  Added triangulate with normaliz

comment:13 Changed 2 years ago by
 Description modified (diff)
comment:14 Changed 2 years ago by
 Commit changed from e32c9fddc39dd6780fea791796c5a536ecf4d42c to 45c39b85af9485489e081b14af08fcde297c4bd5
Branch pushed to git repo; I updated commit sha1. New commits:
45c39b8  Some small edits

comment:15 Changed 2 years ago by
 Commit changed from 45c39b85af9485489e081b14af08fcde297c4bd5 to 56798216576de3521fc2276c734cbf6c70f14c62
Branch pushed to git repo; I updated commit sha1. New commits:
5679821  Added handling the triangulation with point configurations

comment:16 Changed 2 years ago by
 Commit changed from 56798216576de3521fc2276c734cbf6c70f14c62 to b045caf912268156ab2ea91b9efc425b8510df09
Branch pushed to git repo; I updated commit sha1. New commits:
b045caf  Adapted inhomogeneous case

comment:17 Changed 2 years ago by
 Description modified (diff)
comment:18 followup: ↓ 19 Changed 2 years ago by
It's probably best to add all latticepoint related methods to Polyhedron_QQ_normaliz
rather than Polyhedron_normaliz
in anticipation of #25097 (which moves integral_points
and integral_hull
there).
comment:19 in reply to: ↑ 18 ; followup: ↓ 20 Changed 2 years ago by
comment:20 in reply to: ↑ 19 Changed 2 years ago by
Replying to jipilab:
Replying to mkoeppe:
It's probably best to add all latticepoint related methods to
Polyhedron_QQ_normaliz
rather thanPolyhedron_normaliz
in anticipation of #25097 (which movesintegral_points
andintegral_hull
there).I see. Yes I can do that.
As discussed with mkoeppe: The move of the rational polyhedron methods will be done in #25097. The merge should go without conflicts.
comment:21 in reply to: ↑ 7 ; followup: ↓ 30 Changed 2 years ago by
 Dependencies changed from #22984, #25090 to #22984, #25090, #20382
 Description modified (diff)
Replying to vdelecroix:
Note that
is_package_installed
is being deprecated, see #20382
Oh! Wow! Thanks for the timely warning! I set it as a dependency. I will change the check once it is merged...
comment:22 Changed 2 years ago by
 Commit changed from b045caf912268156ab2ea91b9efc425b8510df09 to e7be14a7c0f2b3f2fc13cc2196b6f4cbad3bccdd
Branch pushed to git repo; I updated commit sha1. New commits:
e7be14a  Fixed some polytopes and added some tests

comment:23 Changed 2 years ago by
 Commit changed from e7be14a7c0f2b3f2fc13cc2196b6f4cbad3bccdd to 3615f8d33d9fb37ca0a905195dd21c8e17a05a7d
comment:24 Changed 2 years ago by
 Commit changed from 3615f8d33d9fb37ca0a905195dd21c8e17a05a7d to 8bff81f233fb570e8477db0ea1d898a8a611512d
comment:25 Changed 2 years ago by
 Commit changed from 8bff81f233fb570e8477db0ea1d898a8a611512d to d0e2498cf6a3e028c856702c23316e709fe17b00
Branch pushed to git repo; I updated commit sha1. New commits:
d0e2498  Added the Ehrhart Series and QuasiPolynomial

comment:26 Changed 2 years ago by
 Commit changed from d0e2498cf6a3e028c856702c23316e709fe17b00 to 668fdfc3d93a3c567c3258b7d8eaa2c36bcc5400
Branch pushed to git repo; I updated commit sha1. New commits:
668fdfc  Added an example

comment:27 Changed 2 years ago by
 Description modified (diff)
comment:28 Changed 2 years ago by
 Milestone changed from sage8.2 to sage8.4
comment:29 Changed 2 years ago by
Out of interest: Given a monomial ideal in Sage, how can one compute its Hilbert series using normaliz?
comment:30 in reply to: ↑ 21 ; followup: ↓ 37 Changed 2 years ago by
Replying to jipilab:
Replying to vdelecroix:
Note that
is_package_installed
is being deprecated, see #20382Oh! Wow! Thanks for the timely warning! I set it as a dependency. I will change the check once it is merged...
It is really a pity that #20382 doesn't explain (in the ticket description!!!) what one is supposed to do instead of is_package_installed
.
So, how would one change the current branch to make it use #20382?
comment:31 followup: ↓ 32 Changed 2 years ago by
Replying to SimonKing?
Normaliz can compute Hilbert series of the integral closures of monomial ideals, not of arbitrary monomial ideals. If one wants to apply it: input the monomials as vertices, the polynomial ring as cone (= unit matrix), and put the weights of the indeterminates into the grading.
Example for Normaliz input (integral closure of (x^{2, y}2, deg x = 2, deg y =1)
amb_space 2 vertices 2 2 0 1 0 2 1 cone 2 1 0 0 1 grading 2 1 HilbertSeries
comment:32 in reply to: ↑ 31 ; followup: ↓ 34 Changed 2 years ago by
Replying to Winfried:
Replying to SimonKing?
Normaliz can compute Hilbert series of the integral closures of monomial ideals, not of arbitrary monomial ideals. If one wants to apply it: input the monomials as vertices, the polynomial ring as cone (= unit matrix), and put the weights of the indeterminates into the grading.
Example for Normaliz input (integral closure of (x^{2, y}2, deg x = 2, deg y =1)
amb_space 2 vertices 2 2 0 1 0 2 1 cone 2 1 0 0 1 grading 2 1 HilbertSeries
Thank you. But my question was really: Given a monomial ideal in Sage, how to compute the Hilbert series of it (or maybe of its integral closure) using normaliz as a backend (not: in normaliz).
I.e.:
sage: n=4;m=11;P = PolynomialRing(QQ,n*m,"x"); x = P.gens(); M = Matrix(n,x) sage: I = P.ideal(M.minors(2)) sage: J = P*[m.lm() for m in I.groebner_basis()]
and what now? How to use the new interface to normaliz to compute the Hilbert series of (the integral closure of) J
?
Since I never worked with the integral closure of a monomial ideal: How are the Hilbert series of the ideal and its integral closure related?
comment:33 Changed 23 months ago by
"What now?" is a question that JeanPhilippe should answer. I have no overview which of the Normaliz computation goals can be reached from Sage. However, all of them can be reached via PyNormaliz?. For example, Normaliz can take a binomial ideal (suitably encoded) as input. It finds the affine monoid A defined by the binomial ideal, and then computes invariants of the normalization of A.
I think there is no way to relate the Hilbert series of an ideal I and the Hilbert series of its integral closure in a useful way.
comment:34 in reply to: ↑ 32 Changed 23 months ago by
Thank you. But my question was really: Given a monomial ideal in Sage, how to compute the Hilbert series of it (or maybe of its integral closure) using normaliz as a backend (not: in normaliz).
I.e.:
sage: n=4;m=11;P = PolynomialRing(QQ,n*m,"x"); x = P.gens(); M = Matrix(n,x) sage: I = P.ideal(M.minors(2)) sage: J = P*[m.lm() for m in I.groebner_basis()]and what now? How to use the new interface to normaliz to compute the Hilbert series of (the integral closure of)
J
?Since I never worked with the integral closure of a monomial ideal: How are the Hilbert series of the ideal and its integral closure related?
My knowledge of the theory lacks in order to give you a satisfactory answer to your questions.
The present ticket allows you to get the Hilbert series of a rational polyhedron. Whether this Hilbert series is or corresponds to the Hilbert series of a monomial ideal should be clarified. In case they match, then, what I would do in Sage is
1) a function that transfers the data of the monomial ideal to a polyhedron that uses the backend normaliz and compute its hilbert series (a function implemented in this ticket).
2) use PyNormaliz? package directly as the normaliz backend of polyhedron is currently doing. This is not really well documented as its usage is currently only in the backend for polyhedron.
For this, the present ticket provides some useful functions to deal with PyNormaliz? that builds on #25090.
comment:35 Changed 23 months ago by
Would this compute what you want? I did this with the current ticket and merge with the latest sage.
sage: n=4;m=11;P = PolynomialRing(QQ,n*m,"x"); x = P.gens(); M = Matrix(n,x) sage: I = P.ideal(M.minors(2)) sage: J = P*[m.lm() for m in I.groebner_basis()] sage: P = Polyhedron(rays=[m.degrees() for m in J.gens()],backend='normaliz') sage: P A 42dimensional polyhedron in ZZ^44 defined as the convex hull of 1 vertex and 330 rays sage: P.hilbert_series()
Well, I'd say this is a quite large computation involving triangulating a 42 dimensional polyhedron with 330 vertices...
comment:36 Changed 23 months ago by
What one really wants is not the Hilbert series of the ideal. The goal is the Hilbert series of the residue class ring defined by the ideal. The residue class ring can be identified with a monomial algebra of Krull dimension 14. The Hilbert series is easily calculated. If one wants to use Normaliz for it, then the best approach is to input the binomial ideal to Normaliz.
You make a matrix of integer vectors whose rows represent the binomials in the form vw, corresponding to x^{v  x}w. The input type is "lattice_ideal". You must also define a grading. Simply say "total_degree". In this case it represents a vector with 44 entries 1.
That one can go this way is due to the fact that the residue class ring is a normal monoid algebra. In general this is not the case.
comment:37 in reply to: ↑ 30 Changed 23 months ago by
Replying to SimonKing:
Replying to jipilab:
Replying to vdelecroix:
Note that
is_package_installed
is being deprecated, see #20382Oh! Wow! Thanks for the timely warning! I set it as a dependency. I will change the check once it is merged...
It is really a pity that #20382 doesn't explain (in the ticket description!!!) what one is supposed to do instead of
is_package_installed
.
Indeed.
comment:38 followup: ↓ 39 Changed 23 months ago by
Concerning the is_package_installed
thing, it seems that it is not deprecated when we want to check whether a package was installed using sage i
which is what we want here I guess.
So I would remove it from the necessary things to do for this ticket.
comment:39 in reply to: ↑ 38 ; followup: ↓ 40 Changed 23 months ago by
Replying to jipilab:
Concerning the
is_package_installed
thing, it seems that it is not deprecated when we want to check whether a package was installed usingsage i
which is what we want here I guess.So I would remove it from the necessary things to do for this ticket.
I tend to disagree. is_package_installed
is used here to test whether pynormaliz, normaliz, or latte_int is available. And it may be available for reasons that have nothing to do with sage i
.
comment:40 in reply to: ↑ 39 ; followups: ↓ 41 ↓ 42 Changed 23 months ago by
Replying to SimonKing:
Replying to jipilab:
Concerning the
is_package_installed
thing, it seems that it is not deprecated when we want to check whether a package was installed usingsage i
which is what we want here I guess.So I would remove it from the necessary things to do for this ticket.
I tend to disagree.
is_package_installed
is used here to test whether pynormaliz, normaliz, or latte_int is available. And it may be available for reasons that have nothing to do withsage i
.
Okay. Nevertheless, the mentionned ticket does not provide a way to check if they are available or not. Right?
comment:41 in reply to: ↑ 40 Changed 23 months ago by
Replying to jipilab:
Okay. Nevertheless, the mentionned ticket does not provide a way to check if they are available or not. Right?
Yes, it doesn't. But I guess that's because the author of that ticket thought that the solution is relatively straight forward: When you code in Python, how do you use a package? You do so by importing stuff. Hence, you should test whether the package is available by trying to import something from that package (sooner or later, you will do anyway!), surrounded with a try: ... except ImportError: ...
clause.
Indeed, in your code, you do import PyNormaliz
. So, this could be
try: import PyNormaliz except ImportError: raise ImportError("some meaningful error message, telling the user at least one way to install PyNormaliz")
For latte_int, I am not so sure what you do, because I don't see a related import statement in your code. However, in some place of your code, you do use it; you should try and see what happens if latte_int is not available: Will there be an ImportError, or a TypeError, or a ValueError, etc.? Then, instead of is_package_available
, you could try to use latte_int in a dry run, and if an error occurs, you take it as an answer (namely the answer that latte_int is not available).
comment:42 in reply to: ↑ 40 Changed 23 months ago by
Replying to jipilab:
Replying to SimonKing:
Replying to jipilab:
Concerning the
is_package_installed
thing, it seems that it is not deprecated when we want to check whether a package was installed usingsage i
which is what we want here I guess.So I would remove it from the necessary things to do for this ticket.
I tend to disagree.
is_package_installed
is used here to test whether pynormaliz, normaliz, or latte_int is available. And it may be available for reasons that have nothing to do withsage i
.Okay. Nevertheless, the mentionned ticket does not provide a way to check if they are available or not. Right?
I think what may be intended here is to subclass sage.features.PythonModule
.
comment:43 Changed 22 months ago by
 Commit changed from 668fdfc3d93a3c567c3258b7d8eaa2c36bcc5400 to d5fdf168e49a1a4260dc804d29998f33da3a4aba
Branch pushed to git repo; I updated commit sha1. New commits:
d5fdf16  Merge branch 'develop' into HEAD

comment:44 Changed 22 months ago by
 Commit changed from d5fdf168e49a1a4260dc804d29998f33da3a4aba to 20d2fde78750f859770f05473f49d7de460d6bb1
Branch pushed to git repo; I updated commit sha1. New commits:
20d2fde  Fix typo in docstring.

comment:45 Changed 22 months ago by
 Cc ghbraunmath added
comment:46 Changed 18 months ago by
I looked back at this now, and I'm getting a weird result from an example in Normaliz' manual:
sage: P = Polyhedron(vertices=[(1/2,1/2),(1/3,1/3),(1/4,1/2)],backend='normaliz') sage: P A 2dimensional polyhedron in QQ^2 defined as the convex hull of 3 vertices sage: P.hilbert_series() 0
From calling PyNormaliz?, I get:
sage: h = PyNormaliz.NmzResult(new_cone, "HilbertSeries") sage: h [[], [], 1L]
So it's not the parsing from PyNormaliz? to sage, but rather something weird is happening here. It has been a while since I wrote the code so I am a bit lost in trying to figure out what is going on...
I'll try to do some more searching...
comment:47 followup: ↓ 48 Changed 18 months ago by
I guess you want the Ehrhart series. The Normaliz computation goal HilbertSeries? requires the existence of a grading which is not defined with this input.
This is an intricate point, but it is inevitable: the grading is defined on the space in which the polyhedron lives, not on the cone over it.
comment:48 in reply to: ↑ 47 Changed 18 months ago by
Replying to Winfried:
I guess you want the Ehrhart series. The Normaliz computation goal HilbertSeries? requires the existence of a grading which is not defined with this input.
Yes, in the end I want the Ehrhart series, but I was doing both computations to understand their difference again...
This is an intricate point, but it is inevitable: the grading is defined on the space in which the polyhedron lives, not on the cone over it.
Yes, agreed. I take this example from p.23 of the Normaliz manual. I thought that HilbertSeries? would work out of the box like in the example. That's why I am asking myself what I did wrong.
In the code for hilbert_series, if no grading is given, we canonically give the all 1's grading on the ambient space, and then ask for the HilbertSeries?, in which case we get the empty result(!?). (Is that a good choice of canonical grading?)
So, my question would be, how do I get the result from the Normaliz manual?
This issue makes me wonder also if the results in the current examples are actually true (because maybe I am doing something wrong). For example, what would be the expected outputs of for the 3dimensional 01 cube?
Currently it gives me:
sage: C = Polyhedron(vertices = [[0,0,0],[0,0,1],[0,1,0],[0,1,1],[1,0,0],[1,0,1] ....: ,[1,1,0],[1,1,1]],backend='normaliz') sage: C.hilbert_series() t^3 + 3*t^2 + 3*t + 1 sage: C.ehrhart_series() (t^2 + 4*t + 1)/(t^10  t^9  t^8 + 2*t^5  t^2  t + 1) sage: C.ehrhart_series().denominator().factor() (t + 1)^2 * (t  1)^4 * (t^2 + 1) * (t^2 + t + 1) sage: C.ehrhart_quasipolynomial() t^3 + 3*t^2 + 3*t + 1
Hilbert series and ehrhart polynomials look fine, but the denominator of the Ehrhart series is not what I would expect. Is that related to the discussion on p.24 of Normaliz' manual?
I would have expected (1t)^{4, so that the degree would be 2 so that when we dilate by a factor two, we see the first interior integer point. Right? }
comment:49 Changed 18 months ago by
HilbertSeries? works out of the box, provided you use the right input type:
polytope defines a cone and a grading, and the Hilbert series of the cone under this grading is the Ehrhart series of the polytope that arises as the intersection of the cone and the hyperplane of degree 1 elements. polytope is a homogeneous input type.
The inhomogeneous type vertices only defines a polytope P, or in connection with cone a potentially umbounded polyhedron. We can have an grding on the space in which the polytope lives, and then HilbertSeries? makes sense. This Hilbert series is the generating function of the lattice point enumerator of the points in P. You can ask Normaliz to compute the Ehrhart series of P if P is bounded. It constructs a cone with a grading behind the scenes and computes the Hilbert series of that cone.
So far, so good.
In your cube example: the vertices are in principle rational vectors. What are your denominators? Does Sage choose them automatically? If so what are they?
Moreover, it seems to me tat Sage implicitly defines a grading on the space of the polytpe, taking the sum of the coordinates. This explains why you get a Hilbert series and that it is exactly the result shown: 1 point of degree 0, 3 points of degree 1, 3 of degree 2 and 1 of degree 3.
The denominator of the Ehrhart series is for a 3dimensional polytope that has vertices of degree 4,3,2 qnd 1.
Before I can say more, I must know what happens on the way from Sage input to Normaliz input.
comment:50 Changed 18 months ago by
PS. My question for the denominators must be understood form the viewpoint of Normaliz input. For the 3cube we need vertices with four coordinates in Normaliz where the last coordinate represents a denominator for the first three. (This was introduced before Normaliz allowed fractions in input; it would be superfluous now.)
If you send the matrix with the 8 vertices as it is to Normaliz as input type vertices, then the result will be a BadInputexception?, because we have the denominator 0 for some vertices. In other words: Something must have happened from the Sage vertices before they go into Normaliz.
I think you want the denominator 1 attached to each vertex. If we do that, the Normaliz input becomes
amb_space auto vertices [[0,0,0,1],[0,0,1,1],[0,1,0,1],[0,1,1,1],[1,0,0,1],[1,0,1,1],[1,1,0,1],[1,1,1,1]] EhrhartSeries
and the output contains
Ehrhart series: 1 4 1 denominator with 4 factors: 1: 4
as one expects.
Again my question: what happens on the way from Sage input to Normaliz input?
comment:51 Changed 18 months ago by
Dear Winfried,
Thanks for your quick reply. This explanation is really good, and probably I should add it to the top to shortly describe how the translation from sage to normaliz is done.
Here is the verbose of the example for the cube, we can see the input given by sage to normaliz. The current way is to always use inhomogeneous (vertices, cones, and subspaces). Eventually, inhom. equations and inequalities are added.
sage: C = Polyhedron(vertices = [[0,0,0],[0,0,1],[0,1,0],[0,1,1],[1,0,0],[1,0,1],[1,1,0],[1,1,1]],backend='normaliz', ....: verbose=True) # Calling PyNormaliz.NmzCone(**{'subspace': [], 'vertices': [[0, 0, 0, 1], [0, 0, 1, 1], [0, 1, 0, 1], [0, 1, 1, 1], [1, 0, 0, 1], [1, 0, 1, 1], [1, 1, 0, 1], [1, 1, 1, 1]], 'cone': []}) # 8< Equivalent Normaliz input file 8< amb_space 3 subspace 0 vertices 8 0 0 0 1 0 0 1 1 0 1 0 1 0 1 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1 1 cone 0 # 8<8<8<
Once I have this, I compute the hilbert series, without giving any grading. In this case, Sage assign the grading [1,...,1] and passes it to normaliz (as you observed) and get:
sage: C.hilbert_series() t^3 + 3*t^2 + 3*t + 1
Now, for the Ehrhart series, I get:
sage: C.ehrhart_series() (t^2 + 4*t + 1)/(t^10  t^9  t^8 + 2*t^5  t^2  t + 1)
As you said:
It constructs a cone with a grading behind the scenes and computes the Hilbert series of that cone.
What is the grading in this case?
So there seems to be a difference between the above input and the one you mentioned:
amb_space auto vertices [[0,0,0,1],[0,0,1,1],[0,1,0,1],[0,1,1,1],[1,0,0,1],[1,0,1,1],[1,1,0,1],[1,1,1,1]] EhrhartSeries
because they give different output for the Ehrhart series?
comment:52 followup: ↓ 53 Changed 18 months ago by
The Normaliz input is the desired one. If I run it with the computation goal EhhhartSeries? I get the right answer, and if I run it with the computation goal HilbertSeries?, then the answer is that it cannot be computed. Everything o.k.
But it is also clear that this workflow is not exactly that of Sage/PyNormaliz?: the latter constructs a cone and then calls functions that trigger computations and return results. For a better analysis the next step is to put the same data into PyNormaliz? and to see what happens.
comment:53 in reply to: ↑ 52 Changed 18 months ago by
Replying to Winfried:
The Normaliz input is the desired one. If I run it with the computation goal EhhhartSeries? I get the right answer, and if I run it with the computation goal HilbertSeries?, then the answer is that it cannot be computed. Everything o.k.
Ok. Good to know!
But it is also clear that this workflow is not exactly that of Sage/PyNormaliz?: the latter constructs a cone and then calls functions that trigger computations and return results. For a better analysis the next step is to put the same data into PyNormaliz? and to see what happens.
Ok, I will look into that.
comment:54 Changed 18 months ago by
Ok, I think that I have figured out something, that I did not expect.
sage: C = Polyhedron(vertices = [[0,0,0],[0,0,1],[0,1,0],[0,1,1],[1,0,0],[1,0,1],[1,1,0],[1,1,1] ....: ],backend='normaliz',verbose=True) # Calling PyNormaliz.NmzCone(**{'subspace': [], 'vertices': [[0, 0, 0, 1], [0, 0, 1, 1], [0, 1, 0, 1], [0, 1, 1, 1], [1, 0, 0, 1], [1, 0, 1, 1], [1, 1, 0, 1], [1, 1, 1, 1]], 'cone': []}) # 8< Equivalent Normaliz input file 8< amb_space 3 subspace 0 vertices 8 0 0 0 1 0 0 1 1 0 1 0 1 0 1 1 1 1 0 0 1 1 0 1 1 1 1 0 1 1 1 1 1 cone 0 # 8<8<8<
So far I simply created the 01cube using normaliz. Then, I get the normaliz cone and the enclosed data:
sage: cone = C._normaliz_cone;cone <capsule object "Cone" at 0x7fcc880cf210> sage: data = C._get_nmzcone_data()
From there, if I compute the Hilbert Series directly, it will complain, because it does not have a grading:
sage: PyNormaliz.NmzResult(cone,"HilbertSeries")  error Traceback (most recent call last) <ipythoninput2693c9989a5377> in <module>() > 1 PyNormaliz.NmzResult(cone,"HilbertSeries") error: Could not compute: No grading specified and cannot find one. Cannot compute some requested properties!
So far so good. Then, I ask for the Ehrhart series and then, it unlocks the Hilbert series:
sage: PyNormaliz.NmzResult(cone,"EhrhartSeries") [[1L, 4L, 1L], [1L, 1L, 1L, 1L], 0L] sage: PyNormaliz.NmzResult(cone,"HilbertSeries") [[1L, 4L, 1L], [1L, 1L, 1L, 1L], 0L]
For now, there is no way to access the difference in the normaliz cone. It is now equipped with a grading in the background, but it seems that we can not extract it:
sage: new_C = C.parent().element_class._from_normaliz_cone(parent=C.parent(),normaliz_cone=cone) sage: PyNormaliz.NmzResult(new_C._normaliz_cone,"HilbertSeries") [[1L, 4L, 1L], [1L, 1L, 1L, 1L], 0L] sage: PyNormaliz.NmzResult(new_C._normaliz_cone,"Grading")  error Traceback (most recent call last) <ipythoninput43809cb91899e5> in <module>() > 1 PyNormaliz.NmzResult(new_C._normaliz_cone,"Grading") error: Could not compute: Grading !
But, when creating a whole new cone from the data and adding by hand the grading (which is what is done in hilbert_series
in sage, we get the enumeration of the lattice points with respect to this grading:
sage: C.hilbert_series() t^3 + 3*t^2 + 3*t + 1
This can be reproduced manually as:
sage: data['grading'] = [1] * C.ambient_dim() sage: new_cone = C._make_normaliz_cone(data) sage: h = PyNormaliz.NmzResult(new_cone, "HilbertSeries");h [[1L, 3L, 3L, 1L], [], 0L]
So, right now, we get two potential output for Hilbert series.
Further, if we actually change the normaliz cone while doing the computation of the Hilbert series (instead of doing it on a distinct cone created on the side), it then breaks the ehrhart series:
sage: C._normaliz_cone = new_cone sage: C.ehrhart_series()  error Traceback (most recent call last) <ipythoninput850117d93a819f> in <module>() > 1 C.ehrhart_series() /home/jplabbe/sage/local/lib/python2.7/sitepackages/sage/geometry/polyhedron/backend_normaliz.pyc in ehrhart_series(self, variable) 636 637 cone = self._normaliz_cone > 638 e = PyNormaliz.NmzResult(cone, "EhrhartSeries") 639 640 from sage.rings.polynomial.polynomial_ring_constructor import PolynomialRing error: Some error in the normaliz input data detected: Grading not allowed with Ehrhart series in the inhomogeneous case
Oi Oi Oi...
comment:55 Changed 18 months ago by
Everything above is as it should be, with 2 exceptions:
1) The computation of the Ehrhart series should not make the Hilbert series computed. This requires a correction in libnormaliz.
There is no grading in the background. The Ehrhart series was computed on an auxiliary cone.
2) Sage should not add a default grading if one asks for the Hilbert series. This is not in my hands.
Further remark: if one changes the grading later on (or defines one), all grading dependent results are erased.
comment:56 Changed 18 months ago by
PS. The last observation is the intended behavior of Normaliz: HilbertSeries? and EhrhartSeries? are not allowed on the same input. One intended obstruction to this is that we do not even allow a grading if EhrhartSeries? is to be computed.
comment:57 Changed 18 months ago by
The next Normaliz release will store the Hilbert series and the Ehrhart series separately and provide a specific access function for the Ehrhart series. Then the confusion that we see in the preceding remarks will not arise anymore.
comment:58 Changed 18 months ago by
Ok!!
So I went back to PyNormaliz? and had a look at the tutorial mentioned in Appendix E of the normaliz manual:
https://mybinder.org/v2/gh/Normaliz/NormalizJupyter/master
In there, the PyNormaliz? output of .hilbert_series is explained. Importantly, it seems that the output is different than the one of normaliz(!). This was the source of one of the above problems.
[[1L, 4L, 1L], [1L, 1L, 1L, 1L], 0L]
As to be interpreted as t^2+4t+1
divided by (1t)(1t)(1t)(1t)
and a 0
shift.
So, it's all good. But it indicates that the code should be more explicit on those output formats... I'll put this on my list of TODO to add on some comments on the pynormaliz side...
comment:59 followup: ↓ 61 Changed 18 months ago by
In the Normaliz pitput file we use a multiset notation. i think this is difficult to represent in Python. Therefore it is translated into a vector format where each element of the multiset is repeated as often as it occurs in the multiset.
comment:60 Changed 18 months ago by
 Commit changed from 20d2fde78750f859770f05473f49d7de460d6bb1 to f36bfeb89fd9293a021a0bb2f8f9584ba1a5ca10
comment:61 in reply to: ↑ 59 Changed 18 months ago by
Replying to Winfried:
In the Normaliz pitput file we use a multiset notation. i think this is difficult to represent in Python. Therefore it is translated into a vector format where each element of the multiset is repeated as often as it occurs in the multiset.
Yes, makes sense now. Somehow got it wrong from the beginning. I just commited an update that should cover the mistake. I added several examples coming from normaliz manual to be able to compare.
I still have to smoothen the usage for the user. But now, at least things look like they work and make sense...
comment:62 Changed 18 months ago by
 Description modified (diff)
comment:63 Changed 18 months ago by
 Commit changed from f36bfeb89fd9293a021a0bb2f8f9584ba1a5ca10 to dbf7db846182a3bc496458d8eea1bf10e8792b0a
Branch pushed to git repo; I updated commit sha1. New commits:
dbf7db8  Fixed the Feature of Latte

comment:64 Changed 18 months ago by
 Description modified (diff)
Last 10 new commits:
Updating patch with upstream fix for wrong number of lattice points.
Merge branch 'u/tscrim/upgrade_noramliz_pynormaliz22984' of git://trac.sagemath.org/sage into u/tscrim/upgrade_noramliz_pynormaliz22984
Upgrade Normaliz to 3.5.2.
Adding tests from comment:24,25 of #22984.
Upgrade PyNormaliz to 1.12
Adapted the polyhedron docstring
Upgrade normaliz to 3.5.3
Merge branch to get docstring adaptation
Merge branch 'develop' into test_normaliz
First version of integral pts gen