Opened 4 years ago

Closed 3 years ago

# Polyhedron_normaliz.save

Reported by: Owned by: mkoeppe major sage-9.1 geometry jipilab, Winfried, tscrim, gh-kliem Jonathan Kliem Travis Scrimshaw N/A 57beae3 #28639

```sage: P = polytopes.dodecahedron(backend='normaliz')
sage: P
A 3-dimensional polyhedron in (Number Field in sqrt5 with defining polynomial x^2 - 5)^3 defined as the convex hull of 20 vertices
sage: P.save('dodecahedron.sobj')
TypeError: can't pickle PyCapsule objects
```

We fix this by removing the cone with `__getstate__` on pickling.

On unpickling we use `__setstate__` and `_cone_from_Vrepresentation_and_Hrepresentation` from #28639 to restore the cone.

Special care has to be taken in the following cases:

• no inequalities (the cone can only be initialized from Vrep),
• the empty polyhedron (cone is `None` in this case).

As the lines are recomputed, there is no guarantee that they appear in the same order in the normaliz cone. However, normaliz sorts the given lines anyway:

```sage: P = Polyhedron(lines=[[1,0], [0,1]], backend='normaliz').lines()
(A line in the direction (1, 0), A line in the direction (0, 1))
sage: P = Polyhedron(lines=[[0,1], [1,0]], backend='normaliz').lines()
(A line in the direction (1, 0), A line in the direction (0, 1))
sage: P = Polyhedron(lines=[[1,1], [1,0]], backend='normaliz').lines()
(A line in the direction (1, 0), A line in the direction (0, 1))
```

Also, even if `_normaliz_cone` has the lines somewhat shuffled, this shouldn't be noticable as computations are invariant on which line we choose.

### comment:1 Changed 4 years ago by mkoeppe

• Description modified (diff)

### comment:2 Changed 3 years ago by gh-kliem

One could define in `Polyhedron_normaliz`:

```def __getstate__(self):
state = super(Polyhedron_normaliz, self).__getstate__()
# Remove the unpicklable entries.
del state[1]['_normaliz_cone']
return state
```

This constructs an object just as

```sage: P = P.base_extend(P.base_ring(),backend='field')
sage: P.base_extend(P.base_ring(),backend='normaliz')
```

(but saving computed results)

However, one would need a method to recover `_normaliz_cone` (this method is needed anyway, to make the second thing work).

### comment:4 Changed 3 years ago by gh-kliem

I think I know what to do about it.

1. As mentioned one can just remove the cone on pickling. Then the loaded object is just as good as changing backend back and forth (and changing backend to normaliz should also work and still give us a cone or a way to retrive the cone).
1. Next step would be do allow initialization of a cone from `Vrepresentation` and `Hrepresentation`. This works by homogenization of the input and explicitly giving a dehomogenization (this is the only way that Normaliz accepts precomputed data).

Note: I'm not proposing to allow to give both representations to normaliz by the user, but when changing fields or loading a stored object I think we should trust them to be correct.

1. Once this is done, one can set up normaliz cone to be a lazy attribute.

### comment:5 Changed 3 years ago by gh-kliem

• Dependencies set to #28639

In #28639 I will implement a method that generates the cone from both Vrep and Hrep (recomputing the lines, but thats ok I guess). I have tested this with a few polyhedra, but I have no idea, which examples can be tricky.

### comment:6 Changed 3 years ago by gh-kliem

• Description modified (diff)
• Milestone changed from sage-8.4 to sage-9.0

### comment:7 Changed 3 years ago by gh-kliem

• Authors set to Jonathan Kliem
• Branch set to public/26363
• Commit set to 984cc62eb2b46b245dc93f5ca9dec983ec478531

New commits:

 ​c42c907 `method to obtain cone from Vrep and Hrep` ​cc17f07 `added documentation to cone from normaliz data` ​a54cfd9 `Merge branch 'public/28639' of git://trac.sagemath.org/sage into public/28639-reb` ​fc4c596 `alignment fix in docs` ​f42a4e2 `removed pickling restriction on normaliz tests` ​984cc62 `fixed pickling/unpickling of normaliz polyhedra`

### comment:8 Changed 3 years ago by git

• Commit changed from 984cc62eb2b46b245dc93f5ca9dec983ec478531 to a75d3b22d3ece100a0cdfa925644d9bd040cba3c

Branch pushed to git repo; I updated commit sha1. New commits:

 ​a75d3b2 `take care of special cases`

### comment:9 Changed 3 years ago by gh-kliem

• Description modified (diff)
• Status changed from new to needs_review

New commits:

 ​a75d3b2 `take care of special cases`

### comment:10 Changed 3 years ago by git

• Commit changed from a75d3b22d3ece100a0cdfa925644d9bd040cba3c to 12b90604672457b96b8bb26d4b133478734d4c85

Branch pushed to git repo; I updated commit sha1. New commits:

 ​12b9060 `fixing optional flags, added doctest for number field polyhedron pickling`

### comment:11 Changed 3 years ago by Winfried

I plan to extend Normaliz as suggested. The essential point is that Normaliz can skip a convex hull computation, but nevertheless can produce the facet/ectreme ray incidence vectors and keep them for further operations, like the modification of the original cone.

I am not sure whether one can give up the requirement to input the precomputed data in homogenized form with an added dehomogenization if they come from an inhomogeneous computation.

### comment:12 Changed 3 years ago by embray

• Milestone changed from sage-9.0 to sage-9.1

Ticket retargeted after milestone closed

### comment:13 Changed 3 years ago by gh-kliem

• Branch changed from public/26363 to public/26363-reb
• Commit changed from 12b90604672457b96b8bb26d4b133478734d4c85 to 57beae33ef606c604a22fffabd1cfaa02765e091

New commits:

 ​7681b24 `removed pickling restriction on normaliz tests` ​5a9decf `fixed pickling/unpickling of normaliz polyhedra` ​38dff79 `take care of special cases` ​57beae3 `fixing optional flags, added doctest for number field polyhedron pickling`

### comment:14 Changed 3 years ago by tscrim

• Reviewers set to Travis Scrimshaw
• Status changed from needs_review to positive_review

If there are any extensions to Normaliz that can be utilized in the future (as per comment:11), we can do further changes then. This is a nice improvement. LGTM.

### comment:15 Changed 3 years ago by vbraun

• Branch changed from public/26363-reb to 57beae33ef606c604a22fffabd1cfaa02765e091
• Resolution set to fixed
• Status changed from positive_review to closed

### comment:16 Changed 3 years ago by Winfried

• Commit 57beae33ef606c604a22fffabd1cfaa02765e091 deleted

Meanwhile I have implemented the use of precomputed data. Version 3.8.4 will very soon be published. Normaliz accepts

```Type::extreme_rays
```

and

```Type::support_hyperplanes
```

as precomputed data.

Howevber, these do not always define the cone (or polyhedron) in which we want to compute since nontrivial coordinate transformations may come into play. These are restored via

```Type::generated_lattice
```

(also in the algebraic case) and

```Type::maximal_subspace
```

It is also important that these precomputed data are considered homogeneous input types so that the polyhedron must be defined via Type::dehomogenization or Type::grading.

See Sections 6.21 and D.8.16 in Normaliz.pdf (as of today).

Note: See TracTickets for help on using tickets.