Changes between Version 17 and Version 18 of Ticket #31991
- Timestamp:
- 06/19/21 21:27:42 (12 months ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
Ticket #31991 – Description
v17 v18 1 Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named "Density Matrix Renormalization Group (DMRG)" and later popularized by Prof. Ivan Oseledets.1 Following the progress with Ticket #30307, we propose to add computational functionalities with tensors via the tensor train decomposition, which is an idea originally named Matrix Product States in the quantum physics community and later popularized by Prof. Ivan Oseledets. As a computational tool, it represents tensors in compressed formats that would enable tensor operations with cost scaling only linearly in the number of dimensions. 2 2 3 This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization), currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust. 3 - Affleck et al., Valence bond ground states in isotropic quantum antiferromagnets (https://link.springer.com/article/10.1007/BF01218021) 4 - Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3). 4 5 5 We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under `tensor`, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be TensorFlow, however this may change / be flexible based on how the refactoring of `Components` progresses. 6 This functionality is especially useful for high dimensional tasks to prevent the curse of dimensionality (such as in Riemannian optimization). Numerical packages are currently supported in MATLAB, Python, and C++. The MATLAB version is currently most versatile and robust. 7 8 We may consider internalizing this computational library for Sage, as an effort to formalize the currently available implementations. This addition can either be an additional child class under `tensor`, and overwrite the arithmetics; or a separate module. The backend is currently proposed to be `tensorflow`, however this may change / be flexible based on how the refactoring of `Components` progresses. 6 9 7 10 One can look at this github page for an example implementation in MATLAB (https://github.com/oseledets/TT-Toolbox). … … 15 18 - Quantized tensor train (see section 3.5.1): (https://refubium.fu-berlin.de/handle/fub188/3366) 16 19 17 For the theory behind why one can reduce a normally exponential (in the number of dimensions) task to a linear task, see Oseledets, Tensor-Train Decomposition, SIAM J. Sci. Comput., 33(5), 2295–2317 (https://epubs.siam.org/doi/abs/10.1137/090752286?journalCode=sjoce3).18 19 20 20 21 '''Tickets:'''