Special Issue: Recent Advances in Fast Matrix and Tensor Computations
Guest Editors
Prof. Xian-Ming Gu
School of Economic Mathematics, Southwestern University of Finance and Economics, Chengdu 611130, China
Email: guxm@swufe.edu.cn
Prof. Junfeng Yin
Department of Mathematics, Tongji University, Shanghai 200092, China
Email: yinjf@tongji.edu.cn
Prof. Bruno Carpentieri
Faculty of Computer Science, Free University of Bozen-Bolzano, 39100 Bolzano, Italy
Email: Bruno.Carpentieri@unibz.it
Dr. Yong-Liang Zhao
School of Mathematical Sciences, Sichuan Normal University, Chengdu 610068, China
Email: ylzhao@sicnu.edu.cn
Manuscript Topics
Matrix and tensor computations are at the core of a multitude of applications in diverse domains of scientific computing and data science. These computations present several challenges due to their complexity, high computational cost, and large memory footprint. While there is a widespread availability and use of general-purpose high-performance libraries for matrix computations on multicore CPUs and GPUs, the same is not true for tensor computations. Now, it seems that scientific advances on tensor and matrix computations are scattered and, despite there is a growing community working on tensor and matrix computations, researchers in this field have a rather limited visibility and tend to work in a compartmentalized fashion.
This Special Issue aims at bringing together many experts from distinct domains such as numerical solutions of PDEs, fractional differential equations and integro-differential equations, Image Processing and related topics, Computational electromagnetics, just to name a few, to uncover computational challenges, bottlenecks, and advances in high-performance tensor and matrix computations arising in those disciplines. The aim is to enhance understanding of the similarities and differences in the tensor operations and computational tasks across these fields, and to seek pathways to general purpose software libraries and frameworks for high-performance tensor computations. This also takes into account the analytical and algebraic foundations of structured tensor and matrix decompositions and approximations. In the long term, the vision is to create a framework for tensor and matrix computations that is understandable by the many practitioners and can be utilized to carry out efficient, parallel, and high-performance tensor and matrix calculations.
We are interested in topics including but not limited to the following:
• Fractional integro-differential equations (of any type) and their generalisations
• Methodologies for solving (variable-order) fractional differential equations
• Applications of fractional calculus to real-world problems
• Parallel-in-time algorithms and applications
• Numerical linear algebra with applications
• Computational electromagnetics
• The model of image processing and its efficient methods
• Tensor analysis and computations
• Low-rank matrix and tensor approximations
• Randomized matrix algorithms
Keywords
Fractional differential equations, partial differential equations, numerical and approximation methods, parallel-in-time methods, applications of fractional calculus, numerical linear algebra, matrix and tensor computations, image processing, Computational electromagnetics.
For Instructions for authors, please visit
https://www.aimspress.com/era/news/solo-detail/instructionsforauthors
Please submit your manuscript to online submission system
https://aimspress.jams.pub/