Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Arxiv preprint arXiv:1008.2923
…
32 pages
1 file
In this paper we propose a generalized spectral theory for tensors. Our proposed factoriza-tion decomposes a tensor into a product of orthogonal and diagonal tensors. At the same time, our factorization offers an expansion of a tensor as a summation of lower ...
Annales de la faculté des sciences de Toulouse Mathématiques, 2011
In this paper we propose a general spectral theory for tensors. Our proposed factorization decomposes a tensor into a product of orthogonal and scaling tensors. At the same time, our factorization yields an expansion of a tensor as a summation of outer products of lower order tensors. Our proposed factorization shows the relationship between the eigen-objects and the generalised characteristic polynomials. Our framework is based on a consistent multilinear algebra which explains how to generalise the notion of matrix hermicity, matrix transpose, and most importantly the notion of orthogonality. Our proposed factorization for a tensor in terms of lower order tensors can be recursively applied so as to naturally induces a spectral hierarchy for tensors.
2010
In this paper we proposes a generalized spectral theory for tensors. Our proposed factorization decomposes a symmetric tensor into a product of an orthogonal and a diagonal tensor. In the same time, our factorization offers an expansion of a tensor as a summation of lower rank tensors that are obtained through an outer product defined on matrices. Our proposed factorization shows the relationship between the eigen-objects (eigen matrices and eigen vectors for order 3 tensor) and the generalized determinant and trace operators. Our framework is based on a consistent multilinear algebra that explains how we can generalize the notion of Hermitian matrices, the notion of transpose, and most importantly the notion of orthogonality for tensors.
2013
In general, a tensor is a multilinear transformation defined over an underlying finite dimensional vector space. In this brief introduction, tensor spaces of all integral orders will defined inductively. Initially the underlying vector space, V, will be assumed to be an inner product space in order to simplify the discussion. Subsequently, the presentation will be generalized to vector spaces without inner product. Usually, bold-face letters, a, will denote vectors in V and upper case letters, A, will denote tensors. The inner product on V will be denoted by a · b.
2011
Operations with tensors, or multiway arrays, have become increasingly prevalent in recent years. Traditionally, tensors are represented or decomposed as a sum of rank-1 outer products using either the CANDECOMP/PARAFAC (CP) or the Tucker models, or some variation thereof. Such decompositions are motivated by specific applications where the goal is to find an approximate such representation for a given multiway array. The specifics of the approximate representation (such as how many terms to use in the sum, orthogonality constraints, etc.) depend on the application.
2001
Tensors are mathematical objects that generalize vectors and matrices. They describe geometrical quantities and they are used in various applied settings including mathematical physics. The indicial notation of tensors permits us to write an expression in a compact manner and to use simplifying mathematical operations. In a large number of problems in differential geometry and general relativity, the time consuming and straightforward algebraic manipulation is obviously very important. Thus, tensor computation came into existence and became necessary and desirable at the same time. Over the past 25 years, few algorithms have appeared for simplifying tensor expressions. Among the most important tensor computation systems, we can mention SHEEP, Macsyma !Tensor Package, MathTensor and GRTensorII. Meanwhile, graph theory, which had been lying almost dormant for hundreds of years since the time of Euler, started to explode by the turn of the 20th century. It has now grown into a major discipline in mathematics, which has branched off today in various directions such as coloring problems, Ramsey theory, factorization theory and optimization, with problems permeating into many scientific areas such as physics, chemistry, engineering, psychology, and of course computer science. Investigating some of the tensor computation packages will show that they have some deficiencies. Thus, rather than building a new system and adding more features to it, it was an objective in this thesis to express an efficient algorithms by removing most, if not all, restrictions compared to other packages, using graph theory. A summary of the implementation and the advantages of this system is also included. iii I am thankful to my supervisor, Professor Stephen Watt, who taught me everything I needed to know about tensor expressions, for accepting to supervise me, for his kindness and generous contributions of time and for his careful commentary of my thesis. Without him, this work would never been completed. Also, I would like to thank every person in the SCL lab for their helpful advice and useful comments on this thesis. Furthermore, I am sincerely grateful to my parents who kept supporting me regardless of the consequences and to my family, especially my wife, for their endless support and love. I shall never forget that.
Mathematics, 2023
In the present paper, we study two different approaches of tensor decomposition. The first part aims to study some properties of tensors that result from the fact that some components are vanishing in certain coordinates. It is proven that these conditions allow tensor decomposition, especially (1, σ), σ = 1, 2, 3 tensors. We apply the results for special tensors such as the Riemann, Ricci, Einstein, and Weyl tensors and the deformation tensors of affine connections. Thereby, we find new criteria for the Einstein spaces, spaces of constant curvature, and projective and conformal flat spaces. Further, the proof of the theorem of Mikeš and Moldobayev is repaired. It has been used in many works and it is a generalization of the criteria formulated by Schouten and Struik. The second part deals with the properties of a special differential operator with respect to the general decomposition of tensor fields on manifolds with affine connection. It is shown that the properties of special differential operators are transferred to the components of a given decomposition.
Journal of Mathematical Sciences, 2009
Elementary information on polynomials with tensor coefficients and operations with them is given. A generalized Bezout theorem is stated and proved, and on this basis, the Hamilton-Cayley theorem is proved. Another proof of the latter theorem is also considered. Several important theorems are proved, which apply in deducing of the formula expressing the adjunct tensor B (λ) for the tensor binomial λ (2p) E − A in terms of the tensor A ∈ C2p(Ω) (elements of this module are complex tensors of rank 2p) and its invariants. Furthermore, the definitions of minimal polynomial of the tensor of module C2p(Ω), of the tensor of module Cp(Ω) (whose elements are complex tensors of rank p), and of the tensor of module Cp(Ω) with respect to the given tensor of module C2p(Ω) are given. Here, Ω is some domain of the n-dimensional Euclidean (Riemannian) space. Some theorems concerning minimal polynomials are stated and proved. Moreover, the first, second, and third theorems on the splitting of the module Cp(Ω) into invariant submodules are given. Special attention is paid to theorems on adjoint, normal, Hermitian, and unitary tensors of modules C2p(Ω) and R2p(Ω) (elements of this module are real tensors of rank 2p). The theorem on polar decomposition [4, 6, 9, 13, 14], the Schur theorem [6], and the existence theorems for a general complete orthonormal system of eigentensors for a finite or infinite set of pairwise commuting normal tensors of modules C2p(Ω) and R2p(Ω) are generalized to tensors of a complex module of an arbitrary even order. Canonical representations of normal, conjugate, Hermitian, and unitary tensors of the module C2p(Ω) are given (the definition of this module can be found in [3, 17]). Moreover, the Cayley formulas for linear operators [6] are generalized to tensors of the module C2p(Ω). CONTENTS Definition 1.1. The tensor B(λ) ∈ C 2p (Ω), whose components are polynomials with respect to λ, is called a polynomial with tensor coefficients, or tensor polynomial , or λ-tensor. By virtue of the definition, the components of the tensor polynomial B(λ) ∈ C 2p (Ω) are represented in the form B i 1 .
2021
The extensions of the Riemannian structure include the Finslerian one, which provided in recent years successful models in various fields like Biology, Physics, GTR, Monolayer Nanotechnology and Geometry of Big Data. The present article provides the necessary notions on tensor spectral data and on the HO-SVD and the Candecomp tensor decompositions, and further study several aspects related to the spectral theory of the main symmetric Finsler tensors, the fundamental and the Cartan tensor. In particular, are addressed two Finsler models used in Langmuir-Blodgett Nanotechnology and in Oncology. As well, the HO-SVD and Candecomp decompositions are exemplified for these models and metric extensions of the eigenproblem are proposed.
2011
The tensor decomposition addressed in this paper may be seen as a generalisation of Singular Value Decomposition of matrices. We consider general multilinear and multihomogeneous tensors. We show how to reduce the problem to a truncated moment matrix problem and give a new criterion for flat extension of Quasi-Hankel matrices. We connect this criterion to the commutation characterisation of border
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Numerical Linear Algebra with Applications, 2017
Linear Algebra and its Applications, 2020
Journal of Mathematical Sciences, 2011
Journal of Mathematical Sciences, 2009
SIAM Journal on Matrix Analysis and Applications, 2009
Linear Algebra and its Applications, 2013
SIAM Journal on Matrix Analysis and Applications, 2013