Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010
AI
This paper explores Lie algebras and their corresponding linear groups, focusing on critical concepts in linear algebra that are foundational for understanding these mathematical structures. Key definitions and theorems related to properties of matrices, including norms, self-adjoint matrices, and eigenvalues, are presented. Additionally, the paper discusses the properties of linear Lie algebras and algebra homomorphisms, illustrating the relationships between these mathematical constructs and their applications in various fields.
Linear Algebra and its Applications, 2018
This note is concerned with isometries on the spaces of selfadjoint traceless matrices. We compute the group of isometries with respect to any unitary similarity invariant norm. This completes and extends the result of Nagy on Schatten p-norm isometries. Furthermore, we point out that our proof techniques could be applied to obtain an old result concerning isometries on skew-symmetric matrices.
Advanced Information and Knowledge Processing, 2014
Linear Algebra and its Applications, 1997
Given a square, nonnegative, symmetric matrix A, the Rayleigh quotient of a nonnegative vector u under A is given by QA(u)= urAu//uru. We show that QA(~/u o Au) is not less than QA(u), where ~-denotes coordinatewise square roots and o is the Hadamard product, but that QA(Au) may be smaller than QA(U).
Mathematics and Computers in Simulation, 1989
Global Journal of Science Frontier Research, 2018
This paper begins the study of bicomplex matrices. In this paper, we have defined bicomplex matrices, determinant of a bicomplex square matrix and singular and non-singular matrices in C 2. We have proved that the set of all bicomplex square matrices of order n is an algebra. We have given some definitions and results regarding adjoint and inverse of a matrix in C 2. We have defined three types of conjugates and three types of tranjugates of a bicomplex matrix. With the help of these conjugates and tranjugates, we have also defined symmetric and skew-symmetric matrices, Hermitian and Skew-Hermitian matrices in C 2 .
Mathematical Sciences and Applications E-Notes, 2018
We present a new study on the square roots of real 2 × 2 matrices with a special view towards examples, some of them inspired by geometry. We begin with the following general matrix A = a c b d ∈ M 2 (R) and ask: is there a matrix B ∈ M 2 (R) such that B 2 = A? Such a matrix B is called square root of A. We point out that the more complicated case of a real matrix of order 3 is discussed in [4]. Although the case we consider is also well studied (according to the bibliography of [3]) we add several examples and facts concerning this notion as well as a series of geometrical applications. The Euclidean example We recall the n-orthogonal group: O(n) = {A ∈ M n (R) : A t · A = I n }; is the invariant group of the Euclidean inner product < ·, · > (yielding the usual Euclidean norm ·). If A ∈ O(n) then (det A t) · (det A) = detI n = 1 implies that det A = ±1. Hence, the orthogonal group splits into two components: O(n) = SO(n) O − (n) where SO(n) contains the matrices from O(n) having detA = 1 and O − (n) those with detA = −1; represents the disjoint reunion of sets. SO(n) is a subgroup in O(n) and is called n-special orthogonal group. O − (n) is not closed under product: A 1 , A 2 ∈ O − (n) implies that A 1 A 2 ∈ SO(n). Since M 1 (R) = R we have that O(1) = {±1} with SO(1) = {1} and O − (1) = {−1}; we remark that O(1) contains the integer unit roots! We know O(2) as well: R(t) = cos t − sin t sin t cos t ∈ SO(2), S(t) = cos t sin t sin t − cos t ∈ O − (2), t ∈ R. Hence, we have that: S(t) 2 = cos t sin t sin t − cos t cos t sin t sin t − cos t = I 2 which means that any S(t) is a root of the unit matrix I 2. We recall that from a geometrical point of view a square root of the unit matrix is called almost product structure, see for example [6]. Geometrical significance: R(t) is the matrix of rotation of angle t in trigonometrical sense (i.e anticlockwise) around the origin and S(t) is the matrix of axial symmetry with respect to d t/2 =line from plane R 2 which contains the origin O and makes the oriented angle t/2 with Ox. We have that S(t 2) · S(t 1) = A(t 2 − t 1) = S(t 1) · S(t 2). 2
2015
We explore properties and representations of first and second order spectral shift functions evaluated for pairs of self-adjoint matrices guided by examples. We examine the conditions sufficient for the first and second order spectral shift functions to be identically 0 in 3.11, 4.10, and 4.13.
Documenta Mathematica
Chapter 1. Generalities Definitions of group, isomorphism, representation, vector space and algebra. Biographical notes on Galois, Abel and Jacobi are given. Chapter 2. Lie groups and Lie algebras Lie Groups, infinitesimal generators, structure constants, Cartan's metric tensor, simple and semisimple groups and algebras, compact and non-compact groups. Biographical notes on Euler, Lie and Cartan are given. Chapter 3. Rotations: SO(3) and SU(2) Rotations and reflections, connectivity, center, universal covering group. Chapter 4. Representations of SU(2) Irreducible representations, Casimir operators, addition of angular momenta, Clebsch-Gordan coefficients, the Wigner-Eckart theorem, multiplicity. Biographical notes on Casimir, Weyl, Clebsch, Gordan and Wigner are given. Chapter 5. The so(n) algebra and Clifford numbers Spin(n), spinors and semispinors, Schur's lemma. Biographical notes on Clifford and Schur are given. Chapter 6. Reality properties of spinors Conjugate, orthogonal and symplectic representations. Chapter 7. Clebsch-Gordan series for spinors Antisymmetrie tensors, duality. Chapter 8. The center and outer automorphisms of Spin(n) Inversion, 12, 14 and 12 x 12 centers. A biographical note on Dynkin is given.
SIAM Journal on Matrix Analysis and Applications, 2004
Given an n-by-n Hermitian matrix A and a real number λ, index i is said to be Parter (resp. neutral, downer) if the multiplicity of λ as an eigenvalue of A(i) is one more (resp. the same, one less) than that in A. In case the multiplicity of λ in A is at least 2 and the graph of A is a tree, there are always Parter vertices. Our purpose here is to advance the classification of vertices and, in particular, to relate classification to the combinatorial structure of eigenspaces. Some general results are given and then used to deduce some rather specific facts, not otherwise easily observed. Examples are given.
Linear Algebra and Its Applications, 2006
Basic classes of matrices or linear transformations in finite dimensional quaternionic vector spaces with a regular indefinite inner product are introduced and studied. The classes include plus matrices, selfadjoint, skew-adjoint, and unitary matrices. In particular, results are proved concerning extensions of invariant semidefinite subspaces. Canonical form for unitary matrices is developed and subsequently applied to stability of periodic Hamiltonian systems.
The purpose of these lectures to report on the recent solution of a 50 years old problem of describing the set of the eigenvalues of a sum of two hermitian matrices with prescribed eigenvalues 1 Statement of the problem For a field F denote by F n the vector space of column vectors f = (f 1 , ..., f n) T with entries in F. We will mostly assume that F is either the field of reals R or complexes C. We view R n and C n as inner product spaces with the inner product (x, y) equal to either y T x or y * x respectively. Set R n ≥ := {x = (x 1 , ..., x n) T ∈ R n : x 1 ≥ x 2 ≥ • • • ≥ x n }. Let S n ⊂ H n be the real vector spaces of n × n real symmetric and hermitain matrices respectively. Note that S n and H n describe the space of selfadjoint operators in R n and C n respectively, with respect to the standard inner product (•, •). Let A ∈ H n. It is well known that C n has an orthonormal basis consisting entirely of the eigenvectors of A:
Linear Algebra and its Applications, 2013
The identity ray, αI for α > 0, can be seen as the center ray of the cone of symmetric and positive definite (SPD) matrices. In that sense, the angle that any SPD matrix forms with the identity plays a very important role to understand the geometrical structure of the cone. In this work, we extend this relationship, and analyze the geometrical structure of symmetric matrices including the location of all orthogonal matrices, not only the identity matrix. This geometrical understanding leads to new results in the subspace of symmetric matrices. We also extend some of the geometrical results for the case of general (not necessarily symmetric) nonsingular matrices.
2014
In this paper we prove that matrix groups are manifolds and use them as a special case to introduce the concepts of Lie groups, Lie algebras, and the exponential map.
Let a ⊕ b = max(a, b) and a ⊗ b = a + b for a, b ∈ R := R ∪ {−∞}. By max-algebra we understand the analogue of linear algebra developed for the pair of operations (⊕, ⊗), extended to matrices and vectors. The symbol A k stands for the kth max-algebraic power of a square matrix A. Let us denote by ε the max-algebraic "zero" vector, all the components of which are −∞. The max-algebraic eigenvalue-eigenvector problem is the following: Given A ∈ R n×n , find all λ ∈ R and
Linear Algebra and Its Applications, 2006
Basic classes of matrices or linear transformations in finite dimensional quaternionic vector spaces with a regular indefinite inner product are introduced and studied. The classes include plus matrices, selfadjoint, skew-adjoint, and unitary matrices. In particular, results are proved concerning extensions of invariant semidefinite subspaces. Canonical form for unitary matrices is developed and subsequently applied to stability of periodic Hamiltonian systems.
Linear Algebra and its Applications, 1997
Let ~ be a conjugate-homogeneous reflector on a vector space V (over R or C) with ~.~ a pointed cone contained in spec,~(V). A mapping on V X V whose range is contained in ~ which generalizes the usual inner product properties is called a vectorial inner product. A certain family of these vectorial inner products on matrices (which we call matricial inner products) is used to generate a set of pointed cones in the ambient space of hermitian-preserving linear transformations. Some basic results on these cones [including ~'(PSD), ~r(PSD)*, and ~'.~] and on the partial orders that they induce are given.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.