Matrix ,Vector multiplication

* In the above exercise, put real values. 
* A state vector A > is represented in general as A > = Σ_{i=1 to n }(αii _{i}>)....(1) where n is the dimension of Hilbert Space / Euclidian Space . In 3D Vector Space, n=3 and A>= α1i _{1} + α2i_{2} + α3i_{3}= α1i + α2j + α3k where i,j,k are orthogonal basis vectors ( orthonormal vectors) in x,y,z coordinate axes, i _{1} =i , i _{2} =j ,i _{3} =k and αi are in general complex numbers and are called components of the state vector. A > is called a Ket vector and its components ( complex numbers )are represented as a column matrix with no. of rows as the dimension of vector space and column=1. Its counterpart is <A & is called a Bra vector whose components (which are complex conjugate of components of the Ket vector) are represented as a row matrix with no of rows =1 
* A state vector A > is said to be normalized if <A A > = 1. A state vector A > & another state vector B > are said to be orthogonal if <A B > = <B A > =0. 
* The inner product of a orthonormal basis vector ( bra vector) with a ket vector is the corresponding component of the ket vector :< i A >= αi ; < j A >= αj ; < k A >= αk ; A= αii + αjj + αkk where αi,αj,αk are the components of ket vector in 3D Hilbert space ( vector space of complex numbers ). In general, iiA>= αi....(2) 
* If M is a linear operator acting on a state vector λ> and M  λ> = λ  λ > = .....(3) where λ is a number , then λ> is called an eigen vector and λ is called an eigen value. Operators in Quantum Mechanics have certain characterstic features they are things we use to calculate eigen values and eigen vectors. They act on state vectors which are abstract mathematical entities and not on actual systems. When an operator acts on a state vector, it produces a new state vector. 
* If λ> is a state vector & λ_{N}> is a normalized state vector, then λ_{N}> =√[1/Σ_{i }(αi)^{2}]  λ> .......(4) and λ_{N} =√[Σ_{i }(αi)^{2}] * λ ........(5) It follows that M λ_{N}> = λ_{N}  λ_{N} > ........(6) αiN= αi / √[Σ_{i }(αi)^{2}] .......(7)  λ_{N }> = Σ_{i=1 to n }(αiNi _{i}>) .....(8) 
* Σ_{j (}<kMj>)*αj =βk or Σ_{j }Mkjαj=βk where k,j are basis vectors, αj is jth component of A> & βk is kth component of B>. Since αj and βk are both column vectors of same dimension say n, the operator M has to be a n x n matrix. We call it Mkj . If direction of A> and direction of B> are same, then A> is called the Eigen Vector of M and in such cases B>= λ  A> where λ is called the Eigen value. 
* In Quantum Mechanics, operator M is a linear Hermitian matrix. M=M^{† }where M^{†} = (M^{T})* where M dagger is the Hermitian Conjugate of M and is arrived at by first taking the transpose of M i.e. M^{T }( changing rows to columns and columns to rows) and replacing the complex numbers with their respective complex conjugates) or mij = (mji)* where * stands for complex conjugate. The diagonal elements of the Hermitian matrix are real. The real number counterpart of Hermitian matrix is a symmetric square (nxn) matrix. The product of two hermitian matrices A,B is hermitian if and only if AB=BA. Every hermitian matrix is a normal matrix i.e. MM^{† }=M^{†}M. For real counterpart, MM^{T}=M^{T}M. In case of real matrices, all orthogonal, symmetric and skew symmetric matrices are normal. Among complex no. matrices, all hermitian, skew hermitian, unitary matrices are normal. The reverses are not necessarily true. 
* A Hermitian matrix i.e. a complex square matrix is said to be unitary if M^{†} =M^{1} or MM^{† } =I where I is identity matrix. Unitary matrices are important in quantum mechanics because they preserve norms and thus probability amplitudes. If x, y are two complex vectors, multiplication by unitary matrix, say U preserves their inner product which implies U is normal. U is also diagonizable.  det (U)  =1, <Ux, Uy> = <x,y> . U can be written as U=e^{iH} where e is matrix exponential and H is a Hermitian matrix. The real analogue of unitary matrix is orthogonal matrix. The eigenvectors of a Hermitian operator form an orthonormal basis and also form a complete set . But sometimes, we encounter a set of linearly independent eigenvectors that do not form an orthonormal set. This typically happens when a system has degenerate states distinct states having same eigen value. In those cases, we create an orthonormal set by using the linearly independent vectors. 
* If X is a n x n real or complex matrix, matrix
exponential e^{X} is given by the power series
such a series always converges so X is well defined. Finding reliable and accurate methods to compute the matrix exponential is difficult, and this is still a topic of considerable current research in mathematics and numerical analysis. Matlab, GNU Octave, and SciPy all use the Padé approximant.^{[7]}^{[8]}^{[9]} 
General expression of 2x2 unitary matrix is
which depends on 4 real parameters. (the phase of a, the phase of b, the relative magnitude between a and b, and the angle ). The determinant of such a matrix is: det (U) = e^{iθ} The SubGroup of such elements of U where det(U) =1 is called the special unitary group SU(2) Alternative form is By introducing φ_{1} = ψ + Δ and φ_{2} = ψ − Δ, takes the following factorization: This expression highlights the relation between 2 × 2 unitary matrices and 2 × 2 orthogonal matrices of angle θ. Many other factorizations of a unitary matrix in basic matrices are possible. 
* For any nonnegative integer n, the set of all nbyn unitary matrices with matrix multiplication forms a group, called the unitary group U(n). Any square matrix with unit Euclidean norm is the average of two unitary matrices.^{[1]} 
* The eigen vectors of a Hermitian operator form a complete set. Any vector the operator can generate, can be expanded as a linear combination of the eigen vectors. 
* If the two eigen values of the Hermitian operator are different and are λ1 and λ2 respectively, the corresponding eigen vectors are orthogonal to each other. 
* If the two eigen values of same Hermitian operator acting on two different state vectors are same, any state vector which is a linear combination of the above two state vectors will yield the same eigen value if operated upon by the same Hermitian operator. Exa If M λ1> = λ  λ1> , M λ2> = λ  λ2> and A> =α λ1> +β λ2 > where α , β are complex numbers , then M A> = λ  A> . Here λ1 & λ2 are two linearly independent state vectors but not necessarily orthogonal to each other. 
* If the two eigen values of same Hermitian operator are same, the corresponding eigen vectors can be chosen to be orthogonal to each other. Exa From λ1 & λ2 , we can create 2 orthonormal state vectors ( i, j ), both having the same eigen value λ. i = λ1> /  λ1 & j=  λ2_{⊥} >/  λ2_{⊥}  where λ2_{⊥}> = λ2>  <λ2i> * i> and also create 2 orthogonal vectors from these orthonormal vectors. 
* The situation when two different eigen values are same for two different eigen vectors but same operator, is called Degeneracy. 
*Unambiguously distinguishable states are represented by orthogonal vectors. 
* Sometimes, we encounter a set of linearly independent eigen vectors that do not form an orthonormal set. This typically happens when the system has degenerate states distinct states having same eigen value. In that case, we always use linearly independent vectors we have to create an orthonormal set that spans the same space. The method is called GramSchmidt Procedure. 
* If A> is a state vector of a system and L is an observable to be measured, the probability to observe λi ( λi are the eigen values of L &  λi > are corresponding state vectors) is P ( λi ) = <Aλi>< λi  A > =  <Aλi> ^{2} 
* Suppose L is a Hermitian operator ( observable) and M is another Hermitian operator, then [L,M] =LMML is called a commutator of L with M. 
Links: 1,2,3,4,5,6,7,8,9,10, 
Link: 