Quantum One: Lecture 16.

Slides:



Advertisements
Similar presentations
10.4 Complex Vector Spaces.
Advertisements

Matrix Representation
Rules of Matrix Arithmetic
5.1 Real Vector Spaces.
Quantum One: Lecture 9. Graham Schmidt Orthogonalization.
Applied Informatics Štefan BEREŽNÝ
Quantum One: Lecture Ket-Bra Operators, Projection Operators, and Completeness Relations 3.
Quantum One: Lecture 17.
Quantum One: Lecture Postulate II 3 Observables of Quantum Mechanical Systems 4.
Quantum One: Lecture 3. Implications of Schrödinger's Wave Mechanics for Conservative Systems.
Quantum One: Lecture Canonical Commutation Relations 3.
New chapter – quantum and stat approach We will start with a quick tour over QM basics in order to refresh our memory. Wave function, as we all know, contains.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
Chapter 3 Determinants and Matrices
Dirac Notation and Spectral decomposition Michele Mosca.
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Dirac Notation and Spectral decomposition
Quantum One: Lecture 7. The General Formalism of Quantum Mechanics.
5  Systems of Linear Equations: ✦ An Introduction ✦ Unique Solutions ✦ Underdetermined and Overdetermined Systems  Matrices  Multiplication of Matrices.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Quantum One: Lecture Completeness Relations, Matrix Elements, and Hermitian Conjugation 3.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
राघव वर्मा Inner Product Spaces Physical properties of vectors  aka length and angles in case of arrows Lets use the dot product Length of Cosine of the.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
Quantum Mechanics(14/2)Taehwang Son Functions as vectors  In order to deal with in more complex problems, we need to introduce linear algebra. Wave function.
1 Chapter 6 – Determinant Outline 6.1 Introduction to Determinants 6.2 Properties of the Determinant 6.3 Geometrical Interpretations of the Determinant;
Physics 3 for Electrical Engineering
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
3. Hilbert Space and Vector Spaces
Matrices. A matrix, A, is a rectangular collection of numbers. A matrix with “m” rows and “n” columns is said to have order m x n. Each entry, or element,
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
1 C ollege A lgebra Systems and Matrices (Chapter5) 1.
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Matrices. Definitions  A matrix is an m x n array of scalars, arranged conceptually as m rows and n columns.  m is referred to as the row dimension.
Quantum One: Lecture Representation Independent Properties of Linear Operators 3.
4 © 2012 Pearson Education, Inc. Vector Spaces 4.4 COORDINATE SYSTEMS.
Class 26: Question 1 1.An orthogonal basis for A 2.An orthogonal basis for the column space of A 3.An orthogonal basis for the row space of A 4.An orthogonal.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Mathematical Tools of Quantum Mechanics
TH EDITION LIAL HORNSBY SCHNEIDER COLLEGE ALGEBRA.
Chapter 5 Chapter Content 1. Real Vector Spaces 2. Subspaces 3. Linear Independence 4. Basis and Dimension 5. Row Space, Column Space, and Nullspace 6.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
1 Objective To provide background material in support of topics in Digital Image Processing that are based on matrices and/or vectors. Review Matrices.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
MTH108 Business Math I Lecture 20.
Matrices and vector spaces
Postulates of Quantum Mechanics
Systems of First Order Linear Equations
Quantum One.
Quantum One.
Quantum One.
Quantum One.
Quantum One.
Quantum Two.
Quantum One.
Lecture on Linear Algebra
Quantum One.
Quantum One.
Quantum One.
Chapter 3 Linear Algebra
Matrices and Matrix Operations
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Presentation transcript:

Quantum One: Lecture 16

Hermitian, Anti-Hermitian, and Unitary Operators

In the last lecture, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real.

Anti-Hermitian Operators

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian.

Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian.

Thus an arbitrary operator can be decomposed into Hermitian and anti-Hermitian parts, very similar to the way that an arbitrary complex number z =x + iy can be decomposed in real and imaginary parts

Unitary Operators

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then

Thus, a unitary operator maps any complete ONB of states Thus, a unitary operator maps any complete ONB of states onto another complete ONB of states for the space, and generally preserves vector relationships in state space, the way that orthogonal transformations do in in R³.

Ket-Bra and Matrix Representation of Operators

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what we call a ket-bra expansion for this operator in this representation, in which appear the matrix elements of A connecting the basis states |n〉 and |n′〉.

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what is called a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements of A connecting the basis states |n〉 and |n′〉.

Matrix Representation of Operators : Let {|n〉} be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what is called a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements connecting the basis states of this representation.

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and . The expansion coefficients for the states and clearly related. Note that if then which can be written

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the ith component of the equation that one obtains through the operation

Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the ith component of the equation that one obtains through the operation

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix [A], i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix [A], i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself.

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e.,

Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector

Matrix Representation of Operators That is, the expression is equivalent to

Matrix Representation of Operators That is, the expression is equivalent to

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which we will write as [C] = [A][B], where [A] stands for the matrix representing A, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which we will write as [C] = [A][B], where [A] stands for the matrix representing A, etc.

Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which can write as [C] = [A][B], where as before, [A] stands for “the matrix representing A”, etc.

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working.

In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators defined on the state space. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators. . We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.

In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form.