Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantum One: Lecture 19 1. 2 Representation Independent Properties of Linear Operators 3.

Similar presentations


Presentation on theme: "Quantum One: Lecture 19 1. 2 Representation Independent Properties of Linear Operators 3."— Presentation transcript:

1 Quantum One: Lecture 19 1

2 2

3 Representation Independent Properties of Linear Operators 3

4 In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators. We then began a study of unitary operators and showed that any two sets of orthonormal basis vectors are connected by a unitary operator and by its adjoint. We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations. In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties. We begin by introducing what is called the Trace of an Operator 4

5 In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators. We then began a study of unitary operators and showed that any two sets of basis vectors are connected by a unitary operator and by its adjoints. We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations. In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties. We begin by introducing what is called the Trace of an Operator 5

6 In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators. We then began a study of unitary operators and showed that any two sets of basis vectors are connected by a unitary operator and by its adjoints. We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations. In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties. We begin by introducing what is called the Trace of an Operator 6

7 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 7

8 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 8

9 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 9

10 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 10

11 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 11

12 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 12

13 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 13

14 The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i 〉 }, It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product. 14

15 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 15

16 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 16

17 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 17

18 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 18

19 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 19

20 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 20

21 The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then is the matrix representing A in some other representation, and which shows that the trace of any matrix [A],[A′],[A′′],… that represents A is the same. Thus, Tr(A) is a representation independent property. In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so 21

22 The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e., if in some discrete representation of states {|i 〉 }, the matrix [A] represents A, then Basic familiarity with general properties of the determinant of a matrix will be assumed. 22

23 The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e., if in some discrete representation of states {|i 〉 }, the matrix [A] represents A, then Basic familiarity with general properties of the determinant of a matrix will be assumed. 23

24 The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e., if in some discrete representation of states {|i 〉 }, the matrix [A] represents A, then Basic familiarity with general properties of the determinant of a matrix will be assumed. 24

25 The Determinant of an Operator For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices. The determinant of a 2×2 matrix is The determinant of a diagonal matrix is just the product of the diagonal elements. 25

26 The Determinant of an Operator For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices. The determinant of a 2×2 matrix is The determinant of a diagonal matrix is just the product of the diagonal elements. 26

27 The Determinant of an Operator For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices. The determinant of a 2×2 matrix is The determinant of a diagonal matrix is just the product of the diagonal elements. 27

28 The Determinant of an Operator Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity, In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then det[ABC] = det([A][B][C]) = det[A) det[B] det [C] 28

29 The Determinant of an Operator Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity, In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then det[ABC] = det([A][B][C]) = det[A) det[B] det [C] 29

30 The Determinant of an Operator Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity, In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then det[ABC] = det([A][B][C]) = det[A] det[B] det [C] 30

31 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 31

32 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 32

33 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 33

34 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 34

35 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 35

36 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 36

37 The Determinant of an Operator It follows that if [A] and [A′] = [U][A][U⁺] represent a linear operator A in two different representations connected by the unitary operator U, then in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e., det([U][U⁺]) = det(1) = 1. 37

38 The Determinant of an Operator Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish. This condition extends to any operator represented by such a matrix, i.e. If det(A) = 0, then A is non-invertible or singular. If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1. 38

39 The Determinant of an Operator Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish. This condition extends to any operator represented by such a matrix, i.e. If det(A) = 0, then A is non-invertible or singular. If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1. 39

40 The Determinant of an Operator Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish. This condition extends to any operator represented by such a matrix, i.e. If det(A) = 0, then A is non-invertible or singular. If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1. 40

41 The Determinant of an Operator Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish. This condition extends to any operator represented by such a matrix, i.e. If det(A) = 0, then A is non-invertible or singular. If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹ = A⁻¹A = 1. 41

42 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 42

43 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 43

44 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 44

45 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 45

46 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 46

47 Eigenvalues, Eigenvectors, and Eigenspaces Recall that a nonzero vector |χ 〉 is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation A|χ 〉 =a|χ 〉. The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}. The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded. Comment: A number of basic features follow from the eigenvalue equation. 1.If |χ 〉 is an eigenvector of A then so is any multiple λ|χ 〉 of |χ 〉. This follows from the fact that A is a linear operator so that A(λ|χ 〉 ) = λA|χ 〉 = λa|χ 〉 = a(λ|χ 〉 ). 47

48 Eigenvalues, Eigenvectors, and Eigenspaces Thus, only the direction in Hilbert space of a given eigenvector is unique. This means that we are always free to construct eigenvectors that are appropriately normalized. 2.By taking the adjoint of the eigenvalue equation A|χ 〉 =a|χ 〉, we see that if |χ 〉 is an eigenket with eigenvalue a then which implies that 〈 χ| is an eigenbra of A⁺ with eigenvalue. 48

49 Eigenvalues, Eigenvectors, and Eigenspaces Thus, only the direction in Hilbert space of a given eigenvector is unique. This means that we are always free to construct eigenvectors that are appropriately normalized. 2.By taking the adjoint of the eigenvalue equation A|χ 〉 =a|χ 〉, we see that if |χ 〉 is an eigenket with eigenvalue a then which implies that 〈 χ| is an eigenbra of A⁺ with eigenvalue. 49

50 Eigenvalues, Eigenvectors, and Eigenspaces Thus, only the direction in Hilbert space of a given eigenvector is unique. This means that we are always free to construct eigenvectors that are appropriately normalized. 2.By taking the adjoint of the eigenvalue equation we see that if |χ 〉 is an eigenket with eigenvalue a then which implies that 〈 χ| is an eigenbra of A⁺ with eigenvalue. 50

51 Eigenvalues, Eigenvectors, and Eigenspaces Thus, only the direction in Hilbert space of a given eigenvector is unique. This means that we are always free to construct eigenvectors that are appropriately normalized. 2.By taking the adjoint of the eigenvalue equation we see that if |χ 〉 is an eigenket with eigenvalue a then which implies that 〈 χ| is an eigenbra of A⁺ with eigenvalue. 51

52 Eigenvalues, Eigenvectors, and Eigenspaces Thus, only the direction in Hilbert space of a given eigenvector is unique. This means that we are always free to construct eigenvectors that are appropriately normalized. 2.By taking the adjoint of the eigenvalue equation we see that if |χ 〉 is an eigenket with eigenvalue a then which implies that 〈 χ| is an eigenbra of A⁺ with eigenvalue. 52

53 53

54 54

55 55

56 56

57 Eigenvalues, Eigenvectors, and Eigenspaces It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if then the action of A on any linear combination of these vectors is 57

58 Eigenvalues, Eigenvectors, and Eigenspaces It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if then the action of A on any linear combination of these vectors is 58

59 Eigenvalues, Eigenvectors, and Eigenspaces It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if then the action of A on any linear combination of these vectors is 59

60 Eigenvalues, Eigenvectors, and Eigenspaces It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if then the action of A on any linear combination of these vectors is 60

61 Eigenvalues, Eigenvectors, and Eigenspaces As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace formed from all possible linear combinations of those vectors. Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace, each vector of which is an eigenvector of A with that eigenvalue. We refer to the subspace as the eigenspace of A associated with eigenvalue a. 61

62 Eigenvalues, Eigenvectors, and Eigenspaces As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace formed from all possible linear combinations of those vectors. Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace, each vector of which is an eigenvector of A with that eigenvalue. We refer to the subspace as the eigenspace of A associated with eigenvalue a. 62

63 Eigenvalues, Eigenvectors, and Eigenspaces As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace formed from all possible linear combinations of those vectors. Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace, each vector of which is an eigenvector of A with that eigenvalue. We refer to the subspace as the eigenspace of A associated with eigenvalue a. 63

64 Eigenvalues, Eigenvectors, and Eigenspaces Within the eigenspace we may form linear combinations of the linear independent eigenvectors using the Gram-Schmidt orthogonalization procedure to construct an orthonormal basis of eigenvectors for this eigenspace. Here τ is a discrete index that labels the different orthonormal basis states that span this subspace. Thus each linear operator A of the state space can be associated with 1) a set of eigenvalues (each of which has a certain degeneracy ), 2) a set of eigenvectors, and 3) and a set of eigenspaces associated with each the different eigenvalues of that operator. 64

65 Eigenvalues, Eigenvectors, and Eigenspaces Thus each linear operator A of the state space can be associated with 1.a set of eigenvalues (each of which has a certain degeneracy ), 2.a set of eigenvectors, 3.and a set of eigenspaces associated with each the different eigenvalues of that operator. 65

66 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 66

67 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 67

68 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 68

69 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 69

70 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 70

71 Eigenvalues and Eigenvectors of Hermitian Operators The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include: Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ 〉 is one of its eigenvectors, then A|χ 〉 =a|χ 〉, and so But for a Hermitian operator the adjoint of this last equation is Comparing the last two relations we deduce that. 71

72 Eigenvalues and Eigenvectors of Hermitian Operators Thus, we see that the eigenvalues of Hermitian operators are necessarily real. Formerly we showed that expectation values of Hermitian operators are real. The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal. The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators. Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form 72

73 Eigenvalues and Eigenvectors of Hermitian Operators Thus, we see that the eigenvalues of Hermitian operators are necessarily real. Formerly we showed that expectation values of Hermitian operators are real. The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal. The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators. Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form 73

74 Eigenvalues and Eigenvectors of Hermitian Operators Thus, we see that the eigenvalues of Hermitian operators are necessarily real. Formerly we showed that expectation values of Hermitian operators are real. The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal. The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators. Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form 74

75 Eigenvalues and Eigenvectors of Hermitian Operators Thus, we see that the eigenvalues of Hermitian operators are necessarily real. Formerly we showed that expectation values of Hermitian operators are real. The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal. The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators. Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form 75

76 Eigenvalues and Eigenvectors of Hermitian Operators Thus, we see that the eigenvalues of Hermitian operators are necessarily real. Formerly we showed that expectation values of Hermitian operators are real. The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal. The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators. Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form 76

77 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′, where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 77

78 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′, where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 78

79 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′, where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 79

80 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′, where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 80

81 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′, where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 81

82 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′ where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 = a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 82

83 Orthogonality of Eigenvectors for Hermitian Operators Let |χ 〉 and |χ′ 〉, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write A|χ 〉 = a|χ 〉 andA|χ′ 〉 = a′|χ′ 〉. Taking the inner product of the first of these with |χ′ 〉 we find that 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉. But the adjoint of the eigenvalue equation for a′ is 〈 χ′|A = 〈 χ′|a′ where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ 〉, we find that 〈 χ′|A|χ 〉 =a′ 〈 χ′|χ 〉. Equating these two expressions for the matrix element 〈 χ′|A|χ 〉 we find that 83

84 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 So a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 84

85 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 85

86 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 86

87 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 87

88 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 88

89 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 89

90 Orthogonality of Eigenvectors for Hermitian Operators 〈 χ′|A|χ 〉 = a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉 so a 〈 χ′|χ 〉 = a′ 〈 χ′|χ 〉, or (a - a′) 〈 χ′|χ 〉 = 0. There are two ways in which this product can vanish. Either a = a′, in which case we haven't found out anything, or a ≠ a′, in which case 〈 χ′|χ 〉 = 0 showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal. 90

91 91

92 92

93 93

94 94

95 95

96 In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it. These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum. We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue. We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal. 96

97 In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it. These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum. We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue. We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal. 97

98 In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it. These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum. We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue. We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal. 98

99 In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it. These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum. We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue. We then proved two important facts about Hermitian operators: (1) that they have real eigenvalues, and (2) that eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal. 99

100 In the next lecture, we address the question of how one actually goes about solving the eigenvalue equation for a given linear operator A in order to find its eigenvalues (and their degeneracies) and eigenvectors. 100

101 101


Download ppt "Quantum One: Lecture 19 1. 2 Representation Independent Properties of Linear Operators 3."

Similar presentations


Ads by Google