Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantum One: Lecture 16 1. 2 Hermitian, Anti-Hermitian, and Unitary Operators 3.

Similar presentations


Presentation on theme: "Quantum One: Lecture 16 1. 2 Hermitian, Anti-Hermitian, and Unitary Operators 3."— Presentation transcript:

1 Quantum One: Lecture 16 1

2 2

3 Hermitian, Anti-Hermitian, and Unitary Operators 3

4 In the last lecture, we saw how the completeness relation for ONBs allows one to easily obtain representation dependent equations from representation independent ones, and vice versa. We also defined the matrix elements of an operator between different states, and extended the action of operators so that they could act either to the right on kets, or to the left on bras. In the process, we were led to the notion of Hermitian conjugation, which allows us to identify any relation in the space of kets, with the corresponding relation in the space of bras, and vice versa, and we developed some rules for “taking the Hermitian adjoint” of any expression formulated in the Dirac notation. In the this lecture, we use this idea to introduce some important new concepts. We start with the definition of a Hermitian operator. 4

5 Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. 5

6 Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. 6

7 Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. 7

8 Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. 8

9 Definition: An operator A is said to be Hermitian or self adjoint if A is equal to its Hermitian adjoint, i.e., if A = A⁺. In terms of matrix elements, the property which is true for any operator, reduces for Hermitian operators to the relation As a special case this implies that Thus, the expectation values of Hermitian operators are strictly real. 9

10 10

11 Anti-Hermitian Operators 11

12 Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary. 12

13 Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary. 13

14 Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary. 14

15 Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary. 15

16 Definition: An operator A is said to be anti-Hermitian if it is equal to the negative of its Hermitian adjoint, i.e., if In terms of matrix elements, the property which is true for any operator, reduces for anti-Hermitian operators to the relation As a special case this implies that Thus, expectation values of anti-Hermitian operators are strictly imaginary. 16

17 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian. 17

18 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian. 18

19 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian. 19

20 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti- Hermitian. 20

21 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian. 21

22 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian. 22

23 Note that if A is any operator, it may be written in the form where A_{H}=(1/2)(A+A⁺) (the Hermitian part of A) is Hermitian and A_{A}=(1/2)(A-A⁺) (the anti-Hermitian part of A) is anti-Hermitian. 23

24 Thus an arbitrary operator can be decomposed into Hermitian and anti-Hermitian parts, very similar to the way that an arbitrary complex number z =x + iy can be decomposed in real and imaginary parts 24

25 25

26 Unitary Operators 26

27 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 27

28 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 28

29 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 29

30 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 30

31 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 31

32 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 32

33 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 33

34 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 34

35 Definition: An operator U is said to be unitary if its adjoint is equal to its inverse: Thus, for a unitary operator U⁺=U⁻¹, or equivalently, Unitary operators (or the transformations they induce) play the same role in quantum mechanical Hilbert spaces that orthogonal transformations play in R³. Indeed, note that if the states form an ONB for S, and if then 35

36 Thus, a unitary operator maps any complete ONB of states onto another complete ONB of states for the space, and generally preserves vector relationships in state space, the way that orthogonal transformations do in in R³. 36

37 37

38 Ket-Bra and Matrix Representation of Operators 38

39 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write 39

40 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write 40

41 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity we write 41

42 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what we call a ket-bra expansion for this operator in this representation, in which appear the matrix elements of A connecting the basis states |n 〉 and |n′ 〉. 42

43 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what is called a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements of A connecting the basis states |n 〉 and |n′ 〉. 43

44 Matrix Representation of Operators : Let {|n 〉 } be an ONB for the space S and let A be an operator acting in the space. From the trivial identity or This gives what is called a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements connecting the basis states of this representation. 44

45 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 45

46 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 46

47 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 47

48 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 48

49 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 49

50 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 50

51 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 51

52 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 52

53 Matrix Representation of Operators The operator A, therefore, is completely determined by its matrix elements in any ONB. Thus, suppose that for some states and. The expansion coefficients for the states and clearly related. Note that if then which can be written 53

54 Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the ith component of the equation that one obtains through the operation 54

55 Matrix Representation of Operators But this is of the form of a matrix multiplying a column vector i.e., is just the ith component of the equation that one obtains through the operation 55

56 Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix [A], i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself. 56

57 Matrix Representation of Operators Thus we see that in the row-vector-column vector representation induced by any discrete ONB, an operator A is naturally represented by a square matrix [A], i.e., with entries that are just the matrix elements of A connecting the different basis states of that representation. This matrix representation facilitates computing quantities related to A itself. 57

58 Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e., 58

59 Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e., 59

60 Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e., 60

61 Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector, i.e., 61

62 Matrix Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where and so that we can write which is readily expressed in terms of the product of a row-vector, a square matrix, and a column-vector 62

63 Matrix Representation of Operators That is, the expression is equivalent to 63

64 Matrix Representation of Operators That is, the expression is equivalent to 64

65 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 65

66 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 66

67 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 67

68 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 68

69 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 69

70 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 70

71 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 71

72 Matrix Representation of Operators As another example, consider the operator product of and The product operator C=AB has a similar expansion, i.e., where so 72

73 Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which we will write as [C] = [A][B], where [A] stands for the matrix representing A, etc. 73

74 Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which we will write as [C] = [A][B], where [A] stands for the matrix representing A, etc. 74

75 Matrix Representation of Operators But this is just of the form of a matrix multiplication, i.e., Is equivalent to which can write as [C] = [A][B], where as before, [A] stands for “the matrix representing A”, etc. 75

76 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 76

77 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 77

78 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 78

79 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 79

80 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 80

81 Matrix Representation of Operators As a final example, consider the matrix representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the summation indices to find that from which we deduce that 81

82 Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy 82

83 Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy 83

84 Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy 84

85 Thus, the matrix representing is the complex-conjugate transpose of the matrix representing We thus define the adjoint of a matrix to be this combined operation (which applies to single column/row vectors as well, obviously). A Hermitian operator is equal to its adjoint, so that the matrix elements representing such an operator obey the relation Thus, Hermitian operators are represented by Hermitian matrices, which satisfy 85

86 It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation. 86

87 It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation. 87

88 It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation. 88

89 It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation. 89

90 It is an interesting fact that neither the transpose nor the complex conjugate of an operator are well defined concepts; i.e., Given a linear operator A there is no operator that can be uniquely identified with the transpose of A. Although one can form the transpose of the matrix [A] representing A in any basis, the operator associated with the transposed matrix will not correspond to the operator associated with the transpose of the matrix representing A in other bases. Similarly, complex conjugation can be performed on matrices, or on matrix elements, but it is not an operation that is defined for the operators themselves. The Hermitian adjoint, which in a sense combines these two representation dependent operations, yields an operator that is independent of representation. 90

91 This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working. 91

92 This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schro”dinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working. 92

93 This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working. 93

94 This again emphasizes one of the basic themes, which is that bras, kets, and operators are not row vectors, column vectors, and matrices. The former may be represented by the latter, but they are conceptually different things, and it always important to keep that in mind. Matrix representations of this form were developed extensively by Heisenberg and gave rise to the term ``matrix mechanics'', in analogy to the ``wave mechanics'' developed by Schrödinger, which focuses on a wave function representation for the underlying space. Clearly, however, whether one has a wave mechanical or matrix mechanical representation depends simply upon the choice of basis (i.e., discrete or continuous) in which one is working. 94

95 In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators defined on the state space. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form. 95

96 In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators.. We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form. 96

97 In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form. 97

98 In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form. 98

99 In this lecture, we defined what we mean by Hermitian operators, anti-Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators We then saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In the next lecture, we show these discrete representations can be extended to continuously indexed bases sets, where linear operators can be represented by ket-bra expansions in integral form. 99

100 100


Download ppt "Quantum One: Lecture 16 1. 2 Hermitian, Anti-Hermitian, and Unitary Operators 3."

Similar presentations


Ads by Google