Quantum One.

Slides:



Advertisements
Similar presentations
Matrix Representation
Advertisements

Quantum One: Lecture 6. The Initial Value Problem for Free Particles, and the Emergence of Fourier Transforms.
Quantum One: Lecture 9. Graham Schmidt Orthogonalization.
Ch 7.7: Fundamental Matrices
Quantum One: Lecture 5a. Normalization Conditions for Free Particle Eigenstates.
Quantum One: Lecture More about Linear Operators 3.
Quantum One: Lecture 17.
Quantum One: Lecture Postulate II 3 Observables of Quantum Mechanical Systems 4.
Quantum One: Lecture 16.
Quantum One: Lecture 3. Implications of Schrödinger's Wave Mechanics for Conservative Systems.
Matrices: Inverse Matrix
Quantum One: Lecture Canonical Commutation Relations 3.
Linear Transformations
Economics 2301 Matrices Lecture 13.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
MOHAMMAD IMRAN DEPARTMENT OF APPLIED SCIENCES JAHANGIRABAD EDUCATIONAL GROUP OF INSTITUTES.
Matrix Definition A Matrix is an ordered set of numbers, variables or parameters. An example of a matrix can be represented by: The matrix is an ordered.
Quantum One: Lecture 7. The General Formalism of Quantum Mechanics.
Quantum One: Lecture 8. Continuously Indexed Basis Sets.
Quantum One: Lecture Completeness Relations, Matrix Elements, and Hermitian Conjugation 3.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
CHAPTER 2 MATRIX. CHAPTER OUTLINE 2.1 Introduction 2.2 Types of Matrices 2.3 Determinants 2.4 The Inverse of a Square Matrix 2.5 Types of Solutions to.
3. Hilbert Space and Vector Spaces
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
Quantum One: Lecture Representation Independent Properties of Linear Operators 3.
PHYS 773: Quantum Mechanics February 6th, 2012
Meeting 18 Matrix Operations. Matrix If A is an m x n matrix - that is, a matrix with m rows and n columns – then the scalar entry in the i th row and.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Mathematical Tools of Quantum Mechanics
TH EDITION LIAL HORNSBY SCHNEIDER COLLEGE ALGEBRA.
Quantum Two 1. 2 Angular Momentum and Rotations 3.
Quantum One.
The Quantum Theory of Atoms and Molecules
College Algebra Chapter 6 Matrices and Determinants and Applications
MTH108 Business Math I Lecture 20.
Chapter 7: Systems of Equations and Inequalities; Matrices
Mathematical Formulation of the Superposition Principle
Linear Algebra Review.
Quantum Two.
Chapter 6 Angular Momentum.
Boyce/DiPrima 10th ed, Ch 7.7: Fundamental Matrices Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
Systems of First Order Linear Equations
Quantum One.
Quantum One.
Quantum One.
Quantum One.
Quantum Two.
Quantum Two.
Quantum One.
Quantum One.
Quantum One.
Lecture on Linear Algebra
Quantum One.
Quantum One.
Quantum Two.
Quantum One.
Quantum One.
Quantum One.
Chapter 3 Linear Algebra
Matrices and Matrix Operations
Quantum Two.
The Stale of a System Is Completely Specified by lts Wave Function
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Linear Vector Space and Matrix Mechanics
Presentation transcript:

Quantum One

Ket-Bra Expansions and Integral Representations of Operators

In the last segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators, and saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In this segment, we extend some of these ideas to continuously indexed bases sets, and develop integral representations of linear operators.

In the last segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators, and saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In this segment, we extend some of these ideas to continuously indexed bases sets, and develop integral representations of linear operators.

In the last segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators, and saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In this segment, we extend some of these ideas to continuously indexed bases sets, and develop integral representations of linear operators.

In the last segment, we defined what we mean by Hermitian operators, anti- Hermitian operators, and unitary operators, and saw how any operator can be expressed in terms of its Hermitian and anti-Hermitian parts. We then used the completeness relation for a discrete ONB to develop ket-bra expansions, and matrix representations of general linear operators, and saw how these matrix representations can be used to directly compute quantities related to the operators they represent. Finally, we saw how to represent the matrix corresponding to the adjoint of an operator, and how Hermitian operators are represented by Hermitian matrices. In this segment, we extend some of these ideas to continuously indexed bases sets, and develop integral representations of linear operators.

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write

Continuous Ket-Bra Expansion of Operators : Let form a continuous ONB for the space and let A be an operator acting in the space. Then from the trivial identity we can write This gives what we call a ket-bra expansion for this operator in this representation, and completely specifies the linear operator A in terms of its matrix elements taken between the basis states of this representation.

Integral Representation of Operators Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel, which is a function of two continuous indices, or arguments, the values of which that are just the matrix elements of A connecting the different members of the basis states defining that continuous representation. Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

Integral Representation of Operators Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel, which is a function of two continuous indices, or arguments, the values of which that are just the matrix elements of A connecting the different members of the basis states defining that continuous representation. Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

Integral Representation of Operators Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel, which is a function of two continuous indices, or arguments, the values of which are just the matrix elements of A connecting the different members of the basis states defining that continuous representation. Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

Integral Representation of Operators Thus in the wave function representation induced by any continuous ONB, an operator A is naturally represented by an integral kernel, which is a function of two continuous indices, or arguments, the values of which are just the matrix elements of A connecting the different members of the basis states defining that continuous representation. Like the matrices associated with discrete representations, knowledge of the kernel facilitates computing quantities related to A itself.

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Thus, suppose that for some states and . The expansion coefficients for the states and are then clearly related. Note that if then which can be written, rather like a continuous matrix operation

Integral Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where identifying the wave functions for the two states involved we can write which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

Integral Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where identifying the wave functions for the two states involved we can write which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

Integral Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where identifying the wave functions for the two states involved we can write which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

Integral Representation of Operators Consider the matrix element of A between arbitrary states and Inserting our expansion for A this becomes where identifying the wave functions for the two states involved we can write which is the continuous version of product of a row-vector, a square matrix, and a column-vector.

Integral Representation of Operators As another example, consider the operator product of and The operator product has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The operator product has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The operator product has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators As another example, consider the operator product of and The product operator has a similar expansion, i.e., where which gives the continuous analog of a matrix multiplication, i.e.,

Integral Representation of Operators So if we know the kernels and representing and , we can compute the kernel representing C = AB through the integral relation

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

Integral Representation of Operators: As a final example, consider the integral kernel representing the adjoint of an operator. If then by the two-part rule we developed for taking the adjoint, it follows that We can now switch the prime on the integration variables, and reorder, to find that from which we deduce that

This, obviously, is just the continuous analog of the complex-conjugate transpose of a matrix A Hermitian operator is equal to its adjoint, so that the integral kernels representing Hermitian operators obey the relation

This, obviously, is just the continuous analog of the complex-conjugate transpose of a matrix A Hermitian operator is equal to its adjoint, so that the integral kernels representing Hermitian operators obey the relation

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. Thus, The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

Examples: As an example, in 3D, the operator has as its matrix elements in the position representation This allows us to construct the expansion for this operator where the double integral has been reduced to a single integral because of the delta function. Thus, The operator is said to be diagonal in the position representation, because it has no nonzero elements connecting different basis states.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

This concept of diagonality extends to arbitrary representations. An operator A is said to be diagonal in the discrete representation if so that which only has one summation index, in contrast to the general form which requires two.

In a discrete representation, an operator that is diagonal in that representation is represented by a diagonal matrix, i.e., if then

In a discrete representation, an operator that is diagonal in that representation is represented by a diagonal matrix, i.e., if then

In a discrete representation, an operator that is diagonal in that representation is represented by a diagonal matrix, i.e., if then

Similarly, in a continuous representation an operator A is diagonal if so that which ends up with only one integration variable, i.e., in contrast to the general form which requires two.

Similarly, in a continuous representation an operator A is diagonal if so that which ends up with only one integration variable, i.e., in contrast to the general form which requires two.

Similarly, in a continuous representation an operator A is diagonal if so that which ends up with only one integration variable, i.e., in contrast to the general form which requires two.

Similarly, in a continuous representation an operator A is diagonal if so that which ends up with only one integration variable, i.e., in contrast to the general form which requires two.

Similarly, in a continuous representation an operator A is diagonal if so that which ends up with only one integration variable, i.e., in contrast to the general form which requires two.

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

It is easy to show that in any basis in which an operator is diagonal, it is what we referred to earlier as a multiplicative operator in that representation. That is, if is diagonal in the representation, and if then which shows that a diagonal operator G acts in the representation to multiply the wave function by .

We list below ket-bra expansions and matrix elements of important operators. The position operator The potential energy operator The wavevector operator The momentum operator The kinetic energy operator

We list below ket-bra expansions and matrix elements of important operators. The position operator The potential energy operator The wavevector operator The momentum operator The kinetic energy operator

We list below ket-bra expansions and matrix elements of important operators. The position operator The potential energy operator The wavevector operator The momentum operator The kinetic energy operator

We list below ket-bra expansions and matrix elements of important operators. The position operator The potential energy operator The wavevector operator The momentum operator The kinetic energy operator

We list below ket-bra expansions and matrix elements of important operators. The position operator The potential energy operator The wavevector operator The momentum operator The kinetic energy operator

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

Another nice thing about ket-bra expansions of this sort, particularly in a representation in which the operator is diagonal, is that it is very easy to determine whether or not the operator is Hermitian. For example, we can take the Hermitian adjoint of the position operator by replacing each term in this continuous summation with its adjoint. Thus we easily see that so the position operator (and each of its components) is clearly Hermitian.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is also Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

You should verify to yourself that each of the basic operators whose diagonal ket- bra expansion we previously displayed is also Hermitian. It follows that the wavevector operator is Hermitian, and so the operator D=iK, satisfies the relation D⁺=-iK⁺=-iK=-D Thus we see that the operator D, which takes the gradient in the position representation, is actually an anti-Hermitian operator, That’s why we traded it in for the wavevector operator.

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation. Recall that for any state |ψ〉 for which the state has a position wave function given by the expression But we we can also write where The right hand side of this last expression seems to be the wave function in the position representation for the state .

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation. Recall that for any state |ψ〉 for which the state has a position wave function given by the expression But we we can also write where The right hand side of this last expression seems to be the wave function in the position representation for the state .

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation. Recall that for any state |ψ〉 for which the state has a position wave function given by the expression But we we can also write where The right hand side of this last expression seems to be the wave function in the position representation for the state .

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation. Recall that for any state |ψ〉 for which the state has a position wave function given by the expression But we we can also write where The right hand side of this last expression seems to be the wave function in the position representation for the state .

As an additional example, we work out below the matrix elements of the wavevector operator K in the position representation. Recall that for any state |ψ〉 for which the state has a position wave function given by the expression But we we can also write where The right hand side of this last expression seems to be the wave function in the position representation for the state .

Reminding ourselves of the position eigenfunctions we see that, evidently i.e., is -i times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently is -i times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently Is times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently is times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently is times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently is times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

Reminding ourselves of the position eigenfunctions we see that, evidently is times the gradient of the delta function. The properties of this not-so-frequently encountered object are reviewed in the appendix at the end of the first chapter. The most important of which is that for any function ,

We deduce, therefore that can be expanded in the position representation in the form so that when we apply this to any state , we obtain consistent with our previous definition.

We deduce, therefore that can be expanded in the position representation in the form so that when we apply this to any state , we obtain consistent with our previous definition.

We deduce, therefore that can be expanded in the position representation in the form so that when we apply this to any state , we obtain consistent with our previous definition.

We deduce, therefore that can be expanded in the position representation in the form so that when we apply this to any state , we obtain consistent with our previous definition.

We deduce, therefore that can be expanded in the position representation in the form so that when we apply this to any state , we obtain consistent with our original definition.

In a similar fashion, one finds that and that

In a similar fashion, one finds that and that

In this segment, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators. We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent. We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal. Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient. In the next lecture we consider still other, representation independent, properties of linear operators.

In this segment, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators. We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent. We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal. Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient. In the next lecture we consider still other, representation independent, properties of linear operators.

In this segment, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators. We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent. We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal. Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient. In the next lecture we consider still other, representation independent, properties of linear operators.

In this segment, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators. We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent. We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal. Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient. In the next lecture we consider still other, representation independent, properties of linear operators.

In this segment, we used the completeness relation for continuous ONBs to develop ket-bra expansions, and integral representations of linear operators. We then saw how the integral kernel associated with these representations can be used to directly compute quantities related to the operators they represent. We also introduced the notion of diagonality of an operator in a given representation, and developed expansions for the basic operators of a single particle in representations in which they are diagonal. Finally, we saw how differential operators can be expressed as ket-bra expansions with integral kernels that involve derivatives the delta function, so that we could understand how a linear operator acting on kets, can somehow end up replacing the wave function with its derivative or gradient. In the next segment we consider still other, representation independent, properties of linear operators.