Halaman ini belum diterjemahkan. Anda sedang melihat versi asli dalam bahasa Inggris.
To better prepare ourselves to explore the capabilities and limitations of quantum circuits, we now introduce some additional mathematical concepts — namely the inner product between vectors (and its connection to the Euclidean norm), the notions of orthogonality and orthonormality for sets of vectors, and projection matrices, which will allow us to introduce a handy generalization of standard basis measurements.
Recall that when we use the Dirac notation to refer to an arbitrary column vector as a ket, such as
∣ψ⟩=α1α2⋮αn,
the corresponding bra vector is the conjugate transpose of this vector:
⟨ψ∣=(∣ψ⟩)†=(α1α2⋯αn).(1)
Alternatively, if we have some classical state set Σ in mind, and we express a column vector as a ket,
such as
∣ψ⟩=a∈Σ∑αa∣a⟩,
then the corresponding row (or bra) vector is the conjugate transpose
⟨ψ∣=a∈Σ∑αa⟨a∣.(2)
We also have that the product of a bra vector and a ket vector, viewed as matrices either having a single row or a single column, results in a scalar.
Specifically, if we have two column vectors
∣ψ⟩=α1α2⋮αnand∣ϕ⟩=β1β2⋮βn,
so that the row vector ⟨ψ∣ is as in equation (1), then
where the last equality follows from the observation that ⟨a∣a⟩=1 and ⟨a∣b⟩=0 for classical states a and b satisfying a=b.
The value ⟨ψ∣ϕ⟩ is called the inner product between the vectors ∣ψ⟩ and ∣ϕ⟩.
Inner products are critically important in quantum information and computation;
we would not get far in understanding quantum information at a mathematical level without them.
Let us now collect together some basic facts about inner products of vectors.
Relationship to the Euclidean norm. The inner product of any vector
∣ψ⟩=a∈Σ∑αa∣a⟩
with itself is
⟨ψ∣ψ⟩=a∈Σ∑αaαa=a∈Σ∑∣αa∣2=∣ψ⟩2.
Thus, the Euclidean norm of a vector may alternatively be expressed as
∣ψ⟩=⟨ψ∣ψ⟩.
Notice that the Euclidean norm of a vector must always be a nonnegative real number.
Moreover, the only way the Euclidean norm of a vector can be equal to zero is if every one of the entries is equal to zero, which is to say that the vector is the zero vector.
We can summarize these observations like this: for every vector ∣ψ⟩ we have
⟨ψ∣ψ⟩≥0,
with ⟨ψ∣ψ⟩=0 if and only if ∣ψ⟩=0.
This property of the inner product is sometimes referred to as positive definiteness.
Conjugate symmetry. For any two vectors
∣ψ⟩=a∈Σ∑αa∣a⟩and∣ϕ⟩=b∈Σ∑βb∣b⟩,
we have
⟨ψ∣ϕ⟩=a∈Σ∑αaβaand⟨ϕ∣ψ⟩=a∈Σ∑βaαa,
and therefore
⟨ψ∣ϕ⟩=⟨ϕ∣ψ⟩.
Linearity in the second argument (and conjugate linearity in the first).
Let us suppose that ∣ψ⟩,∣ϕ1⟩, and ∣ϕ2⟩ are vectors and α1 and α2 are complex numbers. If we define a new vector
That is to say, the inner product is linear in the second argument.
This can be verified either through the formulas above or simply by noting that matrix multiplication is linear in each argument (and specifically in the second argument).
Combining this fact with conjugate symmetry reveals that the inner product is conjugate linear in the first argument. That is, if ∣ψ1⟩,∣ψ2⟩, and ∣ϕ⟩ are vectors and α1 and α2 are complex numbers, and we define
Two vectors ∣ϕ⟩ and ∣ψ⟩ are said to be orthogonal if their inner product is zero:
⟨ψ∣ϕ⟩=0.
Geometrically, we can think about orthogonal vectors as vectors at right angles to each other.
A set of vectors {∣ψ1⟩,…,∣ψm⟩} is called an orthogonal set if every vector in the set is orthogonal to every other vector in the set.
That is, this set is orthogonal if