In
physics, a
photon is an
elementary particle, the
quantum of the
electromagnetic field and the basic
"unit" of
light and all other forms of
electromagnetic radiation.
It is also the
force carrier for the
electromagnetic force. The
effects of this
force are easily observable at
both the
microscopic and
macroscopic level, because the photon has
no
rest mass; this allows for
interaction at long distances. Like
all elementary particles, photons are governed by
quantum mechanics and will exhibit
waveparticle duality – they
exhibit properties of both
waves and
particles. For example, a single photon may be
refracted by a
lens or exhibit
wave
interference, but also act as a particle giving a definite
result when quantitative mass is measured.
The modern concept of the photon was developed gradually by
Albert Einstein to explain
experimental observations that did not fit the classical
wave model of light. In
particular, the photon model accounted for the frequency dependence
of light's energy, and explained the ability of
matter and
radiation to be in
thermal equilibrium. It also accounted
for anomalous observations, including the properties of
black body radiation, that other
physicists, most notably
Max Planck, had
sought to explain using
semiclassical models, in which
light is still described by
Maxwell's equations, but the material
objects that emit and absorb light are quantized. Although these
semiclassical models contributed to the development of quantum
mechanics, further experiments proved Einstein's hypothesis that
light itself is
quantized; the
quanta of light are photons.
In the modern
Standard Model of
particle physics, photons are
described as a necessary consequence of physical laws having a
certain
symmetry at every point
in
spacetime. The intrinsic properties of
photons, such as
charge,
mass and
spin,
are determined by the properties of this
gauge symmetry.
The photon concept has led to momentous advances in experimental
and theoretical physics, such as
lasers,
Bose–Einstein
condensation,
quantum field
theory, and the
probabilistic
interpretation of quantum mechanics. It has been applied to
photochemistry,
highresolution microscopy,
and
measurements
of molecular distances. Recently, photons have been studied as
elements of
quantum computers and
for sophisticated applications in
optical communication such as
quantum cryptography.
Nomenclature
In 1900, Max Planck was working on blackbody radiation and
suggested that the energy in electromagnetic waves could only be
released in "packets" of energy; he called these
quanta
(singular
quantum). Later, in 1905
Albert Einstein went further by suggesting
that EM waves could only exist in these discrete wavepackets. He
called such a
wavepacket
the light
quantum (German:
das Lichtquant). The name
photon derives from the
Greek
word for light, (transliterated
phôs), and was coined
in 1926 by the physical chemist
Gilbert
Lewis, who published a speculative theory in which photons were
"uncreatable and indestructible". Although Lewis' theory was never
accepted—being contradicted by many experiments—his new name,
photon, was adopted immediately by most physicists.
Isaac Asimov credits
Arthur Compton with defining quanta of energy
as photons in 1927.
In physics, a photon is usually denoted by the symbol
γ
(the
Greek letter gamma). This symbol for the photon probably derives
from
gamma rays, which were discovered and
named in 1900 by
Paul Villard
and shown to be a form of
electromagnetic radiation in 1914
by
Ernest Rutherford and
Edward Andrade. In
chemistry and
optical engineering, photons are usually
symbolized by
hν, the energy of a photon, where
h
is
Planck's constant and the
Greek letter ν (
nu) is the photon's
frequency. Much less commonly, the photon can be
symbolized by
hf, where its frequency is denoted by
f.
Physical properties
The photon is
massless,The
mass of the photon is believed to be
exactly zero, based on experiment and theoretical considerations
described in the article. Some sources also refer to the "
relativistic mass" concept, which is just
the energy scaled to units of mass. For a photon with wavelength
λ or energy
E, this is
h/λc or
E/
c^{2}. This usage for the term "mass"
is no longer common in scientific literature. Further info: What is
the mass of a photon?
http://math.ucr.edu/home/baez/physics/ParticleAndNuclear/photon_mass.html
has no
electric charge, and does not
decay spontaneously in empty space. A
photon has two possible
polarization
states and is described by exactly three continuous parameters: the
components of its
wave vector, which
determine its wavelength
λ and its direction of
propagation. The photon is the
gauge
boson for
electromagnetism, and
therefore all other quantum numbers of the photon (such as
lepton number,
baryon
number, and
flavour
quantum numbers) are zero.
Photons are emitted in many natural processes. For example, when a
charge is
accelerated it emits
synchrotron radiation. During a
molecular,
atomic or
nuclear transition to a lower
energy level, photons of various energy
will be emitted, from
infrared light
to
gamma rays. A photon can also be
emitted when a particle and its corresponding
antiparticle are
annihilated (see
Electronpositron
annihilation for an example).
In empty space, the photon moves at
c (the
speed of light) and its
energy and momentum are related by , where
p
is the
magnitude of the
momentum
vector
p. For comparison, the corresponding equation for
particles with
mass m is:
 \ E^{2} = p^{2} c^{2} + m^{2} c^{4}.
The energy and momentum of a photon depend only on its
frequency (
ν) or equivalently, its
wavelength (
λ):
 \ E = \hbar\omega = h\nu = \frac{h c}{\lambda}
 \mathbf{p} = \hbar\mathbf{k},
where
k is the
wave
vector (with the wave number
k = 2π/
λ as its
magnitude), is the
angular
frequency, and is the
reduced
Planck constant.
Since
p points in the direction of the photon's
propagation, the magnitude of the momentum is
 \ p = \hbar k = \frac{h\nu}{c} = \frac{h}{\lambda}.
The photon also carries
spin angular
momentum that does not depend on its frequency. The magnitude
of its spin is \scriptstyle{\sqrt{2} \hbar} and the component
measured along its direction of motion, its
helicity, must be ±ħ. These two
possible helicities, called righthanded and lefthanded,
correspond to the two possible
circular polarization states of the
photon.
To illustrate the significance of these formulae, the annihilation
of a particle with its antiparticle in free space must result in
the creation of at least
two photons for the following
reason. In the
center of mass
frame, the colliding
antiparticles have no net momentum, whereas a single photon always
has momentum (since it is determined, as we have seen, only by the
photon's frequency or wavelength  which cannot be zero). Hence,
conservation of momentum (or equivalently,
translational invariance)
requires that at least two photons are created, with zero net
momentum. (However it is possible if the system interacts with
another particle or field for annihilation to produce one photon,
as when a positron annihilates with a bound atomic electron, it is
possible for only one photon to be emitted, as the nuclear Coulomb
field breaks translational symmetry.) The energy of the two
photons—or, equivalently, their frequency—may be determined from
conservation of fourmomentum. Seen
another way, the photon can be considered as its own antiparticle.
The reverse process,
pair
production, is the dominant mechanism by which highenergy
photons such as
gamma rays lose energy
while passing through matter. That process is the reverse of
"annihilation to one photon" allowed in the electric field of an
atomic nucleus.
The classical formulae for the energy and momentum of
electromagnetic radiation can be
reexpressed in terms of photon events. For example, the
pressure of electromagnetic radiation on
an object derives from the transfer of photon momentum per unit
time and unit area to that object, since pressure is force per unit
area and force is the change in
momentum
per unit time.
Historical development
In most theories up to the eighteenth century, light was pictured
as being made up of particles. One of the earliest particle
theories was described in the
Book of
Optics (1021) by
Alhazen,
who held
light rays to be streams of
minute particles that "lack all sensible qualities except energy."
Since
particle models cannot easily account
for the
refraction,
diffraction and
birefringence of light, wave theories of light
were proposed by
René Descartes
(1637),
Robert Hooke (1665), and
Christian Huygens (1678); however,
particle models remained dominant, chiefly due to the influence of
Isaac Newton. In the early nineteenth
century,
Thomas Young and
August Fresnel clearly
demonstrated the
interference and
diffraction of light and by 1850 wave models were generally
accepted. In 1865,
James Clerk
Maxwell's
prediction that
light was an electromagnetic wave—which was confirmed
experimentally in 1888 by
Heinrich
Hertz's detection of
radio waves—seemed to
be the final blow to particle models of light.
The
Maxwell wave
theory, however, does not account for
all properties
of light. The Maxwell theory predicts that the energy of a light
wave depends only on its
intensity, not on
its
frequency; nevertheless, several
independent types of experiments show that the energy imparted by
light to atoms depends only on the light's frequency, not on its
intensity. For example,
some chemical
reactions are provoked only by light of frequency higher than a
certain threshold; light of frequency lower than the threshold, no
matter how intense, does not initiate the reaction. Similarly,
electrons can be ejected from a metal plate by shining light of
sufficiently high frequency on it (the
photoelectric effect); the energy of
the ejected electron is related only to the light's frequency, not
to its intensity. . It is to be understood that "no matter how
intense" is referring to intensities below approximately
10
^{13} W/cm
^{2} at which point
perturbation theory begins to break
down. Interestingly, in the intense regime, which for visible light
is above approximately 10
^{14} W/cm
^{2}, the
classical wave description correctly predicts the energy acquired
by electrons, called
ponderomotive
energy. Also see
^{[3831]} . By comparison, sunlight is only about
0.1 W/cm
^{2}.
At the same time, investigations of
blackbody radiation carried out over
four decades (1860–1900) by various researchers culminated in
Max Planck's
hypothesis that the energy of
any
system that absorbs or emits electromagnetic radiation of frequency
\nu is an integer multiple of an energy quantum E = h\nu . As shown
by
Albert Einstein, some form of
energy quantization
must be assumed to account for the
thermal equilibrium observed between matter and
electromagnetic radiation; for
this explanation of the
photoelectric effect, Einstein received
the 1921
Nobel Prize in physics.
Since the Maxwell theory of light allows for all possible energies
of electromagnetic radiation, most physicists assumed initially
that the energy quantization resulted from some unknown constraint
on the matter that absorbs or emits the radiation. In 1905,
Einstein was the first to propose that energy quantization was a
property of electromagnetic radiation itself. Although he accepted
the validity of Maxwell's theory, Einstein pointed out that many
anomalous experiments could be explained if the
energy of
a Maxwellian light wave were localized into pointlike quanta that
move independently of one another, even if the wave itself is
spread continuously over space. In 1909 and 1916, Einstein showed
that, if
Planck's
law of blackbody radiation is accepted, the energy quanta must
also carry
momentum p=h/\lambda, making
them fullfledged
particles.
This photon momentum was observed experimentally by
Arthur Compton, for which he received the
Nobel Prize in 1927. The pivotal
question was then: how to unify Maxwell's wave theory of light with
its experimentally observed particle nature? The answer to this
question occupied
Albert Einstein
for the rest of his life, and was solved in
quantum electrodynamics and its
successor, the
Standard Model (see
Second quantization and
The photon as a gauge
boson, below).
Early objections
Einstein's 1905 predictions were verified experimentally in several
ways in the first two decades of the 20th century, as recounted in
Robert Millikan's Nobel lecture.
However, before
Compton's
experiment showing that photons carried
momentum proportional to their
wave number (or frequency) (1922), most
physicists were reluctant to believe that
electromagnetic radiation itself
might be particulate. (See, for example, the Nobel lectures of
Wien,
Planck
and Millikan.). Instead, there was a widespread belief that energy
quantization resulted from some unknown constraint on the matter
that absorbs or emits radiation. Attitudes changed over gradually.
In part, the change can be traced to experiments such as
Compton scattering, where it was much
more difficult not to ascribe quantization to light itself to
explain the observed results.
Even after Compton's experiment, Bohr,
Hendrik Kramers and
John Slater made one last attempt to preserve
the Maxwellian continuous electromagnetic field model of light, the
socalled
BKS model. To account for the
thenavailable data, two drastic hypotheses had to be made:
 Energy and momentum are conserved only on the average
in interactions between matter and radiation, not in elementary
processes such as absorption and emission. This allows one
to reconcile the (at the time believed to be) discontinuously
changing energy of the atom (jump between energy states) with the
continuous release of energy into radiation. (It is now known that
this is actually a continuous process, the combined atomfield
system evolving in time according to Schroedinger's equation.)
 Causality is abandoned. For example, spontaneous emissions are merely
emissions induced by a "virtual"
electromagnetic field.
However, refined Compton experiments showed that energymomentum is
conserved extraordinarily well in elementary processes; and also
that the jolting of the electron and the generation of a new photon
in
Compton scattering obey
causality to within 10
ps. Accordingly,
Bohr and his coworkers gave their model "as honorable a funeral as
possible". Nevertheless, the failures of the BKS model inspired
Werner Heisenberg in his
development of
matrix
mechanics.
A few physicists persisted in developing semiclassical models in
which
electromagnetic
radiation is not quantized, but matter appears to obey the laws
of
quantum mechanics. Although the
evidence for photons from chemical and physical experiments was
overwhelming by the 1970s, this evidence could not be considered as
absolutely definitive; since it relied on the interaction
of light with matter, a sufficiently complicated theory of matter
could in principle account for the evidence. Nevertheless,
all semiclassical theories were refuted definitively in
the 1970s and 1980s by photoncorrelation experiments. Hence,
Einstein's hypothesis that quantization is a property of light
itself is considered to be proven.
Wave–particle duality and uncertainty principles
Photons, like all quantum objects, exhibit both wavelike and
particlelike properties. Their dual wave–particle nature can be
difficult to visualize. The photon displays clearly wavelike
phenomena such as
diffraction and
interference on the length scale of its
wavelength. For example, a single photon passing through a
doubleslit experiment lands on the
screen exhibiting interference phenomena but only if no measure was
made on the actual slit being run across. To account for the
particle interpretation that phenomena is called
probability distribution but
behaves according to the
Maxwell's
equations. However, experiments confirm that the photon is
not a short pulse of electromagnetic radiation; it does
not spread out as it propagates, nor does it divide when it
encounters a
beam splitter". Rather,
the photon seems to be a
pointlike
particle since it is absorbed or emitted
as a whole by
arbitrarily small systems, systems much smaller than its
wavelength, such as an atomic nucleus (≈10
^{–15} m across)
or even the pointlike
electron.
Nevertheless, the photon is
not a pointlike particle
whose trajectory is shaped probabilistically by the
electromagnetic field, as conceived by
Einstein and others; that hypothesis
was also refuted by the photoncorrelation experiments cited above.
According to our present understanding, the electromagnetic field
itself is produced by photons, which in turn result from a local
gauge symmetry and the laws of
quantum field theory (see the
Second quantization and
Gauge boson
sections below).
A key element of
quantum mechanics
is
Heisenberg's uncertainty principle, which forbids
the simultaneous measurement of the position and momentum of a
particle along the same direction. Remarkably, the uncertainty
principle for charged, material particles
requires the
quantization of light into photons, and even the frequency
dependence of the photon's energy and momentum. An elegant
illustration is Heisenberg's
thought
experiment for locating an electron with an ideal microscope.
The position of the electron can be determined to within the
resolving power of the
microscope, which is given by a formula from classical
optics
\Delta x \sim \frac{\lambda}{\sin \theta}
where \theta is the
aperture angle
of the microscope. Thus, the position uncertainty \Delta x can be
made arbitrarily small by reducing the wavelength \lambda. The
momentum of the electron is uncertain, since it received a "kick"
\Delta p from the light scattering from it into the microscope. If
light were
not quantized into photons, the uncertainty
\Delta p could be made arbitrarily small by reducing the light's
intensity. In that case, since the wavelength and intensity of
light can be varied independently, one could simultaneously
determine the position and momentum to arbitrarily high accuracy,
violating the
uncertainty
principle. By contrast, Einstein's formula for photon momentum
preserves the uncertainty principle; since the photon is scattered
anywhere within the aperture, the uncertainty of momentum
transferred equals\Delta p \sim p_{\mathrm{photon}} \sin\theta =
\frac{h}{\lambda} \sin\theta
giving the product \Delta x \Delta p \, \sim \, h, which is
Heisenberg's uncertainty principle. Thus, the entire world is
quantized; both matter and fields must obey a consistent set of
quantum laws, if either one is to be quantized.
The analogous uncertainty principle for photons forbids the
simultaneous measurement of the number n of photons (see
Fock state and the
Second quantization section
below) in an electromagnetic wave and the phase \phi of that
wave
\Delta n \Delta \phi > 1
See
coherent state and
squeezed coherent state for more
details.
Both photons and material particles such as electrons create
analogous
interference patterns when
passing through a
doubleslit
experiment. For photons, this corresponds to the interference
of a
Maxwell light
wave whereas, for material particles, this corresponds to the
interference of the
Schrödinger wave equation.
Although this similarity might suggest that
Maxwell's equations are simply
Schrödinger's equation for photons, most physicists do not agree.
For one thing, they are mathematically different; most obviously,
Schrödinger's one equation solves for a
complex field,
whereas Maxwell's four equations solve for
real fields. More generally, the normal concept
of a Schrödinger
probability
wave function cannot be applied to
photons. Being massless, they cannot be localized without being
destroyed; technically, photons cannot have a position eigenstate
\mathbf{r} \rangle, and, thus, the normal Heisenberg uncertainty
principle \Delta x \Delta p > h/2 does not pertain to photons. A
few substitute wave functions have been suggested for the photon,
but they have not come into general use. Instead, physicists
generally accept the secondquantized theory of photons described
below,
quantum
electrodynamics, in which photons are quantized excitations of
electromagnetic modes.
Bose–Einstein model of a photon gas
In 1924,
Satyendra Nath Bose
derived
Planck's
law of blackbody radiation without using any electromagnetism,
but rather a modification of coarsegrained counting of
phase space. Einstein showed that this
modification is equivalent to assuming that photons are rigorously
identical and that it implied a "mysterious nonlocal interaction",
now understood as the requirement for a
symmetric quantum mechanical state. This
work led to the concept of
coherent
states and the development of the laser. In the same papers,
Einstein extended Bose's formalism to material particles (
bosons) and predicted that they would condense into
their lowest quantum state at low enough temperatures; this
Bose–Einstein
condensation was observed experimentally in 1995.
The modern view on this is that photons are, by virtue of their
integer spin,
bosons (as opposed to
fermions with halfinteger spin). By the
spinstatistics theorem, all bosons
obey Bose–Einstein statistics (whereas all fermions obey
FermiDirac statistics).
Stimulated and spontaneous emission
In 1916, Einstein showed that Planck's radiation law could be
derived from a semiclassical, statistical treatment of photons and
atoms, which implies a relation between the rates at which atoms
emit and absorb photons. The condition follows from the assumption
that light is emitted and absorbed by atoms independently, and that
the thermal equilibrium is preserved by interaction with atoms.
Consider a cavity in
thermal
equilibrium and filled with
electromagnetic radiation and
atoms that can emit and absorb that radiation. Thermal equilibrium
requires that the energy density \rho(\nu) of photons with
frequency \nu (which is proportional to their
number density) is, on average, constant in
time; hence, the rate at which photons of any particular frequency
are
emitted must equal the rate of
absorbing
them.
Einstein began by postulating simple proportionality relations for
the different reaction rates involved. In his model, the rate
R_{ji} for a system to
absorb a photon of frequency \nu
and transition from a lower energy E_{j} to a higher energy E_{i}
is proportional to the number N_{j} of atoms with energy E_{j} and
to the energy density \rho(\nu) of ambient photons with that
frequency,
R_{ji} = N_{j} B_{ji} \rho(\nu) \!
where B_{ji} is the
rate constant for
absorption. For the reverse process, there are two possibilities:
spontaneous emission of a photon, and a return to the lowerenergy
state that is initiated by the interaction with a passing photon.
Following Einstein's approach, the corresponding rate R_{ij} for
the emission of photons of frequency \nu and transition from a
higher energy E_{i} to a lower energy E_{j} is
R_{ij} = N_{i} A_{ij} + N_{i} B_{ij} \rho(\nu) \!
where A_{ij} is the rate constant for
emitting a photon spontaneously, and
B_{ij} is the rate constant for emitting it in response to ambient
photons (
induced or stimulated
emission). In thermodynamic equilibrium, the number of atoms in
state i and that of atoms in state j must, on average, be constant;
hence, the rates R_{ji} and R_{ij} must be equal. Also, by
arguments analogous to the derivation of
Boltzmann statistics, the ratio of
N_{i} and N_{j} is g_i/g_j\exp{(E_jE_i)/kT)}, where g_{i,j} are
the
degeneracy of the state i and that of
j, respectively, E_{i,j} their energies, k the
Boltzmann constant and T the system's
temperature. From this, it is readily
derived thatg_iB_{ij} = g_jB_{ji} andA_{ij} = \frac{8 \pi h
\nu^{3}}{c^{3}} B_{ij}.The A and Bs are collectively known as the
Einstein coefficients.
Einstein could not fully justify his rate equations, but claimed
that it should be possible to calculate the coefficients A_{ij},
B_{ji} and B_{ij} once physicists had obtained "mechanics and
electrodynamics modified to accommodate the quantum hypothesis". In
fact, in 1926,
Paul Dirac derived the
B_{ij} rate constants in using a semiclassical approach, and, in
1927, succeeded in deriving
all the rate constants from
first principles within the framework of quantum theory. Dirac's
work was the foundation of quantum electrodynamics, i.e., the
quantization of the electromagnetic field itself. Dirac's approach
is also called
second quantization or
quantum field theory; earlier quantum
mechanical treatments only treat material particles as quantum
mechanical, not the electromagnetic field.
Einstein was troubled by the fact that his theory seemed
incomplete, since it did not determine the
direction of a
spontaneously emitted photon. A probabilistic nature of
lightparticle motion was first considered by
Newton in his treatment of
birefringence and, more generally, of the
splitting of light beams at interfaces into a transmitted beam and
a reflected beam. Newton hypothesized that hidden variables in the
light particle determined which path it would follow. Similarly,
Einstein hoped for a more complete theory that would leave nothing
to chance, beginning his separation from quantum mechanics.
Ironically,
Max Born's
probabilistic interpretation of the
wave function was inspired by
Einstein's later work searching for a more complete theory.
Second quantization
In 1910,
Peter Debye derived
Planck's law of blackbody
radiation from a relatively simple assumption. He correctly
decomposed the electromagnetic field in a cavity into its
Fourier modes, and assumed that the energy in
any mode was an integer multiple of h\nu, where \nu is the
frequency of the electromagnetic mode. Planck's law of blackbody
radiation follows immediately as a geometric sum. However, Debye's
approach failed to give the correct formula for the energy
fluctuations of blackbody radiation, which were derived by Einstein
in 1909.
In 1925,
Born,
Heisenberg and
Jordan reinterpreted Debye's concept in a key
way. As may be shown classically, the
Fourier modes of the
electromagnetic field—a
complete set of electromagnetic plane waves indexed by their wave
vector
k and polarization state—are equivalent to
a set of uncoupled
simple
harmonic oscillators. Treated quantum mechanically, the energy
levels of such oscillators are known to be E = nh\nu, where \nu is
the oscillator frequency. The key new step was to identify an
electromagnetic mode with energy E = nh\nu as a state with n
photons, each of energy h\nu. This approach gives the correct
energy fluctuation formula.
Dirac took this one step further. He
treated the interaction between a charge and an electromagnetic
field as a small perturbation that induces transitions in the
photon states, changing the numbers of photons in the modes, while
conserving energy and momentum overall. Dirac was able to derive
Einstein's A_{ij} and B_{ij} coefficients from first principles,
and showed that the Bose–Einstein statistics of photons is a
natural consequence of quantizing the electromagnetic field
correctly (Bose's reasoning went in the opposite direction; he
derived
Planck's
law of black body radiation by
assuming BE
statistics). In Dirac's time, it was not yet known that all bosons,
including photons, must obey BE statistics.
Dirac's secondorder
perturbation theory
can involve
virtual photons,
transient intermediate states of the electromagnetic field; the
static
electric and
magnetic interactions are mediated by such virtual
photons. In such
quantum field
theories, the
probability
amplitude of observable events is calculated by summing over
all possible intermediate steps, even ones that are
unphysical; hence, virtual photons are not constrained to satisfy E
= pc, and may have extra
polarization
states; depending on the
gauge used,
virtual photons may have three or four polarization states, instead
of the two states of real photons. Although these transient virtual
photons can never be observed, they contribute measurably to the
probabilities of observable events. Indeed, such secondorder and
higherorder perturbation calculations can give apparently
infinite contributions to the sum. Such unphysical
results are corrected for using the technique of
renormalization. Other virtual particles may
contribute to the summation as well; for example, two photons may
interact indirectly through virtual
electron
positron pairs. In fact, such photonphoton
scattering, as well as electronphoton scattering, is meant to be
one of the modes of operations of the planned particle accelerator,
the
International Linear
Collider.
In modern physics notation, the
quantum
state of the electromagnetic field is written as a
Fock state, a
tensor
product of the states for each electromagnetic mode

n_{k_0}\rangle\otimesn_{k_1}\rangle\otimes\dots\otimesn_{k_n}\rangle\dots
where n_{k_i}\rangle represents the state in which \, n_{k_i}
photons are in the mode k_i. In this notation, the creation of a
new photon in mode k_i (e.g., emitted from an atomic transition) is
written as n_{k_i}\rangle \rightarrow n_{k_i}+1\rangle. This
notation merely expresses the concept of Born, Heisenberg and
Jordan described above, and does not add any physics.
The photon as a gauge boson
The electromagnetic field can be understood as a
gauge theory, i.e., as a field that results
from requiring that symmetry hold independently at every position
in
spacetime. For the
electromagnetic field, this gauge
symmetry is the
Abelian U symmetry of a
complex number, which reflects the ability to
vary the
phase of a complex number
without affecting
Observables or
real valued functions made from it, such
as the
energy or the
Lagrangian.
The quanta of an
Abelian gauge field
must be massless, uncharged bosons, as long as the symmetry is not
broken; hence, the photon is predicted to be massless, and to have
zero
electric charge and integer
spin. The particular form of the
electromagnetic interaction
specifies that the photon must have
spin ±1; thus, its
helicity must be \pm \hbar.
These two spin components correspond to the classical concepts of
righthanded and lefthanded
circularly polarized light. However, the transient
virtual photons of
quantum electrodynamics may also
adopt unphysical polarization states.
In the prevailing
Standard Model of
physics, the photon is one of four
gauge
bosons in the
electroweak
interaction; the
other three are
denoted W
^{+}, W
^{−} and Z
^{0} and are
responsible for the
weak
interaction. Unlike the photon, these gauge bosons have
invariant mass, owing to a
mechanism that breaks their
SU gauge symmetry. The unification of
the photon with W and Z gauge bosons in the electroweak interaction
was accomplished by
Sheldon Glashow,
Abdus Salam and
Steven Weinberg, for which they were awarded
the 1979
Nobel Prize in physics.
Physicists continue to hypothesize
grand unified theories that connect
these four
gauge bosons with the eight
gluon gauge bosons of
quantum chromodynamics; however, key
predictions of these theories, such as
proton decay, have not been observed
experimentally.
Contributions to the mass of a system
The energy of a system that emits a photon is
decreased by
the energy E of the photon as measured in the rest frame of the
emitting system, which may result in a reduction in mass in the
amount {E}/{c^2}. Similarly, the mass of a system that absorbs a
photon is
increased by a corresponding amount. As an
application, the energy balance of nuclear reactions involving
photons is commonly written in terms of the masses of the nuclei
involved, and terms of the form {E}/{c^2} for the gamma photons
(and for other relevant energies, such as the recoil energy of
nuclei).
This concept is applied in key predictions of
quantum electrodynamics (QED, see
above). In that theory, the mass of electrons (or, more generally,
leptons) is modified by including the mass contributions of virtual
photons, in a technique known as
renormalization. Such "radiative
corrections" contribute to a number of predictions of QED, such as
the
magnetic dipole
moment of
leptons, the
Lamb shift, and the
hyperfine structure of bound lepton
pairs, such as
muonium and
positronium.
Since photons contribute to the
stressenergy tensor, they exert a
gravitational attraction on other objects,
according to the theory of
general
relativity. Conversely, photons are themselves affected by
gravity; their normally straight trajectories may be bent by warped
spacetime, as in
gravitational lensing, and
their frequencies may be lowered by
moving to a higher
gravitational
potential, as in the
PoundRebka experiment. However,
these effects are not specific to photons; exactly the same effects
would be predicted for classical
electromagnetic waves.
Photons in matter
(Visible) light that travels through transparent matter does so at
a lower speed than
c, the speed of light in a vacuum.
Xrays, on the other hand, usually have a phase velocity above c,
as evidenced by
total external
reflection. In addition, light can also undergo
scattering and
absorption. There are
circumstances in which heat transfer through a material is mostly
radiative, involving emission and absorption of photons within it.
An example would be in the
core of the
sun. Energy can take about a million years to reach the surface;.
However, this phenomenon is distinct from scattered radiation
passing diffusely through matter, as it involves local
equilibration between the radiation and the temperature. Thus, the
time is how long it takes the
energy to be transferred,
not the
photons themselves. Once in open space, a photon
from the Sun takes only 8.3 minutes to reach Earth. The factor by
which the speed of light is decreased in a material is called the
refractive index of the material.
In a classical wave picture, the slowing can be explained by the
light inducing
electric
polarization in the matter, the polarized matter radiating new
light, and the new light interfering with the original light wave
to form a delayed wave. In a particle picture, the slowing can
instead be described as a blending of the photon with quantum
excitations of the matter (
quasiparticles such as
phonons and
excitons) to form
a
polariton; this polariton has a nonzero
effective mass, which means that it
cannot travel at
c.
Alternatively, photons may be viewed as
always traveling
at
c, even in matter, but they have their phase shifted
(delayed or advanced) upon interaction with atomic scatters: this
modifies their wavelength and momentum, but not speed. A light wave
made up of these photons does travel slower than the speed of
light. In this view the photons are "bare", and are scattered and
phase shifted, while in the view of the preceding paragraph the
photons are "dressed" by their interaction with matter, and move
without scattering or phase shifting, but at a lower speed.
Light of different frequencies may travel through matter at
different speeds; this is
called
dispersion. In some
cases, it can result in
extremely slow speeds
of light in matter. The effects of photon interactions with
other quasiparticles may be observed directly in
Raman scattering and
Brillouin scattering.
Photons can also be
absorbed by nuclei,
atoms or molecules, provoking transitions between their
energy levels. A classic example is the
molecular transition of
retinal
C
_{20}H
_{28}O, which is responsible for
vision, as discovered in 1958 by Nobel
laureate
biochemist George Wald and coworkers. The absorption
provokes a
cistrans isomerization that, in combination with other
such transitions, is transduced into nerve impulses. The absorption
of photons can even break chemical bonds, as in the
photodissociation of
chlorine; this is the subject of
photochemistry. Analogously,
gamma rays can in some circumstances dissociate
atomic nuclei in a process called
photodisintegration.
Technological applications
Photons have many applications in technology. These examples are
chosen to illustrate applications of photons
per se,
rather than general optical devices such as lenses, etc. that could
operate under a classical theory of light. The laser is an
extremely important application and is discussed above under
stimulated emission.
Individual photons can be detected by several methods. The classic
photomultiplier tube exploits the
photoelectric effect: a photon
landing on a metal plate ejects an electron, initiating an
everamplifying avalanche of electrons.
Chargecoupled device chips use a
similar effect in
semiconductors: an
incident photon generates a charge on a microscopic
capacitor that can be detected. Other detectors
such as
Geiger counters use the
ability of photons to
ionize gas molecules,
causing a detectable change in
conductivity.
Planck's energy formula E=h\nu is often used by engineers and
chemists in design, both to compute the change in energy resulting
from a photon absorption and to predict the frequency of the light
emitted for a given energy transition. For example, the
emission spectrum of a
fluorescent light bulb can be designed
using gas molecules with different electronic energy levels and
adjusting the typical energy with which an electron hits the gas
molecules within the bulb.
Under some conditions, an energy transition can be excited by "two"
photons that individually would be insufficient. This allows for
higher resolution microscopy, because the sample absorbs energy
only in the region where two beams of different colors overlap
significantly, which can be made much smaller than the excitation
volume of a single beam (see
twophoton excitation
microscopy). Moreover, these photons cause less damage to the
sample, since they are of lower energy.
In some cases, two energy transitions can be coupled so that, as
one system absorbs a photon, another nearby system "steals" its
energy and reemits a photon of a different frequency. This is the
basis of
fluorescence resonance
energy transfer, a technique that is used in
molecular biology to study the interaction
of suitable
proteins.
Several different kinds of
hardware random number
generator involve the detection of single photons. In one
example, for each bit in the random sequence that is to be
produced, a photon is sent to a
beamsplitter. In such a situation, there are
two possible outcomes of equal probability. The actual outcome is
used to determine whether the next bit in the sequence is "0" or
"1".
Recent research
Much research has been devoted to applications of photons in the
field of
quantum optics. Photons seem
wellsuited to be elements of an extremely fast
quantum computer, and the
quantum entanglement of photons is a
focus of research.
Nonlinear optical
processes are another active research area, with topics such as
twophoton absorption,
selfphase modulation,
modulational instability
and
optical parametric
oscillators. However, such processes generally do not require
the assumption of photons
per se; they may often be
modeled by treating atoms as nonlinear oscillators. The nonlinear
process of
spontaneous parametric
down conversion is often used to produce singlephoton states.
Finally, photons are essential in some aspects of
optical communication, especially for
quantum cryptography.
See also
Notes
References
 . A partial English
translation is available from Wikisource.
 Role as gauge boson and polarization section 5.1 in
 See p.31 in .
 See section 1.6 in
 Electromagnetic radiation is made of
photons
 This property was experimentally verified by Raman and
Bhagavantam in 1931. :*
 E.g. section 1.3.3.2 in
 E.g. section 9.3 in
 E.g. Appendix XXXII in
 . An English translation is available from Project
Gutenberg
 This article followed a presentation by Maxwell on 8 December
1864 to the Royal Society.
 Frequencydependence of luminiscence p. 276f., photoelectric
effect section 1.4 in
 Presentation speech by Svante Arrhenius for the 1921 Nobel Prize
in Physics, December 10 1922. Online text from [nobelprize.org], The Nobel
Foundation 2008. Access date 20081205.
 . An
English translation is available from Wikisource.
 Also Physikalische Zeitschrift, 18,
121–128 (1917).
 Also Zeitschrift für Physik,
24, 69 (1924).
 These experiments produce results that cannot be explained by
any classical theory of light, since they involve anticorrelations
that result from the quantum measurement
process. In 1974, the first such experiment was carried out by
Clauser, who reported a violation of a classical Cauchy–Schwarz inequality.
In 1977, Kimble et al. demonstrated an analogous
antibunching effect of photons interacting with a beam splitter;
this approach was simplified and sources of error eliminated in the
photonanticorrelation experiment of Grangier et al.
(1986). This work is reviewed and simplified further in Thorn
et al. (2004). (These references are listed below under
Additional references.)
 B. E. A. Saleh and M. C. Teich, Fundamentals of
Photonics (Wiley, 1991)
 E.g. p. 10f. in .
 Section 1.4 in .
 P. 322 in :
 Specifically, Born claimed to have been inspired by Einstein's
neverpublished attempts to develop a "ghostfield" theory, in
which pointlike photons are guided probabilistically by ghost
fields that follow Maxwell's equations.
 Photonphotonscattering section 731, renormalization chapter
82 in
 .
 Sheldon Glashow Nobel lecture, delivered 8
December 1979.
 Abdus Salam Nobel lecture, delivered 8 December
1979.
 Steven Weinberg Nobel lecture, delivered 8
December 1979.
 E.g. chapter 14 in
 E.g. section 10.1 in
 Radiative correction to electron mass section 712, anomalous
magnetic moments section 721, Lamb shift section 732 and
hyperfine splitting in positronium section 103 in
 E. g. sections 9.1 (gravitational contribution of photons) and
10.5 (influence of gravity on light) in
 Ch 4 in
 Polaritons section 10.10.1, Raman and Brillouin scattering
section 10.11.3 in
 E.g. section 115 C in
 Nobel lecture given by G. Wald on December 12, 1967, online at
nobelprize.org: The Molecular Basis of Visual Excitation.
 Photomultiplier section 1.1.10, CCDs section 1.1.8, Geiger
counters section 1.3.2.1 in
 An example is US Patent Nr. 5212709.
 Introductorylevel material on the various subfields of
quantum optics can be found in
Additional references
By date of publication:
 Special supplemental issue of Optics and Photonics
News (vol. 14, October 2003) article web link
Education with single photons: