Question: If the energy can change only by quanta of hf, calculate the actual change in amplitude when the energy changes by one quantum. Why does the decay or increase of the oscillatory motion of springs, pendulums, and the like generally seem to us continuous when it is really quantized? (Holton & Brush, 2001).
|
This question is about the nature of energy which may be described as continuous and discrete. Currently, many physics textbook authors state that energy is ultimately composed of indivisible (or irreducible) tiny lumps. For example, some authors elaborate that particles can have only certain amount of energy such as e, 2e, 3e etc., but they cannot have 1.5e or include any other fractional multiple of e. Furthermore, some physicists consider the nature of physical quantities, such as space and time, are fundamentally discrete rather than continuous. Thus, physics students may have the incorrect idea that energy must only be discrete rather than continuous.
Historically speaking, Planck is sometimes known as the reluctant father of quantum theory. For instance, Kragh (1999) writes that “Planck did not really understand the introduction of energy elements as a quantization of energy, i.e., that the energy of the oscillators can attain only discrete values (p. 62).” However, Planck is rightly cautious on the nature of energy. It is possible that deceleration of “free” electrons can result in the emission of a continuous spectrum of electromagnetic radiation. In 1909, Sommerfeld coins the term Bremsstrahlung that means “braking radiation.”
How would Feynman answer?
Feynman would answer that the nature of energy can be discrete or continuous. We should understand his position from the perspective of “bound/unbound state,” “energy band,” and “definition of energy.”
1. Bound/unbound state: No physical law states that energy must always consist of definite quanta or amounts. Feynman mentions that “[y]ou may have heard that photons come out in blobs and that the energy of a photon is Planck’s constant times the frequency. That is true, but since the frequency of light can be anything, there is no law that says that energy has to be a certain definite amount (Feynman et al., 1963, section 4–4 Other forms of energy).” It is simply not true that the energy of a photon must come in certain discrete lumps. Technically speaking, the energy of electromagnetic radiation is quantized because electrons bounded in atoms have discrete energy levels. Therefore, the energy emitted by the atoms is related to the difference or spacing between these energy levels, and it is numerically equal to E = hf.
Importantly, Feynman provides an excellent analogy on the discreteness of energy: “if sound is confined to an organ pipe or anything like that, then there is more than one way that the sound can vibrate, but for each such way there is a definite frequency. Thus, an object in which the waves are confined has certain resonance frequencies. It is, therefore, a property of waves in a confined space—a subject which we will discuss in detail with formulas later on - that they exist only at definite frequencies. And since the general relation exists between frequencies of the amplitude and energy, we are not surprised to find definite energies associated with electrons bound in atoms (Feynman et al., 1963, section 38–5 Energy levels).” In short, the resonant frequencies of confined sound waves are similar to the definite energy levels of bounded electrons.
In general, a free particle which is not bound to an atom can have continuous (or unrestricted) energy levels. For instance, “[w]hen the electron is free, i.e. when its energy is positive, it can have any energy; it can be moving at any speed. But bound energies are not arbitrary (Feynman et al., 1963, section 38-5 Energy levels).” In other words, “[i]f the energy E is above the top of the potential well, then there are no longer any discrete solutions, and any possible energy is permitted. Such solutions correspond to the scattering of free particles by a potential well (Feynman et al., 1966, section 16–6 Quantized energy levels).” To summarize, bounded particles have discrete energy levels, whereas free particles have continuous energy levels (See Fig 1).
Fig 1
|
2. Energy band: In the band theory of solids, an energy band consists of a large number of energy levels that are very close together. In Feynman Lectures on Computation, “this theory predicts that the possible physical states that can be occupied by electrons within a material are arranged into a series of (effectively continuous) strata called ‘bands,’ each characterized by a specific range of energies for the allowed electron energy levels within it (Feynman et al., 1998, p. 213).” Loosely speaking, textbook authors may describe energy levels of electrons in a semiconductor to be essentially continuous. However, we can empirically determine the energy gap between the conduction band and valence band rather than the discreteness of energy levels within an energy band with current instrumentations and techniques.
Theoretically speaking, physicists may deduce the energy gaps and energy bands (continuous energy levels) by using Kronig and Penney (one-dimensional lattice) model. For example, Feynman explains that “you can see from the figure, the energy can go from (E0 − 2A) at k = 0 to (E0 + 2A) at k = ±π/b. The graph is plotted for positive A; if A were negative, the curve would simply be inverted, but the range would be the same. The significant result is that any energy is possible within a certain range or “band” of energies, but no others (Feynman et al., 1966, section 13–2 States of definite energy).” Nevertheless, possible energy levels of electrons in a semiconductor crystal may be as many as the number of atoms based on a finite lattice model. The number can be related to the Avogadro’s constant that is of the order of 1023.
3. Definition of energy: According to Feynman, “[i]t is important to realize that in physics today, we have no knowledge of what energy is (Feynman et al., 1963, section 4–1 What is energy?).” This does not mean that he has no knowledge of energy. In addition, Feynman did not explicitly say that energy cannot be defined. However, Feynman does have deep knowledge of energy such as the quantization of energy. Moreover, in his Ph.D. thesis, Feynman (1942) proves the conservation of energy by using the transformation of time-displacement. Interestingly, Feynman adopted a more general approach than Noether and he did not know about the Noether’s theorem (Mehra, 1994).
Currently, physics textbooks define energy as the “ability (or capacity) to perform work or “a measure of change” (e.g. Hecht, 2003). These textbook definitions of energy include possible “effects” of energy, but they do not specify the nature of energy. The term “ability” or “capability” does not specifically tell us what energy is. In his autobiography, Feynman (1985) mentions that “[i]t’s also not even true that ‘energy makes it go,’ because if it stops, you could say, ‘energy makes it stop’ just as well (p. 298).” Thus, it is likely that Feynman does not agree with common definitions of energy.
However, Feynman clarifies that “we do not understand this energy as counting something at the moment, but just as a mathematical quantity, which is an abstract and rather peculiar circumstance (Feynman et al., 1963, section 4–4 Other forms of energy).” Essentially, energy is an abstract mathematical quantity. Furthermore, energy is not something concrete like children’s toy blocks (Feynman et al., 1963, section 4–1 What is energy?).” In other words, energy is not a material substance and it is given meaning through mathematical calculations.
To conclude, Feynman would explain that bounded particles have discrete energy levels, whereas free particles have continuous energy levels. In addition, he would elaborate that an energy band consists of a large number of energy levels that are very close together or effectively continuous. However, Feynman disagrees with common definitions of energy and considers energy as a mathematical abstraction.
Note:
1. It is inappropriate to quote Feynman that “we have no knowledge of what energy is” and then conclude that the concept of energy cannot be defined at all. In Feynman’s words, “[d]uring the war, I didn’t have time to work on these things very extensively, but wandered about on buses and so forth, with little pieces of paper, and struggled to work on it and discovered indeed that there was something wrong, something terribly wrong. I found that if one generalized the action from the nice Langrangian forms (2) to these forms (1) then the quantities which I defined as energy, and so on, would be complex. The energy values of stationary states wouldn’t be real and probabilities of events wouldn’t add up to 100%. That is, if you took the probability that this would happen and that would happen - everything you could think of would happen, it would not add up to one (Feynman, 1965, p. 22).”
2. In A survey of physical theory, Planck (1925) writes that “[e]nergy itself cannot be measured, but only a difference of energy. Therefore, one did not previously deal with energy, but with work, and Ernst Mach, who was concerned to a great extent with the conservation of energy, but avoided all speculations outside the domain of observation, has always refrained from talking of energy itself (pp. 106-107).”
References:
1. Feynman, R. P. (1942). Feynman thesis: A New approach to Quantum Theory. Singapore: World Scientific.
1. Feynman, R. P. (1942). Feynman thesis: A New approach to Quantum Theory. Singapore: World Scientific.
2. Feynman, R. P. (1965). The development of the
space-time view of quantum electrodynamics. In
Brown, L. M. (ed.), Selected
papers of Richard Feynman. Singapore: World Scientific.
3. Feynman, R. P. (1985). Surely you’re joking, Mr. Feynman. New York: Norton.
4. Feynman, R. P., Hey, J. G., & Allen, R. W. (1998). Feynman lectures on computation. Reading, Massachusetts: Addison-Wesley.
5. Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.
6. Feynman, R. P., Leighton, R. B., & Sands, M. L. (1966). The Feynman Lectures on Physics, Vol III: Quantum mechanics. Reading, MA: Addison-Wesley.
7. Hecht, E. (2003). Physics: Algebra/Trigonometry (3rd ed.). Pacific Grove, California: Brooks/Cole Publishing.
8. Holton, G. and Brush, S. G. (2001). Physics the Human Adventure: From Copernicus to Einstein and beyond. New Brunswick: Rutgers University Press.
9. Kragh, H. (1999). Quantum generations: A history of physics in the twentieth century. Princeton: Princeton University Press.
10. Mehra, J. (1994). The Beat of a Different Drum: The life and science of Richard Feynman. Oxford: Oxford University Press.
11. Planck, M. (1925/1993). A Survey of Physical Theory. Ontario: Dover.
3. Feynman, R. P. (1985). Surely you’re joking, Mr. Feynman. New York: Norton.
4. Feynman, R. P., Hey, J. G., & Allen, R. W. (1998). Feynman lectures on computation. Reading, Massachusetts: Addison-Wesley.
5. Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.
6. Feynman, R. P., Leighton, R. B., & Sands, M. L. (1966). The Feynman Lectures on Physics, Vol III: Quantum mechanics. Reading, MA: Addison-Wesley.
7. Hecht, E. (2003). Physics: Algebra/Trigonometry (3rd ed.). Pacific Grove, California: Brooks/Cole Publishing.
8. Holton, G. and Brush, S. G. (2001). Physics the Human Adventure: From Copernicus to Einstein and beyond. New Brunswick: Rutgers University Press.
9. Kragh, H. (1999). Quantum generations: A history of physics in the twentieth century. Princeton: Princeton University Press.
10. Mehra, J. (1994). The Beat of a Different Drum: The life and science of Richard Feynman. Oxford: Oxford University Press.
11. Planck, M. (1925/1993). A Survey of Physical Theory. Ontario: Dover.