Sunday 26 June 2016

The blue sky (Rayleigh scattering?)


Question: Why is the sky blue?


The sky may appear red or orange near sunrise and sunset. The question could be more appropriately rephrased as “Why does the sky appear blue during a cloudless day?” However, the appearance of blue sky is sometimes simply explained to be due to Rayleigh scattering. The use of the phrase “Rayleigh scattering” does not mean a genuine understanding of the phenomenon. In the words of John William Strutt (Lord Rayleigh), “[w]henever the particles of the foreign matter are sufficiently fine, the light emitted laterally is blue in colour, and, in a direction perpendicular to that of the incident beam, is completely polarized (Strutt, 1871, p. 107).” Simply put, small particles in the atmosphere are set in oscillation by incoming light rays and the particles emit polarized light rays in various directions.

How would Feynman answer?

Feynman would answer that the sky appears blue because of the scattering of sunlight by atoms including electrons, and it is related to the human eye’s sensitivity to the blue light. More importantly, Feynman would elaborate his answer with regard to the definition of scattering, the scatterers (the objects involved in scattering), and the human visual system (a problem in defining blue).

1. The definition of scattering: Based on the classical theory, light scattering is a phenomenon in which “a light wave from some source can induce a motion of the electrons in a piece of material, and these motions generate their own waves (Feynman et al., 1963, section 30–2 The diffraction grating).” In other words, the light rays oscillate the electrons in atoms, and their motions can generate light waves in various directions. Essentially, the oscillations of an electron can be modeled by using Newton’s law of motion and thus, expressed as (d2x/dt2) + γ(dx/dt) + ω02x = F/m (Feynman et al., 1963, section 23–2 The forced oscillator with damping).

By solving the equation, we can derive the total amount of light energy per second, scattered in all directions by the single atom, P = (½ϵ0cE02)(8πr02/3)[ω4/(ω2ω02)2]. This means that, to a first approximation, the intensity of scattered light rays is proportional to the fourth power of the frequency (or 1/λ4). Feynman explains that “light which is of higher frequency by, say, a factor of two, is sixteen times more intensely scattered, which is a quite sizable difference. This means that blue light, which has about twice the frequency of the reddish end of the spectrum, is scattered to a far greater extent than red light. Thus when we look at the sky it looks that glorious blue that we see all the time! (Feynman et al., 1963, section 32-5 Scattering of light)”

Alternatively, the scattering mechanism can be explained by the quantum theory. In short, Feynman considers the scattering of light as a two-step process: “[t]he photon is absorbed, and then is re-emitted (Feynman et al., 1966, section 182 Light scattering).” The photon, whether it is left or right circularly polarized (RHC), is initially absorbed in an atom. Next, the photon is re-emitted in another direction due to the oscillating electric field of the atom. For example, “with an incoming beam of RHC light the intensity of the RHC light in the scattered radiation will vary as (1 + cos θ)2 (Feynman et al., 1966, section 182 Light scattering).” However, the oscillating direction of the emitted photon remains the same as the atom and the incident photon.

2. The scatterers: The scattering mechanism is dependent on the scatterers such as hydrogen atoms. In Alix G. Mautner Memorial Lectures, Feynman (1985) clarifies that “[a]toms that contain more than one proton and the corresponding number of electrons also scatter light (atoms in the air scatter light from the sun and make the sky blue)! (p. 100)” In addition, the scattering amplitude can be calculated by using quantum electrodynamics: “[t]he total amplitude for all the ways an electron can scatter a photon can be summed up as a single arrow, a certain amount of shrink and turn. This amount depends on the nucleus and the arrangement of the electrons in the atoms, and is different for different materials (Feynman, 1985, p. 101).”

The scattering of light is related to the atomic size and its resonant frequency. If the frequency of the light rays is closer to the atom’s resonant frequency, the atom tends to oscillate more vigorously. Feynman proposes the following experiment: “[w]e can make particles that are very small at first, and then gradually grow in size. We use a solution of sodium thiosulfate with sulfuric acid, which precipitates very fine grains of sulfur. As the sulfur precipitates, the grains first start very small, and the scattering is a little bluish. As it precipitates more it gets more intense, and then it will get whitish as the particles get bigger… That is why the sunset is red, of course, because the light that comes through a lot of air, to the eye has had a lot of blue light scattered out, so it is yellow-red (Feynman et al., 1963, section 32-5 Scattering of light).”

The amount of scattering may also vary with the atoms’ locations. To cite Feynman, “if the atoms are very beautifully located in a nice pattern, it is easy to show that we get nothing in other directions, because we are adding a lot of vectors with their phases always changing, and the result comes to zero. But if the objects are randomly located, then the total intensity in any direction is the sum of the intensities that are scattered by each atom, as we have just discussed (Feynman et al., 1963, section 32-5 Scattering of light).” Moreover, the atoms are in motion such that the relative phases between any two atoms continue to change, and there is no continuous constructive interference or stronger scattering in a particular direction.

3. The human visual system: The sensation of “blue sky” is dependent on the human visual system (rods and cones). The sky does not appear violet (shortest visible wavelength) because our eyes are less sensitive to violet light.  In the words of Feynman, “we do not try to define what constitutes a green sensation, or to measure in what circumstances we get a green sensation, because it turns out that this is extremely complicated... Then we do not have to decide whether two people see the same sensation in different circumstances (Feynman et al., 1963, section 35-3 Measuring the color sensation).” Simply phrased, human beings may have different sensations of blue color.

The blue color can be measured as light rays that have wavelengths between 440 to 492 nm (Hoeppe, 2007). However, it is difficult to distinguish the two colors, blue and green, in the dark. The sensation of a color is dependent on the light intensity. Feynman explains that “[i]f we are in the dark and can find a magazine or something that has colors and, before we know for sure what the colors are, we judge the lighter and darker areas, and if we then carry the magazine into the light, we may see this very remarkable shift between which was the brightest color and which was not (Feynman et al., 1963, section 35-2 Color depends on intensity).” This phenomenon is related to the Purkinje effect and it poses a problem of defining blue color precisely.

Finally, it is worthwhile mentioning that the visual system of other living beings and how they perceive the sunlight. Feynman elaborates that “bees can apparently tell the direction of the sun by looking at a patch of blue sky, without seeing the sun itself. We cannot easily do this. If we look out the window at the sky and see that it is blue, in which direction is the sun? The bee can tell, because the bee is quite sensitive to the polarization of light, and the scattered light of the sky is polarized (Feynman et al., 1963, section 36-4 The compound (insect) eye).” However, there should be more research on the sensitivity of visual system of living beings.

       In summary, the light rays from the Sun oscillate the electrons in atoms and their motions can re-emit the light waves in all directions. The amount of scattering is dependent on the scatterers such as hydrogen atoms as well as their size and random locations. Importantly, the sky appears to be blue in color because the human visual system is less sensitive to violet which has a shorter wavelength.

Note:
1. A Feynman diagram of light scattering can be found in page 100 of QED: The strange theory of light and matter (Feynman, 1985).

2. The color of the sky can be used to estimate the size of air molecules by assuming the density fluctuations in the atmosphere. To quote Einstein (1910), “[a]s a rough calculation shows, this formula might very well explain why the light given off by the irradiated atmosphere is predominantly blue. In this connection it is worth noting that our theory does not make any direct use of the assumption of the discrete distribution of matter (p. 247).”

3. Feynman’s explanation of the polarized sky: “The first example of the polarization effect that we have already discussed is the scattering of light. Consider a beam of light, for example from the sun, shining on the air. The electric field will produce oscillations of charges in the air, and motion of these charges will radiate light with its maximum intensity in a plane normal to the direction of vibration of the charges. The beam from the sun is unpolarized, so the direction of polarization changes constantly, and the direction of vibration of the charges in the air changes constantly. If we consider light scattered at 90o, the vibration of the charged particles radiates to the observer only when the vibration is perpendicular to the observer’s line of sight, and then light will be polarized along the direction of vibration. So scattering is an example of one means of producing polarization (Feynman et al., 1963, section 33-2 Polarization of scattered light).”

References:
1. Einstein, A. (1910). The Theory of Opalescence of Homogeneous Fluids and Liquid Mixtures near the Critical State. Annalen der Physik, 33, 1275–1298. In The Collected Papers of Albert Einstein, Volume 3: The Swiss Years: Writings 1909-1911 (Translated by A. Beck & D. Howard). Princeton: Princeton University Press. 
2. Feynman, R. P. (1985). QED: The strange theory of light and matter. Princeton: Princeton University Press.
(245of 452)3. Feynman, R. P., Leighton, R. B., & Sands, M. L. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley. 
4. Feynman, R. P., Leighton, R. B., & Sands, M. L. (1966). The Feynman Lectures on Physics, Vol III: Quantum Mechanics. Reading, MA: Addison-Wesley. 
5. Hoeppe, G. (2007). Why the Sky Is Blue: Discovering the Color of Life. Princeton, NJ: Princeton University Press. 
6. Strutt, J. W. (1871). XV. On the light from the sky, its polarization and colour. The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 41(271), 107-120.

Sunday 12 June 2016

Entropy (remains unchanged or increases?)


Question: Explain the change, if any, in the entropy of an ideal gas when it has completed one cycle.


In general, the entropy of an ideal gas is a state function which is path independent. Thus, the entropy of the ideal gas remains unchanged in a cyclic process because it is a state function. Interestingly, Swendsen (2011) provided a list of physicists’ disagreements on the meaning of entropy:

1. The basis of statistical mechanics: theory of probability or other theory?
2. The entropy of an ideal classical gas of distinguishable: extensive or not extensive?
3. The properties of macroscopic classical systems with distinguishable and indistinguishable particles: same or different?
4. The entropy of a classical ideal gas of distinguishable particles: additive or not additive?
5. Boltzmann defined (or did not define?) the entropy of a classical system by the logarithm of a volume in phase space.
6. The symbol W in the equation S = k log W, which is inscribed on Boltzmann’s tombstone is a volume in phase space or the German word “Wahrscheinlichkeit” (probability)?
7. The entropy should be defined in terms of the properties of an isolated system or a composite system?
8. The validity of thermodynamics: a finite system or in the limit of infinite system size?
9. Extensivity is (or is not?) essential to thermodynamics.

However, there are also disagreements such as whether we should define entropy by using dS = dQ/T. One may consider the entropy of an isolated system can be well defined even when its temperature is unknown, undefined, or zero.
(e.g. https://www.av8n.com/physics/thermo/entropy.html#sec-not-dq)

How would Feynman answer?


Feynman would explain that the entropy of the ideal gas remains constant or increases after it has completed one cycle. However, it is more meaningful to understand his explanations from the perspectives of “state function,” “idealized process,” and “problems of defining entropy.”


1. Entropy is a function of the condition: Generally speaking, the concept of entropy may be defined as a measure of disorder. Thus, Feynman explains that “[i]f we have white and black molecules, how many ways could we distribute them among the volume elements so that white is on one side, and black on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure ‘disorder’ by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy (Feynman et al., 1963, section 46–5 Order and entropy).” However, entropy is also dependent on our knowledge of the possible locations of the molecules. In Feynman’s (1996) words, “[t]his concept of ‘knowledge’ is extremely important, and central to the concept of entropy (p. 142).”

Furthermore, Feynman mentions that “we have found another quantity which is a function of the condition, i.e., the entropy of the substance. Let us try to explain how we compute it, and what we mean when we call it a “function of the condition (Feynman et al., 1963, section 44-6 Entropy).” That is, entropy is a state function and the change in entropy is dependent on the initial and final state of a system. Importantly, Feynman states that “it applies only to reversible cycles. If we include irreversible cycles, there is no law of conservation of entropy (Feynman et al., 1963, section 44-6 Entropy).” In other words, the change in entropy is dependent on the process such as a reversible process or irreversible process.

2. A reversible process is an idealized process: The reversible process is a quasi-static process in which its direction can be reversed by means of infinitesimal change. Feynman clarifies that “in any process that is irreversible, the entropy of the whole world is increased. Only in reversible processes does the entropy remain constant. Since no process is absolutely reversible, there is always at least a small gain in the entropy; a reversible process is an idealization in which we have made the gain of entropy minimal (Feynman et al., 1963, section 44-6 Entropy).” In short, a reversible process is an idealized process that is unlikely possible in the real world. Thus, in Feynman Lectures on Computation, Feynman (1996) elaborates that “[f]or an irreversible process, the equality is replaced by an inequality, ensuring that the entropy of an isolated system can only remain constant or increase (p. 141).” However, the entropy of the universe still increases even the process is reversible.

Importantly, the concept of entropy could be defined in terms of reversible engines. According to Feynman, “we will lose something if the engines contain devices in which there is friction. The best engine will be a frictionless engine. We assume, then, the same idealization that we did when we studied the conservation of energy; that is, a perfectly frictionless engine (Feynman et al., 1963, section 44-3 Reversible engines).” That is, the reversible engines should be ideally frictionless. Therefore, Feynman explains that “the ideal engine is a so-called reversible engine, in which every process is reversible in the sense that, by minor changes, infinitesimal changes, we can make the engine go in the opposite direction. That means that nowhere in the machine must there be any appreciable friction (Feynman et al., 1963, section 44-3 Reversible engines).” Simply phrased, the reversible engine is a theoretical idealization.

3. Problems of defining entropyCurrently, there is no consensual definition of entropy among physicists. In a footnote of Feynman Lectures on Computation, it is stated that “[l]egend has it that Shannon adopted this term on the advice of John von Neumann, who declared that it would give him ‘... a great edge in debates because nobody really knows what entropy is anyway’ (Feynman, 1996, p. 123).” Thus, we may expect the concept of entropy to be redefined in the future. More importantly, Feynman emphasizes that the expression ∆S = ∫dQ/T does not completely define the entropy, but the difference of entropy between two different physical conditions. He explains that “[o]nly if we can evaluate the entropy for one special condition can we really define S absolutely (Feynman et al., 1963, section 44-6 Entropy).” For example, we can calculate the final entropy of an isolated system by adding the change in entropy to the initial entropy. In addition, one may need to determine the initial entropy or the entropy of the system at absolute zero.

Interestingly, the concept of entropy is related to a problem of defining reversible processes. For example, in Samiullah’s (2007) words, “reversible processes are idealized processes in which entropy is exchanged between a system and its environment and no net entropy is generated (p. 609).” Essentially, he proposes to define a reversible process in terms of the constancy of entropy such that it distinguishes reversible processes from quasi-static processes. However, there is a circularity problem if the reversible process is defined in terms of entropy, whereas the concept of entropy is defined in terms of the reversible process. Perhaps one may still argue whether this problem of circularity is trivial or unavoidable.

       In summary, the entropy of the ideal gas could remain constant or increase after a cyclic process. Importantly, a reversible process is an idealized process in which its direction can be “reversed” by infinitesimally small and extremely slow steps. However, Feynman would also discuss problems of defining entropy.

Note:
You may want to take a look at this website:
http://physicsassessment.blogspot.sg/2016/06/ib-physics-2015-higher-level-paper-2_9.html

References:
1. Feynman, R. P., Hey, J. G., & Allen, R. W. (1998). Feynman lectures on computation. Reading, Massachusetts: Addison-Wesley.
2. Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.
3. Swendsen, R. H. (2011). How physicists disagree on the meaning of entropy. American Journal of Physics79(4), 342-348.