Sunday 12 June 2016

Entropy (remains unchanged or increases?)


Question: Explain the change, if any, in the entropy of an ideal gas when it has completed one cycle.


In general, the entropy of an ideal gas is a state function which is path independent. Thus, the entropy of the ideal gas remains unchanged in a cyclic process because it is a state function. Interestingly, Swendsen (2011) provided a list of physicists’ disagreements on the meaning of entropy:

1. The basis of statistical mechanics: theory of probability or other theory?
2. The entropy of an ideal classical gas of distinguishable: extensive or not extensive?
3. The properties of macroscopic classical systems with distinguishable and indistinguishable particles: same or different?
4. The entropy of a classical ideal gas of distinguishable particles: additive or not additive?
5. Boltzmann defined (or did not define?) the entropy of a classical system by the logarithm of a volume in phase space.
6. The symbol W in the equation S = k log W, which is inscribed on Boltzmann’s tombstone is a volume in phase space or the German word “Wahrscheinlichkeit” (probability)?
7. The entropy should be defined in terms of the properties of an isolated system or a composite system?
8. The validity of thermodynamics: a finite system or in the limit of infinite system size?
9. Extensivity is (or is not?) essential to thermodynamics.

However, there are also disagreements such as whether we should define entropy by using dS = dQ/T. One may consider the entropy of an isolated system can be well defined even when its temperature is unknown, undefined, or zero.
(e.g. https://www.av8n.com/physics/thermo/entropy.html#sec-not-dq)

How would Feynman answer?


Feynman would explain that the entropy of the ideal gas remains constant or increases after it has completed one cycle. However, it is more meaningful to understand his explanations from the perspectives of “state function,” “idealized process,” and “problems of defining entropy.”


1. Entropy is a function of the condition: Generally speaking, the concept of entropy may be defined as a measure of disorder. Thus, Feynman explains that “[i]f we have white and black molecules, how many ways could we distribute them among the volume elements so that white is on one side, and black on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure ‘disorder’ by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy (Feynman et al., 1963, section 46–5 Order and entropy).” However, entropy is also dependent on our knowledge of the possible locations of the molecules. In Feynman’s (1996) words, “[t]his concept of ‘knowledge’ is extremely important, and central to the concept of entropy (p. 142).”

Furthermore, Feynman mentions that “we have found another quantity which is a function of the condition, i.e., the entropy of the substance. Let us try to explain how we compute it, and what we mean when we call it a “function of the condition (Feynman et al., 1963, section 44-6 Entropy).” That is, entropy is a state function and the change in entropy is dependent on the initial and final state of a system. Importantly, Feynman states that “it applies only to reversible cycles. If we include irreversible cycles, there is no law of conservation of entropy (Feynman et al., 1963, section 44-6 Entropy).” In other words, the change in entropy is dependent on the process such as a reversible process or irreversible process.

2. A reversible process is an idealized process: The reversible process is a quasi-static process in which its direction can be reversed by means of infinitesimal change. Feynman clarifies that “in any process that is irreversible, the entropy of the whole world is increased. Only in reversible processes does the entropy remain constant. Since no process is absolutely reversible, there is always at least a small gain in the entropy; a reversible process is an idealization in which we have made the gain of entropy minimal (Feynman et al., 1963, section 44-6 Entropy).” In short, a reversible process is an idealized process that is unlikely possible in the real world. Thus, in Feynman Lectures on Computation, Feynman (1996) elaborates that “[f]or an irreversible process, the equality is replaced by an inequality, ensuring that the entropy of an isolated system can only remain constant or increase (p. 141).” However, the entropy of the universe still increases even the process is reversible.

Importantly, the concept of entropy could be defined in terms of reversible engines. According to Feynman, “we will lose something if the engines contain devices in which there is friction. The best engine will be a frictionless engine. We assume, then, the same idealization that we did when we studied the conservation of energy; that is, a perfectly frictionless engine (Feynman et al., 1963, section 44-3 Reversible engines).” That is, the reversible engines should be ideally frictionless. Therefore, Feynman explains that “the ideal engine is a so-called reversible engine, in which every process is reversible in the sense that, by minor changes, infinitesimal changes, we can make the engine go in the opposite direction. That means that nowhere in the machine must there be any appreciable friction (Feynman et al., 1963, section 44-3 Reversible engines).” Simply phrased, the reversible engine is a theoretical idealization.

3. Problems of defining entropyCurrently, there is no consensual definition of entropy among physicists. In a footnote of Feynman Lectures on Computation, it is stated that “[l]egend has it that Shannon adopted this term on the advice of John von Neumann, who declared that it would give him ‘... a great edge in debates because nobody really knows what entropy is anyway’ (Feynman, 1996, p. 123).” Thus, we may expect the concept of entropy to be redefined in the future. More importantly, Feynman emphasizes that the expression ∆S = ∫dQ/T does not completely define the entropy, but the difference of entropy between two different physical conditions. He explains that “[o]nly if we can evaluate the entropy for one special condition can we really define S absolutely (Feynman et al., 1963, section 44-6 Entropy).” For example, we can calculate the final entropy of an isolated system by adding the change in entropy to the initial entropy. In addition, one may need to determine the initial entropy or the entropy of the system at absolute zero.

Interestingly, the concept of entropy is related to a problem of defining reversible processes. For example, in Samiullah’s (2007) words, “reversible processes are idealized processes in which entropy is exchanged between a system and its environment and no net entropy is generated (p. 609).” Essentially, he proposes to define a reversible process in terms of the constancy of entropy such that it distinguishes reversible processes from quasi-static processes. However, there is a circularity problem if the reversible process is defined in terms of entropy, whereas the concept of entropy is defined in terms of the reversible process. Perhaps one may still argue whether this problem of circularity is trivial or unavoidable.

       In summary, the entropy of the ideal gas could remain constant or increase after a cyclic process. Importantly, a reversible process is an idealized process in which its direction can be “reversed” by infinitesimally small and extremely slow steps. However, Feynman would also discuss problems of defining entropy.

Note:
You may want to take a look at this website:
http://physicsassessment.blogspot.sg/2016/06/ib-physics-2015-higher-level-paper-2_9.html

References:
1. Feynman, R. P., Hey, J. G., & Allen, R. W. (1998). Feynman lectures on computation. Reading, Massachusetts: Addison-Wesley.
2. Feynman, R. P., Leighton, R. B., & Sands, M. (1963). The Feynman Lectures on Physics, Vol I: Mainly mechanics, radiation, and heat. Reading, MA: Addison-Wesley.
3. Swendsen, R. H. (2011). How physicists disagree on the meaning of entropy. American Journal of Physics79(4), 342-348.

No comments:

Post a Comment