Saturday, October 16, 2010

As Time's ARROW

Entropy as Time's Arrow

One of the ideas involved in the concept of entropy is that nature tends from order to disorder in isolated systems. This tells us that the right hand box of molecules happened before the left. Using Newton's laws to describe the motion of the molecules would not tell you which came first.










Entropy and Disorder

If you assert that nature tends to take things from order to disorder and give an example or two, then you will get almost universal recognition and assent. It is a part of our common experience. Spend hours cleaning your desk, your basement, your attic, and it seems to spontaneously revert back to disorder and chaos before your eyes. So if you say that entropy is a measure of disorder, and that nature tends toward maximum entropy for any isolated system, then you do have some insight into the ideas of the second law of thermodynamics.
Some care must be taken about how you define "disorder" if you are going to use it to understand entropy. A more precise way to characterize entropy is to say that it is a measure of the "multiplicity" associated with the state of the objects. If a given state can be accomplished in many more ways, then it is more probabable than one which can be accomplished in only a few ways. When "throwing dice", throwing a seven is more probable than a two because you can produce seven in six different ways and there is only one way to produce a two. So seven has a higher multiplicity than a two, and we could say that a seven represents higher "disorder" or higher entropy.
For a glass of water the number of molecules is astronomical. The jumble of ice chips may look more disordered in comparison to the glass of water which looks uniform and homogeneous. But the ice chips place limits on the number of ways the molecules can be arranged. The water molecules in the glass of water can be arranged in many more ways; they have greater "multiplicity" and therefore greater entropy.

Sunday, October 3, 2010

Second Law of Thermodynamics

The second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system. Hence, in a system isolated from its environment, the entropy of that system will tend not to decrease. It follows that heat will not flow from a colder body to a hotter body without the application of work (the imposition of order) to the colder body. Secondly, it is impossible for any device operating on a cycle to produce net work from a single temperature reservoir; the production of net work requires flow of heat from a hotter reservoir to a colder reservoir. As a result, there is no possibility of a perpetual motion system. Finally, it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.

It follows from the second law of thermodynamics that the entropy of a system that is not isolated may decrease. An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics.

Entropy

Entropy is a macroscopic property of a thermodynamic system that is a measure of the microscopic disorder within the system. It is defined by the second law of thermodynamics. Thermodynamic systems are made up of microscopic objects, e.g., atoms or molecules, which carry energy. According to the second law of thermodynamics, the thermodynamic entropy is a measure of the amount of energy which does no work during energy conversions.

From a thermodynamic point of view, machines are energy conversion devices. Thus, such devices can only be driven by convertible energy. The combination of thermal energy (or its equivalents) and entropy is already converted energy.This is the reason why Rudolf Clausius in 1865 coined the term entropy based on the Greek εντροπία [entropía], a turning toward, from εν- [en-] (in) and τροπή [tropē] (turn, conversion).

The dimension of entropy is energy divided by temperature, and its SI unit is joules per kelvin.