Start Encyclopedia69 Dictionary | Overview | Topics | Groups | Categories | Bookmark this page.
 
dictionary -  encyclopedia  
Full text search :        
   A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   X   Y   Z   #   

 

 

Entropy

 
     
  Today, entropy (Greek, ‘transformation’) can be thought of in three ways. It is intimately related to the fact that naturally occurring processes have a direction. On a microscopic level, it is related to the number of ways in which a system can arrange itself. And in modern-day communication theory, it is a measure of information.

The first aspect of entropy is concerned with the second law of thermodynamics. This law has been stated in various forms, but it can be summarized by saying that nothing ever works with perfect efficiency. Any engine, motor or other system which converts energy from one form to another will not convert 100 per cent of the energy. Some will be lost by friction, or heat loss, or other forms of wastage. Examples of such systems are power stations (converting chemical energy in fuel to electrical energy), living creatures and petrol engines. The entropy change of such a system is defined as being equal to the energy change divided by the temperature of the system.

All systems attempt to maximize their entropy. Entropy should not be confused with energy; the energy of the world is constant, but the entropy tends towards a maximum value.

On a microscopic scale, entropy is a measure of the number of ways that a system can arrange itself. This statement implies quantum mechanics; only a system with a finite number of available ‘positions’ or energies for the particles to occupy can have a finite number of arrangements of the particles.

To see why entropy tends towards a maximum, consider several particles in a box. We can arrange them so that they are all in a corner, or more evenly distributed around the box. There are far more ways to arrange them so that they are evenly distributed. This makes sense; if we put some gas into a box, it is unlikely to condense in a corner, but spreads out to fill the box. The entropy of the disordered, evenly distributed arrangement is much higher than an orderly bunching of particles in a corner, thus the system finds the state of highest entropy.

Entropy is a measure of disorder within a system. A system with all its particles neatly in one place is an ordered system, while one that has its particles randomly spread out is far more disordered.

Entropy may also be considered as a measure of information. If we know that all the particles are in the corner, we have considerable information about them. However, if they are spread out, all we know about a particular particle is that it is in the box somewhere! Paul Shannon has shown that entropy and the transmission of information are closely related.

The last quantity that entropy is closely linked with is the direction of time. Entropy always increases with time. Taken to its logical conclusion, this means that the universe is tending toward a state of maximum entropy. JJ
 
 

 

 

 
 
Bookmark this page:
 
 

 

 

 
 
<< former term
 
next term >>
Entomology
 
Environmental Art
 
     

 

Other Terms : Pastoral | Inspiration | Monopsony
Home |  Add new article  |  Your List |  Tools |  Become an Editor |  Tell a Friend |  Links |  Awards |  Testimonials |  Press |  News |  About |
Copyright ©2009 GeoDZ. All rights reserved.  Terms of Use  |  Privacy Policy  |  Contact Us