Entropy Gradient Forcing -- A Simple Example


Consider a box divided in half by a permeable partition. With each tick of a clock, there is probability P that a particle in one half of the box will cross to the other half. If n1 and n2 are numbers of particles in the left and right sides, then a given particle lies with probability p1=n1/(n1+n2) on the left side, and p2=n2/(n1+n2) on the right side. The entropy of the probability distribution is

S=p1 log(p1) + p2 log(p2)


The above animation shows how particles and entropy S evolve for initial n1=100 and n2=0. Initially S=0, because S measures the deficit of information between knowledge of (p1,p2) and a complete specification of the system: when all particles lie in the left-hand side, there is no uncertainty as to the location of individual particles. As the particles `mix' across the partition, S approaches one bit per particle. This is because when p1=p2=1/2, individual particles are equally likely to lie in either half of the box. Thus, one bit is needed to specify the location of each particle. Notice that S fluctuates because the number of particles is finite. These fluctuations disappear if the number of particles approaches infinity, or if averages are performed over an infinite ensemble of statistically identical systems.

We can think of the ensemble mean <n1> as driven toward its equilibrium value <n1>=50 by an entropic `force'. Sufficiently near equilibrium, the force is proportional to the gradient of entropy with respect to <n1>, so that

d<n1>/dt = K(dS/d<n1>)

where K is a constant which is proportional to the rate at which particles switch sides of the box. The next figure compares d<n1>/dt and K(dS/d<n1>) as functions of <n1>, with units chosen such that K=1:


The linearization about equilibrium is accurate to within a few percent for <n1> between about 20 and 80.   Next: Entropy gradient forcing in a non-isolated system

This page reflects contribution from Bill Merryfield.

Dysfonction érectile