%path = "physics/S=E*t" %kind = 1 %level = 12

Entropy is the information in a system Entropy is the number of states of the system. What state means is of no importance. It has been abstracted away. Only the number counts here.

The states must actually occur, i.e. the system must change (information events). Energy is the number of states per time. Energy defines time by these changes.

\[S = Et\]

A constant amount of energy can either result in

But a system consists of layers.

Assuming an ideal gas, the energy \(Q=TS\) is given by:

This divides the system in two layers:

The logarithm in entropy only comes up when we distribute the information to more variables of the same kind (e.g. the bit). In the other direction this is the reason for the exponent \(e^S\).

The particle’s direction of motion partitions the number \(N\) of particles

Therefore:

\[Q = ST = 3/2NkT = 3/2RT = 3/2pV\]

For an ideal gas the inner energy is equal to the work done on the surrounding \(3/2 pV\).

The average energy per particle \(E\) is:

\[E = 1/2 m v^2 = 3/2 kT\]

Boltzmann’s constant \(k\) is a conversion factor of units of energy. \(v^2\) can be related better to micro events per time than \(T\), but also \(E\) is only the energy in that layer and not the ultimate unit of information event per time.

The ultimate information event is given by the Planck constant. The sum of all such events create space and time: E-t, x-v, …