%path = "physics/S=E*t" %kind = 1 %level = 12
Entropy is the information in a system Entropy is the number of states of the system. What state means is of no importance. It has been abstracted away. Only the number counts here.
The states must actually occur, i.e. the system must change (information events). Energy is the number of states per time. Energy defines time by these changes.
A constant amount of energy can either result in
a few states cycling fast: \(\Delta S\) and \(\Delta t\) small
a lot of states cycling slow: \(\Delta S\) and \(\Delta t\) large
But a system consists of layers.
Assuming an ideal gas, the energy \(Q=TS\) is given by:
the temperature \(T\): the average kinetic energy of one particle
the entropy \(S\)
This divides the system in two layers:
\(T\) encodes information events (energy) of a layer below.
\(S\) counts the events in the current layer.
The logarithm in entropy only comes up when we distribute the information to more variables of the same kind (e.g. the bit). In the other direction this is the reason for the exponent \(e^S\).
The particle’s direction of motion partitions the number \(N\) of particles
by direction: factor \(1/2\), since exclusive
by orientation: factor \(3\), since \(T\), through averaging, is acting on all three orientations simultaneously
Therefore:
For an ideal gas the inner energy is equal to the work done on the surrounding \(3/2 pV\).
The average energy per particle \(E\) is:
Boltzmann’s constant \(k\) is a conversion factor of units of energy. \(v^2\) can be related better to micro events per time than \(T\), but also \(E\) is only the energy in that layer and not the ultimate unit of information event per time.
The ultimate information event is given by the Planck constant. The sum of all such events create space and time: E-t, x-v, …