Entropy measures the system's thermal energy per unit temperature. The amount of entropy is also a measure of the system's molecular randomness or disorder, as the work is generated from ordered molecular motion. The principle of entropy provides a fascinating insight into the course of random change in many daily phenomena. German physicist Rudolf Clausius was introduced entropy in 1850. It is a thermodynamic property, such as temperature, pressure and volume, but we cannot visualize it easily.
Properties of Entropy
The various properties of entropy are as follows:
Entropy Changes and Calculations
The process is defined as the quantity of heat generated during the entropy change and is reversibly divided by the absolute temperature. The formula for entropy is given below:
∆S = qrev, iso/T
If we apply the same amount of heat to a higher or lower temperature, the randomness of entropy would be higher at the lower temperature. So, we can say that the temperature is inversely proportional to the entropy.
Total entropy change, ∆Stotal = ∆Ssurroundings + ∆Ssystem
WE have to noted that the total change in entropy is equal to the number of systems and environment changes in entropy.
The system loses heat q at a temperature of T1, which is obtained by the surroundings at a temperature of T2.
Therefore, ∆Stotal can be calculated as:
∆Ssystem = -q/T1
∆Ssurrounding = q/T2
∆Stotal = -q/T1+q/T2
There are some steps to calculate the entropy:
Change of entropy during an ideal gas's reversible isothermal expansion,
According to the first law of thermodynamics,
∆U = q + w
An ideal gas to the isothermal expansion is ∆U = 0,
qrev = -wrev = nRTln(V2/V1)
∆S = nRln(V2/V1)
Change in Entropy During Reversible Adiabatic Expansion
Heat exchange would be zero for an adiabatic phase (q = 0), so reversible adiabatic expansion occurs at a constant entropy (isentropic),
q = 0
∆S = 0
While reversible adiabatic expansion is isentropic, it is not isentropic to irreversible adiabatic expansion.
∆S not equal to Zero.
It is the thermodynamic function used to calculate the system's instability and disorder. The entropy of the solid (the particles are tightly packed) is more than the gas (particles are free to move). Also, scientists have concluded that the process entropy would increase in a random process. It contains the system entropy and the entropy of the surroundings. There are several equations to calculate the entropy:
1. When the process occurs at a constant temperature, the entropy would be:
ΔSsystem = qrev/T
Whereas ΔS is the entropy change,
Qrev refers to the reverse of heat,
And T refers to the Kalvin temperature.
2. If the process reaction is known, we can use a table of standard entropy values to find ΔSrxn.
ΔSrxn = ΣΔSproducts - ΣΔSreactants
Where ΔSrxn refers to the standard entropy values,
ΣΔSproducts refers to the sum of the ΔSproducts,
And ΣΔSreactants refers to the sum of the ΔSreactants
3. It is also possible to use the free energy of Gibbs (ΔG) and enthalpy (ΔH) to find ΔS.
ΔG = ΔH - TΔs