What is Entropy? Definition, Meaning, Equation, Units, Formula, Law, Thermodynamics

The word “entropy” comes from the Greek word Entropia, which means “to turn toward” or “to transform”. The concept of entropy provides detailed and deep information about the phenomenon taking place inside the world.

What is Entropy? Definition & Meaning

Entropy is the measure of randomness or disorder of any thermodynamic system. The German physicist Rudolph Clausius coined the term to describe the measurement of disorder, and it first appeared in English in 1868. Entropy tells the amount of thermal energy which is available for useful work.

Entropy Meaning & Explanation

To do work on the system or to get work from the system the particles of the system should be arranged in a particular direction. When there is less arrangement of particles, the system will have more entropy. A system always moves towards higher entropy. For any reversible process, entropy remains constant, while for isolated systems the entropy always increases.

The German scientist Rudolf Clausius created the term entropy in 1865, derived from Greek en- = in + trope = a turning (point). The name bears an etymological resemblance to the word energy. Etymologists believe it was coined to describe the energy type that eventually and inevitably devolves into worthless heat.

History of Entropy

Sadi Carnot’s early statement of what is now known as the second law of thermodynamics influenced the notion. Entropy was introduced to statistical mechanics by Austrian physicist Ludwig Boltzmann and American scientist Willard Gibbs (around 1875). Max Planck later expanded on this concept.

The concept of entropy was applied to quantum mechanics. Then two great scientists of that era, the Kolmogorov-Sinai entropy, Andrei Kolmogorov, proposed the idea of entropy in dynamical systems, which Yakov Sinai refined.

The discovery by Rolf Landauer (1961) of the heat equivalent of the erasure of one bit of information, which brought the notions of entropy in thermodynamics and information theory together, was sparked by the formulation of Maxwell’s paradox by James C. Maxwell (around 1871) sparked a search for the physical meaning of information.

The term entropy is now employed in various other fields (such as sociology), and it is sometimes utilized outside of physics or mathematics, where it loses its strict quantitative nature. It usually refers to confusion, chaos, the loss of diversity, or a tendency toward a uniform distribution of types.

What is Charle’s Law?

Explanation of Entropy

Entropy is a mathematical concept that expresses the intuitive sense that operations are impossible, even though they do not contradict the fundamental law of energy conservation. A block of ice placed on a hot stove, for example, will undoubtedly melt as the stove cools. This process is irreversible because no slight modification will cause the melted water to revert into the ice while the stove heats up.

In contrast, depending on whether a tiny amount of heat is given to or taken from the system, a block of ice placed in an ice-water bath will melt or freeze a bit more. It is reversible because just a minuscule amount of heat is required to reverse the process from progressive freezing to progressive thawing.

Similarly, compressed gas confined in a cylinder might either expand freely into the atmosphere (an irreversible process) if a valve was opened, or it could accomplish beneficial work by moving a movable piston against the force required to confine the gas. Because a minor increase in the restraining force can switch the direction of the process from expansion to compression, the latter is reversible. The system is in equilibrium with its surroundings for reversible processes but not for irreversible processes.

Formula or Equation of Entropy

According to the Clausius definition, if a quantity of heat Q flows into a huge heat reservoir at a temperature T above absolute zero, the entropy increase is S = Q/T. This equation essentially presents an alternate temperature definition that coincides with the standard definition. Assume at temperatures T1 and T2; there are two heat reservoirs R1 and R2 (such as the stove and the block of ice). The net entropy change for the two reservoirs is if heat Q travels from R1 to R2. Hence, the equation of entropy is hereunder:

ΔS =Q (1/T2 – 1/T1)

If T1 > T2, this is a favorable result. As a result, the heat never flows spontaneously from cold to hot is comparable to a spontaneous flow of heat needing a positive net entropy change. T1 = T2 indicates that the reservoirs are in equilibrium, that no heat flows, and that ΔS = 0.

The condition ΔS= 0 establishes the highest attainable efficiency of heat engines, which are systems like gasoline or steam engines that may cyclically do work. For the entire cycle, suppose a heat engine collects heat Q1 from R1 and exhausts heat Q2 to R2. W = Q1 – Q2 

The net entropy difference becomes,

ΔS =Q/T2 – Q/T1

Q2 should be as tiny as feasible concerning Q1 to make W as large as possible. However, Q2 cannot be 0 because this would break the second rule by making S negative. The requirement ΔS = 0 corresponds to the minimum feasible value of Q2. As a result,

(Q2/Q1)min = T2/T1

As the basic equation determining the efficiency of all heat engines, a slight modification would be enough to make the heat engine operate backward as a refrigerator; a process with ΔS = 0 is reversible.

The entropy changes of the working substance in a heat engine, such as a gas in a cylinder with a moveable piston, may also be calculated using the same approach. When a gas absorbs a total quantity of heat dQ from a heat reservoir at temperature T and expands reversibly against the highest feasible restraining pressure P, the gas does the maximum work dW = P dV, where dV is the volume change. As the gas expands, its internal energy may change by dU. As a result, dQ = dU + PdV due to energy conservation. Because when maximum work is done, the net entropy change for the system plus reservoir is zero, and the entropy of the reservoir falls by an amount dSreservoir = dQ/T, which is then counterbalanced by an increase of entropy

dSsystem = (dU + P dV)/T = dQ/T

For the present working gas, so that dSsystem + dSreservoir = 0.

Because less than the maximum amount of work is done in any practical process (due to friction, for example), the actual quantity of heat dQ′ absorbed from the heat reservoir is less than the maximum dQ. The gas may, for example, be allowed to expand freely into a vacuum and do no work. As a result, it is possible to assert that

dSsystem = (dU + P dV)/T ≥ dQ’/T

This equation describes the system as a thermodynamic state variable, meaning that its value is entirely determined by its present state rather than how it got there. Entropy is a broad attribute whose quantity is determined by the amount of material in the system.

For an extensive system in thermodynamic equilibrium, entropy S is proportional to the natural logarithm of a quantity Ω. It depicts the maximum number of microscopic ways according to which the macroscopic state corresponding to S can be realized; that is, S = k ln Ω, where k is the Boltzmann constant related to molecular energy.

Because all spontaneous processes are irreversible, the world’s entropy is said to be growing, meaning that more and more energy becomes unavailable for conversion into work. The universe is supposed to be “running down” as a result of this.

Explore zeroth law of thermodynamics

Units of Entropy

Entropy changes ΔS= Tqrev (the change in the value of entropy is called entropy change). Unit of entropy is JK-1mol-1

The units of entropy are JK−1mol−1, which means that joules of energy are unavailable per unit of heat (in Kelvin) per mol. 

Types of Entropy

Entropy has many types generally, but in thermodynamics, there are two types of entropies

Thermodynamic Entropy

One of the two types of entropy in Thermodynamic entropy. A physical system in thermodynamics is a collection of objects (bodies) whose state is defined by numerous attributes such as density distribution, pressure, temperature, velocity, chemical potential, etc. When a physical system transitions from one state to another, its entropy changes.

ΔS=∫dQ/T

When dQ signifies a heating, element absorbed (or expelled; in the latter case, it has a negative sign), T is the absolute temperature of that body at that time, and the integration is overall heat elements active in the passage. The method above may be used to compare the entropies of different states of a system. It can also be used to calculate the entropy of each state up to a fixed value (which is satisfactory in most cases). The third law of thermodynamics determines the absolute value of entropy.

Notice that when an element dQ of heat is transferred from a warmer body at temperature T1 to a cooler one at temperature T2, then the entropy of the first body changes by –dQ/T1, while that of the other rises by dQ/T2. Since T2<T1, the absolute value of the latter fraction is larger, and jointly the entropy of the two-body system increases (while the global energy remains the same).

If a system does not interact with its environment, it is considered isolated (i.e., is not influenced in any way). An isolated system, in an instance, does not exchange energy, matter, or even information with its surroundings. According to the first rule of thermodynamics, an isolated system may only transition between states with the same global energy (the conservation of energy principle).

The second rule of thermodynamics establishes the irreversibility of evolution: an isolated system cannot go from a higher to a lower entropy state. Alternatively, the second law states that it is impossible to carry out a process whose only result is heat transfer from a cooler medium to a warmer medium. The second rule of thermodynamics establishes the irreversibility of evolution: an isolated system cannot go from a higher to a lower entropy state. Alternatively, the second law states that it is impossible to carry out a process whose sole result is heat transfer from a cooler to a warmer medium. Any such transmission must entail outside activity; the elements involved in the work will also change their states, raising the total entropy.

The first and second laws of thermodynamics together imply that an isolated system will tend to the state of maximal entropy among all states of the same energy. This state is called the equilibrium state, and reaching it is interpreted as the thermodynamic death of the system. The energy distributed in this state is incapable of any further activity.

Boltzmann entropy

Boltzmann defined entropy as the total number of accessible microstates in a system with many particles. He specified this by the macroscopic parameters of energy E, volume V, and particle number N. Consider a gas with N simple particles in volume V, with each particle’s micro-state represented by its position vector Ri and velocity vector vi.

Simple particles are defined as those that have no internal degrees of freedom. Simple atoms include argon, neon, and other similar elements. They all have internal degrees of freedom, but they are believed to be constant in the processes we’ll look at. Assuming that the gas is highly dilute and that particle interactions can be ignored; the system’s total energy is just the sum of all the particles’ kinetic energies.

He further explained it by postulating a formula

S=Kb log WS= KB log W

Kb is a value (1.380 1023 J/K), known as the Boltzmann Constant, and W is the number of accessible microstates in the system. The natural logarithm is used here. Boltzmann’s entropy appears to be much different from Clausius’ entropy. Nonetheless, in all circumstances where changes in entropy can be computed, the values determined by the two approaches match. As described in equation (2), Boltzmann’s entropy has caused a great deal of debate about whether entropy is a subjective quantity or not.

The following is an example of ambiguity seen in many popular science books. “The entropy of a system is thought to be connected to our understanding of its state.” The Entropy is zero if we know the system is in a particular state. As a result, one does not know whether the entropy is affected by the state of the system. This misconception stemmed from a misinterpretation of W, which is the system’s total number of accessible microstates. If W=1, then the system’s entropy is zero (as it is for many substances at 0K). if there are W states and we know which one the system is in, the entropy remains k in W rather than zero.

The Boltzmann entropy does not have an explicit entropy function in general. However, one can derive an entropy function for some specific systems based on Boltzmann’s definition. The most famous case is the entropy of an ideal gas, for which one can derive an explicit entropy function

Entropy of a System

The random disorderliness of a system can be defined as the change in entropy of a system. When there is more chaos in an isolated system, the entropy rises as well—entropy increases when reactants break down into more products during chemical processes. A system with more unpredictability has a higher temperature than one with a lower temperature. From these examples, it is clear that entropy increases with a decrease in regularity.

Entropy Change and Calculations

The process is defined as the quantity of heat emitted or absorbed isothermally and reversibly divided by the absolute temperature during entropy change. The following is the entropy formula:

Total entropy change, ∆Stotal =∆Ssurroundings+∆Ssystem

If the system loses heat q at temperature T1, which is received by surroundings at a temperature T2.

So, ∆Stotal can be calculated

  • ∆Ssystem=-q/T1
  • ∆Ssurrounding =q/T2
  • ∆Stotal=-q/T1+q/T2

If ∆Stotal is positive, then the process is spontaneous, and if it is negative, then the process is nonspontaneous, and if it is zero, then the process is at equilibrium.

 And entropy change during an isothermal reversible expansion of a gas is

  • ∆S = qrev,iso/T

According to the first law of thermodynamics,

  • ∆U=q+w

For the isothermal expansion of an ideal gas, ∆U = 0

  • qrev = -wrev = nRTln(V2/V1)

Therefore,

  • ∆S = nRln(V2/V1)

Importance of Entropy

Entropy holds great importance in physics, chemistry, and our daily lives. The concept of existence without entropy is not possible

  •  One of the foremost significances of entropy is that it shows which way the heat is flowing. Because spontaneous events occur at such a rapid rate, they are often known as irreversible processes. Natural processes are all spontaneous.
  • Another important significance of entropy is that It aids in determining an object’s thermodynamic condition. A little thought will reveal that when a spontaneous process occurs, it changes from a less likely to a more likely condition.
  • One of the general significances of entropy is that like temperature, volume, and pressure; it also depicts the state of the body.
  • It assesses the system’s instability or randomness. When a gas expands into a vacuum, water flows out of a reservoir, a spontaneous chain reaction begins, which results in increasing disorder and hence increasing entropy.

Applications of Entropy in Daily Life

  • Entropy has various applications in the medicinal, physical, and chemical fields. For example, one common application of entropy is in our random life activities when some gas molecules are imprisoned (confined) in a closed container. Releasing a confined gas such as steam in a pressure cooker results in a reduction in the anger of steam molecules (temperature of them). When we remove the weight from the pressure cooker’s top and let the steam fill a large balloon, we notice that the temperature of steam in the balloon is much lower than the temperature of steam in the pressure cooker’s worth nothing than steam. Once released, it can expand to the same size (entropy) as the enthalpy (energy) contained inside it.
  • Another application of entropy is in a 4-stroke diesel engine when the piston takes the air into the cylinder (pulls the air into the cylinder). When performing the first or inlet stroke while moving from top to bottom, it fills the cylinder with air and then comes the turn of compressing this air, which is too much (about the ratio of 1:20) in comparison to a 4-stroke petrol engine. This drop-in Entropy without lowering Enthalpy (i.e., without dispersing heat someplace in the surrounding or neighboring matter or environment) results in a helpful rise in air temperature of roughly 600 degrees Fahrenheit. Which is a very high temperature to burn the diesel sprayed at that time
  • Application of entropy is also present in chemistry; one of entropy’s preferred techniques is oxidation. Entropy will begin to affect the chemical structure of every physical thing that humanity develops, whether it is formed of most metals or other elements. The chemical composition of a thing eventually deteriorates to the point that the object’s original purpose is no longer feasible, rust being the most well-known example. Liquids are another typical example. When not used for its original purpose, almost every liquid, whether related to food or industrial operations, begins to deteriorate and becomes unusable pretty rapidly.
  • It also has its application in heat processes. A temperature rise is another term for it. Entropy promotes breakdown, degradation, and destruction of the target object with each degree increase in temperature. Entropy’s preferred way of rendering all produced machinery and electronics obsolete is heat. Minimum entropy is achieved by lowering the temperature to.01 degrees Kelvin. Maximum entropy is reached when the temperature rises to x millions/billions of degrees.
  • Entropy prefers to combine approaches wherever feasible for the sake of speeding up the destruction process, which is generally exponential. The finest example is when friction creates heat, which causes expansion, which causes more friction, which causes more heat, and so on, with the inevitable and sometimes rapid outcome of the victim item being destroyed. It is most commonly seen in any manufactured device with moving parts.

Conclusion

In short, by considering the application, significance, types, and equation of entropy, we come to know that entropy is a common and important phenomenon taking place in thermodynamics systems and in the universe.

The entropy equation, its importance in our everyday lives, and its change during the routine and isothermal expansion of gas create a void for scientists to discover more about it. Entropy can be regarded as an important phenomenon out of many processes producing a change in the universe.

MechStudies

Hello Everyone! We, Rituparna Guha & Firan Mondal, are delighted to present our Mechanical Engineering articles. We will capture all types of articles and try to explain in the simplest way with a lot of diagrams! Happy Reading....

One thought on “What is Entropy? Definition, Meaning, Equation, Units, Formula, Law, Thermodynamics

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Posts