top of page

Entropy

  • In layman's terms, Entropy is a measure of randomness or disorder of any system. The more disordered or random a system is, the higher the entropy it possesses and vice-versa.

  • A very simple example to understand entropy: If you put a bird in a cage, then it is limited to move in the volume of that cage. But when you free that bird, then it has a huge volume in the atmosphere to move. So we can easily conclude that a bird is less random or possesses less entropy when it is bounded by a cage, while on the other hand, a bird is highly random or possesses very high entropy when it is free to move in the surrounding atmosphere.


  • But, What factor decides the level of disorder? Or What decides whether a system is having high entropy or lower?

  • Here comes the main role of the number of arrangements or configurations of a system.

  • If a system’s particles have a very large number of different arrangements or configurations that result in the same state of a system, then that particular state is highly disordered.

  • Similarly, if a system’s particles only have a few arrangements or configurations that result in the same state of a system, then that particular state is highly ordered.

  • In this manner, the entropy of any system at a given state is determined. Here are two examples to understand this principle :

  1. Imagine that you are in a room where a fan is placed at the top of that room. Now, moving fan is changing the position or arrangements or configurations of air particles at every instant. But , in every arrangements of particle due to fan , you are able to breathe without any problem at every instant. There are million and million of such arrangements , so this state is highly unordered and have entropy increasing at every instant. Now, hypothetically , let’s say all the particle of air is concentrated in one corner of the room in a very small volume. Now , there will be a few number of arrangements that the particles can have in such a small volume . Therefore , it is a highly ordered state .If we talk about probability in both the cases ,then a state with high entropy has more arrangements. While, a state with a few number of arrangements of particles is very unlikely to exist by itself because it is highly ordered.

  2. Imagine, there are 100 coins on a table,

    • Suppose all are facing heads up, call it an event A, now there is only 1 arrangements of such type.

    • Suppose all are facing heads up except one coin, call it an event B, now there are only 100 arrangements of such type.

    • Suppose half the coins are facing heads up, call it an event C, now there are around 10^29 such arrangements.

    • There are total 101 events like above. But let’s observe two extreme cases, which are one with all heads up and another with half the coins are heads up. In event A, the number of arrangements is only 1, so it is highly ordered while in event C, the number of arrangements are around 10^29, so it is highly disordered. So , event C is very likely to happen by itself than event A.

    • From above examples, we can conclude that naturally ,systems always tend to have have disordered states , i.e, with high entropy.

Recent Posts

See All

4 Comments


chloe lee
chloe lee
Mar 31, 2021

yess i love this

Like

Armaan Khokhar
Armaan Khokhar
Mar 24, 2021

Wow amazing excellent very straight forward!

Like

mitaakbari46
mitaakbari46
Mar 24, 2021

I liked that "bird in the cage" example bcoz it was very straightforward example to decode the basic meaning of entropy.

Like

Iván V. Reyes
Iván V. Reyes
Mar 23, 2021

i read most of this article and i think its good, i already know about the entropy and these things but if i didnt, i think that i will understand this and its GOOD, it can be better by talking (i mean writting) more, giving more examples and not only presenting the facts but its good enough :) congrats to the person who wrote this

Like
bottom of page