Ideas, Stories, and Profiles

One thing I’ve enjoyed learning about, lately, is the concept of entropy.

In the normal dictionary, entropy means a “state of disorder, confusion, and disorganisation.” In the science of physics, entropy is referred to as a measure of disorder or randomness in a system: the higher the entropy, the greater the disorder. And all this is at the heart of the second law of thermodynamics.

Entropy is an interesting concept because it explains why there are immensely more ways for things to go wrong than to go right. In a closed system (that is one without input of energy), entropy can only increase.

This post is for subscribers only

Sign up now to read the post and get access to the full library of posts for subscribers only.

Sign up now Already have an account? Sign in
You’ve successfully subscribed to The Kigalian
Welcome back! You’ve successfully signed in.
Great! You’ve successfully signed up.
Your link has expired
Success! Check your email for magic link to sign-in.