One thing I’ve enjoyed learning about, lately, is the concept of entropy.

In the normal dictionary, entropy means a “state of disorder, confusion, and disorganisation.” In the science of physics, entropy is referred to as a measure of disorder or randomness in a system: the higher the entropy, the greater the disorder. And all this is at the heart of the second law of thermodynamics.

Entropy is an interesting concept because it explains why there are immensely more ways for things to go wrong than to go right. In a closed system (that is one without input of energy), entropy can only increase.

Get to the end of the story — this is just the beginning

This post is for subscribers only. Sign up now to read the post and get access to the full library of content.

Sign up now Already have an account? Sign in
You have successfully subscribed to The Kigalian
Welcome back! You have successfully signed in.
Great! You have sucessfully signed up.
Success! Your email is updated.
Your link has expired
Success! Check your email for a magic link to sign-in.