One thing I’ve enjoyed learning about, lately, is the concept of entropy.
In the normal dictionary, entropy means a “state of disorder, confusion, and disorganisation.” In the science of physics, entropy is referred to as a measure of disorder or randomness in a system: the higher the entropy, the greater the disorder. And all this is at the heart of the second law of thermodynamics.
Entropy is an interesting concept because it explains why there are immensely more ways for things to go wrong than to go right. In a closed system (that is one without input of energy), entropy can only increase.
Get to the end of the story — don't miss any insights
Sign up now to read the post and get access to the full library of posts for subscribers only.
Sign up now
Already have an account? Sign in