en·tro·py /ˈɛntrəpi/ [en-truh-pee] n. 1. Thermodynamics. (in statistical mechanics) a measure of the randomness of the microscopic constituents of a thermodynamic system. Symbol: SMy definition of entropy came from Doc Hebden, my chem 11 and 12 professor back at Kam High - I'll come back to him later. Entropy is a measure of the increasing tendency towards randomness. i.e. if you left things unto their own, the end state would be that which is the most random distribution possible; lacking order and even verging on chaotic.
Take my kitchen counter island (peninsula, really). I built this really nice spot between the kitchen and the "not so great room" where there's a granite countertop, perfect for hanging out during those miscellaneous daily activities like having a coffee or reading e-mail. Ideally, it's a clean space, with only a telephone, laptop and a notepad on it. But over time, it ends up being a repository for everything in your arms/hands that you need a place to set it down. Over the course of a few days or a week, we've put some much stuff on there that you can't even see the counter anymore! No matter how hard we try, the natural state of the countertop is clutter.
So every once in a while, you have to fight the chaos; turn back the clock on entropy, and clean up the mess.
Originally Posted on: Oct 22, 2007