Entropy (Information Theory)

Initially written as a Tweet, but to-be-expounded on this blog post.

When the data source produces a low-probability value (i.e., when a low-probability event occurs), the event carries more “information” than when the data source produces a high-probability value.

Wiki on Entropy. Information Theory by Claude Shannon.

In the context of relationships

This translates to relationships in that – “people put their best foot forward”. More information is conveyed when low-probability events occur: e.g. emotional outbursts triggered by stressful situations, a slip of the tongue excused as banter.

brown cows standing on grass field

Call to action

Be vigilant. Observe your surroundings and your environment. Analyze patterns.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.