Hi Caglar, thanks for leaving me a note.

Uncertainty and information are *not* negatively correlated. If there are many possible scenarios/cases (this is what “information” refers to in information theory), it means more uncertainty. :-)

For me, referring to entropy as “(all possible) information” sounds clearer, rather than “uncertainty”. If the word “information” doesn’t click with you, maybe adding “all possible” would make sense to you?

I’m an Engineering Manager at Scale AI and this is my notepad for Applied Math / CS / Deep Learning topics. Follow me on Twitter for more!

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store