Hi Caglar, thanks for leaving me a note.

Uncertainty and information are *not* negatively correlated. If there are many possible scenarios/cases (this is what “information” refers to in information theory), it means more uncertainty. :-)

For me, referring to entropy as “(all possible) information” sounds clearer, rather than “uncertainty”. If the word “information” doesn’t click with you, maybe adding “all possible” would make sense to you?

I’m an Engineering Manager at Scale AI and this is my notepad for Applied Math / CS / Deep Learning topics. Follow me on Twitter for more!

I’m an Engineering Manager at Scale AI and this is my notepad for Applied Math / CS / Deep Learning topics. Follow me on Twitter for more!