math - How to distinguish low entropy and high entropy with the information produced using shannon entropy -


i calculated entropy level of user's behaviour possible states of occurence (h:=home, w:=work or e:=elsewhere) day. user has possible states each hour of day {h,h,h,h,h,h,h,h,h,w,w,w,w,w,w,w,w,w,e,e,h,h,h,h}

p(h)=13/24=0.54

p(w)=9/24=0.38

p(e)=2/24=0.08

using shannon's entropy, wanted calculate entropy level of user figure out how predictable user is.

h(a)=(-p(h)*log2(h)) + (-p(w)*log2(w)) + (-p(e)*log2(e))

=0.48 + 0.53 + 0.29  =1.3 

how distinguish information in bits low entropy or high entropy?

a maximum entropy achieved when events equally probable, , outcome has highest uncertainty

enter image description here

in case maximum entropy is:

hn = log2(24) = 4.5849625007 

the minimum 0. decide if result can classified high or low entropy.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -