IEVref: | 171-07-15 | ID: | |

Language: | en | Status: Standard | |

Term: | entropy, <in information theory> | ||

Synonym1: | average information content [Preferred] | ||

Synonym2: | negentropy [Deprecated] | ||

Synonym3: | |||

Symbol: | H(X)
| ||

Definition: | mean value of the information content of the events in a finite set of mutually exclusive and jointly exhaustive events $H={\displaystyle \sum _{i=1}^{n}p({x}_{i})\cdot I({x}_{i})}={\displaystyle \sum _{i=1}^{n}p({x}_{i})\cdot \mathrm{log}\left(\frac{1}{p({x}_{i})}\right)}$ where $X=\left\{{x}_{1}\mathrm{,}\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}{x}_{n}\right\}$ is the set of events ${x}_{i}\text{\hspace{0.17em}}\left(i=\mathrm{1,}\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n\right)$, $I({x}_{i})$ are their information contents and $p({x}_{i})$ the probabilities of their occurrences, subject to $\sum _{i=1}^{n}p({x}_{i})=1$ EXAMPLE Let $\left\{a,b,c\right\}$ be a set of three events and let $p(a)=\mathrm{0,5}$, $p(b)=\mathrm{0,25}$ and $p(c)=\mathrm{0,25}$ be the probabilities of their occurrences. The entropy of this set is $H(X)=p(a)\cdot I(a)+p(b)\cdot I(b)+p(c)\cdot I(c)=\mathrm{1,5}\text{\hspace{0.17em}}\mathrm{Sh}$. | ||

Publication date: | 2019-03-29 | ||

Source | IEC 80000-13:2008, 13-25, modified – Addition of information useful for the context of the IEV, and adaptation to the IEV rules | ||

Replaces: | |||

Internal notes: | |||

CO remarks: | |||

TC/SC remarks: | |||

VT remarks: | |||

Domain1: | |||

Domain2: | |||

Domain3: | |||

Domain4: | |||

Domain5: |

$H={\displaystyle \sum _{i=1}^{n}p({x}_{i})\cdot I({x}_{i})}={\displaystyle \sum _{i=1}^{n}p({x}_{i})\cdot \mathrm{log}\left(\frac{1}{p({x}_{i})}\right)}$

where $X=\left\{{x}_{1}\mathrm{,}\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}{x}_{n}\right\}$ is the set of events ${x}_{i}\text{\hspace{0.17em}}\left(i=\mathrm{1,}\text{\hspace{0.17em}}\dots ,\text{\hspace{0.17em}}n\right)$, $I({x}_{i})$ are their information contents and $p({x}_{i})$ the probabilities of their occurrences, subject to $\sum _{i=1}^{n}p({x}_{i})=1$

EXAMPLE Let $\left\{a,b,c\right\}$ be a set of three events and let $p(a)=\mathrm{0,5}$, $p(b)=\mathrm{0,25}$ and $p(c)=\mathrm{0,25}$ be the probabilities of their occurrences. The entropy of this set is $H(X)=p(a)\cdot I(a)+p(b)\cdot I(b)+p(c)\cdot I(c)=\mathrm{1,5}\text{\hspace{0.17em}}\mathrm{Sh}$.