术语 | entropy |
释义 | entropy 熵,平均信息量 In information theory, the mean value of the measure of information conveyed bythe occurrence of any one of a finite number of mutually exclusive and jointly exhaustive events of definite probabilities; in mathematical notation, this mean H(x) for a set of events x1,…,xn with the probabilities p(x1),…,\\{p(xn)\\} equals the mathematical expectation, or mean value, of the information content I(xn) of the individual events; that is: H(x)=∑ni=1p(xi)I(xi)\\==∑ni=1p(xi)log1p(xi)\\==-∑ni=1p(xi)log p(xi) 在信息论中,对被传送的信息进行度量所采用的一种平均值。被传送的信息包括有限数目的 互斥并联合完备的事件,它们都以一定的概率出现。用数学式子来表示就是:一组事件 x1,…,xn,以既定概率p(x1),…,p(xn)出现,其平均值H(x)等于每个事件的信息量I(xi)的数学期望,即: H(x)=∑ni=1p(xi)I(xi)\\==∑ni=1p(xi)log1p(xi)\\==-∑ni=1p(xi)log p(xi) |
随便看 |
|
计算机英汉双解词典包含21137条计算机术语英汉翻译词条,基本涵盖了全部常用计算机术语的翻译及用法,是计算机学习及翻译工作的有利工具。