admin 管理员组

文章数量: 1087833


2024年4月15日发(作者:substrate 半导体)

大白话理解——交叉熵

英文回答:

Cross-entropy is a measure of the difference between

two probability distributions. It is often used as a loss

function in machine learning, where the goal is to minimize

the difference between the predicted distribution and the

true distribution.

The cross-entropy of two probability distributions p

and q is defined as:

H(p, q) = -Σp(x)log(q(x))。

where x is a random variable.

The cross-entropy is always non-negative, and it is

zero if and only if p = q.

The cross-entropy is a measure of how much information

is lost when q is used to approximate p. The higher the

cross-entropy, the more information is lost.

中文回答:

什么是交叉熵?

交叉熵是衡量两个概率分布之间差异的一种方法。在机器学习

中,交叉熵经常被用作损失函数,目的是最小化预测分布和真实分

布之间的差异。

交叉熵的公式。

两个概率分布 p 和 q 的交叉熵定义为:

H(p, q) = -Σp(x)log(q(x))。

其中 x 是一个随机变量。

交叉熵的含义。

交叉熵始终是非负的,当且仅当 p = q 时,交叉熵才为零。

交叉熵衡量的是当用 q 来近似 p 时损失的信息量。交叉熵越

大,损失的信息越多。


本文标签: 交叉 损失 衡量