For a probability distribution with density $p$, its entropy is given by

\[H(p) = -\Ex_P(\log p).\]

Note that

\[H(p) = -\KL(p \vert u) + \textrm{Const},\]

where $u$ is a uniform distribution. In other words, the (negative) entropy is the Kullback-Leibler distance from a distribution $P$ to the maximum entropy distribution (uniform).