Distance entropy (DE) quantifies node centrality based on the distribution of shortest-path lengths from a node to all other nodes in the network [2]. The distance entropy of node \(i\) is defined as
\begin{equation*}
c_{\text{DE}}(i) = -\frac{1}{\log (M_i - m_i)} \sum_{k=1}^{M_i - m_i} p_k^{(i)} \log p_k^{(i)},
\end{equation*}
where \(M_i = \max_j d_{ij}\) and \(m_i = \min_j d_{ij}\) are the maximum and minimum distances from node \(i\), and \(p_k^{(i)}\) is the probability that the distance from \(i\) to a node equals \(k\). Specifically, in a connected graph \(G\), if node \(i\) is at distance \(k\) from \(n_k\) other nodes, then
\[
p_k^{(i)} = \frac{n_k}{N-1},
\]
where \(N\) is the total number of nodes in the network.
Distance entropy captures how evenly distributed the distances from a node are: higher values indicate a more uniform distribution of distances, reflecting nodes that are well-positioned across multiple network layers.

References

[1] Shvydun, S. (2025). Zoo of Centralities: Encyclopedia of Node Metrics in Complex Networks. arXiv: 2511.05122 https://doi.org/10.48550/arXiv.2511.05122
[2] Stella, M., & De Domenico, M. (2018). Distance entropy cartography characterises centrality in complex networks. Entropy, 20(4), 268. doi: 10.3390/e20040268.