Entropy and mutual information-based centrality (EMI)
Entropy and mutual information-based centrality
(EMI) quantifies the importance of a node by combining its structural entropy with the mutual information shared with its neighbors, which can highlight nodes that are overvalued in terms of connectivity [2]. For an unweighted, undirected network, the centrality of node \(i\) is defined as
\begin{equation*}
c_{\mathrm{EMI}}(i) = S(i) + MI(i),
\end{equation*}
where \(S(i)\) is the
structural entropy
of node \(i\):
\begin{equation*}
S(i) = - \sum_{j \in \mathcal{N}(i)} \frac{d_j}{\sum_{l \in \mathcal{N}(i)} d_l}
\log \frac{d_j}{\sum_{l \in \mathcal{N}(i)} d_l},
\end{equation*}
with \(d_j\) being the degree of neighbor \(j\). The term \(MI(i)\) captures the mutual information between node \(i\) and its neighbors:
\begin{equation*}
MI(i) =\sum_{j \in \mathcal{N}(i)} \frac{|\mathcal{N}(i) \cap \mathcal{N}(j)|}{|\mathcal{N}(i)| |\mathcal{N}(j)|} \log \frac{|\mathcal{N}(i)|+|\mathcal{N}(j)|-|\mathcal{N}(i) \cap \mathcal{N}(j)|}{|\mathcal{N}(i)| |\mathcal{N}(j)|}
\end{equation*}