Pukyong National University
Complex systems, from brain networks to financial markets, are often represented as weighted networks where the edge weights indicate the strength of interactions between nodes. However, traditional network analysis methods may fail to capture subtle dependencies between these weights, potentially missing crucial information about the system's behavior and structure.
In this study, we introduce an information-theoretic approach to uncover hidden dependencies in weighted networks using the concept of information entropy. Our method quantifies the amount of information contained in the weight distribution of a network and identifies patterns of dependency that are not apparent from conventional network measures.
We apply our approach to various empirical networks, including brain connectivity networks, financial correlation networks, and social interaction networks. Our results reveal significant hidden dependencies that conventional methods overlook. In brain networks, we identify subtle patterns of neural connectivity that correlate with cognitive functions. In financial networks, we detect interdependencies between market sectors that become particularly pronounced during periods of market stress.
The proposed entropy-based framework provides a powerful tool for analyzing weighted networks across diverse domains, offering new insights into the complex dependencies that shape system behavior. This approach has potential applications in understanding brain function, predicting financial market dynamics, and analyzing other complex networked systems where weighted interactions play a crucial role.