Abstract
We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdös-Rényi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.
Original language | English (US) |
---|---|
Pages (from-to) | 32-43 |
Number of pages | 12 |
Journal | Seminars in Cell and Developmental Biology |
Volume | 51 |
DOIs | |
State | Published - Mar 1 2016 |
Externally published | Yes |
Keywords
- Algorithmic probability
- Algorithmic randomness
- Biological networks
- Complex networks
- Information theory
- Kolmogorov complexity
ASJC Scopus subject areas
- Developmental Biology
- Cell Biology