Logical depth
Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm.
Informally, the logical depth of a string to a significance level is the time required to compute by a program no more than bits longer than the shortest program that computes .[1]
Formally, let be the shortest program that computes a string on some universal computer . Then the logical depth of to the significance level is given by where is the number of computation steps that made on to produce and halt.
See also
[edit]References
[edit]- ^ Antunes, Luís; Bauwens, Bruno; Souto, André; Teixeira, Andreia (2017-02-01). "Sophistication vs Logical Depth". Theory of Computing Systems. 60 (2): 280–298. arXiv:1304.8046. doi:10.1007/s00224-016-9672-6. ISSN 1433-0490. S2CID 253745548.
- Bennett, Charles H. (1988), "Logical Depth and Physical Complexity", in Herken, Rolf (ed.), The Universal Turing Machine: a Half-Century Survey, Oxford U. Press, pp. 227–257, CiteSeerX 10.1.1.70.4331
- Craig, Edward (1998), "Computability and Information, Section 6: Logical depth", Routledge Encyclopedia of Philosophy, Vol. 10: Index, Taylor & Francis, p. 481, ISBN 9780415073103