logic, languages, information and computability

Set theorist and mathematical philosopher Gregory Chaitin, inspired by the recent appearance of three popular books on the Riemann hypothesis, discusses the possibility that it might in some sense be appropriate to consider it as an axiom.

"The traditional view held by most mathematicians is that... the RH cannot be taken as [an] axiom, and cannot require new axioms, we simply must work much harder to prove them. According to the received view, we're not clever enough, we haven't come up with the right approach yet. This is very much the current concensus. However this majority view completely ignores the incompleteness phenomenon discovered by Gödel, by Turing, and extendedby my own work on information-theoretic incompleteness. What if there is no proof?"



"Gödel's result was a major body blow to mathematicians everywhere. There were so many statements about numbers, and especially prime numbers, which appeared to be true but we had no idea how to prove. Goldbach: that every number is the sum of two prime numbers; Twin Primes: that there are infinitely many primes differing by 2, such as 17 and 19. Were these going to be statements that we couldn't prove from the existing axiomatic foundations?

There is no denying that it was an unnerving state of affairs. Maybe the Riemann Hypothesis was simply unprovable within our current axiomatic description of what we think we mean by arithmetic. Many mathematicians consoled themselves with the belief that anything that is really important should be provable, that it is only tortuous statements with no valuable mathematical content that will end up being one of Gödel's unprovable statements.

But Gödel was not so sure. In 1951, he questioned whether our current axioms were sufficient for many of the problems of number theory.

"One is faced with an infinite series of axioms which can be extended further and further, without any end being visible...It is true that in the mathematics of today the higher levels of this hierarchy are practically never used...it is not altogether unlikely that this character of present-day mathematics may have something to do with its inability to prove certain fundamental theorems, such as, for example, Riemann's Hypothesis."

from M. du Sautoy, The Music of the Primes (Fourth Estate, 2003)



A.S. Karpenko, Lukasiewicz Logics and Prime Numbers (English translation), Logical Studies 13 (2005) (special issue)

[abstract:] "In this book we investigate functional properties of finite-valued Lukasiewicz logics Ln+1. The truth-functional approach to the study of Ln+1 allowed to give a definition of prime numbers in algebraic-logical terms (Finn's theorem). As a result, there emerged an opportunity to clarify the structure of prime numbers. It turns out that prime numbers can be represented as rooted trees. We present an algorithm that for every prime number n builds a rooted tree with natural numbers as nodes and n as a root. Furthermore, we define finite-valued logics Kn+1 that have tautologies if and only if n is a prime number. Kn+1 turn out to have the same functional properties as Ln+1 whenever n is a prime number. Thus, Kn+1 is "logic for prime numbers." It is interesting that a combination of logics for prime numbers helps discover a law of generation of classes of prime numbers. Lastly, along with categorization of prime numbers we also give characterization, in terms Lukasiewicz's logical matrices, of powers of primes, odd numbers, and even numbers, the last of which is of particular interest.

In algebraic terms, we investigate the clones of finite MV-chains."

A.S. Karpenko, Lukasiewicz's Logic and Prime Numbers (introduction and contents of Russian language book), Logical Studies 9 (2002)

[abstract:] "For the first time in the world literature this monograph shows a direct relation between logic and prime numbers. Although many-valued Lukasiewicz's logics are the result of refutation of Aristotle's fatalistic argument their functional properties have a theoretically-numeric nature. Consideration of this fact allowed to give the definition of prime number in logical terms. So real possibility to clarify a structure of prime numbers appeared. Consequently, prime numbers can be presented in the form of rooted trees. A combination of different logical definitions of prime number leads to the construction of algorithm for creation of classes of prime numbers. Computer programs for the algorithm as well as for the trees are elaborated in this monograph. Also different tableaux of numbers are published for the first time."

[as of 02/06 the book has been published in English by Luniver Press (UK)]

A.S. Karpenko, "Characterization of prime numbers by Lukasiewicz's many-valued logics", Bulletin of the Section of Logic 13/2 (1984)

A.S. Karpenko, "Sheffer's stroke for prime numbers", Bulletin of the Section of Logic 23/3 (1994)

A.S. Karpenko, "Characterization of prime numbers in Lukasiewicz's logical matrix", Studia Logica



S.W. Golomb, "Probability, information theory, and prime number theory", Discrete Mathematics 106-107 (1992) 219-229

[abstract:] "For any probability distribution D = {\alpha(n)} on Z+, we define. . . the probability in D that a 'random' integer is a multiple of m; and . . . the probability in D that a 'random' integer is relatively prime to k. We specialize this general situation to three important families of distributions . . . Several basic results and concepts from analytic prime number theory are revisited from the perspective of these families of probability distributions, and the Shannon entropy for each of these families is determined."



K.H. Knuth, "Deriving laws from ordering relations", In: G.J. Erickson, Y. Zhai (eds.), Bayesian Inference and Maximum Entropy Methods in Science and Engineering, AIP Conference Proceedings 707 (2003) 204-235.

[abstract:] "The effect of Richard T. Cox's contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes' Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox's technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics."

K.H. Knuth, "Lattice duality: The origin of probability and entropy", Neurocomputing 67 C (2005) 245-274

[author's description:] "This paper shows how a straight-forward generalization of the zeta function of a distributive lattice gives rise to bi-valuations that represent degrees of belief in Boolean lattices of assertions and degrees of relevance in the distributive lattice of questions. The distributive lattice of questions originates from Richard T. Cox's definition of a question as the set of all possible answers, which I show is equivalent to the ordered set of down-sets of assertions. Thus the Boolean lattice of assertionns is shown to be dual to the distributive lattice of questions in the sense of Birkhoff's Representation Theorem. A straightforward correspondence between bi-valuations generalized from the zeta functions of each lattice give rise to bi-valuations that represent probabilities in the lattice of assertions and bi-valuations that represent entropies and higher-order informations in the lattice of questions."



P. Kumar, P.C. Ivanov, H.E. Stanley, "Information entropy and correlations in prime numbers"

[abstract:] "The difference between two consecutive prime numbers is called the distance between the primes. We study the statistical properties of the distances and their increments (the difference between two consecutive distances) for a sequence comprising the first 5 x 107 prime numbers. We find that the histogram of the increments follows an exponential distribution with superposed periodic behavior of period three, similar to previously-reported period six oscillations for the distances."

Nature article on this research (24/03/03)



C. Bonanno and M.S. Mega, "Toward a dynamical model for prime numbers" Chaos, Solitons and Fractals 20 (2004) 107-118

[abstract:] "We show one possible dynamical approach to the study of the distribution of prime numbers. Our approach is based on two complexity methods, the Computable Information Content and the Entropy Information Gain, looking for analogies between the prime numbers and intermittency."

The main idea here is that the Manneville map Tz exhibits a phase transition at z = 2, at which point the mean Algorithmic Information Content of the associated symbolic dynamics is n/log n. n is a kind of iteration number. For this to work, the domain of Tz [0,1] must be partitioned as [0,0.618...] U [0.618...,1] where 1.618... is the golden mean.

The authors attempt to exploit the resemblance to the approximating function in the Prime Number Theorem, and in some sense model the distribution of primes in dynamical terms, i.e. relate the prime number series (as a binary string) to the orbits of the Manneville map T2. Certain refinements of this are then explored.

"We remark that this approach to study prime numbers is similar to the probabilistic approach introduced by Cramér...that is we assume that the [binary] string [generated by the sequence of primes]...is one of a family of strings on which there is a probability measure..."



K.K. Nambiar, "Information-theoretic equivalent of Riemann Hypothesis"

[abstract:] "Riemann Hypothesis is viewed as a statement about the capaicity of a communication channel as defined by Shannon."



B. Long, The structure of information (preprint 09/03)

[abstract:] "A formal model of the structure of information is presented in five axioms which define identity, containment, and joins of infons. Joins are shown to be commutative, associative, provide inverses of infons, and, potentially, have many identity elements, two of which are multiplicative and additive. Those two types of join are distributive. The other identity elements are for operators on entwined states. Multiplicative joins correspond to adding or removing new bits to a system while additive joins correspond to a change of state. The order or size of an infon is defined. This groundwork is intended to be used to model continuous and discreet information structures through time, especially in closed systems."

[personal communication regarding number theory content:] "The connection to prime numbers is this: From five axioms that define infons in terms of the identity relation and a sort of containment I define a join operator for joining two infons under containment and show that infon joins are commutative, associative, etc. and that depending upon the intersection between the joined infons, joins can be multiplicative, additive, or in-between (for entwined states.) Multiplicative joins occur whenever disjoint infons are joined. Because they are multiplicative, repeated divisions into disjoint parts end at infons of prime size. This is interesting because, due to the relation between states and bits, infons are analogous to very small closed state systems."



J. Shallit, "Number theory and formal languages" in Emerging Applications of Number Theory, eds. D. Hejhal, et. al., IMA volumes in Mathematics and its Applications 109 (Springer, 1999)

[abstract:] "I survey some of the connections between formal languages and number theory. Topics discussed include applications of representation in base k, representation by sums of Fibonacci numbers, automatic sequences, transcendence in finite characteristic, automatic real numbers, fixed points of homomorphisms, automaticity, and k-regular sequences."

P. Enflo, A. Granville, J. Shallit and S. Yu, "On sparse languages $L$ such that $LL = \Sigma$", Discrete Applied Mathematics 52 (1994) 275-285

[abstract:] "A language $L \in \Sigma^{*}$ is said to be sparse if $L$ contains a vanishingly small fraction of all possible strings of length $n$ in $\Sigma^{*}$. C. Ponder asked if there exists a sparse language $L$ such that $LL = \Sigma^{*}$. We answer this question in the affirmative. Several different constructions are provided, using ideas from probability theory, fractal geometry, and analytic number theory. We obtain languages that are optimally sparse, up to a constant factor. Finally, we consider the generalization $L^{j} = \Sigma^{*}."



C.S. Calude and M. Stay, "Natural halting probabilities, partial randomness, and zeta functions" (preprint 01/06)

[abstract:] "We introduce the natural halting probability and the natural complexity of a Turing machine and we relate them to program-size complexity and Chaitin's halting probability. A classification of Turing machines according to their natural (Omega) halting probabilities is proposed: divergent, convergent and tuatara. We prove the existence of universal convergent and tuatara machines. Various results on randomness and partial randomness are proved. For example, we show that the natural halting probability of a universal tuatara machine is c.e. and random. A new type of partial randomness, asymptotic randomness, is introduced. Finally we show that in contrast to classical (algorithmic) randomness - which cannot be characterised in terms of plain complexity - various types of partial randomness admit such characterisations."



B. Poonen, "Undecidability in number theory", Notices of the AMS 55 (2008)



archive      tutorial      mystery      new      search      home      contact