Ω is my own personal favorite transcendental number. Ω isn't really a specific number, but rather a family of related numbers with bizzare properties. It's the one real transcendental number that I know of that comes from the theory of computation, that is important, and that expresses meaningful fundamental mathematical properties. It's also deeply non-computable; meaning that not only is it non-computable, but even computing meta-information about it is non-computable. And yet, it's almost computable. It's just all around awfully cool.
So. What is it Ω?
It's sometimes called the halting probability. The idea of it is that it encodes the probability that a given infinitely long random bit string contains a prefix that represents a halting program.
It's a strange notion, and I'm going to take a few paragraphs to try to elaborate on what that means, before I go into detail about how the number is generated, and what sorts of bizzare properties it has.
Remember that in the theory of computation, one of the most fundamental results is the non-computability of the halting problem. The halting problem is the question "Given a program P and input I, if I run P on I, will it ever stop?" You cannot write a program that reads P and I and gives you the answer to the halting problem. It's impossible. And what's more, the statement that the halting problem is not computable is actually equivalent to the fundamental statement of Gödel's incompleteness theorem.
Ω is something related to the halting problem, but stranger. The fundamental question of Ω is: if you hand me a string of 0s and 1s, and I'm allowed to look at it one bit at a time, what's the probability that eventually the part that I've seen will be a program that eventually stops?
When you look at this definition, your reaction should be "Huh? Who cares?"
The catch is that this number - this probability - is a number which is easy to define; it's not computable; it's completely uncompressible; it's normal.
Let's take a moment and look at those properties:
Non-computable: no program can compute Ω. In fact, beyond a certain value N (which is non-computable!), you cannot compute the value of any bit of Ω.
Uncompressible: there is no way to represent Ω in a non-infinite way; in fact, there is no way to represent any substring of Ω in less bits than there are in that substring.
Normal: the digits of Ω are completely random and unpatterned; the value of and digit in Ω is equally likely to be a zero or a one; any selected pair of digits is equally likely to be any of the 4 possibilities 00, 01, 10, 100; and so on.
So, now we know a little bit about why Ω is so strange, but we still haven't really defined it precisely. What is Ω? How does it get these bizzare properties?
Well, as I said at the beginning, Ω isn't a single number; it's a family of numbers. The value of an Ω is based on two things: an effective (that is, turing equivalent) computing device; and a prefix-free encoding of programs for that computing device as strings of bits.
(The prefix-free bit is important, and it's also probably not familiar to most people, so let's take a moment to define it. A prefix-free encoding is an encoding where for any given string which is valid in the encoding, no prefix of that string is a valid string in the encoding. If you're familiar with data compression, Huffman codes are a common example of a prefix-free encoding.)
So let's assume we have a computing device, which we'll call φ. We'll write the result of running φ on a program encoding as the binary number p as &phi(p). And let's assume that we've set up φ so that it only accepts programs in a prefix-free encoding, which we'll call ε; and the set of programs coded in ε, we'll call Ε; and we'll write φ(p)↓ to mean φ(p) halts. Then we can define Ω as:
Ωφ,ε = Σp ∈ Ε|p↓ 2-len(p)
So: for each program in our prefix-free encoding, if it halts, we add 2-length(p) to Ω.
Ω is an incredibly odd number. As I said before, it's random, uncompressible, and has a fully normal distribution of digits.
But where it gets fascinatingly bizzare is when you start considering its computability properties.
Ω is definable. We can (and have) provided a specific, precise definition of it. We've even described a procedure by which you can conceptually generate it. But despite that, it's deeply uncomputable. There are procedures where we can compute a finite prefix of it. But there's a limit: there's a point beyond which we cannot compute any digits of it. And there is no way to compute that point. So, for example, there's a very interesting paper where the authors computed the value of Ω for a Minsky machine; they were able to compute 84 bits of it; but the last 20 are unreliable, because it's uncertain whether or not they're actually beyond the limit, and so they may be wrong. But we can't tell!
What does Ω mean? It's actually something quite meaningful. It's a number that encodes some of the very deepest information about what it's possible to compute. It gives a way to measure the probability of computability. In a very real sense, it represents the overall nature and structure of computability in terms of a discrete probability. It's an amazingly dense container of information - as a n infinitely long number and so thoroughly non-compressible, it contains an unmeasurable quantity of information. And we can't get near most of it!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment