so, you wanna understand the universe, do you? Step right up, sit right down, and follow along, it's time for a brief lesson in science!
We start with a bit of logic:
- Let us assume that (P OR Q) is logically equivalent to (Q OR P)
Not all that controversial, right?
From there, we can start playing with logical expressions:
- P OR (Q OR R) = (P OR Q) OR R
- NOT P = P NOR P
- P NOR Q' = 'NOT (P OR Q)
(note that with the first two principles, we can start making (as yet uninteresting) inferences, like P OR (R OR Q) = (P OR Q) OR R.)
Now we start upping the degree of difficulty: (P NOR Q) NOR (P NOR (NOT Q)) = P, which is a very roundabout expression of the
Robbins Axioms which, if you're really into tedious logical proof, can be shown as equivalent to standard Boolean logic. Rounding out our basic logical toolbox, we can define the concepts of AND, IMPLIES (allowing us to do "if-then" statements) and IFF ("if-and-only-if"):
- (P AND Q) = ((NOT P) NOR (NOT Q))
- (P IMPLIES Q) = ((NOT P) OR Q)
- (P IFF Q)' = '(P IMPLIES Q) AND (Q IMPLIES P)
To be conservative, one may note (P IFF Q) = (P = Q).
OK so far?
You might be wondering why we're starting with concepts like "Does not exist." Well, Asserting anything leads to risk of self-contradiction, which (under the usual rules of logic) leads to...
Malaclypse the Younger: Everything is true.
Greater Poop: Even false things?
Malaclypse the Younger: Even false things are true.
Greater Poop:How can that be?
Malaclypse the Younger: I don't know man, I didn't do it.
- from
The Principia Discordia This generally doesn't allow for further non-Absurdist discussion. Therefore, they presume "no" unless "yes" is explicitly allowed.
Moving on!
Presuming you don't have a burning desire to go through the
proof of the Robbins conjecture, we have Boolean Logic, including useful things like (P OR (NOT P)) and NOT (P AND (NOT P)).
We also want to grab the
rules of universal/existential introduction/elimination (see link) for the ideas of "for all" and "exists", so as to head next to the basics of
Zemelo-Fraenkel set theory:
- Axiom of Extensionality: If X and Y have the same elements, then X=Y.
- Axiom of the Unordered Pair: For any a and b there exists a set {a,b} that contains exactly a and b. (also called Axiom of Pairing)
- Axiom of Subsets: If phi is a property (with parameter p), then for any X and p there exists a set Y={u in X:phi(u,p)} that contains all those u in X that have the property phi. (also called Axiom of Separation or Axiom of Comprehension)
- Axiom of the Sum Set: For any X there exists a set Y= union X, the union of all elements of X. (also called Axiom of Union)
forall X exists Y forall u(u in Y= exists z(z in X ^ u in z)). - Axiom of the Power Set: For any X there exists a set Y=P(X), the set of all subsets of X.
- Axiom of Infinity: There exists an infinite set.
- Axiom of Replacement: If F is a function, then for any X there exists a set Y=F[X]={F(x):x in X}.
- Axiom of Foundation: Every nonempty set has an in-minimal element. (also called Axiom of Regularity)
I recommend looking at Wolfram's page, as it's prettier than I care to HTML markup here, and he also has the immediate next step on the page (existence of the empty set). Note that we're omitting the Axiom of Choice, because it has any number of strange aspects to it; among other things, it leads to a theorem that I regard as mathematics' version of the g-o-a-t-s-e pic, otherwise known as the
Banach-Tarski Sphere Dissection, which is enough to make most people's heads asplode. (Take a sphere, and split it into two spheres of equal size to the original...)
The Axiom of Foundation is a bit daunting, but it has to do with making sure you don't start talking about sets which may-or-may-not contain themselves. EG: "Q: 'Does the catalog of all catalogs that do not contain themselves, contain itself or not?' A: 'STFU&GBTW.'"
So, as mentioned, Wolfram takes the first step of showing the existence of the empty set, {}. We can show via the axiom of infinity that the set of the empty set {{}} is in the infinity axiom's infinite step, that the set of the set of the empty set {{{}}} is as well, and even by the Axiom of replacement define a functional relationship ++(x) being {x}, and show that ++(x) is necessarily in the infinite set S if x is.
I believe (but am a bit shakier on) that it can be next shown that there is no x such that ++(x)={} (possibly via foundation?). Then, using the axiom of subsets, we can pull a set N out of s such that {} is in N, and for all x in N that ++(x) is in N.
So, any problems with my having N?
So, by constructing our set N, we now have N - the set of Natural Numbers.
We may then define the relationship "+" on N, such that for all x and y in N:
- x + {} = x (additive identity)
- x + (++y) = ++(x + y)
and eventually prove Commutativity (a+b = b+a), and Associativity ((a+b)+c = a+(b+c)) for addition, along with fun things like
- {{{}}} + {{{}}} = {{{{{}}}}}
- which is to say, '2+2=4'.
Surprise - you've reached first grade math!
We may next define an operator '*' on N, such that for all x and y in N,
- x * {{}} = x (multiplicative identity)
- x * (++y) = (x * y) + x
And get Commutativity (a*b = b*a) and Associativity ((a*b)*c = a*(b*c)) for multiplication, and left-and-right Distributivity of multiplication over addition ((a+b)*c=ac+bc and a*(b+c)=ab+ac).
So far so good?
From where we have achieved, we may use the idea of additive and multiplicative inverses (x + -x = 0 and x * x-1 = 1) to extend from the Natural numbers to Integers and Rational number, and define exponentiation to eventually construct the Algebraics. While the Real numbers would seem the obvious next step from math classes (if not overdue), we actually then want "tuples", that is to say, ordered pairs (or higher groups) of numbers or sets; that is, things like (1, {2, 3}, {{},4,7}, 23, 2). My understanding is they can be justified via the Axiom of Replacement, but I've not been dragged through this stage personally by a math Professor in at least a few years, if ever. (Possibly in one of my conversations with Altaz...)
Why? Well, things are about to go to heck in a handbasket and a hurry....
College level math now, although a good AP Compsci might touch on it. We need tuples for
Turing Machines. With Turing Machines, we can do lots of tricks. We can make a machine that reads in the symbols comprising a mathematical proof from a (finite) list of axioms, and indicate whether the proof is valid. We can make a turing machine that can be fed in as input the description of another Turing machine and its input and emulate it's behavior. This can be done even when working with "more complicated" varieties - a one tape machine can emulate a two tape, two dimensional tape, or multiple-dimensional tape, among many others.
And now we approach the precipice that made Godel famous....
We also start reaching territory where I'm not completely sure I understand it at all, let alone well enough to explain to other people. Let's see how well I do
So, once we have universal Turing machines, we can try to find something they don't do. The definitive example is the "Halting problem" - that is, make a Turing machine that can determine (in a fashion that WILL certainly halt in a "yes/no" answer) whether a tape input describing a universal Turing machine and program will certainly halt in a definite finite number of steps. This can be shown, by making a modified one that, after detecting a "halt" state instead jumps into an infinite loop, encode that, and feed it to itself
![](http://www.watercure2.org/images/fat_cat.jpg)
(URP?!?)
...giving you a program that halts if and only if... it doesn't halt. Since that's impossible, the durn thing couldn't have been a Turing machine in the first place.
The splitting headache that resulted from this was bad enough. However, Kurt Godel also showed something even nastier: you can show something similar with mathematical systems. It is "relatively" (IE: good way to earn a Masters or maybe PhD) easy to show that many systems are just as consistent as the system used to construct the natural numbers. However, Godel showed that any finite axiom system able to establish a proof for itself being consistent... was inconsistent. Which was deeply disappointing, as it trashed nearly a half-centuries work as "hopeless" (if now provable so).
However, it set the present "working" standard, that systems are shown equivalent to the consistency of the natural numbers. This can be done for the Reals (and Complex) numbers... which was the prime point of this digression. It also means there are things (like the Axiom of Choice) which may be asserted OR refuted as an assumption, and still leave you with a system just as consistent as before.
Having the reals (which are constructed by saying that any set with a finite upper (or lower) bound must have a lowest (highest) upper (lower) bound. This allows for fun things like "pi" to get definitions, and other places where limits come in handy... like calculus. Probability also becomes easier, with
axioms I elect to merely wave hello to before moving on temporarily, as most people have some familiarity with. I'll also briefly note that it's possible to construct a set somewhat like "super-integers" that includes sorta-but-not-really-infinite integers called the "Cantor Ordinals", which will be only of passing and DAFT import. Things like "omega" and "epsilon-0" make even grad students scream at their professors "WHAT'S THE POINT?!?" so I decline to go into them in
any detail, beyond noting (a) they include all the ordinary integers and (b) they include things somehow bigger than that. (Besides, this gives me something to ramble on about in a future essay!)
Heading back to Turing machines, one can posit a hypothetical gizmo called an Oracle, which can solve a halting problem. "Ah" you breathe in a sigh of relief... but only briefly. Because one can then build a first level Hypercomputer by making a Turing machine that can (by going to a special state) ask an Oracle whether a turing-machine-and-tape encoded on the tape halts or not, and have it output a simple "yes" or "no" answer. We can then rebuild our halting problem with a hypercomputer, and are back with an unsolvable problem. We may then get a hyper-oracle... and have started our way down an infinite loop.
Becoming bored and being clever, we may eventually posit a queen oracle that can solve any oracle problem at any level in the natural numbers (which may ominously be corresponded with a hypercomputer of cantor ordinal level "omega"). Unfortunately... it falls back into the halting problem and comes up with a queen halting problem. This leads to the need for an Empress oracle... and here, we let the white jacketed men with the funny long sleeve shirt come for a few of our number, as we wander off to infinite infinities.
There are more subtle layers of complexity than this
![](http://people.virginia.edu/~abb3w/Images/Fark/diagram.jpg)
[POP!] ...most of which have been skipped (see back around "Linear Bounded Automata"). However, this messy Halting part is referred to as the Arithmetic Heirarchy, beginning at a Turing Level of zero (no levels of oracle required), and working our way up through Levels that (as far as I've been able to understand from my studies and conversations with Altaz) corresponds to the Cantor Ordinals (starting, of course with the Naturals).
Having conjured all this complexity... we proceed to ignore most of it, until we have need of a way to summon nice young men in their clean white coats for someone really, really annoying.
And now, we are ready to go to the BIG assumption, followed by the crown jewel that I can barely outline the result, and only say "Go read this now!" for the process.
Anyone not yet asleep?
Speaking of brains hurting, I've mentioned in passing the
Banach-Tarski Sphere Dissection. While requiring the semi-controversial axiom of choice, it at least serves as a painful example of how things in mathematics may have no sensible and obvious connection to "reality". This brings us to the one simple assumption where I expect controversy to be sensible: a proposed relationship between mathematics and reality.
Most of the twisted complexity heirarchy in the green diagram above can be handled by an "ordinary" Turing machine. In fact, most of mathematics can be handled that by a Turing machine; so much so that the usual shorthand for a function that is "Turing Computatable" is just "Computable". If you have a finite starting point, and can tranform it a finite number of times via a finite number of rules (ooh, like... proofs?), it can be recognized via Turing machine. The results are about as complicated as can be imagined... and no-one has figured out yet how to build the conjectured "oracle" or "hypercomputer".
This brings us to a principle known as the Strong Church-Turing Universe Thesis, which roughly says that the universe we percieve (directly and indirectly) via the senses is at most Turing Level Zero Complicated; it does not require (nor allow) a Hypercomputer.
For the terminally anal, I actually hold as Faith a weaker thesis: that the Complexity is a Cantor Ordinal - although I use "zero" as a working assumption, for reasons I'll get to if anyone asks. This assumption is not undisputed. As was recently pointed out on a previous Evolution thread, Carol Cleland at the University of Colorado (who has a BS in math and who worked as a programmer before moving to Philosophy of Science) has written in a highly critical fashion against the Thesis and the entire associated historical background I've covered. (However, I disagree with her position.) At the least, however, SCTUT has at least not been falsified yet.
Anyone who wants to scream about this may do so. And then we will come to the time....
Now, it is time for me to go to the last vestiges of my competence, in the hopes of achieving science. I've run across the following pair of papers, which I hope to at least read more fully through over the upcoming break and possibly even get an inkling of understanding on them. They appear pure mathematics or computer science... until we pull everything together.
Both papers may be found online in Postscript format. Linux, Mac (X.3+), and Unix systems normally turn PS into PDF pretty much automagically; Windows users may acquire the programs GhostScript and GhostView
thisaways; you need to install both, but can then view and convert to PDF. The two papers are:
"Minimum Message Length and Kolmogorov Complexity", by C. S. Wallace and D. L. Dowe "Minimum description length induction, Bayesianism, and Kolmogorov Complexity", by P.M.B. Vitanyi and Ming Li THESE are the point of this whole exercise that began from P OR Q.
To be blunt, I'm not currently ready to drag anyone through the proofs therein. I'm barely able (and in no way formally qualified) to follow them. However, I can give y'all the general idea. All real-world data consists of some experimental measurement and and uncertainty value associated. One can encode this into a compact message, consisting of a program for a universal Turing machine, and a data input set for it. One can achieve the most concise results essentially by compressing the data - that is, describing how it relates, and then indicating deviations from the expected results. And the papers show (via the probability I waved hello to and ignored until now, and with some edge cases I quite frankly don't understand the implications of) that the representation consisting of the shortest data-and-message is the one most probable to predict future results correctly.
Or, in other words, provided the Strong Church-Turing Universe Thesis holds, they prove a mathematically formalized version of Occam's Razor. One should not needlessly multiply entities; the simplest explanation is most likely to be true.
Furthermore, we have Observed data, candidate Theories, and a rigorous metric for Judging between them. Can you say "scientific method", children?
There's the minor downer that finding the optimum candidate theory may be an uncomputable (Halting Problem equivalent) function... but we can at least tell something is more likely crap than something else.
Thus, I close this discussion:
Aren't you glad you asked?