I got an email earlier today about the (possible) coming
technological singularity. This is where an AI of above human intelligence is developed, this AI builds a higher intelligence AI, and so on. This fast iterative improvement of machine intelligence would leave humans far behind, as a species of lower intellect.
Should this AI decide that a human (or humans) pose a risk to its aims, it may capture or destroy them (us). Look what happens when we get bees in the attic or a rat infestation in our house.
Sounds like fun.
...So, anyway, the point. Life. Often on computers people will use
genetic algorithms to simulate life, or to get to a solution to a problem. Using the principles of evolution, let the items breed which match your goal, and remove the ones that don't. The problem in using this for evolving the ultimate intelligence is that you are limited by the environment. The program in which all your artificial life lives. The only evolving part is the "DNA" of each organism. So you are limited by the rules in the program that interprets this DNA.
So remove the limited set of rules.
Make a program that searches for others of its kind, copies its content, and creates a hybrid of its machine code operations with that of its new friend, applies a little mutation to the hybrid and writes out an executable file and runs it.
This first generation would make sure the new life was created correctly, checksum matching, data and operation code in the right place. Maybe the code wouldn't do anything sensible, and would probably crash with a out of range memory error, but it would run.
So build a few, and let them go... (after a backup, obviously)
There's a good chance you'd get at least a few fertile organisms to continue the system as both of its "parents" were fully working.
If it accidentally evolves into an organism that uses up all the memory or disk space (or even shuts down the computer it is on), then so be it. Just like natural life. Animals caused their own extinction by using up the food or resources in around them... and we're heading the same way.
The vast majority will die immediately; the EXE checksum not matching, invalid memory access, operations and dlls. But should a few live, and continue to breed, you'll get more organisms. Your limiting factor is memory and disk space.
If things don't go well, you can always seed the environment with more robust and ability-filled lifeforms later (with skills such as directory movement, deleting competitors, FTP, etc)
Given enough time, and enough space to breed, why wouldn't this become a superintelligence?