Sunday, February 8, 2009

A Price for the Sun


Although the Sun does not generate a uniform amount of light throughout it's life and will continue to slowly brighten until it's death, about 5 billion years from now, we can still roughly estimate how much light will be produced between now and it's end and determine, in terms of energy costs, how much it would likely set one back to buy the Sun.

5.6216744275618786699008 X 10^40 joules of energy will be released in the form of light, by the Sun, in the next 5 billion years. Only one billionth of that energy will actually impact the Earth, so if our future successors require more energy, they might consider building a Dyson Sphere around our star.

This amount of eventual energy leaving the Sun can be converted to kilowatt hours (kWh) which registers:
1.561576229878299630528 X 10^34 kWh
If I were to expend this much energy at my home in the next month, the electric company would bill me approximately 2.1081279103357045012128 X 10^33 dollars, which although not very accurate for an estimate, people of today could reasonable expect to pay this, if the Sun were for sale.

I once heard that the combined sale price for everything on the surface of the Earth would be 1.0 X 10^15 dollars. That's quite a difference.

Slowly Going Green


I bought a 1 watt solar panel for 30 dollars, tie wrapped it to the end of a long PVC pipe with T connector end, sharpened the other end into a point with a hacksaw, and finally drove it 1 foot into the ground in my backyard, where it will tap the sun's free energy and supply power to my small battery / inverter setup.

Considering that the assembly produces only 1 joule per second and there is only 12 hour sunlight exposure per day (multiplied by the squareroot of two divided by 2 - to obtain the average intensity), only 30,547 joules per day of energy are extracted. Now also, assuming that the price for electricity is currently 13.5 cents per kilowatt hour (in my area) or 1 cent for 266,667 joules, the solar panel would have to gather sunlight for about 71 years before the enough money was saved to justify paying the 30 dollar panel price. Since 71 years is far beyond the expected lifespan of my radiant energy to electricity converter, no profit will ever be provided.

The lesson?... Do not buy 1 watt panels for 30 dollars. While there are much better solutions out there, such as more space efficient models, higher wattages, and longer functional lifetimes, they still seem to be too expensive for the majority of us energy users. Apparently, 60% efficiency has been developed but isn't readily available at the moment and also requires a more intense light source to reach it's peak performance.

Since 130,131,352,486,171,736 watts of sunlight are available on Earth at any particular moment, sunlight will inevitably become our primary source of energy.

Sexual Reproduction


Presently, thousands of processor intensive generations are required to amount to even the most modest variation in survival instincts. A proposed solution, sexual reproduction, stolen directly from the biological world, might allow a remedy. Sex, which is the process of mixing and recombining of multiple individuals' genetic construction codes, could substancially improve the efficiency of the Evolution Machine Experiment by helping to expand diversity between generations and spread advantageous traits, while simultaneously preventing the propogation of negative attributes.

In principle, gender differences are not required for the process of sex, however due to differences in selection pressure between the genetic sending and receiving sides, sexual dimorphism is likely to develop. It is offen incorrectly thought that a sexually reproducing species always consists of two modes - male and female. However, without dimorphism, there would be only one type and in more unusual instances of life on Earth, occassionally there are organisms that effectively have three or more genders. By adding such mechanisms into the experiment of evolution, new behaviors will emerge, such as genetic information sharing and possibly sexual rejection if the artificial organism perceives it's potential mate as incompentant.

Thursday, January 29, 2009

Abiognesis in a Bottle

Unfortunately, even with the fastest personal computers available, a large amount of resources are required to conduct experiments involving evolution. In the current implementation of the Balanced-Force Evolution Machine, 30 Avoiding cells compete against 30 Attacker cells, with the former attempting to minimize being shot by the Attackers' cannons and the latter doing the opposite. The average duration of a generation is set to approximately 25 seconds and at the time of this writing has achieved 600 generations on my AMD quad core. Notable advancements in group behavior are beginning to appear as illustrated to the left. To minimize damage inflicted upon them, the Avoider cells appear to be hiding in the corner. The cells on the outside of this cluster are protecting those within and periodically appear to swap places. As intuitive as this might sound, it is a remarkable adaptation considering that all of these cellular robots were originally conceived completely absent minded. After a few weeks of running this simulation, I suspect far more impressive behaviors to emerge. At the moment, all we can do is wait.

Monday, January 26, 2009

Balanced Forced Evolution

This experiment involves the interaction between two mutually capable artificial organisms and has been derived from the algorithms of my first open ended evolution machine. Each "cell", fitted with a 1Mhz brainfuck compatible processor with evolution capabilities, steering control, thrusters and decelerators, one dimensional "laser" scanners for vision, and an optional gun, compete to either maximize their kill points or minimize death points. The two classes, Avaiders and Attackers, respectively have essentially opposite goals. While their mechanics and strength are all exactly equal and forever locked "as-is", they each will inevitably evolve different strategies and behaviors to maximize their survival instincts. A simple user interface is provided so the experimenter can peer within their minds, to see through their eyes, examine their genetic information and lineage, and monitor statistics of the current and past generations of success with both graphs and history charts. Ultimately these groups of beings of opposing forces, may form coalitions to better their odds. It's not as simple as survival of the fittest, as was once suggested. There is in fact, many factors involved, and too many variables to have ever been intuitively known. The attempt of this experiment is to not only reveal evolution's secrets, but to force the two competing sides into an ever advancing race for dominance. As the opposing force advances, the other team will either be destroyed or succeed in a beneficial mutation. A control factor within the simulation, is to disallow death by reverting to a previous generation, in other words, to basically allow the losing side to try again with a possibly more effective solution. The end result is uncertain, but certainly might promote intelligent behavior and possibly the beginnings of a self aware entity.

Friday, January 16, 2009

Cellular Grid Rechargeable Batteries

One of the primary disadvantages of rechargeable batteries is the expended time required to recharge them and the necessary process used to ensure continued long life. Since extremely rapid recharges, by use of high voltage, have the tenancy to generate irregularities in the battery's cells' electrodes, current regulation is a critical function of reliable regeneration of today's batteries.
A possible remedy involves breaking down the battery into a matrix of hundreds or thousands of micro cells. These miniaturized chemical cells could be recharged more reliably at high speed and each cell could be controlled individually by the use of multiplexing circuitry. In the same way that a memory chip is divided into bytes and each byte can be randomly accessed non sequentially, this high tech battery could be designed to automatically scan through the cells which needed a recharge and skip those which didn't. Even more importantly, each cell could be recharged depending on it's own internal status, eliminating the possibility of damage to the battery as a whole. With a microprocessor controlled system, the battery could self organize, on demand, to produce the desired current and voltage output, without any losses associated with the problems of voltage regulators. Battery life, percentage charged, and charge cycles remaining could be measured with accuracy far exceeding anything ever seen before. The possibilities are endless.

One Instruction Set Computers


The logical extreme of reduced instruction set computers, as previously briefly mentioned in this article, is the use of a single parametrized op-code. To properly put this into perspective, modern x86 architectures, like the computer you are likely using now, and other CISC (complex instruction set computers) use hundreds, if not thousands, of individual instructions. The use of so many different commands within the processor forces the design to be very complex and leads to an increased chance of glitches and ultimately reduces maximum clock speed.

On the other hand, a reduced instruction set computer architecture (RISC), especially the one instruction set version (OISC), involves minimal circuitry inside the actual processor, simplifying the design and allowing components to be placed in a smaller area to enhance the speed by reducing propogation delays and latencies.



The model under investigation here is known as "Subtract-and-Branch-If-Less-than-or-Equal-to-Zero", symbolized as "subleq", and coded with no opcodes - since with the use of a sinlge op-code, the computer will not require explicit knowledge of what is expected, the opcode is implied and memory space is saved. Subleq takes three parameters, also known as operands, marked a, b, and c. The basic, and only, function of this code is to set the value at b equal to b minus a. In otherwords, b = b-a. For those that are not familar with typical assignment operators used in computer programming, the way this statement is evaluated is by taking the current value of whatever b-a happens to be. This value is then stored in and overwrited to, "assigned to", the value of b... meanwhile no change is made to a. IF the new value at b is less than or equal to 0 then program execution is "jumped" to the location held by paramter c, or more simply, the processor's program counter is set to the value of c. If a jump is not required you can't simply say 0 for c, since that would cause the program to reset and start over, 0 is the starting point for a program counter (typically). It is generally accepted to leave the third parameter c blank in these cases, which informs the processor that regardless of the previous operation, just go to the next instruction. It's still a jump, but only by one unit. Alternatively, the user can simply write in the location of the next instruction, it's exactly the same.

There are many ways to implement this design and slight modifcations are possible to create new instruction sets. Since program lengths can become extraordinarily long in such an instruction set, it can also be beneficial to add specialized hardware to the processor to allow multiplication or other mathematical functions to be performed in a single cycle. However, for proof of concept, the OISC mentioned here, is capable of universal computation! All with only one instruction.

If you've never run across such a claim before, I don't expect you to immediately understand how to program in this machine language. It's interesting that, at this point, you know everything about this one instruction set computer and yet, at the same time, probably wouldn't have any idea where to begin creating a program to drive it.

Well, here's a start, to perform addition of two numbers on this computer, all you have to do is...
    ADD a, b == subleq a, Z
subleq Z, b
subleq Z, Z
Good luck OISC programmers!