Alexander Chislenko, <sasha1@netcom.com>, writes:
> Suppose we have an intelligent entity with a space/time footprint of,
> say, one light-hour in diameter and complexity on the level of 10^100
> bit and similar computational power.
This is hard to evaluate. How do you define complexity? There are fewer
than 10^70 cubic nanometers in the volume you describe. How do you
define computational power, such that it is measured in bits?
> Every minute the entity doubles
> in terms of effective complexity (in terms of features of architecture,
> not the number of bits)
Is this "effective complexity" different from the "complexity" you
mentioned earlier, which was measured in bits? How do you measure/define
effective complexity?
Is it really realistic to expect a constant doubling in effective
complexity every minute (going on for years or centuries, the time
scale necessary for even nearby interstellar exploration)? What kinds
of laws of physics are you assuming here?
Hard to say more, without clarifying these points...
Hal
Received on Mon Dec 22 16:40:41 1997
This archive was generated by hypermail 2.1.8 : Tue Mar 07 2006 - 14:45:29 PST