Re: poly: Controlling an SI?

From: Peter C. McCluskey <pcm@rahul.net>
Date: Mon Jun 08 1998 - 09:10:56 PDT

 bostrom@ndirect.co.uk ("Nick Bostrom") writes:
>Why do you think that a legal system that included the extra clause
>(saying that human-level intelligences lack don't have rights) would
>be less economically efficient than one which didn't? Consider:
>(1) Once the humans are annihilated, nobody needs to think about the
>law any longer.

 That ignores the costs of changing moral systems (see below), and assumes
that no digital agents of human-level intelligence will be valuable
afterwards. I anticipate that software of virtually every level of
intelligence will serve usefull purposes.

 Where will humans whose minds include some digital enhancements fit
into your predictions?

>(2) A line has to be drawn somewhere. Why is it inefficient to draw
>it between humans and superintelligences, but not between, say,
>slightly uplifted animals and humans?

 They sound equally inefficient. Anything resembling an intelligence-based
threshold seems like it would produce a need for something resembling
high-priced lawyers spending significant resources arguing over what meets
that threshold.
 I don't think such a line needs to be drawn. I think that any entity
which claims to have purchased, homesteaded, or have been given a piece
of property should have that claim evaluated without regard to the nature
of the entity making the claim.

>(3) Maybe superintelligences can manage their capital more
>efficiently than somewhat irrational humans.

 I assume so, and don't see it's relevance.

>> Yes, trying to claim ownership of things over which you have no control
>> can undermine your ability to defend your property rights where they matter
>> most.
>
>And if humans are effectively powerless and don't have
>direct "control" over anything?

 I don't understand. Are you asking what happens if humans no longer have
property rights? About the difference between direct and indirect control?

>> I suspect you underestimate the costs of achieving agreement about a
>> fundamental moral change.
>
>Maybe. What does this cost consists in, and why do you think that it
>will be big?

 Intelligences have to model the society that would exist under each
alternative proposed rule - not just the immediate effects of applying
the rule, but such things as whether it is stable. For instance, with
the IQ-based rule that you seem to hint at, it would seem wise to ask
whether there would be a slippery slope over which the majority with
the highest 75% of IQs at any one time would keep redefining the threshold
upwards to steal from those with the lowest IQs. Modelling enough of the
social dynamics of to determine which rules are safe is not something I
see any hope of doing reliably, so I forsee intelligences continuing to
devote a significant fraction of the resources at stake towards producing
the right results for them.
 Just look at the effort people put into imposing the right rules concerning
abortion and doctor-assisted suicide. Or if you still want to use
intelligence as a criterion, look at how much verbiage and emotion
IQ tests have generated.

 History provides some examples of powerfull groups stealing from weak
groups (usually making claims that the weak group was inferior in a way
that at least superficially resembles your claim). For example, nazis
taking property from jews. It is probably possible to find examples where
the victors were clearly better off, but I think in most cases the societies
suffered from less secure property rights than are optimal for maximizing
wealth.

>Note that reducing the costs of coexistence is something *it*
>does,

 Both sides can influence the cost (e.g. by whether they claim resources
that haven't been homesteaded yet).

>at all. (Though as long as it is restricted to this planet and its
>near surroundings, then the cost of keeping humans in physical
>reality might amount to a large fraction of its capital.)

 You talk as if humans would own a fixed set of physical assets.
I think they would trade some of those assets for services the
superintelligences would provide, making peacefull trade compatible
with superintelligences owning most of earth's resources.

-- 
------------------------------------------------------------------------
Peter McCluskey          | Critmail (http://crit.org/critmail.html):
http://www.rahul.net/pcm | Accept nothing less to archive your mailing list
Received on Mon Jun 8 16:37:27 1998

This archive was generated by hypermail 2.1.8 : Tue Mar 07 2006 - 14:45:30 PST