From: Nick Bostrom <bostrom@ndirect.co.uk>

Date: Thu Apr 23 1998 - 18:53:02 PDT

Date: Thu Apr 23 1998 - 18:53:02 PDT

When we discuss future possibilities, we say that somethings are

probable and others are not. Sometimes we assign numerical

probabilities to hypothesis such as (H) "Nanotech will come before

superintelligence." How are we to understand such probability

statements?

I know some people who think that many hypotheses we discuss are at

the moment so widely out of reach for the scientific method that it

doesn't make sense assign probabilities to them. I don't think this

is the right attitude. But I do think that there are some things that

need to clarified before such probability assigments will be very

informative. In particular, I think we have to distinguish between

several different senses of probability.

When I say "The probability of H is 40%." I could mean to:

1. Say something about my personal subjective probability. This

would be equivalent to saying: "Given the evidence that I have, the

rational betting odds for H would be 40%." Note that on this

interpretation, you could say "I think the of H is 1%.", and we would

not have disagreed with each other, and we could both be perfectly

rational (at least if we didn't know about each others' statements).

2. Say something about the common subjective probability relative to

the present knowledge basis of humankind: i.e. that the rational

betting odds, given by the sum total of present human knowledge is

40%. (This is ambigous: what human knowledge should be included? Only

general scientific knowledge, or also facts that are only known to

one person, such as the hidden intentions of some government

official?) (This is the sense of probability that would hopefully be

approximated by the actual betting odds in an real-money incarnation

of Robin's idea futures. Given such an institution (if it worked as

intended), the probability in sense 1 would tend to approach the

probability in sense 2.

3. Say something about the objective probability of the event. I

could, for example, mean that: if we were to create one million

planets almost exactly similar to present-day earth, then on 40%

of them there would be nanotech before superintelligence. An

ambiguity here is "almost exactly similar". It could mean:

(a) "exactly the same", (b) "same as far as the state of present-day

earth is determined by my present evidence", or (c) "same as far as

the present state of the earch is determined by the totality of

evidence available to humans today".

(A futher ambiguity for each of these interpretations is how much

computing power to assume that the rational estimator has.)

Depending on which sense of probability we have in mind, we might

give very different numerical estimates. Consider the question: "What

is the probability that nanotechnology will cause the extinction of

earth-descendent intelligent life?"

In sense 1, I might answer: "25%".

In sense 2, I might say: "I don't know, but I would guess somewhere

between 2% and 70%."

In sense 3, I might say: "I don't know but I think it more probable

that it is somewhere between 99.9% and 100% than that it is between

50% and 60%.

_____________________________________________________

Nick Bostrom

Department of Philosophy, Logic and Scientific Method

London School of Economics

n.bostrom@lse.ac.uk

http://www.hedweb.com/nickb

Received on Fri Apr 24 01:01:00 1998

*
This archive was generated by hypermail 2.1.8
: Tue Mar 07 2006 - 14:45:30 PST
*