Robin writes:
> Here's an example: insurance. The possibility of adverse selection
> happens because you might know things about your risk that insurance
> companies don't. Adverse selection can result in all insurees buying
> less insurance than they otherwise would, which hurts them all without
> helping insurance companies.
Thanks, that's a very helpful example. As you said, if people were given
the opportunity before they knew if they would turn out to be good or bad
insurance risks, they would prefer to commit to a system where everyone
would have to reveal all the relevant facts about their riskiness.
Chances are that they would turn out to be in the majority which has
relatively low risk, so there is a high probability that they would
benefit from such a commitment policy. Of course, if it does turn out
that they have high risk, they may regret their decision, but a priori
the chances of that happening are low, so they are willing to take their
chances.
It's a bit harder to apply this to real situations, like the gay
youth in an oppressive, small-town community in the bible belt. If we
force him to reveal his sexual preference it does give some benefit to
community members. Previously they were saddened by the thought that
there might be a gay boy hiding among the supposedly upstanding youths
of their community. Now if we suppose there is some method to expose
gays, then the community can drive them out of town and sleep easier,
knowing that there are no longer any homosexuals around. This may be a
net economic gain if the percentage of anti-gay bigots is large enough,
but it does not seem to be an outcome that I find attractive.
Privacy forces society to be more tolerant, in effect, by allowing people
to hide those aspects of their lives which society would not approve of.
Economics may not be able to say whether it is good for a society to be
more tolerant, but this is a value which I personally favor. (I think
someone else already made this point in the thread, but I can't find it
now.)
> If you insist on a general intuition, try this. When there are secrets,
> the need for "incentive compatibility" limits the feasible institutions,
> and hence the feasible outcomes. If you want outcomes to be correlated
> with the secret info people possess (and you usually do), you have to
> make it in people's interest to tell you what they know. Without secrets,
> this constraint doesn't apply.
Yes, I got a book from the library called "Incentives: Motivation and
the Economics of Information" and it discusses these kinds of issues.
It has a lot of examples but so far none which are quite like the
traditional privacy issues (except for the insurance case).
Hal
Received on Thu May 21 19:31:51 1998
This archive was generated by hypermail 2.1.8 : Tue Mar 07 2006 - 14:45:30 PST