Re: [unrev-II] Is "bootstrapping" part of the problem?

From: Paul Fernhout (pdfernhout@kurtz-fernhout.com)
Date: Thu Dec 21 2000 - 07:16:51 PST

  • Next message: Garold L. Johnson: "Re: [unrev-II] Is "bootstrapping" part of the problem?"

    John-

    Great comments. Let me see if I can summarize them (hopefully without
    caricaturing them too much, let me know if I inadvertently did):

    1) An important core value is "Respect for Human Autonomy".
    2) Humans have built-in evolved ways of operating in a "marketplace"
    situation with some individuals rising and some falling based on their
    exploits.
    3) Statistical changes emerging from individual choices are likely to be
    better than changed mandated from a central authority.
    4) Some views of the DKR are authoritarian (such as ones that talk about
    relative priorities for the world as a whole).
    5) All systems have limits, even those growing exponentially. There is
    not a pressing need to deal with any threats from exponential growth.
    6) Rising technological progress lifts all boats, and this is more
    important than charity.
    7) Innovation is especially needed in areas of energy supply, material
    acquisition and disposal, and advanced material processing.
    8) World hunger is due to distribution problems, but attempting to
    correct this directly would lead to other more serious problems.
    9) Social innovations like micro-credit can solve much of the problem of
    poverty in a capitalist setting.
    10) It would be useful to have a DKR containing development success
    stories.
    11) Linking the DKR to changing human nature will cause it to fail.
    12) (Implied) Making statements about supporting alternate lifestyles or
    economics rather than modern day capitalism is equivalent to a demand
    for "re-arranging human character". Basically, the economic and
    technical system as is is working as well as could be expected and the
    world is getting better because of it now and in the future.

    I don't have much time today for giving all the comments by you or
    others as much time as they deserve. However, let me address each point
    briefly, with my summary after the original statement, and my response
    to the summary statement.

    John \"sb\" Werneken wrote:
    > Paul, I respect your point of view but I do not share it.
    >
    > Desires, intelligence, values, - rings true, up to a point. But it's sort
    > of like id, ego, superego. A way of naming and thus discussing something
    > about mind or personality that seems real to many of us, but not terribly
    > informative about how we function or could or should function.
    >
    > I suggest that if we are affirming values (widely shared or otherwise) we
    > ought to affirm Respect for Human Autonomy. The idea that one individual may
    > better determine that individual's needs, wants, and means than another. The
    > idea that one may be mistaken, especially about other people.

    1) An important core value is "Respect for Human Autonomy".

    Agreed.

    However, there are phases in a person's life when the need assistance to
    grow into a member of a particular society, or to survive in that
    society despite calamity. If they do not get that assistance, they do
    not become a productive member of that society. One value to society of
    helping the downtrodden is in part that it creates a better social
    dynamic for everyone (i.e. people don't live in fear of poverty, and
    thus are not driven to certain extreme behaviors such as financial
    obesity.)

    This is especially true for children, naturally. However, it may still
    be true for adults who have grown up in different cultures who did not
    gain the requirements (ideas, skills, capital, outlook) to participate
    in the society under discussion. (This is to not say individuals from
    disadvantaged backgrounds may not bring other valuable things with them
    from their heritage.)

    We need to distinguish the desire to create opportunity from forcing
    people to act on the opportunity. I would not interpret "Respect for
    Human Autonomy" to mean let the "weak" (however that is currently
    defined) die from neglect. And that's what we are talking about to an
    extent with 840 million malnourished people.

    The people trying to take away autonomy (IMHO) are more likely those
    Congress people passing stuff like:
      http://cryptome.org/hr46.htm
    http://slashdot.org/article.pl?sid=00/12/20/1620204&mode=thread&threshold=2
    where bill HR46 may allow the seizure of your computer if it is suspect
    of being "used in ... intellectual property theft" before you are
    convicted or even charged (maybe someday with IP theft interpreted as
    websurfing past a web site with an unauthorized use of a Disney
    character perhaps?)

    > I believe the evolutionary environment humans have adapted to, is that of
    > other humans. Hence the skill at detecting cheaters, free riders etc. - as
    > well as the adulation for those whose exploits demonstrate both a
    > specialness as an individual and a major contribution to the community as a
    > whole - success in two major areas of human endeavor. I think our great
    > teachers refer to this when they talk about harmony, about finding peace as
    > a part of the whole.

    2) Humans have built-in evolved ways of operating in a "marketplace"
    situation with some individuals rising and some falling based on their
    exploits.

    Agreed. However, I think the behavior of a world economy because of the
    huge quantitative difference in number of actors is also qualitatively
    different than face-to-face human situations. Thus the emergent behavior
    from a world economy may be greatly different than in a tribal
    situation.

    > Another related value: voluntary decisions aren't only more whole-hearted
    > than compelled ones, they are more likely to be correct for the precise
    > individual circumstances. Summed over all humanity, these individual
    > decisions - sometimes competitive, sometimes cooperative, are the
    > well-spring of progress, both material and moral.

    3) Statistical changes emerging from individual choices are likely to be
    better than changed mandated from a central authority.

    I'd say yes and no.

    Yes, because this is a model of decentralized decision making, which in
    general I like.

    Also yes, because individuals may create their own new playing fields
    (example, new industries as yet unregulated, or new living spaces in the
    ocean or in space where different laws may be enacted, as was the case
    with the Pilgrims fleeing to america to avoid religious persecution).

    However, also No. We do have government mandates -- for example laws
    derived from the U.S. constitution that are based on shared values. We
    also have individual choices within the mandated framework. The point is
    that the structure of the playing field (and changes in that structure)
    are something that large organizations impact, governments in terms of
    laws, and corporations in terms of the work environment and permissible
    work.

    > I see too much authoritarian direction in some views of the DKR - the right
    > thing made known and enforced upon the people, for example if we forbid
    > lipstick until all eat well.

    4) Some views of the DKR are authoritarian (such as ones that talk about
    relative priorities for the world as a whole).

    Possibly.

    However, in the lipstick example I didn't mean people should be forced
    to support starving African villages instead of buying cosmetics. I
    meant that even if it came down to deciding to go without one lipstick
    tube (for an otherwise very affluent person) in order that a village of
    10,000 people in Africa could avoid starvation for all of eternity,
    there is no guarantee that the village will be helped.

    I am not advocating forcing people to spend their money a certain way
    (our government taxes already people to do that, thank you). However, I
    am suggesting the possibility that some individuals could invest their
    own resources in creating a possible alternative framework (including a
    knowledge base) that may be of more benefit both to themselves and
    others in daily life.

    > I haven't seen any exponential growth yet. The Lilly in the pond taking 39
    > days to cover half the pond, and one day to cover the rest, like Moore's
    > law, may look exponential in the short run - in this case, over 40 days. But
    > the limits still exist.
     
    5) All systems have limits, even those growing exponentially. There is
    not a pressing need to deal with any threats from exponential growth.

    I agree that systems have limits, and typically exponential growth
    becomes an S-curve at some point.

    We obviously do disagree some here though. I feel what is at issue here
    is whether there will be enough clear water in the machine-intelligence
    Lily pond over the next few decades for survival. And I feel strongly
    that this should be a major focus of serious attention, rather than
    advocating business as usual up until day 39.

    > In my not so humble opinion, I believe that we continue to need
    > technological progress rather more than we need more humane distribution of
    > its fruits. As wealth grows, the willingness and the ability to lift those
    > of us lower on the food chain grows as well. More importantly, the prospect
    > of many coming to enjoy what formerly only a few could dream of remains
    > real.

    6) Rising technological progress lifts all boats.

    Agreed to an extent.

    However, in terms of say time parents spend with children in the U.S.
    for example, the last twenty years of technological progress has been a
    step backwards. So too have been steps backwards unforeseen side-effects
    of technology (like plastics producing estrogen mimics). And so too
    might be the implications of advanced warfare and machine intelligence.

    I personally would agree more if we said that technological progress
    lifts a lot of boats, whereas realistically some will get swamped. The
    boats that do get swamped need our charity.

    Also, I do think if things like food or water got cheaper, charity
    relating to helping people survive or bootstrap themselves to be
    self-reliant would be cheaper, and so charitable people could accomplish
    more (perhaps 840 million people's worth on a billion dollars, which
    would certainly be feasible to raise as a charity economically today if
    it would for sure see the immediate end to basic suffering).

    > In practical terms, I think continued progress on three fronts to be
    > mandatory.
    >
    > (1) We need more energy. Probably solar energy converted cheaply in space to
    > microwaves to then become electricity on the ground. Perhaps thermonuclear
    > or others not yet understood.
    >
    > (2) We need more material resources, both as input materials and as a means
    > of disposing (at least temporarily) of wastes. Asteroid bodies come to mind
    > as one practical target here.
    >
    > (3) We also need better ways of organizing and utilizing the energy and
    > material we do have. Nano-technology, genetic technology, and computational
    > technology are examples.

    7) Innovation is especially needed in areas of energy supply, material
    acquisition and disposal, and advanced material processing.

    Agreed. However, I think that some of these innovations are somewhat
    redundant. That is, with enough energy, everything is recyclable. With
    better manufacturing processes, you need less energy and materials.

    > To instead retreat to a closed cycle model would require a complete change
    > in human nature, is not necessary, and is contrary to the desires of all but
    > the already well-off.
    >
    > On the politics of meeting human needs, it has been shown that famine is a
    > distribution affect, not a supply and demand effect. Nonetheless a top-down
    > redistribution would kill the golden goose of competitive autonomous
    > innovation.

    8) World hunger is due to distribution problems, but attempting to
    correct this directly would lead to other more serious problems.

    Reservations about tinkering with a system that has produced so much
    certainly sound reasonable. But I think in the case of distribution we
    are talking more politics than anything else. The greatest number of
    starving people (outside of those saying America) are starving due to
    the consequences of civil war or selfish dictators (or a related culture
    of dictatorship).

    Some of this gets down to feelings bout innovation, which is often cited
    as the reason for competition. I respectfully disagree on this. I think
    people and communities are fundamentally innovative. Historically, we
    have seen the development of fire, "alphabet technology" to quote Rod,
    and the plow even before extensive modern day competitive capitalism.
    I'm not saying competition can't spur innovation, just that it may not
    be strictly required.

    > There are other ways of attacking the distribution issue, for example by
    > extending to the poorest the benefits of capitalism.
    >
    > In Bangladesh, the Grameen Bank has pioneered micro-lending managed at the
    > village level. Through the efforts of individuals empowered by its
    > micro-loans, extreme poverty has reduced from 1/3 of villagers to 10%.

    9) Social innovations like micro-credit can solve much of the problem of
    poverty in a capitalist setting.

    Agreed.

    However, it is not clear whether this will scale as the rest of the
    economy continues to move ahead, if these people can not keep up. That
    is, the market for handicrafts may only be so large in the face of other
    goods.
     
    > Perhaps the DKR could lead to early recognition of many more successes like
    > this winner of many international prizes.

    10) It would be useful to have a DKR containing development success
    stories.

    Agreed.
     
    > But I don't think the DKR itself will prosper, if the concept gets linked to
    > the idea of re-arranging human character.

    11) Linking the DKR to changing human nature will cause it to fail.

    Agreed. However, human nature is variable, and creating DKRs to allow
    other aspects of human nature (i.e. charity) to flourish I think is
    worthwhile.

    12) (Implied) Making statements about supporting alternate lifestyles or
    economics rather than modern day capitalism is equivalent to a demand
    for "re-arranging human character". Basically, the economic and
    technical system as is is working as well as could be expected and the
    world is getting better because of it now and in the future.

    I hope I'm not distorting your points too much especially with this
    last. I would disagree with the implication that choosing anything other
    than our economics as it is require changing human nature. And I should
    make clear I am not advocating changing human nature. I am suggesting
    perhaps more diversity in its expression in terms of economics. I am
    also suggesting we consider creating alternative technical
    infrastructures (and related libraries of tools and knowledge) that may
    work in more harmony with certain aspects of human nature (i.e.
    "Buddhist Economics" mentioned in "Small is beautiful" -- in this
    context, for example, work as spiritual growth).

    I think the current economic system is in for a radical disruption
    whether desired or not -- both in the elimination of the need for
    "supply chains" and in the final implications of machine intelligences
    designed to win competitive arms races (economically for corporations,
    or militarily for the DOD). So, further, I'm suggesting that the current
    unguided technological explosion may not have room for any humans at all
    unless we act positively to innovate in such a way as to make room for
    many or all people. Conversely, if we do act positively, there may be
    room for trillions of more people (and AIs and augmented people, etc.)
    in the solar system.

    To reiterate my main points, which perhaps are getting lost in the
    debate over future economics:
    1) Technological (as opposed to social or political) progress, while
    desirable for many reasons, is not required to solve basic human
    problems.
    2) The exponential growth of technology is both a threat and a blessing,
    and at this point is a given, and like fire we need to do what we can
    with it for good ends (however we define those, where we may not agree).
    3) To an extent the exponential growth of technology may help meet human
    needs of the disenfranchised through reduced costs it may seem
    desirable, but it is not required to do so. This means people driving
    technological innovation, including the Bootstrap Institute, should be
    clearer about what it is they are trying to accomplish. Is it simply to
    escalate the infotech arms race, is it to make charity more effective,
    is it in some belief in "progress", or is it for other reasons?

    -Paul Fernhout
    Kurtz-Fernhout Software
    =========================================================
    Developers of custom software and educational simulations
    Creators of the Garden with Insight(TM) garden simulator
    http://www.kurtz-fernhout.com

    -------------------------- eGroups Sponsor -------------------------~-~>
    Download Net2Phone's free software to make FREE calls anywhere within
    the US from your PC now! Get great low rates on international calls!
    http://click.egroups.com/1/10923/0/_/444287/_/977416028/
    ---------------------------------------------------------------------_->

    Community email addresses:
      Post message: unrev-II@onelist.com
      Subscribe: unrev-II-subscribe@onelist.com
      Unsubscribe: unrev-II-unsubscribe@onelist.com
      List owner: unrev-II-owner@onelist.com

    Shortcut URL to this page:
      http://www.onelist.com/community/unrev-II



    This archive was generated by hypermail 2b29 : Thu Dec 21 2000 - 08:37:58 PST