Paul, I respect your point of view but I do not share it.
(Insert apology" dsl isp changed its name, fighting with egroups over
accepting new addy, hopefully multiples won't be received; apologies if they
Desires, intelligence, values, - rings true, up to a point. But it's sort
of like id, ego, superego. A way of naming and thus discussing something
about mind or personality that seems real to many of us, but not terribly
informative about how we function or could or should function.
I suggest that if we are affirming values (widely shared or otherwise) we
ought to affirm Respect for Human Autonomy. The idea that one individual may
better determine that individual's needs, wants, and means than another. The
idea that one may be mistaken, especially about other people.
I believe the evolutionary environment humans have adapted to, is that of
other humans. Hence the skill at detecting cheaters, free riders etc. - as
well as the adulation for those whose exploits demonstrate both a
specialness as an individual and a major contribution to the community as a
whole - success in two major areas of human endeavor. I think our great
teachers refer to this when they talk about harmony, about finding peace as
a part of the whole.
Another related value: voluntary decisions aren't only more whole-hearted
than compelled ones, they are more likely to be correct for the precise
individual circumstances. Summed over all humanity, these individual
decisions - sometimes competitive, sometimes cooperative, are the
well-spring of progress, both material and moral.
I see too much authoritarian direction in some views of the DKR - the right
thing made known and enforced upon the people, for example if we forbid
lipstick until all eat well.
I haven't seen any exponential growth yet. The Lilly in the pond taking 39
days to cover half the pond, and one day to cover the rest, like Moore's
law, may look exponential in the short run - in this case, over 40 days. But
the limits still exist.
In my not so humble opinion, I believe that we continue to need
technological progress rather more than we need more humane distribution of
its fruits. As wealth grows, the willingness and the ability to lift those
of us lower on the food chain grows as well. More importantly, the prospect
of many coming to enjoy what formerly only a few could dream of remains
real. In practical terms, I think continued progress on three fronts to be
(1) We need more energy. Probably solar energy converted cheaply in space to
microwaves to then become electricity on the ground. Perhaps thermonuclear
or others not yet understood.
(2) We need more material resources, both as input materials and as a means
of disposing (at least temporarily) of wastes. Asteroid bodies come to mind
as one practical target here.
(3) We also need better ways of organizing and utilizing the energy and
material we do have. Nano-technology, genetic technology, and computational
technology are examples.
To instead retreat to a closed cycle model would require a complete change
in human nature, is not necessary, and is contrary to the desires of all but
the already well-off.
On the politics of meeting human needs, it has been shown that famine is a
distribution affect, not a supply and demand effect. Nonetheless a top-down
redistribution would kill the golden goose of competitive autonomous
There are other ways of attacking the distribution issue, for example by
extending to the poorest the benefits of capitalism.
In Bangladesh, the Grameen Bank has pioneered micro-lending managed at the
village level. Through the efforts of individuals empowered by its
micro-loans, extreme poverty has reduced from 1/3 of villagers to 10%.
Perhaps the DKR could lead to early recognition of many more successes like
this winner of many international prizes.
But I don't think the DKR itself will prosper, if the concept gets linked to
the idea of re-arranging human character.
> Date: Tue, 19 Dec 2000 09:46:39 -0500
> From: Paul Fernhout <firstname.lastname@example.org>
>Subject: Re: Is "bootstrapping" part of the problem?
>Rod Welch wrote:
>> As usual, I am impressed by the depth of your analysis. In this
>> your point is not clear.
>First let me summarize: there is more to living than "intelligence".
>Intelligence doesn't call one to act, "desire" does that.
>> that are not being done, or that should be done differently?
>1) Value Affirmation. There should be an affirmation of core human
>2) Understanding Exponential Growth. .
>3) Accepting the Politics of Meeting Human Needs.
>If corporations now doing IT have the major goal of profit as opposed to
>"meeting unmet social needs" (to quote William C. Norris)
>then corporations whether they do IT or KM are irrelevant to human
>survival. They are effectively machine intelligences with their own ends
>(the ethic of profit maximization, or "bucks is beautiful") to which
>humans are only relevant in well defined "roles" to the extent they are
>currently required for service or markets. If they could be replaced at
>less cost by automation, they will be -- nay, by the corporation's rules
>in a competitive landscape, they must be (except union jobs?). The only
>hope to resist this is some form of government intervention or worker
>(individual or union) resistance. These decisions will all be made in
>bits and pieces, each one seeimgly sensible at the time. Consider the
>starting replacement of telephone support people by voice recognition
>The corporate social form has had little time to evolve (a few hundred
>years?) so there is not guarantee that contemporary corporate
>organization forms will be capable of doing more than exhausting
>convenient resources (passing on external costs when possible) and then
>Obviously, to the extent KM could transform an organization like GE into
>one that makes good on their corporate slogan "if we can dream it we can
>do it" and deliver on their implied promises in their 1986 Disney Epcot
>center pavilion (underwater cities, space habitats) then KM will be
>useful. It is always "Knowledge about what?" For an alternative to a
>world view producing organizations that refuse to clean up PCBs they
>dumped in the Hudson, consider The Venus project's world view:
>There is one obvious exception to saying KM won't change the direction
>of organizations, which is to the extent humans as individuals in
>corporations have access to KM tools and might see the bigger picture
>and act as individuals. The only other hope is that a general increase
>in organizational capacity in large corporations or governments will let
>some small amount leak through for unsanctioned human ends (but the cost
>in human suffering to that approach is high -- witness as one example
>the 840 million people now in hunger.) But be very clear, this secondary
>effect is not the reasons organizations will adopt KM. They will adopt
>KM for competitive advantage in business as usual (barring a cultural
>shift for other reasons.)
>As I saw this weekend on "DebatesDebates" with a debate on "Is the Good
>one of the debaters made the point that even if capitalism is good at
>generating wealth, it is not good at distributing it. That is why I say
>capitalism without charity is evil. Taken to an extreme when machine
>intelligence is possible on a human level, capitalism as we now know may
>leave (most) people behind, while at the same time owning or controlling
>all the resources, preventing most people from earning a living
>("shading them out"). Historically, this has happened many times before
>-- for example, the enclosure acts driving the English peasantry
>(initially) into poverty and starvation.
>> I propose a single, breakthrough, solution by enhancing alphabet
>> using a continual "intelligence" process that turns information
>> thus the goal to move up a notch on the cognitive scale from IT
>to KM. You seem
>> to suggest today that bootstrapping, while intending to solve
>> in some respects, be said to compound the problem it seeks to solve.
>I am not leveling this criticism directly at "bootstrapping" as the
>Bootstrap Institute and Doug tries to define it. What I am trying to say
>is that "bootstrapping" in terms of exponential growth of technology
>(which enables more technology etc.) is already happening. Bootstrapping
>is the given. So the issue is, how do we use related exponential growth
>processes to deal with this? To the extent Doug's techniques are used
>just to drive the technological innovation process faster, in no
>specific direction, they are potentially just making things worse. To
>the extent such techniques are used for specific human ends (example,
>dealing with world hunger, making medical care more accessible, ensuring
>children don't grow up in ignorance and poverty, reducing conflicts and
>arms races) they make things better. The thing is, in a world where
>competition (the arms race) has moved from physical weapons to infotech
>(both corporate and military), simple saying you will speed the arms
>race is not enough. In my thinking, it is the arms race itself that is
>the potential enemy of humankind, and the issue is transcending the arms
>race (whatever grounds it is fought on -- nuclear, biological,
>> I think
>> there is another way to explain bootstrapping that avoids this
>conflict, but you
>> seem to be arguing against it. Can you clarify?
>I don't have a conflict in thinking about an OHS/DKR or working towards
>one. I accept the possibility that this bootstrap process may end badly
>for most of humanity. It is a shame, and humanity should try to avoid
>this looming disaster, and may well, but I have accepted that one can
>not save everyone.
>For over a decade I have wanted to build a library of human knowledge
>related to sustainable development. I as a small mammal am using the
>crumbs left over by the dinosaurs to try to do so (not with great
>success, but a little, like our garden simulator intended to help people
>learn to grow their own food). I spent a year hanging around Hans
>Moravec's Mobile Robot Lab at CMU, and I turned my back on
>self-replicating robotics work -- not because I thought it was sci-fi,
>but because I saw it was quite feasible, and wanted to do something else
>that was more likely to ensure human survival (self-replicating
>habitats, for space, water, and land). I also did not want to speed the
>process along. Now fifteen years later, this process is effectively
>unstoppable, so I have fewer qualms about doing a little that might
>hasten it if the payoff might be some type of refugia for humans.
>The way to put it is that "bootstrapping" has linked itself conceptually
>to an exponential growth process happening right now in our
>civilization. Almost all explosions entail some level of exponential
>growth. So, in effect, our civilization is exploding. The meaning of
>that as regards human survival is unclear, but it is clear people are
>only slowly coming to take this seriously.
>As one example, lots of trends:
>Lou Gerstner(IBM's Chairman) was recently quoted as talking about a near
>term e-commerce future of 10X users, 100X bandwidth, 1000X devices, and
>1,000,000X data. Obviously, IBM wants to sell the infrastructure to
>support that. But I think the bigger picture is lost.
>Even for seeing the "trees" of individual quantitative changes, the
>"forest" that these quantitative changes would have a qualitative change
>on the business or human landscape is ignored. Or if people see it, it
>is the "elephant in the living room" no one talks about (well obviously
>a few like Kurzweil or Moravec or Joy). More of everything yes, but
>always business as usual.
-------------------------- eGroups Sponsor -------------------------~-~>
It's Easy. It's Fun. Best of All, it's Free!
Community email addresses:
Post message: unrev-II@onelist.com
List owner: unrev-IIemail@example.com
Shortcut URL to this page:
This archive was generated by hypermail 2b29 : Wed Dec 20 2000 - 19:14:50 PST