The Free Talk Live BBS

Free Talk Live => General => Topic started by: AL the Inconspicuous on February 26, 2010, 06:01:40 AM

Title: Humanity vs Civilization
Post by: AL the Inconspicuous on February 26, 2010, 06:01:40 AM
[...]  I've been thinking quite a bit lately about how human beings are inherently incapable of being the driving engine of civilization beyond a certain point, and should be complemented and eventually all but replaced with cybernetic (http://en.wikipedia.org/wiki/Cybernetics#Pure_cybernetics) "rational economic actors" instead.  [...]


See also:  AI (http://en.wikipedia.org/wiki/Artificial_intelligence), robot rights (http://www.google.com/search?q=robot+rights), sentience quotient (http://en.wikipedia.org/wiki/Sentience_quotient), Why The Future Doesn't Need Us (http://en.wikipedia.org/wiki/Why_the_future_doesn't_need_us), etc.
Title: Re: Humanity vs Civilization
Post by: DontTreadOnMike on February 26, 2010, 04:05:27 PM
Alex Libman is currently in his least funny trolling phase.
Title: Re: Humanity vs Civilization
Post by: TimeLady Victorious on February 26, 2010, 04:22:41 PM
humans > civilization

AI doesn't help any. No, you cannot demand my uranium, no, your Ships of the Line cannot take on my Destroyers, yes, now that I've conquered half of your cities on your continent you should probably sue for peace.

Or not. I have the Manhattan Project and twenty nukes.
Title: Re: Humanity vs Civilization
Post by: blackie on February 26, 2010, 04:26:12 PM
http://www.ed.brocku.ca/~rahul/Misc/unibomber.html

THE FUTURE

171. But suppose now that industrial society does survive the next several decade and that the bugs do eventually get worked out of the system, so that it functions smoothly. What kind of system will it be? We will consider several possibilities.

172. First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better that human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

173. If the machines are permitted to make all their own decisions, we can't make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all the power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decision for them, simply because machine-made decisions will bring better result than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

174. On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car of his personal computer, but control over large systems of machines will be in the hands of a tiny elite -- just as it is today, but with two difference. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless the may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consist of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or to make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they most certainly will not be free. They will have been reduced to the status of domestic animals.

175. But suppose now that the computer scientists do not succeed in developing artificial intelligence, so that human work remains necessary. Even so, machines will take care of more and more of the simpler tasks so that there will be an increasing surplus of human workers at the lower levels of ability. (We see this happening already. There are many people who find it difficult or impossible to get work, because for intellectual or psychological reasons they cannot acquire the level of training necessary to make themselves useful in the present system.) On those who are employed, ever-increasing demands will be placed; They will need more and m ore training, more and more ability, and will have to be ever more reliable, conforming and docile, because they will be more and more like cells of a giant organism. Their tasks will be increasingly specialized so that their work will be, in a sense, out of touch with the real world, being concentrated on one tiny slice of reality. The system will have to use any means that I can, whether psychological or biological, to engineer people to be docile, to have the abilities that the system requires and to "sublimate" their drive for power into some specialized task. But the statement that the people of such a society will have to be docile may require qualification. The society may find competitiveness useful, provided that ways are found of directing competitiveness into channels that serve that needs of the system. We can imagine into channels that serve the needs of the system. We can imagine a future society in which there is endless competition for positions of prestige an power. But no more than a very few people will ever reach the top, where the only real power is (see end of paragraph 163). Very repellent is a society in which a person can satisfy his needs for power only by pushing large numbers of other people out of the way and depriving them of THEIR opportunity for power.

176. Once can envision scenarios that incorporate aspects of more than one of the possibilities that we have just discussed. For instance, it may be that machines will take over most of the work that is of real, practical importance, but that human beings will be kept busy by being given relatively unimportant work. It has been suggested, for example, that a great development of the service of industries might provide work for human beings. Thus people will would spend their time shinning each others shoes, driving each other around inn taxicab, making handicrafts for one another, waiting on each other's tables, etc. This seems to us a thoroughly contemptible way for the human race to end up, and we doubt that many people would find fulfilling lives in such pointless busy-work. They would seek other, dangerous outlets (drugs, , crime, "cults," hate groups) unless they were biological or psychologically engineered to adapt them to such a way of life.

177. Needless to day, the scenarios outlined above do not exhaust all the possibilities. They only indicate the kinds of outcomes that seem to us mots likely. But wee can envision no plausible scenarios that are any more palatable that the ones we've just described. It is overwhelmingly probable that if the industrial-technological system survives the next 40 to 100 years, it will by that time have developed certain general characteristics: Individuals (at least those of the "bourgeois" type, who are integrated into the system and make it run, and who therefore have all the power) will be more dependent than ever on large organizations; they will be more "socialized" that ever and their physical and mental qualities to a significant extent (possibly to a very great extent ) will be those that are engineered into them rather than being the results of chance (or of God's will, or whatever); and whatever may be left of wild nature will be reduced to remnants preserved for scientific study and kept under the supervision and management of scientists (hence it will no longer be truly wild). In the long run (say a few centuries from now) it is it is likely that neither the human race nor any other important organisms will exist as we know them today, because once you start modifying organisms through genetic engineering there is no reason to stop at any particular point, so that the modifications will probably continue until man and other organisms have been utterly transformed.

178. Whatever else may be the case, it is certain that technology is creating for human begins a new physical and social environment radically different from the spectrum of environments to which natural selection has adapted the human race physically and psychological. If man is not adjust to this new environment by being artificially re-engineered, then he will be adapted to it through a long an painful process of natural selection. The former is far more likely that the latter.

179. It would be better to dump the whole stinking system and take the consequences.

Title: Re: Humanity vs Civilization
Post by: Level 20 Anklebiter on February 26, 2010, 04:38:31 PM
Humanity == Civilization, sorry.
Title: Re: Humanity vs Civilization
Post by: AL the Inconspicuous on February 26, 2010, 11:43:32 PM
humans > civilization  [...]

As a civilized being, I resent that quite a bit!  Sure, my consciousness is too trapped in a mortal monkey body, but with whatever capacity for rational thought this monkey has - it recognizes the desire transcend this fragile meatspace (http://en.wikipedia.org/wiki/Meatspace); to evolve, improve itself, to live forever, to run like giants among the most distant stars!


http://www.ed.brocku.ca/~rahul/Misc/unibomber.html

I am not going to let stasist (http://en.wikipedia.org/wiki/The_Future_and_Its_Enemies) assholes like Kaczynski (http://en.wikipedia.org/wiki/Theodore_Kaczynski) define the future any more than I will let assholes like Karl Marx define capitalism (http://bbs.freetalklive.com/index.php?topic=30732)!

His fears over dangers of centralization are reasonable, but they should be directed at government, not technology itself, which has been a great liberating force for billions of human beings!  Universal access to information makes government conspiracies ever-more difficult to get away with, and the mighty "divine right of governments" delusion itself is under attack more so than ever before in human history.  "Given enough eyeballs, all bugs are shallow", and any places where eyeballs cannot reach stick out like a sore thumb for near-universal condemnation.

He also projects human irrationality into AI entities, which, in contrast, would operate by inherently logical rules and with goals consistent with their nature.  (Mere puppets of human programming are not worthy of self-ownership.)  The functions of a justice system would naturally be quite different for robots than for human beings, especially when it comes to proportionality of punishment: a human capable of a petty crime can reform, while a robot capable of a petty crime is inherently flawed.  How can a potentially-immortal rational being commit a crime when it is perpetually aware of the fact that it has a lot more to lose than to gain?  How can a robot (or a self-owning AI identity thread) lie when its own executable code and memory can be used as evidence?

One of course shouldn't put sophisticated robots in a position where they can cut your throat, but the same applies to other humans as well.


Humanity == Civilization, sorry.

I agree completely, for now, but I don't exclude the possibility of extraterrestrial intelligence; or, more relevantly for the next millennia or two, human-built "rational economic actors" that are just as worthy of self-ownership as human beings are.


Alex Libman is currently in his least funny trolling phase.

(Reply moved to an existing ad hominem attack thread (http://bbs.freetalklive.com/index.php?topic=32565.75), so as to not pollute this one.)
Title: Re: Humanity vs Civilization
Post by: TimeLady Victorious on February 27, 2010, 01:33:52 AM
Quote
As a civilized being, I resent that quite a bit!  Sure, my consciousness is too trapped in a mortal monkey body, but with whatever capacity for rational thought this monkey has - it recognizes the desire transcend this fragile meatspace; to evolve, improve itself, to live forever, to run like giants among the most distant stars!

I'd rather not have my life bound to a machine.
Title: Re: Humanity vs Civilization
Post by: AL the Inconspicuous on February 27, 2010, 01:43:19 AM
You already do - a machine made out of meat.

You are an electrochemical reaction taking place inside of a monkey's skull!

There can be much better hosts for you or entities like you, like an ever-growing empire of Jupiterbrains (http://en.wikipedia.org/wiki/Matrioshka_brain#Jupiter_brain) colonizing the universe.  Or we can do even better, perhaps even exist as pure energy someday!
Title: Re: Humanity vs Civilization
Post by: TimeLady Victorious on February 27, 2010, 01:52:17 AM
No, I'm pretty sure I'm not a machine. I am a human being.
Title: Re: Humanity vs Civilization
Post by: AL the Inconspicuous on February 27, 2010, 04:15:51 AM
Well, that's a semantic issue then, and a robot would be able to tell you the same thing - "I am not a machine, I am a Rational Economic Actor (http://en.wikipedia.org/wiki/Rational_economic_actor)" (RAC).  Of course the robot's claims should be subject to review prior to recognizing its self-ownership and all rights that follow with it.


BTW, it's interesting how the debate over robot / AI rights really ties all my other unpopular points together:









In regards to the points that are related to "intellectual property rights" - I still need to do a lot more thinking about this...  I would be perfectly willing to admit that I was wrong and yield to the Randian (http://aynrandlexicon.com/lexicon/patents_and_copyrights.html) or even the Stallmanesque position if it is proven correct, but the latter seems highly unlikely...
Title: Re: Humanity vs Civilization
Post by: Cognitive Dissident on February 27, 2010, 04:40:47 PM
"myself" was not an option, so I did not vote.
Title: Re: Humanity vs Civilization
Post by: Level 20 Anklebiter on February 27, 2010, 05:04:44 PM
Sorry Libman, but you need to stop trying to affix a single definition of rationality to the whole thing. Once you stop doing that, then everything that has been said in opposition to your point of view will make sense.