From: Charles Hottel on 22 Jan 2010 21:05 "Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> wrote in message news:7rtoeoFd3qU1(a)mid.individual.net... > <snip> > I think we part company here, Charlie. I don't believe people are evil (I > do believe there are aberrated people who behave very badly, but I don't > believe they were born that way...). I don't believe in "original sin", > and I can't believe a newborn baby is anything but a clean slate that life > experience and the intellect of the child will write on. > <snip> I worded this poorly. I believe people are capable of both 'good' actions and 'badl' actions depending upon their circumstances and the pressures that they are under and on their ablity to control their emotional and impulsive responses as well as a host of other factors. Suppose that the nanotechnology existed that could be fed a good but imperfect terrorist profile as input, and could seek out and identify and count all those people that fit the profile. Further suppose that some innocent people would be counted as terrorists and some potential terrorists would escape detection but that a limit to the margin of error could be determined. Suppose it could be determined what the number of auxillary casualties such as children of terroists who might not survive if their parents were eliminated. Lastly suppose that the capability to kill all those identified as a potential threat exists. You say the 'good' guys would not deploy this technology. If it were the day after 9-11 would you still say that? What if there was intelligence that a dirty bomb attack that on New York City was in the process of being set up and that one million people would die? What if it were 2 million or three million people etc.? What if it was a coordinated attack on 3 cities or 5 cities or 10 cities or more? At some point the 'good' guys would weight the amount of their potential causalties against the number of potential 'enemy' casulaties. What if killing all the "enemy" would result in the death of fewer people than if nothing were done? What would be the most moral choice? I think the answer to what would be done is clear just from the history of previous wars. WWII did not involve nanotechnology, but decisions were made that resulted in the deaths of 50 to 70 million people and they are still just as dead even though it was more conventional technology that caused their deaths. The only difference is that nanotechnology is potentially more powerful and possibly more discriminating in its selection of who to kill. Today we use smart bombs and say they cause fewer collateral casualties. As our power to create ever more powerful technologies increases our wisdom to decide whether or not to use it is not increasing at the same pace. I prefer beings that have God like power to also have God like wisdom. I have played God, not out of choice, and I hated it. I had to decide whether and when to have my dog put to sleep. I made the best decision that I could. Was it right and timely? I do not know. Did I wait too long and did my dog suffer more than she should? Did I act too soon and take away good days of remaining life? I don't know and I don't know how I could have known the optimum and best time to act. I just have to live with my choice. As you know I went through an experience with losing a baby. I will skip the details of all the decisions that had to be made, and the details of the variable quality of the information that we had to base those decisions on, and of how that information changed over time, and of the conflicting opnions of the various doctors. These decisions not only affected our baby but also my wife and the potential for her to be able to have a baby in the future. In retrospect we made some bad decisions and we were lucky that circumstances played out so that it at least would not be dangerous for her to have a baby in the future. In fact that actual consequenceses we experenced were the same as if all our decisions had been 100% correct. We were very, very lucky in that part at least. Will politicians and military stategists agonize over the potential deaths of millions and much as I did ove my wife, baby and a dog? I doubt it. Perhaps I am too tender hearted or possibly even emotionally immature, but perhaps the people deciding on the proper uses of nanotechnology are too cynical or too concerned with maintaining their positions. Only time will tell and the future of the human race may rest upon the decisions that are made. I only wish I thought and felt that we are ready to make these momentous decisions. The time to get ready is growing very short and the majority of people have no clue that this is approaching.
From: Pete Dashwood on 22 Jan 2010 21:37 Alistair wrote: > On Jan 21, 10:29 pm, "Pete Dashwood" > <dashw...(a)removethis.enternet.co.nz> wrote: >> HeyBub wrote: >>> Pete Dashwood wrote: >> >>>>> It's difficult. It's difficult because man is a "pack animal" and >>>>> is compelled innately to be a member of a group. >> >>>>> The need, nay, the necessity, to belong to a herd is what drives >>>>> devotees of Manchester United to do silly things, it explains >>>>> nationalism, religious extremism, and maybe even stamp collecting. >> >>>>> People who hate Negros will root for them nevertheless if they are >>>>> players on the home team. Descendants of both Irish Catholics and >>>>> Irish Protestants will defend each other when sharing a foxhole. >> >>>> That is a beautifully expressed paragraph. >> >>>>> The urge to bond and defend your group against all others is a >>>>> survival mechanism and deeply ingrained in the lizard brain. >> >>>> Not sure about that. Lizards are not gregarious... :-) >> >>>> I think it might have more to do with raising children, where there >>>> is more safety in numbers. The more like-minded adults around, the >>>> better chance the young have of surviving. >> >>>>> It >>>>> cannot be denied. The best that can be done is to bend it to a >>>>> larger group and an example of that is the military where all >>>>> races serve in (mostly) harmony. >>>>> Of course the faggots better not try to join... >> >>>> :-) >> >>>> Jerry, I generally enjoy your posts and appreciate your wry humour, >>>> and comon sense, even if I don't always agree with your position. >> >>>> This is a very nice piece of writing. Thanks for posting it. >> >>> As for "lizard brain," I meant the Limbic System. As we know, the >>> Limbic System is responsible for the four "F's": Fight, Flight, >>> Feeding, and Reproduction. >> >>> And you're correct about the "safety/strength in numbers." If there >>> is a switch in the brain somewhere that got flipped once upon a time >>> to encourage group bonding, that bonding would prove evolutionarily >>> beneficial; The "switch" would stand a better chance of being passed >>> on than the "loner" switch. >> >>> Consider dogs. Dogs are pack animals and the survival of the "pack" >>> works best when all the members simultaneously attack the prey or >>> the enemy. (That's why yappy lap dogs get gobbled up by 'gators in >>> Florida so easily - damned pooches run right up to the reptile's >>> mouth and try to bark the one-ton lizard into submission. Zip! Right >>> down the hatch. The dogs can't help it - they were wired that way.) >> >>> In my view, an awful lot of human inclinations can be laid at the >>> doorstep of biology. >> >> I agree. >> >> I'm currently reading Richard ("The God Delusion") Dawkins' latest >> book "The Greatest Show on Earth". He is primarily a Natural >> Historian who got pretty pissed off with Creationists and >> Fundamentalists undermining his life's work. Sometimes his >> frustration boils over, but mostly his books are extremely logical, >> well informed and readable. He shows in easily understandable >> language how Evolution has made us (and a number of other species) >> what we are over a VERY long period of time. >> >> Whatever religious views a person has, it is very interesting >> reading. > > Certainly interesting; I finished it some time last year and told a > Christian fundamentalist friend that I found the book to be a little > preachy. I was amused to see that, when the Humanist Society in London > put posters on the side of buses (with Dawkins' support) that the > posters read "There is probably no God...". Looking for a cop-out? > there is none bigger. The posters should have read "There is no God". > > After you have finished Dawkins, try "Why I am not a Muslim" by Ibn > Warraq (a one-time Muslim). It details the foundation and rise of > Islam and refutes the notion of Islam being a font of science and > Arts. It also underlines the lies and hypocrisy surrounding Islam but > I shouldn't preach to you; just read the book. Thanks Alistair, I'll put it on the list... Pete. -- "I used to write COBOL...now I can do anything."
From: Pete Dashwood on 22 Jan 2010 21:42 Alistair wrote: > On Jan 22, 1:44 pm, "Pete Dashwood" > <dashw...(a)removethis.enternet.co.nz> wrote: >> >> Fortunately, as time goes by, people on both sides of this conflict >> are getting wiser. Eventually (and it will be a very long time) they >> will realise that continued warfare is in nobody's interest and >> they'll start trading and living together. As people become better >> educated, they are less likely to be strictly religious, and the the >> religious grounds for war recede. > > Witness the CIA bomber (a doctor) in Afghanistan. A fair point. Maybe exposure to the Military affects the reason. :-) (I too was a soldier... :-)) Maybe some people have religion so deeply ingrained in them that no amont of education will get them thinking for themselves. I accept your example, but I hope my point is still generally true. Pete. -- "I used to write COBOL...now I can do anything."
From: Pete Dashwood on 22 Jan 2010 21:46 Howard Brazee wrote: > On Fri, 22 Jan 2010 08:05:17 -0800 (PST), Alistair > <alistair(a)ld50macca.demon.co.uk> wrote: > >> On Jan 22, 1:44 pm, "Pete Dashwood" >> <dashw...(a)removethis.enternet.co.nz> wrote: >>> >>> Fortunately, as time goes by, people on both sides of this conflict >>> are getting wiser. Eventually (and it will be a very long time) >>> they will realise that continued warfare is in nobody's interest >>> and they'll start trading and living together. As people become >>> better educated, they are less likely to be strictly religious, and >>> the the religious grounds for war recede. >> >> Witness the CIA bomber (a doctor) in Afghanistan. > > And certainly people educated in theology. > > I haven't seen a good correlation between education or intelligence > and Righteousness that says the other guys are wrong and need to be > stopped or punished. This seems to be more innate. Yes, I thought more about that after reading Alistair's post. Obviously, if the education is received in a religious school, then the opposite effect to the one I hoped for is likely. Nevertheless, I still hold education to be the main light that will bring people out of darkness. Of course, that education doesn't have to be in Universities. It can be in response to the yearning for learning that drives people to "go and find out"... Pete. -- "I used to write COBOL...now I can do anything."
From: Pete Dashwood on 22 Jan 2010 21:55
Charles Hottel wrote: > "Pete Dashwood" <dashwood(a)removethis.enternet.co.nz> wrote in message > news:7rtoeoFd3qU1(a)mid.individual.net... >> > > <snip> > >> I think we part company here, Charlie. I don't believe people are >> evil (I do believe there are aberrated people who behave very badly, >> but I don't believe they were born that way...). I don't believe in >> "original sin", and I can't believe a newborn baby is anything but a >> clean slate that life experience and the intellect of the child will >> write on. > <snip> > > I worded this poorly. I believe people are capable of both 'good' > actions and 'badl' actions depending upon their circumstances and the > pressures that they are under and on their ablity to control their > emotional and impulsive responses as well as a host of other factors. > > Suppose that the nanotechnology existed that could be fed a good but > imperfect terrorist profile as input, and could seek out and identify > and count all those people that fit the profile. Further suppose > that some innocent people would be counted as terrorists and some > potential terrorists would escape detection but that a limit to the > margin of error could be determined. Suppose it could be determined > what the number of auxillary casualties such as children of > terroists who might not survive if their parents were eliminated. > Lastly suppose that the capability to kill all those identified as a > potential threat exists. > You say the 'good' guys would not deploy this technology. If it were > the day after 9-11 would you still say that? What if there was > intelligence that a dirty bomb attack that on New York City was in > the process of being set up and that one million people would die? What if > it were 2 million or three million people etc.? What if it > was a coordinated attack on 3 cities or 5 cities or 10 cities or > more? > At some point the 'good' guys would weight the amount of their > potential causalties against the number of potential 'enemy' > casulaties. What if killing all the "enemy" would result in the death > of fewer people than if nothing were done? What would be the most > moral choice? > I think the answer to what would be done is clear just from the > history of previous wars. WWII did not involve nanotechnology, but > decisions were made that resulted in the deaths of 50 to 70 million > people and they are still just as dead even though it was more > conventional technology that caused their deaths. The only > difference is that nanotechnology is potentially more powerful and > possibly more discriminating in its selection of who to kill. Today > we use smart bombs and say they cause fewer collateral casualties. > > As our power to create ever more powerful technologies increases our > wisdom to decide whether or not to use it is not increasing at the > same pace. I prefer beings that have God like power to also have God > like wisdom. > I have played God, not out of choice, and I hated it. I had to decide > whether and when to have my dog put to sleep. I made the best > decision that I could. Was it right and timely? I do not know. Did > I wait too long and did my dog suffer more than she should? Did I > act too soon and take away good days of remaining life? I don't know > and I don't know how I could have known the optimum and best time to > act. I just have to live with my choice. > As you know I went through an experience with losing a baby. I will > skip the details of all the decisions that had to be made, and the > details of the variable quality of the information that we had to > base those decisions on, and of how that information changed over > time, and of the conflicting opnions of the various doctors. These > decisions not only affected our baby but also my wife and the > potential for her to be able to have a baby in the future. In > retrospect we made some bad decisions and we were lucky that > circumstances played out so that it at least would not be dangerous > for her to have a baby in the future. In fact that actual > consequenceses we experenced were the same as if all our decisions > had been 100% correct. We were very, very lucky in that part at > least. > Will politicians and military stategists agonize over the potential > deaths of millions and much as I did ove my wife, baby and a dog? I > doubt it. Perhaps I am too tender hearted or possibly even > emotionally immature, but perhaps the people deciding on the proper > uses of nanotechnology are too cynical or too concerned with > maintaining their positions. Only time will tell and the future of > the human race may rest upon the decisions that are made. I only > wish I thought and felt that we are ready to make these momentous > decisions. The time to get ready is growing very short and the > majority of people have no clue that this is approaching. A really well written post, Charlie. I thought about your points long and hard, and have nothing to add. I also think your conclusion that we may be about to obtain weapons we are not emotionally ready for is an excellent one. The best analogy I could think of is that it's like giving an Uzi sub machine gun to a 9 year old who is "normally well behaved". So you'd just say: "Don't give him the gun..." But this is not a gun. It is a technology that could help millions of people. We cannot suppress it. The best we can do is educate the child to the enormity of the power in his hands. I agree that it's a frightening thought. Pete. -- "I used to write COBOL...now I can do anything." |