Tuesday, November 24, 2009

Robotic Ethics Part II

Written by Isaac Asimov are the Three Rules

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

-These are the three rules of robotics written in i robot and other books. If one has ever read the real book you will realize that we as humans may never be able to control our own creations. So how do we treat our own creations? Are they equals? Are they based upon the same rules, laws, and rights as a human? We created them so thus they should have shared rights as their predecessors right? These may be some questions that will have to be answered in half a century or so. I know many people believe that there will be some big war once robots realize how destructive we really are. But i believe this is just another imagination by people who want a war. Humans naturally want an enemy to victor over...why not one that is not human. One that if you destroy you never have to blink twice and feel sorry about it. I don't think there will ever be a war. I think that one day we will sit side by side with our creations and hopefully work together to solve this worlds problems. Maybe they can be our saviors to help get rid of the societies corrupt. Having no desire for greed, maybe they can be societies natural regulators. Possibly a white light in the distance and putting humans back on a peaceful track, one we have never seen in our whole existance.

No comments:

Post a Comment