The three laws of robotics are (1. A robot may not injure a human being, or, through inaction, allow a human
being to come to harm. (2. A robot must obey the orders given it by human beings except where such
orders would conflict with the First Law. (3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. I believe that AI will happen because science in robotics is advancing but only in within limits. I think it will be an overall good thing. The worst case scenario I can think of is that they may kill someone. The best case scenario I can think of is that they will save lives. I feel that the Measure of a Man episode was cool. I do not think that is really realistic though. It did stimulate thoughts about possible problems of robotics. I think it was a valuable use of class time. I think that they should be considered as a crossroad because they are man made.I think humans should be able to own them in some extent, but be given freedom in others. I think they should be considered as a pet because they cannot do everything humans can.