Anyway, I think Asimov felt that, just as people have a little trouble adjusting to someone coming home from war (the nagging notion that "he's a killer"), it will be exaggerated in the case of robots, and people will never accept into their communities the same robots that were used for combat (nagging belief it may "go off" and kill people around it). The only way people would accept robots is if they were 100% no threat.
Actually, from statements Asimov made while still walking the Earth, he stated that in fact he thought the 3 Laws were in real life a pretty dumb idea - unworkable and unenforceable - and were created solely as a plot device for his stories, in which they frequently don't work. He grumbled that people just didn't get the point.