If we were able to make intelligent and sentient AIs, wouldn’t that mean we would have to stop programming them? It would be unethical for me to force you to do my will, so wouldn’t the same thing be true with AIs? [Not that it is ever going to happen, but… ]