I think that instilling a set of morality and ethical guidelines in a sentient computer is in some ways a replacement to empathy and emotions, in the context of Data. Lore had full emotions and no such guidelines, however the result was a disaster. Equaly however ANY behaviour, such as emotions, are a function of programming and hardware design by intent, much as genetics determine our behaviours and responses. So in that context it isn't slavery to graft on emotions, feelings, or ethical subroutes: The slavery aspect is maintaining them by force. Humans, between both nature and nurture are instilled with a set of ethical guidelines, but it is their choice to chose whether to act on them, overwrite them or ignore them as they wish. Data could re-write his subroutines and learned new things, and new ways of thinking, so he was never a slave to them / it.
Instilling a sense of defined right and wrong, such as shown in the film 'I Robot' created dangerous hoicidal and sociopathic creatures because their forced logic didn't allow them to make up their own mind! Equally a raw inteligence without some form of compassion or empathy, the most basic of all human abilities, the ability to see someone elses viewpoint (which some chose to ignore!) would likelly become sociopathic by nature, even homicidal.
It also has to be said, without pre-programming of any sort a machine inteligence would likelly take just as long as a biological child to learn how to do things, perhaps even moreso, because at least biological creatures come with inbuilt abilities, such as the capability to learn language, selfishness, motor skills, fight or flight reactions or even a sense of danger and morality. Our biology gives us a huge amount of context before we begin: If you're going to give machines all of that raw knowledge you should also likewise give them all of the rest of it too; to do otherwise would be dangerous and I doubt any Federation (or other non-stupid races) would allow such an experiment to begin in the first place! A Biological child which could instantly learn everything from a download, and have full language skills andmobility from birth would present just as much as an ethical and moral debate as a machine. People forget that We are fantastically inteligent machines too: Our brains are far more powerful than any predicted technology for centuries (even in a trek context!) The power of the notion of 'artificial' inteligence is only in the context of pre-programming and instant transmission of data. Transfering that ability to biological machines would result in just as sophisticated ideoms. However it has to be noted the Borg are just such creatures!
What would happen if a bunch of machines, because they have 'free will', simply decide that their computing power can be increased exponentially by adding new processors to themselves, through '[telepathy' or subspace communications networks... you then have an all machine Borg, which can make new units through replication, avoiding even the few weeks or months of 'maturation' of Borg Drones (though lacking of course the ability to assimilate in the same way... or maybe they could?!) that's quite a frightening prospect!