http://johnnypate.livejournal.com/ ([identity profile] johnnypate.livejournal.com) wrote in [personal profile] roxybisquaint 2009-04-21 07:16 am (UTC)

I think you're conflating two entirely different things here. It seems perfectly plausible to develop a robot soldier with parameters for "shoot" and "no shoot" targets built in - which is essentially what you're referring to when you appeal to morality based on systems of logic. It's why the law is often so useless and always requires a judge and jury to interpret the particular circumstances if justice is to be done. If what you say is true there is no need for any kind of decision-making in the legal process, simply the discovery of the facts pertinent to the parameters required for the algorithm that arrives at "guilty" or "innocent". The reason the jury exists is because that does not, and can, not work.

I'm referring to the actual underlying emotional context that drives humans to want to construct a system of morality. Yes, tribes would compete for scarce resources and kill as necessary but they have empathy and compassion and a sense of shared humanity to give them a context for their actions. Robots can never have that for they are simply machines. That morality is an accident of our biology, not an essential feature of the phenomenal universe, is my point (well, unless you believe in the Christian God, that is).

Post a comment in response:

This account has disabled anonymous posting.
(will be screened if not validated)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting