Self driving cars have to be programmed to make those kinds of “trolley problems” as they’re known
DWV - I think you’re the one being circular with your logic.
You also keep asking for very broad stances but so far you haven’t dealt with the minutiae
Frankly, it’s starting to feel like you don’t even know exactly what it is you’re asking
Quote:
This all sounds very circular..
"I'm justified in my position because it's moral to me, which makes it justified"
|
The reason it isn’t circular is that the onus is on the person who takes action (or chooses not to take action) to make an acceptable case on why the action is justifiable
I’ve been avoiding saying this because the “semantics” argument gets tiresome but all you’re really asking is if justifiable and moral are synonyms. And yes, the meanings overlap.
Like elph pointed toward the trolley problem I too think it’s time for you to think about specifics. Tell us how your position is applicable.