Posts

Sorted by New

Wiki Contributions

Comments

I want to point out regarding first paragraph that 'yourself > others' and 'others > yourself' are not the only options--I think it is generally possible to ascribe approximately equal moral value to yourself and to all other morally-relevant organisms.  This is obviously difficult to do in practice (as is much of morality), but tenable as an ideal.  

Furthermore, even if you do value yourself more than others, I don't think it necessarily follows that you'll rank other morally-relevant organisms based on their similarity to yourself.  E.g. I don't ascribe different moral value to an educated European adult than to a San child, even though I'm far more similar to the European person in terms of cognitive development and genetics.

That's an interesting approach, though I don't currently see how it solves the dilemma. If the premises are that...

1) A random event cannot be sufficiently rational or connected to an individuals character

2) Determined decision cannot be free

...and if both of these effectively reduce (as I think they do) to...

1) We cannot control (choose between) random action selections

2) We cannot control (choose between) deterministic action selections

...I'm not sure how two things which we cannot control can combine into something which we can.

For example, I cannot significantly influence the weather, nor can I significantly influence the orbit of Saturn. There is no admixture of these two variables results that I can influence any more than I can influence them individually. Likewise, if I cannot freely choose actions that are random or deterministic, I also cannot freely choose actions possessing some degree of both aspects.