Eliezer wonders about the thread of conscious experience: "I don't think that, if I were really selfish, I could jump off a cliff knowing smugly that a different person would experience the consequence of hitting the ground."
Instead of wondering whether we should be selfish towards our future selves, let's reverse the question. Let's define our future selves as agents that we can strongly influence, and that we strongly care about. There are other aspects that round out our intuitive idea of future selves (such as having the same name and possessions, and a thread of conscious experience), but this seems the most fundamental one.
In future, this may help clarify issues of personal identity once copying is widespread:
These two future copies, Mr. Jones, are they both 'you'? "Well yes, I care about both, and can influence them both."
Mr Jones Alpha, do you feel that Mr Jones Beta, the other current copy, is 'you'? "Well no, I only care a bit about him, and have little control over his actions."
Mr Evolutionary-Jones Alpha, do you feel that Mr Evolutionary-Jones Beta, the other current copy, is 'you'? "To some extent; I care strongly about him, but I only control his actions in an updateless way."
Mr Instant-Hedonist-Jones, how long have you lived? "Well, I don't care about myself in the past or in the future, beyond my current single conscious experience. So I'd say I've lived a few seconds, a minute at most. The other Mr Instant-Hedonist-Jones are strangers to me; do with them what you will. Though I can still influence them strongly, I suppose; tell you what, I'll sell my future self into slavery for a nice ice-cream. Delivered right now."