As far as I can tell, epistemic and instrumental rationality are two related Arts, even to EY, but are both under the banner of "Rationality" because they both work towards the same goal, that of optimal thinking (I can't cite any specific examples right now, but I'll throw it out there anyway).
Also, another reason for the comparative inefficiency of x-rationality could be lack of information. Epistemic rationality is the Art of filtering/modifying information for greater accuracy. Instrumental rationality is the Art of using all available information to maximize your values. So both techniques increase the amount of benefit you gain from information. But when you don't know all that much, the fine-tuning techniques, x-rationality, would have an extremely low return since they increase your benefit by such a small percentage. There IS an element of akrasia here, in that we could go learn more if we weren't so lazy, but it's not really the same thing.
Goals are yet another problem, which you mentioned already. People just don't need rationality in routine tasks, that's what we have habits for! Would you think rationally about how to brush your teeth? More than once, then? And many of our plans for the future take large amounts of patience but not much thinking to get a 'good enough' result, so most of our focus is on being patient, the rational course of action.
There's no reason for you to change your goals just for the sake of getting to use rationality, but some other ways of getting more out of it (not necessarily the best ones, of course) could be:
Low-"short-term"-investment tasks that would force you to study (like installing a productivity program that only allows you to access certain sites, as many people have already done)
Increasing the entertainment value of studying, the clichéd option (OpenStax CNX has made textbooks that are slightly more interesting than normal, but I don't think it will be enough for most of the population)
Meditation, another cliché. It increases patience, and you can work on analyzing and fixing stray beliefs you find floating around your brain
Recording your thoughts, observations, actions, reasons for those actions, etc. in some sort of portable device (like a notebook or phone). I know this was already mentioned by Yvain, but I just want to make a single list here.
(If you're willing to do so) Putting those recorded thoughts on LessWrong, especially the actions and their reasons, for critical review
Any other ideas?
Note: Markdown was acting up, but I've fixed it now