Instrumental rationality: achieving your values. Not necessarily "your values" in the sense of being selfish values or unshared values: "your values" means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as "winning".
In my opinion, Wikipedia puts things much better here:
Rationality is a central principle in artificial intelligence, where a rational agent is specifically defined as an agent which always chooses the action which maximises its expected performance, given all of the knowledge it currently possesses.
The advantage wikipedia has is that it is talking about expected performance on the basis of the available information, not about actual performance. That emphasis is correct - rationality is (or should be) defined in terms of whether operations performed on the available information constitute correct use of the tools of induction and deduction - and should not depend on whether the information the agent has is accurate or useful.
This has been discussed many times: there is a distinction between trying to win and winning.
Exactly. Rationality is a property of our understanding of our thinking, not the thinking itself.
Being rational doesn't involve choosing correctly, it's about having a justified expectation that the choices you're making are correct.
I wrote an Admin page for "What do we mean by 'Rationality'?" since this has risen to the status of a FAQ. Comments can go here.