Martial arts can be a good training to ensure your personal security, if you assume the worst about your tools and environment. If you expect to find yourself unarmed in a dark alley, or fighting hand to hand in a war, it makes sense. But most people do a lot better at ensuring their personal security by coordinating to live in peaceful societies and neighborhoods; they pay someone else to learn martial arts. Similarly, while "survivalists" plan and train to stay warm, dry, and fed given worst case assumptions about the world around them, most people achieve these goals by participating in a modern economy.
The martial arts metaphor for rationality training seems popular at this website, and most discussions here about how to believe the truth seem to assume an environmental worst case: how to figure out everything for yourself given fixed info and assuming the worst about other folks. In this context, a good rationality test is a publicly-visible personal test, applied to your personal beliefs when you are isolated from others' assistance and info.
I'm much more interested in how we can can join together to believe truth, and it actually seems easier to design institutions which achieve this end than to design institutions to test individual isolated general tendencies to discern truth. For example, with subsidized prediction markets, we can each specialize on the topics where we contribute best, relying on market consensus on all other topics. We don't each need to train to identify and fix each possible kind of bias; each bias can instead have specialists who look for where that bias appears and then correct it.
Perhaps martial-art-style rationality makes sense for isolated survivalist Einsteins forced by humanity's vast stunning cluelessness to single-handedly block the coming robot rampage. But for those of us who respect the opinions of enough others to want to work with them to find truth, it makes more sense to design and field institutions which give each person better incentives to update a common consensus.
Robin Hanson has identified a breakdown in the metaphor of rationality as martial art: skillful violence can be more or less entirely deferred to specialists, but rationality is one of the things that everyone should know how to do, even if specialists do it better. Even though paramedics are better trained and equipped than civilians at the scene of a heart attack, a CPR-trained bystander can do more to save the life of the victim due to the paramedics' response time. Prediction markets are great for governments, corporations, or communities, but if an individual's personal life has gotten bad enough to need the help of a professional rationalist, a little training in "cartography" could have nipped the problem in the bud.
To put it another way, thinking rationally is something I want to do, not have done for me. I would bet that Robin Hanson, and indeed most people, respect the opinions of others in proportion to the extent that they are rational. So the individual impulse toward learning to be less wrong is not only a path to winning, but a basic value of a rationalist community.
Another thing that you must do for yourself is politics; sadly EY is right that we can't start discussing that here.