Update: Beliefs that are about the world in general, and not about yourself in particular (ie. things you don't want to say about yourself)
Any belief that is the opposite of a social construct that most people around me have internalized. I'd give an example if I could post anonymously.
As for the "if I had" part, I think the political views, no matter how nuanced they are; and the reason being the mushiness of the territory. Secondly, my opinion on anything life and how to live it. Too burdensome for me bear the weight of guiding other people down a path which I myself do not have an effing clue about. And finally, my views on effects of progress, technological or otherwise. I thing all of these boils down to being anxious about being judged and not actually being judged. Anyway listing the strong ones below:
Abolish income tax completely in developing countries until these countries catch up with the first world countries. (I feel it will help in exhorting risk-taking, a primary bottle-neck for growth. Thus leading to improved small/medium scale companies)
Stop commoditizing startup wisdom, I feel it creates more failures than successes. The big-guns of the startup world need to hold their horses whenever they feel the urge to share their unparalleled insight into what approach made their startups succeed.
Social sciences(psychology, behavioral econ, etc) has become a kind of porn for the intellectual community. And we must reduce the number of offerings in these fields. Causation≠Causation. The statistical evidences that you fetishize and shove down my throat in the name of focusing on changing the world and on preventable issue do not prevent anything. Worst event will be worst and you wouldn't have seen it, precisely because it is worse and nothing like that has ever happened before --- Lucretius Fallacy.
AI/ML/DL is starting to tread in the realm of unbounded risk both spatially and temporally. I am not talking in the conventional sense of overlords and biases, but in a more abstract sense of expansion. The prevalent use of a black box technology that we don't know the inner workings of needs shock absorbers like financial market, not alignment policies. My point is we do not need more AI but a significant sample of the effects of AI within a mature market like the USA for a significant amount of time (temporal and spatial evidence for its credibility as a reliable technology). Here USA or China acting as shock absorbers.