It is indeed often the case, but there are real-life scenarios where this is false (e.g. 480p DVDs probably look better than YouTube 720p because the latter has a much lower bitrate) and the causal relationship arguably exists in the opposite direction: less bitrate is required to make lower-resolution videos look decent, so lower resolutions tend to be preferred when people need to cut down on bitrate.
It's worth noting the mathematically fairly simple and obvious but intuitively annoying fact that if you've tried the 10% chance 9 times with no success, you do not have a 63% chance of succeeding on your next attempt.
I mean, the ideal settings depend on content, use case and even personal preference when it comes to the details, so it's probably not fundamentally possible to design an ideal video encoding algorithm that automatically produces the best possible video quality at the desired filesize without the need to manually finetune the settings. As long as we are manually finetuning the settings, the output resolution is one of the settings we can change, so in that sense, all the algorithms already implement that feature.