The entropy of a message (H) is roughly proportional to the uniformity of its probability distribution (P(x) for each possible x). If the message has just two possible values, H is greater exactly insofar as the split between their probabilities is close to 50/50.
The entropy of a message is, intuitively, proportional to its information content. Thus you can learn more efficiently by seeking messages generated in higher-entropy ways.
Assuming that people ask questions to get information, and that questions are strictly yes-or-no, or otherwise have two main answers (as in "which direction — east or west — leads to our destination?"), the best questions are those which separate options of roughly equal prior probability to the questioner.
The prior probability does not necessarily match the intuitive (but kinda meaningless) "objective probability". E.g. with no specific information for the scenario, the "which direction" question is a 50/50 split, but if you're near the east coast of an island containing an otherwise-unknown destination, your priors should be biased in favour of "west".
There are at least two uses of this maxim. You can use it yourself to guide your choice of questions. You can assume others follow it and, when they seem to violate it by asking weird binary questions, find that at least one of the maxim's assumptions are false:
Alas, I don't (yet) know the relative frequency of those listed confusion modes.
Option 5: the questioner is optimizing a metric other than what appears to be the post's implicit "get max info with minimal number of questions, ignoring communication overhead", which is IMHO a weird metric to optimize to begin with - not only it does not take length/complexity of each question into account, but is also ignoring things like maintaining answerer wilingness to continue answering questions, not annoying the answerer, ensuring proper context so that a question is not misunderstood, and this is not even taking into account the possiblity that while the questioner does care about getting the information, they might also simultaneously care about other things.