My intuition (probably widely shared) suggests that uncertainty about the future is stress inducing and reducing uncertainty about the future is helpful because it allows us to plan. So I started trying to invent thought experiments that would begin to help me quantify how much I (and others) value uncertainty reduction... and then I began to get confused. Below I'll share two examples focused on knowledge about the next five years of one's career but similar psychological/philosophical issues would arise in many other contexts.

Thought Experiment #1: the risk of job loss

Imagine the true probability you involuntarily lose your job at some point in the next 5 years is either 0% or 50% and that your current best guess is that you have a 25% chance of losing your job. For a price, an oracle will tell you whether the truth is 0% of 50%. How much will you pay?

If you think you can answer this question, please do so. Part of my confusion is that knowing the probability I will lose my job seems certain to affect the probability that I lose my job. If you told me the probability was 50% then I'd do a combination of working overtime and looking for other jobs that should reduce that probability, and if the probability remains 50% then I'm in a much less pleasant situation than I would be in the case that the probability is 50% but only because I'm assuming its a more reasonable 25%.

Thought Experiment #2: uncertainty about future earnings

Imagine your estimate of your total income over the next 10 years is unbiased, and that the random error in your estimate is normally distributed. (Admittedly a normally distributed error term is unrealistic in this problem but bear with me for simplicity). What's a reasonable standard deviation? Let's say 3 years worth of income. How much would you pay to reduce that standard deviation to 1.5 years of income?

Once again, go ahead and to answer this if you can, but I've got myself confused here as well... I'm trying to get at the present value of reducing uncertainty about the future, but in this example it appears I getting an offer to reduce the actual risk of *experiencing* a much lower than expected income at the expense of reducing the chances that I make a much higher than expected income, not just reducing uncertainty.

Any insight into what's going on with my thought experiments would be greatly appreciated. I see some parallels between them and Newcomb's Paradox, but I'm not sure what to make of Newcomb's Paradox either. If people have relevant references to the philosophy literature that's great...

relevant references to judgment and decision-making or economics literature would be even better.

+1 and many thanks for wading into this with me... I've been working all day and I'm still at work so can't necessarily respond in full...

I agree that these problems are a lot simpler if reducing my uncertainty about X cannot help me affect X. This is not a minor class of problems. I'd love to have better information for a lot of problems in this class. That said, many of the problems that it seems most worthwhile for me to spend my time and money reducing my uncertainty about are of the type where I have a non-trivial role in how they play out. Assuming I do have some causal power over X, I think I'd pay a lot more to know the "equilibrium" probability of X after I've digested the information the oracle gave me - anything else seems like stale information... but learning that equilibrium probability seems weird as well. If I'm surprised by what the oracle says, then I imagine I'd ask myself questions like: how am I likely to react in regard to this information... what was the probability before I knew this information such that the current probability is what it is... It feels like I'm losing freedom... to what extent is the experience of uncertainty tied to the experience of freedom?

The equilibrium probability might not be well defined. (E.g., if for whatever reason you form a sufficiently firm intention to falsify whatever the oracle tells you.)

And yes, if the oracle tells you something about your own future actions -- which it has to, to give you an equilibrium probability -- it's unsurprising that you're going to feel a loss of freedom. Either that, or disbelieve the oracle.