[ not a utilitarian; discount my opinion appropriately ]
This hits one of the thorniest problems with Utilitarianism: different value-over-time expectations depending on timescales and assumptions.
If one is thinking truly long-term, it's hard to imagine what resource is more valuable than knowledge and epistemics. I guess tradeoffs in WHICH knowledge to gain/lose have to be made, but that's an in-category comparison, not a cross-category one. Oh, and trading it away to prevent total annihilation of all thinking/feeling beings is probably right.