If hospitals are overwhelmed, it's valuable to have a component of the hospital treatment plan for pneumonia on-hand to treat either yourself or others who have it especially bad. One of these is oxygen concentrators, which are not sold out yet and are ~$400 on Amazon. This doesn't deal with especially severe cases, but for cases which fall in the "shortness of breath, low blood oxygen" class without further medical complications, it'd probably be useful if you can't or don't want to go to a hospital due to overload. https://www.who.int/publications-detail/clinical-management-of-severe-acute-respiratory-infection-when-novel-coronavirus-(ncov)-infection-is-suspected mentions oxygen treatment as the first thing to do for low blood oxygen levels.
I found a paper about this exact sort of thing. Escardo and Olivia call that type signature a "selection functional", and the type signature (A→B)→B is called a "quantification functional", and there's several interesting things you can do with them, like combining multiple selection functionals into one in a way that looks reminiscent of game theory. (ie, if ϵ has type signature (A→C)→A, and δ has type signature (B→C)→B, then ϵ⊗δ has type signature ((A×B)→C)→(A×B).
Oh, I see what the issue is. Propositional tautology given A means A⊢pcϕ, not A⊢ϕ. So yeah, when A is a boolean that is equivalent to ⊥ via boolean logic alone, we can't use that A for the exact reason you said, but if A isn't equivalent to ⊥ via boolean logic alone (although it may be possible to infer ⊥ by other means), then the denominator isn't necessarily small.
Yup, a monoid, because ϕ∨⊥=ϕ and A∪∅=A, so it acts as an identitity element, and we don't care about the order. Nice catch.
You're also correct about what propositional tautology given A means.
Yup! The subscript is the counterfactual we're working in, so you can think of it as a sort of conditional pricing.
The prices aren't necessarily unique, we set them anew on each turn, and there may be multiple valid prices for each turn. Basically, the prices are just set so that the supertrader doesn't earn money in any of the "possible" worlds that we might be in. Monotonicity is just "the price of a set of possibilities is greater than the price of a subset of possibilities"
If there's a short proof of ϕ from ψ and a short proof of ψ from ϕ and they both have relatively long disproofs, then counterfacting on ϕ, ψ should have a high value, and counterfacting on ψ, ϕ should have a high value.
The way to read ⊢ is that the stuff on the left is your collection of axioms (A is a finite collection of axioms and A,ϕ just means we're using the stuff in A as well as the statement ϕ as our axioms), and it proves some statement.
For the first formulation of the value of a statement, the value would be 1 if adding ϕ doesn't provide any help in deriving a contradiction from A. Or, put another way, the shortest way of proving ¬ϕ, assuming A as your axioms, is to derive ⊥ and use principle of explosion. It's "independent" of A, in a sense.
There's a technicality for "equivalent" statements. We're considering "equivalent" as "propositionally equivalent given A" (Ie, it's possible to prove an iff statement with only the statements in A and boolean algebra alone. For example, ¬(ϕ∨ψ)=¬ϕ∧¬ψ is a statement provable with only boolean algebra alone. If you can prove the iff but you can't do it with boolean algebra alone, it doesn't count as equivalent. Unless ϕ is propositionally equivalent to ψ, then ϕ∨ψ is not equivalent to ϕ, (because maybe ϕ is false and ψ is true) which renders the equality you wrote wrong, as well as making the last paragraph incoherent.
In classical probability theory, P(ϕ)+P(ψ)=P(ϕ∨ψ) holds iff P(ϕ∧ψ) is 0. Ie, if it's impossible for both things to happen, the probability of "one of the two things happen" is the same as the sum of the probabilities for event 1 and event 2.
In our thing, we only guarantee equality for when ϕ=¬ψ (assuming A). This is because VA(ϕ∨ψ)=VA(ϕ∨¬ϕ)=VA(⊤)=VA(⊤∧A)=VA(A)=1. (first two = by propositonally equivalent statements getting the same value, the third = by ⊤ being propositionally equivalent to A assuming A, fourth = by ⊤∧A being propositionally equivalent to A, final = by unitarity. Equality may hold in some other cases, but you don't have a guarantee of such, even if the two events are disjoint, which is a major difference from standard probability theory.
The last paragraph is confused, as previously stated. Also, there's a law of boolean algebra that ϕ∨(ψ∧Z) is the same as (ψ∨ϕ)∧(ψ∨Z). Also, the intuition is wrong, VA(ϕ∧ψ) should be less than VA(ϕ), because "probability of event 1 happens" is greater than "probability that event 1 and event 2 happens".
Highlighting something and pressing ctrl-4 turns it to LaTeX.
Yup, this turned out to be a crucial consideration that makes the whole project look a lot less worthwhile. If ventilation at a bad temperature is available, it's cheaper to just get a heat exchanger and ventilate away and eat the increased heating costs during winter than to do a CO2 stripper.
There's still a remaining use case for rooms without windows that aren't amenable to just feeding an air duct outside, but that's a lot more niche than my original expectations. Gonna edit the original post now.
Also, a paper on extremely high-density algal photobioreactors quotes algal concentration by volume as being as high as 6% under optimal conditions. The dry mass is about 1/8 of the wet mass of algae, so that's 0.75% concentration by weight percent. If the algal inventory in your reactor is 9 kg dry mass (you'd need to waste about 3 kg/day of dry weight or 24 kg/day of wet weight, to keep up with 2 people worth of CO2, or a third of the algae each day), that's 1200 kg of water in your reactor. Since a gallon is about 4 kg of water, that's... 300 gallons, or 6 55-gallon drums, footprint 4 ft x 6 ft x 4 ft high, at a bare minimum (probably 3x that volume in practice), so we get the same general sort of result from a different direction.
I'd be quite surprised if you could do that in under a thousand dollars.
[EDIT: I see numbers as high as 4 g/L/day quoted for algae growth rates, I updated the reasoning accordingly]
The numbers don't quite add up on an algae bioreactor for personal use. The stated growth rate for chlorella algae is 0.6 g/L/day, and there are about 4 liters in a gallon, so 100 gallons of algae solution is 400 liters is 240 g of algae grown per day, and since about 2/3ds of new biomass comes from CO2 via the 6CO2+6H2O->C6H12O6 reaction, that's 160 g of CO2 locked up per day, or... about 1/6 of a person worth of CO2 in a 24 hour period. [EDIT: 1 person worth of CO2 in a 24 hour period, looks more plausible]
Plants are inefficient at locking up CO2 relative to chemical reactions!
Also you wouldn't be able to just have the algae as a giant vat, because light has to penetrate in, so the resulting reactor to lock up 1/6 [EDIT: 1] of a person worth of CO2 would be substantially larger than the footprint of 2 55-gallon drums.
I have the relevant air sensor, it'd be really hard to blind it because it makes noise, and the behavioral effects thing is a good idea, thank you.
It's not currently with me.
I think the next thing to do is build the 2.0 design, because it should perform better and will also be present with me, then test the empirical CO2 reduction and behavioral effects (although, again, blinding will be difficult), and reevaluate at that point.