The ideal gas does have a mathematical definition of entropy, Boltzmann used it in the statistical derivation of the second law:https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)
Here is an account of Boltzmann work and the first objections to his conclusions:https://plato.stanford.edu/entries/statphys-Boltzmann/
I think you are not considering some relevant points:
1) the artificial system we are considering (an ideal gas in a box) (a) is often used as an example to illustrate and even to derive the second law of thermodynamics by means of mathematical reasoning (the Boltzmann's H-theorem) and (b) this is because it actually appears to be a prototype for the idea of the second law of thermodynamics so it is not just a random example, it is the root of out intuition of the second law
2) the post is talking about the logic behind the arguments which are used to justify the second law of thermodynamics
3) The core point of the post is this:
An ideal gas in a box is an egodic system. The Poincarè recurrence theorem states that a volume preserving dynamical system (i.e. any conservative system in classical physics) returns infinitely often in any neighbourhood (as small as you want) of any point of the phase space.
"What mechanism exists to cause the particles to vary in speed (given the magical non-deforming non-reactive box we are containing things in)?"The system is a compact deterministc dynamical system and Poincarè recurrence applies: it will return infinitely many times close to any low entropic state it was before. Since the particles are only 3 the time needed for the return is small.
"conditional on any given (nonmaximal) level of entropy, the vast majority of states have increasing entropy"I don't think this statement can be true in any sense that would produce a non-symmetric behavior over a long time, and indeed it has some problem if you try to express it in a more accurate way:1) what does "non-maximal" mean? You don't really have a single maximum, you have a an average maximum and random oscillations around it2) the "vast majority" of states are actually little oscillations around an average maximum value, and the downward oscillations are as frequent as the upward oscillations3) any state of low entropy must have been reached in some way and the time needed to go from the maximum to the low entropy state should be almost equal to the time needed to go from the low entropy to the maximum: why shold it be different if the system has time symmetric laws?
In your graph you take very few time to reach low entropy states from high entropy - compared to the time needed to reach high entropy again, but would this make the high-low transition look more natural or more "probable"? Maybe it would look even more innatural and improbable!
Good point but gravity could be enough to keep the available positions in a bounded set
You do have spontatenous entropy decreases in very "small" environment. For gas in a box with 3 particles entropy is fluctuating in human-scale times.
In order to apply Poincarè recurrence it is the set of available points of the phase space that must be "compact" and this is likely the case if we assume that the total energy of the universe is finite.
Entropy "reversal" - i.e. decrease - must be equally frequent as entropy increases: you cannot have an increase if you didn't have a decrease before. My graph is not quantitatively accurate for sure but with a rescaling of times it should be ok.
Ok but even if I remove the idea of "entropy" from my argument the core problematic issue is still here: we have 50% probability that our universe is evolving in the opposite direction and an incredibly long chain of inbelievably improbable events is happening, and even if it is not happening right now it would happen with the same frequency of the standard "probable" evolution.