You can substitute "the laws of physics" for "Omega" in your argument, and if it proves you will not decide rationally in the Omega situation, then it proves you will not decide --anything-- rationally in real life.
Presumably (or at least hopefully) if you are a rational agent with a certain DT, then a long and accurate description of the ways that "the laws of physics" affect your decision-making process break down into
It's not clear how a reduction like this could work in your example.
This is part of a sequence titled "An introduction to decision theory". The previous post was Newcomb's Problem: A problem for Causal Decision Theories
For various reasons I've decided to finish this sequence on a seperate blog. This is principally because there were a large number of people who seemed to feel that this sequence either wasn't up to the Less Wrong standard or felt that it was simply covering ground that had already been covered on Less Wrong.
The decision to post it on another blog rather than simply discontinuing it came down to the fact that other people seemed to feel that the sequence had value. Those people can continue reading it at "The Smoking Lesion: A problem for evidential decision theory".
Alternatively, there is a sequence index available: Less Wrong and decision theory: sequence index