This is an automated rejection. No LLM generated, heavily assisted/co-written, or otherwise reliant work.
Read full explanation
This post was co-authored with an AI to help structure my thoughts and translate them from Spanish, as I am an independent researcher without a formal academic background. I am disclosing this in compliance with the AI policy.
The Argument
I have spent the last few months obsessed with a single idea: what if time is not a background dimension, but the energy/information "noise" generated by matter as it resists entropy? I call this Entropic Mutation Theory (EMT).
The core model I want to present is that physical systems don't just "exist" in time; they generate duration by actively processing information to maintain their structural identity. If a system stops resisting its "entropic mutation," its local time stops. I suspect there is a minimum informational cost for this update, which I’ve tentatively placed at 0.5 meV, though this is where my confidence intervals widen significantly.
Why this is relevant to LessWrong
I am posting this here because I’ve reached a dead end with my tools. I’ve used LLMs to help me formalize these intuitions, and I’ve noticed a disturbing trend: the AI is very good at making my theory sound like physics (using terms like "Lunar Recession" and "Landauer’s Principle"), but I suspect it has started "hallucinating" technical consistency to please my prompts.
I want to use this community as a "sanity check." Is there a valid seed in the idea of Time as Informational Resistance, or have I been led down a rabbit hole of plausible-sounding nonsense by an LLM?
Key points of the model:
Structural Persistence: Duration is the byproduct of the work done to prevent instantaneous state-decay.
Mass-Time Link: Mass is simply "dense information" that requires more "processing power" to maintain, which we perceive as gravitational time dilation.
The AI Trap: I have published 14 documents on Zenodo (Entropic Mutation Theory). I now believe Appendix IX (Numerical Predictions) might be a stochastic collage of existing papers rather than a valid derivation. I need help deconstructing this.
I am not an academic; I am an office worker trying to apply a rational model to a persistent intuition. I am looking for a rigorous audit of the core logic, specifically where it intersects with Information Thermodynamics.
This post was co-authored with an AI to help structure my thoughts and translate them from Spanish, as I am an independent researcher without a formal academic background. I am disclosing this in compliance with the AI policy.
The Argument
I have spent the last few months obsessed with a single idea: what if time is not a background dimension, but the energy/information "noise" generated by matter as it resists entropy? I call this Entropic Mutation Theory (EMT).
The core model I want to present is that physical systems don't just "exist" in time; they generate duration by actively processing information to maintain their structural identity. If a system stops resisting its "entropic mutation," its local time stops. I suspect there is a minimum informational cost for this update, which I’ve tentatively placed at 0.5 meV, though this is where my confidence intervals widen significantly.
Why this is relevant to LessWrong
I am posting this here because I’ve reached a dead end with my tools. I’ve used LLMs to help me formalize these intuitions, and I’ve noticed a disturbing trend: the AI is very good at making my theory sound like physics (using terms like "Lunar Recession" and "Landauer’s Principle"), but I suspect it has started "hallucinating" technical consistency to please my prompts.
I want to use this community as a "sanity check." Is there a valid seed in the idea of Time as Informational Resistance, or have I been led down a rabbit hole of plausible-sounding nonsense by an LLM?
Key points of the model:
I am not an academic; I am an office worker trying to apply a rational model to a persistent intuition. I am looking for a rigorous audit of the core logic, specifically where it intersects with Information Thermodynamics.