This is much more likely to be due to people with more severe disease being more likely to be noticed in the early stages of an outbreak, while in the later stages everyone including those with mild disease are noticed.
I re-read the studies originally included in this post looking for data relating illness severity to fever. Two of the studies compared the rate of fever in ICU vs non-ICU COVID-19 patients. This one finds 13/13 in ICU and 27/28 in non-ICU have fever; this one finds 36/36 in ICU and 100/102 non-ICU had fever. This study broke down its patients into a hospitalized subset in which 6/6 had fever and a non-hospitals subset in which 58.3% had fever; I had mis-read this one on the first pass when writing the post, and wrote down the percent with fever in the hospitalized subset (100%).
This does seem to be consistent with fever being more common among the subset of patients which are severe enough to be hospitalized. I have edited the post to reflect this interpretation, and corrected the one study that was mis-extracted; you can see the old version of the post here. This weakens my belief that the rate of fever at onset has already declined, though I still think it is somewhat likely, and I still expect it to decline in the future.
If that's the reason, then the same study-series should also have a declining rate of ARDS. I'll check.
Hmm, can you think of a plausible biological mechanism by which a virus could evolve to not cause fever, or to cause fever later than usual? My initial reaction is to be skeptical that fever screening would result in the effects you suggest, mainly because whether or not you get a fever is mostly a function of your innate immune system kicking in and not a function of the virus. Whether or not you get a fever is mostly out of the virus's control, so to speak. The virus could perhaps evolve methods of evading innate immunity, but other examples I've seen of viral adaptation to innate immunity seem like they involve complex mechanisms, which I would guess would not evolve on timescales as short as we're concerned with here (although here I'd welcome correction from someone with more experience in these matters).
But even if there were potential evolutionary solutions close at hand for a virus to evolve evasion to host innate immune responses, I'm not sure that fever screening would really accelerate the discovery of those solutions, given that the virus is already under such extreme selection pressure to evade host immunity. After all, the virus has to face host immune systems in literally every host, whereas fever screening only applies to a tiny fraction of them.
If I follow your argument the fever screening will get those infected with virus that tend to cause a higher fever than other virus. As we control those cases the others go on to reproduce and so the population of the virus then mutates to not causing the fever. Is that the basis position?
If so, that seems to suggest how each person reacts to the infection is the same rather than the virus being the same but people reacting to it differently. If so, we will miss those who don't respond to the infection with a fever (are as strong of a fever) but that should not create any environmental pressure to select for a low fever version of the virus.
Is my thinking here nonsense?
If I follow your argument the fever screening will get those infected with virus that tend to cause a higher fever than other virus. As we control those cases the others go on to reproduce and so the population of the virus then mutates to not causing the fever. Is that the basis position?
Yes.
If so, that seems to suggest how each person reacts to the infection is the same rather than the virus being the same but people reacting to it differently. If so, we will miss those who don't respond to the infection with a fever (are as strong of a fever) but that should not create any environmental pressure to select for a low fever version of the virus.
Yes. The relevant thing is how much of the variance in fever is caused by mutations in the virus, versus being caused by differences between people. If people vary more in whether they have a fever than the virus varies in whether it causes a fever, this would weaken the evolutionary pressure; if they vary enough more, then my prediction of declining fever-screening effectiveness will be wrong.
I have heard anecdotal evidence that this has happened, but the only official numbers I can find are of the early Wuhan study. Does someone have a better sense of whether or not this has happened?
Evading fever screening, on the other hand, involves greater selective pressure and so may happen on a faster time scale, possibly fast enough to significantly influence the shape of the pandemic this year.
Fever is part of the immune response, it creates a less hospitable environment for pathogens when the body is at a temperature above normal (actually a good thing even it it makes you feel shit). Pathogens have always had to deal with it.
How effective is fever-screening now?
Do you have any data:
(I've had a quick look at some of the figures you linked to, they seem hospital based so I'm not sure how they connect to screening).
What are the effects of screening?
I would think people who feel ill are more likely to stay home if they think they might get 'caught' by a screener. i.e. The presence of screening is more likely to change the behaviour of the infected than the evolution of the virus.
Attitudes to illness have shifted - If I've learned anything from adverts from cold and flu remedies is that you can take one and just get back to work with a smile, never mind if you're contagious or not ....
Drug resistance evolves slowly
Can you justify that statement? It feels like words used to justify your thinking but is it grounded in fact? What drugs, pathogens, mechanisms of resistance are you talking about? So many possibilities..
COVID-19 Is Under Strong Selective Pressure
Fever screening is the practice of checking the temperature of everyone passing through a checkpoint, such as at an airport or public gathering, for fever. This can be done individually with regular thermometers, or on entire crowds with infrared cameras. Fever screening has been widely deployed against COVID-19, especially in China, since fever is its most common and often chronologically first symptom.
Fever screening applies an evolutionary pressure on pathogens to not cause fever, or to cause fever later relative to when it becomes transmissible. Pathogens mutate. In many diseases, this results in drug resistance; antibiotics, for example, lose effectiveness over time.
Drug resistance evolves slowly, because partial resistance is only a slight benefit to a pathogen which acquires it; most patients who receive antibiotics do not infect anyone after their treatment has started. Evasion of public-health screening procedures, on the other hand, is likely to evolve much faster, because any incremental improvement in a pathogen's ability to evade screening will give it a large advantage.
Evidence This Is Already Happening
I searched for papers recording the percentage of patients which had a fever in Google Scholar and in the citations on this page. For each paper, I extracted the percentage of patients reported to have a fever, and the date range of cases covered by the data. If the paper did not state the date range of cases, I wrote down the paper's publication date instead. I excluded papers which didn't report a percentage of patients with fever, or had n<20.
I found seven studies, listed below sorted by end date.
Dec 16-Jan 2: 98%
Jan 1-20: 83%
Jan 1-28: 98.6%
Jan 6-31: 58.3%
Pub. Feb 18: 78.2%
Jan 21-Feb 15: 85.7%
Before Feb 20: 87.9%
This is suggestive of a negative correlation between study date and percent of patients with fever. This is what you would expect if COVID-19 were evolving to evade fever screening; however, it could also be explained by later studies finding earlier and less-severe cases. This is weak evidence compared to the what could obtained by someone with access to a good dataset; these papers come from different sample populations and different points in the disease progression, and don't all specify what their cutoff temperature was for diagnosing fever. If someone has access to a suitable dataset, I ask that they do the analysis and publicly state whether they see a declining rate of fever.
Other Properties Will Likely Also Change
It is well known that most diseases evolve to become less severe over time, because patients with severe cases are easier to detect and will take greater precautions. However, this takes place on very slow time scales. Evading fever screening, on the other hand, involves greater selective pressure and so may happen on a faster time scale, possibly fast enough to significantly influence the shape of the pandemic this year.
I don't know how fever-screening evasion relates to disease severity; I can see plausible mechanisms by which this would make it either increase or decrease.