It wouldn't have made a lot of sense to predict any doublings for transistors in an integrated circuit before 1960, because I think that is when they were invented.
This claim doesn't make much sense from the outset. Look at your specific example of transistors. In 1965, an electronics magazine wanted to figure out what would happen over time with electronics/transistors so they called up an expert, the director of research of Fairchild semiconductor. Gordon Moore (the director of research), proceeded to coin Moore's law and tell them the doubling would continue for at least a decade, probably more. Moore wasn't an outsider, he was an expert.
You then generalize from an incorrect anecdote.
I'm not sure the connotation of the term (i.e. a black person being successful at anything is so shocking it's entertainment value all on it's own) makes the statement any better. Especially when discussing, say, one of the most important American musicians of all time (among others).
I thought the heuristic was "if I think I passed the hotel, I was going too fast to notice. I better slow down so I see it when I come up on it, or so I might recognize a landmark/road that indicates I went too far." We slow down not because we are splitting the difference between turning around and continuing on. We slow down to make it easier to gather more information, a perfectly rational response.
Sure, not 100% unique to academia, there are also industrial research environments.
My phd was in physics, and there were lots of examples. Weird tricks for aligning optics benches, semi-classical models that gave good order of magnitude estimates despite a lack of rigour, which estimates from the literature were trust worthy (and which estimates were garbage). Biophysics labs and material science lab all sorts of rituals around sample and culture growth and preparation. Many were voodoo, but there were good reasons for a lot of them as well.
Even tricks for using equipment- such and such piece of equipment might need really good impedance matching at one connection, but you could get by being sloppy on other connections because of reasons A, B and C,etc.
A friend of mine in math was stuck trying to prove a lemma for several months when famous professor Y suggested to him that famous professor Z had probably proven it but never bothered to publish.
In STEM fields, there is a great deal of necessary knowledge that simply is not in journals or articles, and is carried forward as institutional knowledge passed around among grad students and professors.
Maybe someday someone clever will figure out how to disseminate that knowledge, but it simply isn't there yet.
No, the important older theories lead to better theories.
Newton's gravitational physics made correct predictions of limited precision, and Newton's laws lead to the development of Navier-Stokes, kinetic theories of gasses,etc. Even phlogiston lead to the discovery of oxygen and the modern understanding of oxidation. You don't have to be 100% right to make useful predictions.
Vitalism, on the other hand, like astrology, didn't lead anywhere useful.
But quantum theory also makes correct predictions, and mainstream physics does not en masse advocate quackery. Vitalism never worked, and it lead the entire medical community to advocate actively harmful quackery for much of the 19th century.
No, vitalism wasn't just a dead end, it was a wrong alley that too many people spent time wandering down. Vital theories were responsible for a lot of the quack ideas of medical history.
I don't think that is true? There is a huge contingent of evangelicals (last I checked, a bit under half of Americans believe in creationism), it only takes a few non-creationist but religious Christians to get to a majority.