(Originally posted at Secretum Secretorum)
Consider the following regions in the landscape of possible minds:
(1) All possible human minds
(2) All human minds that have ever existed
(3) All human minds that currently exist
(4) All currently existing human minds that participate in science
It should be clear that (1) is a miniscule patch of dirt in the vast mindscape, as (2) is of (1), and so forth. What scientifically productive regions of the mindscape (i.e. minds that could usefully contribute to scientific research) are found in (1), (2), or (3) but not (4)? What factors are limiting the diversity of scientific minds?
In our current technological environment, nothing has influenced our psychology more deeply than computers, the internet, and related digital technologies. The internet is usually seen as an unequivocal good for the advancement of science, but I suspect that we do not fully appreciate the psychological costs, which may be small but compounding over the long run. I can think of at least three reasons why the internet might have negative effects on creativity and diversity in scientific thinking:
(1) There is too much knowledge available at one’s fingertips. This may have a subtle inhibitory effect on creativity, in essence making it harder for us to ever come at a problem with “fresh eyes”. Perhaps we are collectively spending too much time in the library of the internet like the very, very, smart guy that Richard Hamming (mathematician and former member of the Manhattan project and Bells Labs) discussed in a 1986 colloquium.
Question: “How much effort should go into library work?”
Hamming: “It depends upon the field. I will say this about it. There was a fellow at Bell Labs, a very, very, smart guy. He was always in the library; he read everything. If you wanted references, you went to him and he gave you all kinds of references. But in the middle of forming these theories, I formed a proposition: there would be no effect named after him in the long run. He is now retired from Bell Labs and is an Adjunct Professor. He was very valuable; I’m not questioning that. He wrote some very good Physical Review articles; but there’s no effect named after him because he read too much. If you read all the time what other people have done you will think the way they thought. If you want to think new thoughts that are different, then do what a lot of creative people do - get the problem reasonably clear and then refuse to look at any answers until you’ve thought the problem through carefully how you would do it, how you could slightly change the problem to be the correct one.
(2) It is too easy to know if your ideas are considered fringe and unusual, or have already been studied and “proved”. This makes people less likely to do the kind of thinking that can overturn conventional wisdom or show something to be true which was highly unlikely to be so, resulting in a sort of global chilling effect on intellectual risk-taking.
(3) The internet has also had the effect of homogenizing cultures across the world. German, Russia, American, and Chinese cultures are much more similar now than they were 50 years ago (and were much more similar 50 years ago than they were 100 years ago); accordingly, German, Russia, American, and Chinese science (in terms of their organization, goals, norms, values, etc.) are much more similar as well. The internet has also likely played a similar role in reducing the political diversity of modern academia (see “Political diversity will improve social psychological science”, 2015).
I wonder if the overall effect of the internet has been to create a cognitive environment which is better at producing minds that do “normal science” (working towards known and fixed goals within a given paradigm, “puzzle-solving” as Thomas Kuhn called it) and worse at producing the kinds of minds that do revolutionary, paradigm-shifting science.
Modern science education is highly homogenous. Of course there is some variation within and between countries, but the vast majority of students are funneled through educational systems in which the same subjects are taught in the same sequences and students take the same assessments (AP, IB, SAT, etc.) with the same goal of getting into a top university. By and large, becoming a scientist means doing well in school; this creates a significant channeling of psychological diversity - most future scientists spend significant amounts of time in school environments having the experiences that are typical of those environments. To the degree that doing well in school (requiring intelligence and creativity, but also social skills, conscientiousness, agreeableness, conformity, and a strong work ethic) and being a good scientist require different psychologies, we are not crafting our mental diversity “portfolio” in an optimal way for the advancement of science (I of course recognize that there are many other educational goals and tradeoffs must be made between them).
In the U.S., nearly all of our future scientists take biology as freshman, chemistry as a sophomore, and physics as juniors in high school (the reverse order also exists but is less common). This narrow set of educational trajectories through science amounts to another significant channeling of mental diversity. We would do well to create educational systems that allow for students to develop more idiosyncratic sets of knowledge and skills (why must we learn the natural sciences before psychology and the social sciences?).
The homogenization and systematization of scientific training goes hand-in-hand with the professionalization of modern science. Virtually all scientists conduct their work with the mindset of a professional; the necessity of obtaining a position, pursuing funding, and publishing research implicitly, and often explicitly, sets the direction and mode of thinking. Kyle Stanford discusses at length how the professionalization of science has restricted creativity and diversity of thought compared to the pre-modern era of independent amateur scientists. From “Unconceived alternatives and conservatism in science: the impact of professionalization, peer-review, and Big Science”:
“Most critically, where gentlemanly specialists had been largely free to conduct their research in whatever way and on whatever subjects they liked, this emerging class of scientific professionals depended for their livelihoods on the estimation of the achievements and further promise of their research by their professional colleagues, especially following the incorporation of science into the changing academic curriculum of the nascent modern research university. Thus, after the middle decades of the 1800’s, scientists could no longer afford to be indifferent to those colleagues’ collective assessment of the interest and importance of their own scientific research because that research was how they made a living. There is surely much to celebrate in the emergence of such professional communities and many ways in which these developments presumably improved the quality of the resulting scientific work. But such a community of scientific professionals is also, almost by definition, far more homogeneous in its thinking, in its assumptions, in its motives, and in the dimensions of its creative freedom, than a community made up largely of gentlemanly amateurs supported by independent wealth, aristocratic patronage, and the like…”
“Early modern gentleman scholars engaged in science in order to cultivate their own intellects, to impress other members of the gentlemanly class, and perhaps most importantly to establish reputations by the originality of their scientific contributions. With the advent of scientific professionalization, it would seem that scientists themselves became substantially less free to simply satisfy their own curiosities on their own terms, to ride idiosyncratic hobbyhorses, to grind ideological axes, and to pursue lines of research and/or theoretical suggestions that their colleagues might regard as fundamentally misconceived, unpromising, or uninteresting.”
Science may benefit from finding ways to make science more amateurish, either by modifying incentives or relaxing pressures on professionals, or by finding ways to get true amateurs involved in research in more substantial ways.
The vast majority of modern scientists do not believe in God or gods, an afterlife, spirits, ghosts or anything regarded as supernatural or paranormal (belief in these phenomenon seem to be <10% amongst scientists). This represents an unusual situation in that (1) most people alive today, (2) most people throughout history, and (3) most scientists throughout history believed in one or more of these phenomena (see - “Brilliant Scientists Are Open-Minded about Paranormal Stuff, So Why Not You?” for some famous examples)
Certain psychological attributes make one more likely to exhibit religious and magical beliefs; at the same time these beliefs can exert significant influence on cognition and perception (modifying one’s interests, values, attitudes, goals, etc.) - see the work of Ara Norenzayan on the psychology of religion, his paper “Theodiversity” is a good starting point (also see this, this, and this) . We can think of religio-magical belief and behavior as a kind of psychological dimension, one in which only a minority of scientists score highly in. It is an open question whether or not increasing this form of psychological diversity would benefit or harm science, however one possibility that should be considered is that scientists high in this religio-magical dimension may possess a cognitive lens (i.e. a worldview) that allows them to consider a wider range of hypothesis space, thereby allowing for a certain kind of creativity and openness to possibility that is difficult for more hard-headed secular scientists to muster.
The life and work of Isaac Newton provides an instructive example (from Paul Graham’s essay “The Risk of Discovery”).
“In Newton's day the three problems - physics, alchemy, and theology - seemed roughly equally promising. No one knew yet what the payoff would be for inventing what we now call physics; if they had, more people would have been working on it. And alchemy and theology were still then in the category Marc Andreessen would describe as "huge, if true.”
Newton made three bets. One of them worked. But they were all risky.”
Consider Newton’s worldview – nothing was off the table, there was a mystery and magic to the world - gods, spirits, forces (like gravity) all seemed equally improbable (or probable). One way to characterize this form of psychological diversity may be as “enchantedness”. In the early 20th century, Max Weber described the modern world as disenchanted – secularized and rationalized, the paranormal and supernatural no longer admitted to reality, or at least not seen as the default view.
“In Western society, according to Weber, scientific understanding is more highly valued than belief, and processes are oriented toward rational goals, as opposed to traditional society in which "the world remains a great enchanted garden".
Newton, by virtue of his era and unique intellect, may have possessed a particularly enchanted mind and was able to develop his theories in part because of this quality. John Maynard Keynes, a noted Newton admirer who acquired many of his personal notes, shared a similar view of the man:
“In the 18th century and since, Newton came to be thought of as the first and greatest of the modern age of scientists, a rationalist, one who taught us to think along the lines of cold and untinctured reason. I do not see him in this light. I do not think anyone who has pored over the contents of the box he packed up when he finally left Cambridge in 1696 and which, though partly dispersed, have come down to us, can see him like that. Newton was not the first of the age of reason. He was the last of the magicians, the last of the Babylonians and the Sumerians, the last great mind who looked out at the intellectual and visible world with the same eyes as those who began to build our intellectual inheritance rather less than 10,000 years ago.”
Maybe there is some way in which we can cultivate and encourage “enchantedness” in the next generation of scientists, perhaps through radical educational schemes aimed at developing this sense of the world as a “great enchanted garden. “
Many of the greatest scientific ideas/discoveries have required considerable leaps of faith – it may be that scientists high in faithfulness are more likely to take certain intellectual risks than their more skeptical and epistemically conservative counterparts. The discovery that H. pylori plays a major role in peptic ulcers and stomach cancer by Barry Marshall and Robin Warren provides another instructive example.
“In 1982, they performed the initial culture of H. pylori and developed their hypothesis related to the bacterial cause of peptic ulcer and gastric cancer. It has been claimed that the H. pylori theory was ridiculed by established scientists and doctors, who did not believe that any bacteria could live in the acidic environment of the stomach. Marshall was quoted as saying in 1998 that "everyone was against me, but I knew I was right."
“After failed attempts to infect piglets in 1984, Marshall, after having a baseline endoscopy done, drank a broth containing cultured H. pylori, expecting to develop, perhaps years later, an ulcer. He was surprised when, only three days later, he developed vague nausea and halitosis (due to the achlorhydria, there was no acid to kill bacteria in the stomach, and their waste products manifested as bad breath), noticed only by his mother. On days 5–8, he developed achlorhydric (no acid) vomiting. On day eight, he had a repeat endoscopy, which showed massive inflammation (gastritis), and a biopsy from which H. pylori was cultured, showing it had colonised his stomach”
I don’t know if Barry Marshall was a religious person or not, but it wouldn’t surprise me if he was (or at least had some of the psychological characteristics that sometimes manifest as religious behavior/belief).
Drugs are another tool for exploring the mindscape and increasing psychological diversity. Slime Mold Time Mold reviews the history of drug use by scientists in the essay “Higher than the Shoulders of Giants; Or, a Scientist’s History of Drugs” and proposes a bold hypothesis: the so-called Great Stagnation - the slowdown in economic, scientific, and technological innovation starting roughly in the 1970s - is caused by the 1970 controlled substances act in the U.S. which made psychedelics and amphetamines highly illegal, launched the “war on drugs”, and caused many other countries to pass stricter drug legislation as well.
“We’ve heard a lot of moral and social arguments for legalizing drugs. Where are the scientific and economic arguments? Drugs are linked with great scientific productivity.”
“So the foundational technologies driving innovation can be either literal technologies, new techniques and discoveries, or even perspectives like “innovation.” When we cut off the supply and discovery of new drugs, it’s like outlawing the electric motor or the idea of a randomized controlled trial. Without drugs, modern people have stopped making scientific and economic progress. It’s not a dead stop, more like an awful crawl. You can get partway there by mixing red bull, alcohol, and sleep deprivation, but that only gets you so far.”
“Is the Controlled Substances Act really responsible for the general decline since 1970? We’re not sure, but what is clear is that drugs are foundational technologies, like the motor, combustion engine, semiconductor, or the concept of an experiment. New drugs lead to scientific revolutions. Some of those drugs, like coffee, continue to fuel fields like mathematics and computer science, even some hundreds of years later. With apologies to Newton, “If I seem higher than other men, it is because I am standing on the shoulders of giants.”
In 2013, Philosopher Thomas Metzinger noted an internet-fueled proliferation in the number of new psychoactive compounds and suggested that this was just the beginning of what is to come. Balancing the potential the risks and benefits of this explosion of new drugs while also respecting individual liberty is a major challenge that will have significant consequences for nearly all aspects of society in the coming decades, including science.
“Technology is Neither Good, Nor Bad; Nor is it Neutral:” – Kranzberg’s First Law
One theme of this article has been the relationship the effect of technology – things conventionally thought of as technology like the Internet and computers, but also cultural technologies like educational systems, and neurotechnologies like drugs – on the psychological diversity of our species. Looking to the future, there is one inevitability – technology will continue to modify the human psyche and powerfully so – and two possibilities – technology will act to enhance psychological diversity and move us towards better areas of the mental landscape, or we may (intentionally or unwittingly) create a mental monoculture and/or cognitively cripple ourselves in some irreversible manner. As we develop more advanced technologies for modifying our psychology and technology itself becomes more mind-like, the task of creating a healthy cognitive ecosystem will only become increasingly difficult and essential.9 We would be wise to remember that it is the incredible diversity of humanity that gives us the resilience, creativity, and adaptability that have served us so well in the past, and will no doubt do the same in the future.
“And afterward Moses and Aaron went in, and told Pharaoh, Thus saith the Lord God of Israel, Let my people go, that they may hold a feast unto me in the wilderness”
See Kevin Kelley’s articles “The Taxonomy of Minds” and “The Landscape of Possible Intelligences”.
Where is the boundary between human minds and non-human minds?
It is beyond the scope of this article to provide a full analysis on each limiting factor; I leave it up to the reader to decide whether or not psychological variation is constrained in a harmful or beneficial manner (of course not all psychological diversity is good for science, and there are broader societal costs to increased levels of diversity) and whether or not there are any desirable solutions to the limiting factor.
This trend towards a global monoculture may also have had the effect of minimizing the cross-pollination between distant cultures that often leads to bursts of innovation. Gwern writes about how golden ages of artistic creativity tend to happen in the sweet spot between too little cultural exchange and too much. Perhaps we have now entered an era of history where we simply know too much about each other, nothing is so exotic as to produce a certain kind of unique creative inspiration.
Eliezer Yudkowsky and Audrey Tang come to mind examples of innovative thinkers that did not attend schools.
There are many educational trajectories one could imagine that might give a young scientist a unique perspective and set of skills. Could we teach the history and philosophy of science in high school for several years before teaching any current scientific theory? Could we have a specialized high school program that spends junior and senior year only teaching computer science, data science, and neuroscience? Some curriculum sequences might be strange and impractical, but there still could be students for whom learning science in that sequence would produce a truly unique mind. In general, I think we are probably too afraid of specializing at young ages. It seems to me that most of the time our general pedagogical philosophy is to build a broad base of knowledge and then narrow our focus, however this may suboptimal for certain students.
One could also think of gods, spirits, and spiritual forces (e.g. karma) as “belief technologies” that allow us to achieve individual and collective goals that we wouldn’t be able to otherwise – e.g. belief in a moralizing god prevents individuals from breaking norms and reduces the societal costs of enforcement (see Joseph Henrich’s work on “the big gods” hypothesis)
Grappling with advances in medicine/neurotech/AI may require us to adopt a technological philosophy that emphasizes the augmentation and supplementation of our mental abilities rather than the replacement of them. David Krakauer provides a useful distinction when he contrasts competitive vs. complementary cognitive artifacts. As an example, he compares GPS and the calculator to maps and the abacus; after using the former group of artifacts, we are left worse off than we were before - our calculative and navigational abilities atrophied from underuse - whereas the use of maps and the abacus act to enhance our reasoning abilities in various domains (mathematical, topological, geometric). Krakauer warns that we should be wise to remain aware of the complexity and interconnectedness of the mind – gains or losses in some domain may have unintended consequences for another.
“So if I give you a fork or chopsticks or a knife, it’s true that you’re better able to manipulate and eat your food, but you also develop dexterity, and that dexterity can be generalized to new instances…. And for me, the concern is also the indirect, diffusive impact of eliminating a complementary cognitive artifact, like a map, on other characteristics we have.”
“It’s been known for a long time that if you become competent at the abacus, you’re not just competent at arithmetic. It actually has really interesting indirect effects on linguistic competence and geometric reasoning. It doesn’t have a firewall around it such that its functional advantages are confined to arithmetic. And in fact, I think that’s generally true for all interesting complementary cognitive artifacts.”
James Evans provides another valuable perspective when he recommends that we reimagine Computational Social Science as Social Computing. Social Computing recognizes, “societies as emergent computers of more or less collective intelligence, innovation and flourishing” and imagines “a socially inspired computer science that enables us to build machines not merely to substitute for human cognition, but radically complement it” leading to “a vision of social computing as an extreme form of human computer interaction, whereby machines and persons recursively combine to augment one another in generating collective intelligence, enhanced knowledge, and other social goods unattainable without each other.” He also advises re-conceptualizing Artificial Intelligence as Alien Intelligence - our goal should be to create intelligences “not most but least like humans and human groups in order to achieve cognitive diversity for social computing—to help human collaborators think differently, bigger and better.”
I quite like this post. I don't agree or expect to end up agreeing with most ideas you present, but it's definitely proposing exciting alternatives to some way I'm used to thinking about science and creativity that are already stimulating!
(I'm going to make one comment thread per topic/part, because I'm trying something and that seems better to make conversation doable)
Glad you liked it! I certainly think there is a lot of room for disagreement, I'll respond to a few of your comments
Slightly tangential To expand on Hamming's point slightly: Any problem handed to you is almost certainly formulated wrong. Why can you be confident in that? If it were formulated right it would already be solved and not coming into your awareness as a problem. This is helpful for scientific problems, but also personal problems. Traversing the same representation of your problem for the nth time isn't going to do much other than agitate you. This is part of why new self help techniques work for a time then stop working. The problems you had that were amenable to those representations are now solved.
It's so hard for me (and I expect for many around here) to read this section because it promotes at almost every turn a confusion that is infuriating when discussing people debating science, knowledge, and the concept of truth. I'm not saying that you necessarily share these confusions, but I feel like there's a better way of making your point without pushing most readers to discard you as someone saying ridiculous stuff.
What are these confusions?
I just don't really see it as that problematic if a small percentage of scientists spend their time thinking about and working on the paranormal/supernatural because (1) scientists throughout history did this and we still made progress. Maybe it wasn't necessary that Newton believed in alchemy/theology but he did and belief in these things is certainly compatible with making huge leaps in knowledge like he did, (2) I'm not sure if believing in the possibility of ghosts is more ridiculous than the idea that space and time are the same thing and they can be warped (I'm not a physicist :). UFOs would probably have been lumped into these categories as well and now we know that there are credible reports of anomalous phenomenon. Whether they are aliens or not who knows, but it is possible that studying them could lead to an understanding of new phenomenon (I think it already has led us to understand new rare forms of lightning but I'm forgetting the specifics).
Look, I don't really believe in these things and I don't behave as if I did, but I am open to the possibility. The main argument here is that being open to the possibility, having a sense of mystery and epistemic humility, does make a difference in how we think and do science. This kind of goes back to the discussion of paradigm-shifting science/normal science. If absolutely no believes that a paradigm shift is possible then it will never happen. I'm of the opinion that it's important for us to maintain a kernel of doubt in the hard-headed materialist atheist perspective. In truth, I think we are pretty closely aligned and I am just playing devil's advocate :)
Re: normal science
That theory doesn't seem to hold as well as Kuhn's one, which is that the paradigmatisation of science creates the possibility of normal science, which being eminently scalable, makes most scientist normal scientist. Or put differently, if normal science is a possibility, we should expect almost all scientists to almost always conduct normal science.
Hmm yea I see your point. I guess what I was saying is that there are certain thought patterns and styles of cognition which may be more likely to stumble on the kind of ideas or do the kind of work that can potentially lead to paradigm shifts. Whether or not we are less able to think in this way now is definitely an open question but I think one we should worry about.
Agreed that it matters a lot to have people working on new paradigms. I guess the reason I'm absolutely not worried about lacking people like that in today's scientific climate is that I don't expect scientific education can get that out of someone. From my experience, there's a small category of science students who care a lot about asking weird questions and questioning everything, and they almost never end up doing normal science.
Re: the cost of internet
Another way of coming at this intuition appears when newcomers in preparadigmatic field
I find more and more that people (especially me) sin more by not searching for knowledge than by overdoing it. Or more explicitly, your point about preprocessed knowledge doesn't really apply to data: having a lot of data is incredibly useful to see a problem with "fresh eyes". The issue of course is that so much data comes with a perspective attached, which makes creative perspective harder (but not impossible).
First reaction: that sounds like a good thing more than a bad thing. And actually, I would argue for the exact opposite effect: fringe and unusual topics are way more popular thanks to internet and the ability to converse with other people caring about them, even if you're only a handful spread over the world. One just has to look at the explosion of conspiracy theories.
This points sounds true, and that's definitely not something I usually consider. But given the title of your referenced paper, would that also be a problem for pure sciences or maths? I don't see how the political diversity could matter there, while being really interested by good counterarguments.
I actually would disagree with your last point. Certainly cultural/political diversity will matter more for psych/social sciences but I think it will have an effect on what kinds of topics people care about in the first place when it comes to harder sciences and math. I can imagine a culture which has a more philosophical bent to it leading to more people doing theoretical work and a culture which has a greater emphasis on engineering and practicality doing more applied work. I could also imagine a more authoritarian culture leading to people doing physics in a certain style - perhaps more of a search for unifying "theory of everything" type ideas vs. a more democratic and diverse culture leading to a more pluralistic view of the universe. Not saying these would be huge effects necessarily but on the margins it could make a difference.
So your point is something like "political inclinations and culture in general are systemic biases in the search algorithms of researchers, even in pure science"?
That's an interesting take; I just don't know how to go about checking it. Certainly, we see many example of both theoretical and applied work in many sciences, showing that in this regard the diversity is enough.
About the unifying theory of physics, I'm not that sure about the link with authoritarian culture. But once again, in actual science, there are so many viewpoints and theories and approaches that it would take days to list them for only the smallest subfield of physics. So I'm not convinced that we are lacking diversity in this regard.
certainly the authoritarian link is highly speculative, but I think in general we underestimate how politics/culture/psychology influence what we care about and how we think in science. A more extreme version of the question is: how similar would we expect alien science to be to ours? Obviously it would be different if they were much more advanced, but assuming equal levels of progress, how would their very different minds (who knows how different) and culture lead them to think about science differently? In an extreme version, maybe they don't even see and use something like echolocation - how would this influence their scientific investigation?
"Certainly, we see many example of both theoretical and applied work in many sciences, showing that in this regard the diversity is enough.
About the unifying theory of physics, I'm not that sure about the link with authoritarian culture. But once again, in actual science, there are so many viewpoints and theories and approaches that it would take days to list them for only the smallest subfield of physics. So I'm not convinced that we are lacking diversity in this regard."
I don't see how you can make this conclusion, we don't know what the counterfactual is. Obviously there is a lot of diversity of theories/approaches but that doesn't mean that we wouldn't have different theories/approaches if science was born in a different cultural background.
Again, I think these are all open questions, but I think it is reasonable to conclude that it might make a difference on the margins. Really we are asking - how contingent is scientific progress? The answer might be "not very much" but over the long-run of history it may add up.
Re: possible minds
That 1) is but a minuscule fraction of mindspace sounds reasonable, if we define mindspace to be different than human minds (which should be possible IMO). But that doesn't seem true of the inclusion of 3) in 2) for example. Based on the estimates of 2) I found in the wikipedia page (from 100 to 110 billions), that makes 3) a fraction of between 6 and 7% of 2). Not the majority, but hardly a minuscule fraction. The statement seems valid 4) to 3) (Numbers from this UNESCO report give something like 0.1%)
My initial intuition said that you were confused here, as only people in 3) can participate in 4). But I think you want to point towards trends and thought pattern of potentially dead humans that might be valuable to incorporate in the scientific landscape and education.
That being said, I feel your point is more about the uniformity of 4) than it's size, isn't it? After all, there would be diminishing returns if half the population did science, and I don't think it's an especially good idea.
I can't shake of the impression that you're missing an intermediary category between 3) and 4), of the human minds that could be excited about exploring and thinking about the world. It's if this category is significantly more heterogeneous than 4) that there would be a possible problem along the lines you describe, right?
One point of confusion that I think is running through your comments (and this is my fault for not being clear enough) is how I am conceiving of "mind". In my conception, a mind is the genetics and all of the environment/past experiences but also the current context of the mind. So for example, yes you would still have the same mind in one sense whether you were doing science in a university or were just an independent scientist, but in another sense no because the thoughts you are willing and able to think would be different because you are facing very different constraints/incentives. Hope this helps.
Hum, okay. But thinking about the equivalent classes of such minds would be more relevant, no? Like if two different combinations leads to basically thinking the same ideas, we would want to mix them. Then the crux of this debate would be whether almost all modern scientists where in the same equivalence class, and if science could benefit from the inclusion of more equivalence classes.
If this hypothetical scientist was able to actually get a job in a university, I would expect next to no difference between the two. First because it's still a job, but also just because science is not a random personal exploration, it's a shared endeavor. And so you care about communities or specific people finding your work interesting and/or important. That's the most relevant incentive IMO, and I don't see how it changes between these two settings.
Re: professionalization of science
I'm confused by this argument, because a scientific community only composed of rich white guys from european countries sounds incredibly more homogeneous and uniform than what we have now. Maybe you combine this with the uniformity of science training in the world, but that sounds like it overestimates enormously the uniformity of actual scientists, which research weird and varied topics, way more actually than any gentleman-scientist did AFAIK.
It depends on what diversity you are measuring: if you are measuring diversity as variability in the skin colour, there has been definitely an increase of diversity. But if you are measuring the diversity of how scientists approach their work, I would certainly agree with the paragraph of the OP
Yea that's the idea. Not saying that the scientific community in the past was better, but there were some ways in which it allowed for more diversity of thought than our current system. All else being equal (which it never is) a scientific community which is 100% people working at modern universities and competing for the same jobs/journals is worse than a community which has some niches where people can work with very different motivations and approaches
Good idea to point this possible confusion. I brought it on myself by saying "white". :p
Yet I wonder if there isn't also a confusion in considering the past community more diverse. My point was that at least between the 1600s and the 1800s, many aspects of the scientific community were pushing heavily for similarity of thought.
None of this tells us that scientific diversity was less then than now, but in my mind it makes the proposal that it was way better than far less obvious.
That statement makes a lot of sense. But I feel there is a big confusion if you think this is actually how science happens. There are many journals and conference for many different niches. And almost all countries (especially the US that you take as a measure of everything else) have a tenure mechanism that allows researchers to literally do whatever they want. In France we even have tenure by default. (This doesn't completely deal with the need to get money and funding to do cool things -- going to conferences or hiring PhD students -- but it leavens a lot the urgency in them).
So little actual knowledge that almost everyone was a "Renaissance man" (and so they literally all shared the same sources)”
Interesting thought - now everyone has to specialize, there are less people who have different combinations of know in a given discipline. Like i talked about with education, i think its worth thinking more about how our education systems homogenize our mental portfolio of people.
Re: tenure - its a good point and certainly we do have some diversity of scientific niches. Its an open question whether we have enough or not, i think my point more anything is just pointing out that this form of diversity also matters.
Radical proposal: we need scientific monasteries, isolated from the world, with celibate science monks dedicating to growing knowledge above all else :)
So we need Anathem? ^^