Preface

I want a PhD before I die.

Why? Same reason(s) other people might. Something something passion.

Cynical reasons aside-- nothing else useful to do with the major; it's a bureaucratic checkmark; I want Tenure and coast the rest of my life-- people want to have impact in their work and contribute to the world, and a PhD is what many view as an avenue to do that.

And thus, Feibelman knocks these starry-eyed graduates down a peg-- It's Not Enough. "Enough" to do what? To Do Great Things, Peter J. Feibelman answers.

Well, I paraphrased. Here's his direct answer (emphasis mine)--

"This book is meant for those who will not be lucky  
enough to find a mentor early, for those who naively  
suppose that getting through graduate school, doing a  
postdoc, etc., are enough to guarantee a scientific ca-  
reer. I want you to see what stands between you and a  
career, to help you prepare for the inevitable obstacles  
before they overwhelm you." -- Preface p. xiii

Or, as I interpret it, you won't change the world with just a PhD. Hence, our eponymous title.

My immediate questions as I read this were the following-- who are you, and why should I keep reading this book?

Peter J. Feibelman is a rather renowned (retired?) physicist in the materials science world. Here is his Google scholar profile, and you can immediately tell he's made some significant inroads to... honestly I'm not entirely sure. Graphene research? I'm not a materials science expert. Furthermore, he is rather well honored too by the American Institute of Physics as well.

The takeaway here is that Feibelman is a very accomplished, intelligent researcher who has spent decades uplifting the materials science world from his corner at Sandia National Laboratories.

But he's no von Neumann. No Terrence Tao, or Gauss. He is not a generational genius who very likely would have succeeded regardless of academia and makes discoveries while placing gold crowns in bathtubs. 

But, just because you haven't heard of him, does that mean he didn't do some Great Things (TM)? Do you think he was some no-name researcher who didn't advance the state of the world and push our understanding of X thing? 

If you do, then this review will be pointless for you, because the biggest appeal of the book is its approachability and reality.

We stand on the shoulders of giants, but you don't have to be an actual giant to lay yourself at the altar of science to be a humble stepping stone. My goal of a PhD is not to become some superstar public intellectual famous for being smart. It's to push our understanding of some crazy niche field and push it further beyond. The widespread recognition is nice, I'm sure, but I don't think most PhD graduates' goals are this. 

And that's what appealed this book to me. He's not an Einstein. And neither am I. But you don't have to be to do Great Things (TM).

Which leads in to the first point--

I. Being Smart Isn't Enough

You certainly have to be smart to get a PhD, or if you're more cynical, you have to be a certain type of individual to jump through the hoops and filters to get the credential.

In 2019 for science and engineering disciplines, there were about 400,000 doctoral students/doctorate-holding people in the US (sorry other disciplines! I don't have data for those, but given that this is a science-directed advisory book, tough luck).

Let's just assume every student gets a PhD (ha!!) to make life easy so we can say that 1% of the population has a PhD. Thus, any PhD holder is in the 99th percentile of "talent", whatever their field is in.

But 99-th is not good enough for your goals (or 95-th a la Dan Luu). Would you have to be the 99.99-th percentile? Sure. Just be Euler. Otherwise, if you want to change the world, you need more than just raw skill.

He starts with himself. He, likes many others, had the notion that his raw technical prowess would secure him the adoration of any future employer. He could code, he was first author in lots of places, he had seven publications in seven different sub-fields showing massive breath. Every senior scientist he's had would say that he got things done damn well. What's not to love? 

People were so impressed with his work, that they asked Feibelman to give a seminar on his work. "Finally, my chance!" he thought. At four years into his PhD, this popularly attended seminar was a chance to get stellar recommendations and start job hunting. Being one of those jobs was, ideally, at the same university, the seminar was even more... ah, seminal, to his career.

Have you ever seen those presentations that paste equation after equation and immediately assume you too know partial differential equations? That you should recognize how important it was he used this obscure numerical method in just 12 easy simple fast steps? It was simple, really, if we first observe that for wavelengths of visible light, then as particle sizes approach this wavelength and a Monte Carlo distribution of particles, then the resulting dipole scattering will decouple resulting in incoherence and the residual intensity is actually the summed square of the inverse fourth power of said wavelength and the sixth power of the particle size and-- HEY EXCUSE ME why are you guys looking at your phone? Oh it's 30 minutes already and my seminar is over?

As Feibelman was preparing his slides, his professor presumably saw his terrible presentation-- 

the professor supervising my research, C., [...] expressed sur-
prise at how poorly I had prepared my talk (though I
don’t think he was surprised at all), how little grasp I
seemed to have of the reasons that the problems we
had worked out were meaningful, and consequently
how uninterestingly I was going to present them to my
audience.

What follows is a collection of tragedies that he personally has seen befallen fiercely intelligent, technically impressive individuals. I will summarize some of them that I find especially applicable, but this is all in the first chapter if you wish to read them in its entirety.

(I am also giving them actual names; in Feibelman's recounting he uses a single letter e.g. T, L, N, etc. I found this jarring, and will be giving them random names instead. Any potential de-anonymization is coincidental.) 

==============

Tom accepted a post-doc at a top tier government lab under a leading scientist, having received great reviews for being an amazing sounding board and an impressive thesis defense. His first project was on a computational research task that first involved arriving at a numerically practical mathematical formulation of a problem and then required a considerable computer programming effort.

Tom sunk his life into it. Under some immense pressure to impress his lead since he really needed a permanent job, he'd spend 12 hours a day getting work done, a lifestyle drastically different than his time in grad school. But he needed the challenge. And 18 months.

Finally, he got some initial preliminary results. They were promising, but not enough to tell a full "story" supporting the lead's research. Furthermore, because he sunk 12 hours of his day sitting in a corner being a code monkey, few other scientists really knew him, and the only strong recommendation he'd have would be his lead/advisor. Thus, he couldn't stay at the same lab.

Externally, his job offers were similarly lukewarm. While he pushed that he could have more substantial results in 6-12 months, other job candidates had had 2-3 projects done in the same timespan as Tom, and all he could muster up was a "maybe perhaps possibly"?

One might blame these labs for not being "ambitious" enough-- Tom's a genius!--, but there were several open questions in employers' minds. How could he handle the "real" world science? Did his advisor, being incredibly recognizable in the community, do most of the thinking for Tom? Could he do things independently, or would he just be a technician, nothing else? What does Tom want for himself?

==============

Bob had a PhD in physics from a top-tier Midwestern school, working under 2 advisors. One was a Nobel prize winner who you've probably heard of, the other an experimentalist that's also massively respected. As a result, Bob had some really amazing (2) offers at a major laboratory-- permanent jobs, too! And in this economy (~2010)? Wow!

One was to begin work under a senior staff currently doing a major experiment, whose pitch to Bob was that Bob could gain expertise on the actual methods aspect before actually doing independent research. Starting a lab, gathering equipment himself, etc.

Lest you think this sounds like Tom's story-- Bob's job here is already permanent. Bob would, effectively, be a postdoc with the security of a full-timer.

The other offer was to skip all that nonsense and just start independently straight up, where he would be nearly equivalent in role to said senior staff. 

He ended up accepting the latter and becoming independent directly out of school. He worked under aforementioned experimentalist-- who needs the expertise of the senior staff's offer? Furthermore, all of his friends convinced him that, from a manager's perspective, come promotion/annual review, if he were to be an "assistant", he would be seen as being too dependent, not driven, etc.

So he started his own program right out of school. He spent three years buying equipment, setting it up, etc., and had little real direction for research before his employer put their foot down and moved him elsewhere, despite having an impressive background.

Feibelman recounts that, while Bob is doing well now, it took several extra years and a stressful divorce (!!) to get back on track.

==============

Larry had spent 2 years as post-doc in a prestigious lab. His was originally hired to build a piece of equipment combining technology in his new area with that of his thesis work. He did it and got his name on some papers, but his employer recognized that Larry did not actually learn much beyond the technical details of the equipment-- how can this be applied further? What problems does this address? Why was it built in the first place?

Larry did get another offer for a full-time position where he was, again, asked to build an instrument. But he never fully integrated into the team. At seminars or planning committees, he did not contribute. Lastly, when asked about his future research plans, he could say no more than that he planned to look around for “interesting” problems. He was transitioned out of the research group, and Feibelman doesn't know what happened after that.

Feibelman speculates that, while it may be true that Larry's personality is just not that of a researcher, he deeply feel this could have been realized by a mentor to either pull out Larry's directions, or to help him transition to an actual position suitable for him.

==============

ljh2 speaking again! I would encourage you to read the rest of the stories. 

Feibelman recounts these stories not as direct advice, but rather as warnings for relying too much on raw technical skill. A common theme in all these stories is that these people are incredibly bright, but all failed to foster a research mission one way or another. He himself learned this lesson, but hopefully these folks did too.

Which leads into his first piece of advice--

II. Finding an Advisor/Mentor

An advisor is essentially a student's first mentor in science. The roles are not completely overlapping, but still choosing one is critical.

He first advises choosing an older professor/established scientist as a heuristic (easier said than done, Peter). The main benefits are 1) networking, obviously 2) the advisor is not your competition. 

That's a weird statement-- isn't this the case with every advisor? To not compete with their students?

Not necessarily, Feibelman claims. An younger advisor who might be seeking tenure, more prominence, and generally more to prove, might be leery of being shown up by a student and accordingly unlikely to equally share credit. Ultimately, for survival in science, a not-fully-established professor will prioritize themselves over their students. 

If your postdoctoral adviser suggests that you work on a
major, long-term project, you should at the very least
ask for an estimate of what you will have to show for
your efforts by the time your job hunt is to begin. 

If your adviser insists that you devote
yourself wholly to the long-term endeavor, remember
that ultimately you are responsible for your success or
failure as a scientist. If your adviser 
(especially your young adviser) places his or her interests above 
your own, do not be too surprised. Seek a different group to
work in, one that offers you a more realistic opportu-
nity to produce short-term, publishable output.

Contrast this with one who may have tenure-- there is stability in not being evicted from the university, and in which they have not much more to prove. Furthermore, because universities almost exclusively profit off reputation, and ergo this prominent professor, oftentimes universities will have no limit to financially retain the professor.

Thus, all else being equal, one should choose an older professor.

Of course, things are not always equal, and we begin to qualify this thinly-veiled ageism (since tenure and "establishment" seem basically like proxies for age). The number one goal in Feibelman's view is that an advisor should make good researchers

For instance, these qualities may override the tenure-Nobel-laureate-age heuristic--

Is the professor you are considering available to consult 
with students on a reasonably frequent basis and able to 
convey real guidance?

Is your intended adviser comfortable talking to people
who are not scientific peers (i.e., beginners such as
yourself )? 

Does the group you wish to join have a sense
of purpose? 

Do its members interact with each other?

And does Professor Eminent teach survival skills? If
you can learn the answers to the important questions
in advance, by talking to current or former students,
you may save yourself a lot of grief.

The type of culture and growth that an advisor fosters in their research group is critical--

Prof. E. was obsessive. He was obnoxious. I have heard it said that he
didn’t know quantum mechanics. But his contributions to materials 
science were manifold—and his students have done wonderfully well. 
They knew what they wanted to learn, and they learned from each other.
Thus, even if E. was often away consulting at industrial labs, his students thrived.

How do you find out in advance whether the group you are considering 
will be like E.’s? Visit the members. Ask them what they are doing. 
See if they can explain the big picture. If they cannot, find a 
different adviser.

Often a prominent scientist will lead a big group with, say, 15 or 20 
experimental systems, enabling an equal number of graduate students 
to study trends. These students are guaranteed to finish their degrees
in a reasonable period of time. They take their data, report their results, 
and get their degrees. It all seems so easy.
Should you be part of this kind of group?

Again, the issue is whether the students have an inkling of the big
picture. Is it only the adviser who knows what trend is
being studied, while student A. is looking at rhodium,
B. has a sample of ruthenium, and C. has some palladium? 
If the students cannot tell a good story, move on.

On finding a mentor--, this again may overlap with finding an advisor. But here he also emphasizes the importance of networking. 

How do you become a member of the old-boy
or old-girl network? Not by learning a secret hand-
shake, but by taking advantage of opportunities to
make yourself known.

Begin at your desk. Have you read a stimulating
paper related to your work? Has it raised compelling
questions? Engage the author in an email dialogue.
When you start looking for a job, he or she might recall
your thoughtful queries, or your critique, and be willing to help.

The best preparation you can make toward the goal of
having a scientific career is to find yourself a “research
aunt or uncle,” someone with little or no authority over
you, who has enough experience to act as a sounding
board and to give accurate advice. Do not be shy about
getting to know people outside your adviser’s realm.
The scientists at your lab will very likely cherish the
human contact. They spend a lot of time behind the
closed doors of lab and office, and everybody likes to
give advice.

III. Dance for My Money 

So, you've done it. You've gotten your PhD despite being smart. Your advisor has put you on your path. You've finished your post-doc. Now you want a real job. Now what?

Well if you want money-- a job, research grants, tenure, publications-- you need to tell a story. People, myself included, have very low attention spans. You have to give a performance.

Consider a seminar, or any other talks you may give during your interview process. Remember when Feibelman's professor C. crapped all over his initial seminar back in Section I of this review?

What was the advice to fix these? (Emphasis mine).

  1. There has to be a theme to your work—some
    objective—something you want to know. Do not start with, “I have been trying to explain the interesting wavelength dependence of light scattering from small particles,” but rather “There is a widespread need to explain to one’s kids why the sky is blue.”
  2. If you know why you have chosen to work on a particular problem, it is easy to present an absorbing seminar. Start out by telling your story, why the field you are working in is an important one, and what the main problems are. Give some historical material showing where the field is, the relative advantages of different methods, and so on. Then outline what you did, and describe your results. Conclude with a statement of how your results have advanced our understanding of nature,and perhaps give an inkling of the new directions that your work opens up. Do not assume that your audience comprises experts only. 
  3. Lastly, rehearse your talk in front of one or two of your peers or professional supporters. Choose listeners who will not be shy about asking questions and offering constructive suggestions. Giving a seminar is serious business. Your future depends on the strong recommendations of your senior colleagues. If your talk is a hodgepodge of techniques or experiments or equations, if you seem to have no idea where you are headed, if you reek of deference to the experts in the audience, you will not be perceived as a rising star, a budding scientific leader. You will fail.

In other words-- tell a story, and craft an overall narrative. Because at the end of the day, people like stories. They don't want to follow the nitty-gritty details, and if they do, they will do that on their own time. During a seminar, or grant proposal, or any other presentation, you are giving information, synthesizing information that you are personally an expert in.

The skill to abstract and simplify to broad strokes is a critical skill of a scientist (something I probably need to improve, given how long this review is), and that's the entire point of conferences like these. Researchers only have so limited time, and their main method of knowledge acquisition is via talks. So you better make it damn entertaining, you dancing monkey you.

A significant part of making it not boring is not throwing up slides or diagrams everywhere.

Theoretical physicists, particularly inexperienced
ones, often show slides covered with equations. (Mol-
ecular biologists show DNA sequences.) Except in very
special cases, such as meetings of specialists devoted
to technical advances, this is a bad idea. The audience
cannot assimilate more than a small amount of infor-
mation in an hour, to say nothing of ten minutes. A talk
comprising detailed, technical slides is likely to be re-
ceived as a deliberate attempt to persuade the listeners
that because the material being presented is so complex
as to be incomprehensible, it should be looked on as
important. Save this for after your Nobel prize. 


Put yourself in the place of an experimentalist among your listen-
ers. Why would he want to hire you? More likely, he
would prefer someone he thought he could talk to. To
communicate with him, you need to convey not the
details of your math but the basic concepts, the ap-
proximations, the results, and the predictions.

On "Publish or Perish", and Writing Grants

The biggest criticism of academia might be its publish or perish culture; how writing at such a yearly cadence necessarily forces short term views, and that our scientists should be working more on the science and less on the writing.

I'll comment on the "short term-ist" argument later, but Feibelman is actually in support of this publishing culture. One example actually is that of Tom above. At the end of the day, our time is limited, and success requires you to be both well known and widely appreciated. Publishing is one critical avenue to become well known.

Of course, it's also critical you don't just add to the noise of sloppy papers and contribute to our reproduction crisis-- ergo why he says "widely appreciated". Thus, his point is that there is an important equilibrium one must strike-- but the culture of "publish or perish" is not itself a bad thing.

So when should one publish, and how should one write? 

Firstly, on when-- if a project cannot be single-handedly surmised into a paper, it must either be abandoned or broken down into several smaller, iterable chunks (that hopefully encompass an overarching narrative of your research direction). With every publication, you may write portions in the abstract the context, and perhaps even have a linked list of citations going on.

He calls these chunks, half-jokingly, a "publon", the smallest quantum that may produce a publication.

Secondly, in support of quick, relatively short publications, funding cycles are only a couple years; tenure is seven; promotions happen in a couple years. The people who give you money need some concrete evidence to continue giving you money. It's helpful if it coincides with your long term vision, but a single awesome magnus opus that takes 10 years is not worth the risk.

He does understand the frustration here; however, that's the reality of current productivity metrics--

If you have published twice as
many articles, this “objective measure” of their impact
will be roughly twice as great. You may find this idea
crass. I do. But it is safe to assume that there will be
bean counters among those who determine your fu-
ture, and it certainly does you no harm to please them.

(That's not to say "magnus opuses" aren't worth it. More on working on hard problems in Section 4).

Thirdly, writer's block is damn real. Maybe even a similar thing I would call "reader's block". It's much easier to write kernels and modular results, than it is to write and organize whole document (believe me, I spent so long reordering this review). Even for readers, it's easier to digest.

Lastly, speed helps mitigate against getting scooped. Having been a victim of that, it damn sucks. Feibelman comments on some interesting dynamics-- how the "referees" of committees are oftentimes competitors and might be influenced by the work, but that's a whoooole other thing.

It's for all of these reasons that the current pacing is not something to be lambasted, Feibelman says.

On how-- it again goes back to how well you can tell a story.

For Feibelman, the title, abstract, and the introduction are the most critical. This aids both in scientists' paper hunting, but also various search terms and keyword popups. Say what you want about clickbaity newspaper articles, but your papers should follow the same "attention-capturing" behavior and structure.

As in the case of titles, it is worth remembering that
abstracts circulate more widely than the papers they
summarize. They are the first item to pop up when one
searches journal content and are generally available
without charge, even when seeing a full article requires
a subscription. A well-written abstract may thus make
the difference between someone’s downloading your
full text or emailing you for a copy, rather than just
moving on.

(This matches my mental model as well. My advisor once told me in school that he reads the abstract of a paper, maybe the introduction if he's unfamiliar with the authors, and skims the conclusion/results to see if they vaguely correlate the abstract's claims. It's only if the results are counter-intuitive does he read closer. I've begun to do the same, frankly.)

Now, if the title and abstract are the bait, the introduction is where you really crank the reel. Spin the reel? Pull? I don't know, I don't fish. But many might struggle with this balance between describing more detail that the abstract while not droning on technical details. Feibelman advises--

My solution to this problem is to start thinking about the 
first paragraph of an article when I begin a project rather 
than when I complete it. I would not embark on a scientific 
effort if I didn’t think it was important and that my work 
would answer a question of rather wide interest. The reasons 
that I found the project in question interesting enough to 
work on provide half the material I need for my introduction.

Sitting at the
word processor, I imagine I am on the phone with a
scientist friend whom I haven’t spoken to in some time.
He asks me what I have been doing recently. I write
down my imagined response. If, when you try this, you
feel an attack of writer’s block coming on, turn on a
recording device and actually call a friend.

(I like to talk to a rubber duck, but to each his own!)

This is also the time where you may religiously cite your references.

In writing your introduction, as well as the body of
your paper, it is essential to place your work in context,
not only by explaining what you did and why but also
by citing the relevant literature. This is important, not
only to provide your readers with a way of understand-
ing your area of research, but also because your scien-
tific colleagues are very eager to get credit for their
achievements. (This is not just vanity. Scientists’ ca-
reers are built on the perceived importance or useful-
ness of their research results.) You have much to gain
and little to lose by scrupulously citing your competi-
tors’ work

He somewhat glosses over the rest of the paper writing sections-- how apropos, as we just said the intro is the most important!-- 

Often it is a good idea to relegate detailed discussion of a
technical aspect of the work to an appendix. That way,
experts or interested parties can try to understand your
arguments in full detail, whereas others do not have to
guess how much of the text to skip to move on to the
next idea.
Keep in mind that the function of a journal article
is to communicate, not simply to indicate how won-
derful your results are. In principle, a paper should pro-
vide enough information that an interested reader
would be able to reproduce your work.

On the conclusion-- 

As in the preparation of a seminar, the last section
of a paper should provide not just a summary of the
results reported but also some idea of how they might
affect the direction of future research. The goal of the
conclusions section is to leave your reader thinking
about how your work affects his or her own research
plans. Good science opens new doors.

In other words, you should always have some comments on future work. I can honestly say that some ideas I've personally had have come from simply reading directly off of other people's "Future Work". 

Of course, Feibelman leaves us with an example of an introduction that he might use for a grant. It's high level enough, but also leaves granules of information to branch off of.

One of several reasons that research in surface sci-
ence has been actively pursued for the past several
decades is that vastly important chemical reactions,
from the elimination of noxious gases in automobile
exhaust to the production of petrochemicals, are cat-
alyzed on the surfaces of appropriate powdered met-
als and oxides. Learning to make commercial catalysts
cheaper and more efficient is thus a goal worth hun-
dreds of millions of dollars to the world economy.
Surface scientists often point to this fact, despite the
common knowledge that forty-some years of surface
science have not led directly to a single industrially
significant, new catalyst material. 

The reason for this
“failure” is that chemical catalysis on surfaces is a very
complex affair, and even the elementary processes
that together comprise a catalytic reaction, such as
the dissociation and sticking of a molecule to a sur-
face, are not very well understood. One area where
surface scientists have made significant progress is in
developing tools to determine the arrangement of
atoms at a surface. As a result of this progress, the
atomic arrangements of quite a variety of crystal sur-
faces are now known. Surface science has therefore
turned to the study of elementary molecule-surface
interactions. 

By pursuing this kind of work, for ex-
ample, by studying both theoretically and experimen-
tally how a simple molecule like H+ interacts with a
relatively simple metal crystal surface, we believe that
we are taking important first steps toward under-
standing the elements of molecular chemistry on cat-
alyst surfaces

IV. On Hard Problems

Look, everyone wants a Fields Medal or to solve on of the Millennium Problems. Is the best way to attain them to work on those exact problems for 20 or 30 years without any paper? 

No. You need to break it down into publons! Especially if you want to get published/funded. Get me a grant, damn it! See Section III.

Hamming refers to this "hard problem" problem in a more philosophical way, but Feibelman takes a more practical approach-- do you want to keep your job, or not? 

Contrary to what I've said thus far, The System (TM) is not against you having long term goals. They just need to see some progress-- 

The most obvious is to aim at an important long-term goal 
by planning your work as a sequence of short-term projects. 
Each of the latter should yield an identifiable and publishable 
milestone (a “publon”; see Chapter 5). Your papers and oral pre-
sentations can then begin by identifying you and your
work with an exciting research area, while the new ker-
nel of knowledge that you describe will give confidence
that you are a person who completes projects.

If you do continue to insist that this "publon" cannot be done in a simple paper, Feibelman has this to say--

Experience teaches that, important or not, a research endeavor 
becomes timely only once it can be approached with suitable 
technical infrastructure. Before then, a proposed long-term effort
is likely to translate into fruitless weeks, months, or even years of 
struggling to make headway with inadequate tools.

Because beating your head against a wall is neither satisfying nor 
productive, you should be wary of embarking on long-term efforts, 
whether formulated by yourself or suggested by a mentor or 
collaborator. It may make better sense to put off work on that 
important problem until new techniques have been developed—
perhaps by you, perhaps by somebody else—than pushing ahead, 
on the assumption that brute force will eventually lead to success.

Apart from whether you will be able to obtain significant results 
before your return to the job market or your consideration by a 
tenure committee, a serious peril of the brute-force approach 
is that a competitor will develop a labor-saving new technique 
and race to the goal while you are still struggling

One such technique, he mentions later, might be nuclear magnetic resonance spectrometry, whose invention almost immediately solved current outstanding problems of the time. While it may be you who develops these new techniques, even the pursuit of gadget development itself can be segmented further and inserted into a further overarching narrative for long-term goals.

Another very important strategy he mentions is having an adequate research pipeline, perhaps 2 or 3 projects at a time. When Project A stalls for whatever dumb reason-- I once had to wait 2 full weeks because our basement flooded-- project B will hopefully be there ready with fresh eyes.

Working on more than one project is the only way a
young (or any!) scientist should undertake an inher-
ently long-term project. I spent ten years (!!) writing a
computer program to model the energetics of atoms
and molecules on metal crystal surfaces. Although I
was able to publish several pieces of technical progress
along the way (e.g., mathematical tricks that made por-
tions of the computation more efficient), the really sig-
nificant science output could only be produced when
the computer code was substantially complete. I sur-
vived this project scientifically by establishing collab-
orations in which the tools required to generate results
were either completely or almost completely developed.
By devoting about 50 percent of my time to short-
term projects using these tools, I maintained a publi-
cation record—several new papers a year—adequate
to persuade my peers and my employer that I was not
brain-dead.

Furthermore, having multiple projects forces one to maintain a scientific breadth. It's important to come out from under the sea of papers to survey the landscape.

Without at all wanting to argue that you should strive to be
broad and shallow or that you should spread yourself
so thin that you are unable to make progress in any
area, I suggest that by having your fingers in several
pies, you are more likely to prosper scientifically. As
one area loses its scientific appeal, another with which
you are already familiar may increase in importance.
The clever ideas you learn or develop in one area may
be applicable in another. This can be an extraordinarily
efficient way to make progress.

Also, if you get scooped, you have more projects to turn to. This is especially likely if you're in a hot field. Scooping is such a real thing. It's nuts.

In fact, on the topic of hot fields--

Before moving into a fashionable field, you must ask yourself whether you have
a realistic chance of emerging from the mob as someone who has made an important advance. If the problem is solved and this hot area is the only one you know well, how long will it take you to establish yourself in
another one? Are your ideas sufficiently different from
others’ that you can hope to beat the competition to the answer?

A less risky course is to try to lead rather than follow fashion. 
One way is to think how a recent technical advance may have 
made a problem ripe for solution that had previously been untimely 
and therefore pushed to the back burner. Another is to make 
the needed technical advance yourself. That may require hard work.
But in compensation, you will likely not have to race to outdo 
competitors; few will want to invest the labor. If in the end you 
make a distinct advance in the technical state of the art, you 
will deserve, and win, considerable recognition.

Aside from working hard, you can reduce the risk inherent in undertaking 
a major project by making sure that enough money is spent on it. [...] 
In my own area of research, for example, great algorithmic advances 
have made it possible to compute the properties of solids in a 
fraction of the time that was previously required. Does this mean 
people are requesting smaller computer budgets? Not on your life!
They have scaled up the size of the problems they propose to solve. 
They are asking for bigger computers than currently available and for more computer time.

In other words, perhaps you can create insights on various techniques in the field. And, of course, when all else fails, just throw a supercomputer at it :)

Summary

I'm sorry this is such a long review. I would have condensed it more, but I didn't have enough time. I also encourage you to read it on your own-- It took me about a whole 5 hours flight to finish. There's also more tidbits of knowledge that I couldn't have in here, e.g. interviews and winning tenure.

If I could give a super extreme TL;DR of the book, it would be-- Soft skills matter more than you think. But I don't think many people realize (myself included) how much they actually matter until they get into the "real world". It's called soft skills! Why should I bother with them?! Is hard not better than soft??

I find that most of these tips are applicable even for normal industry work life. 

Your technical skill matters less than you think (unless you're Linus Torvalds). Networking is important. People have to like you (again, unless you're Linus).

View yourself from an external perspective. Your manager has to see some metrics somehow, even if you may disgruntle about it.

Committing code that is small, and often, is far better than one giant Pull Request.

Nobody wants to hear you present about the insanely cool complex distributed microkernel written in WASM. Just tell me how to run DOOM.

Lastly-- I, and Feibelman, have some final notes. He himself went "on the market" into a recession in '73, and had some very real fears. He almost was a research advisor under San Francisco mayor Joseph Alioto. It's possibly to do everything right, and still fail. Sometimes that's life. 

No matter how well you do in these regards, 
you will certainly still experience difficult times, 
have regrets about some of your choices, and possibly fail anyway. 
Nevertheless, your chances for having a scientific career will be
greatly improved.

I wish you every success!

New Comment
4 comments, sorted by Click to highlight new comments since: Today at 1:22 AM

It's to push our understanding of some crazy niche field and push it further beyond. The widespread recognition is nice, I'm sure, but I don't think most PhD graduates' goals are this.

Insufficiently Hansonpilled.

Sorry, what does "hansonpilled" mean? Does Robin Hanson have some insight on this as well?

I randomly decided to google “hansonpilled” today to see if anyone had coined the term, congratulations on being one of two results.

While he pushed that he could have more substantial results in 6-12 months, other job candidates had had 2-3 projects done in the same timespan as Tom, and all he could muster up was a "maybe perhaps possibly"?

This trap reminded me of this gem I recently stumbled upon. Here's the bottom line, so you can determine whether this seems unintuitive/worth exploring:

Many of our default intuitions about how to pursue uncertain ideas are counterproductive:

We often try easier tasks first, when instead we should try the most informative tasks first. We often conflate a high-level approach with a low-level instantiation of the approach. We are often too slow to try to disprove our own ideas.

Building frameworks that reify the research process as a concrete search problem can help unearth these incorrect intuitions and replace them with systematic reasoning.