In which case you have to ask yourself, are you so brilliant that you’ve found an important topic that no one has considered yet, or have all the brilliant people already figured out that topic isn’t worthy of study?
It’s the same with the startup world. If you’re the only one doing a thing, are you brilliant or foolish?
Most scientists work on topics that are quite niche. Most of those topics lead to nothing. A lot of good research took years to ripen enough to be of actual value. A lot of popular topics started out in a niche. Most of mathematics took dozens of years to fully come to fruitition. Can you decide beforehand which one will be the next big thing?
Today, most scientists go for the popular topics and whatever is on the government research plan to get funding.* Whenever the wind changes direction they change their topics because they need that funding.
*: This might seem to contradict with the statement that most scientists work on nice topics. But only on the surface. In order to get published you have to do something novel. So, you choose a popular topic and then research a rather unpopular side aspect on it like how a specific chemical behaves when applied to the popular topic. If you're successful you publish and continue. Citations come later or they don't but the next round of funding comes with publishing. After a few years without many citations you move on to the next thing.
On government plans often you need to publish and then it's done. The citations only matter long-term if at all. Most scientists don't achieve anything of greater value. They are happy if they can publish at all. If the institute has a few scientists with a high citation count it carries all the rest of them.
No, the answer is that there is a limited number of scientists and a limitless number of research directions. This doesn't have to be correlated with brilliance.
In fact, it can be easier to research some of the less popular paths because there is less competition and more low-hanging fruits.
There are so many gaps in our knowledge, it's ridiculous. Go look for papers studying how to kill the eggs of canine roundworms (e.g. in veterinary settings) or whether surgery is an effective treatment for exotropia. The literature is SPARSE.
Two economists are walking down the street. One sees a $20 bill and starts to bend over to pick it up. The other economist says "Don't bother - if it were really worth $20, someone else would have picked it up."
There's countless numbers of scientists that made great strides working on topics others deemed rediculous. Heck, many of the Nobel prize winners were ridiculed by their colleagues as borderline wack-jobs at the time they were working on their research. Even after winning the prize, some still were with their later work (Crick's search for consciousness comes to mind, and why it would be so worthless a search does not).
If anything, the hubris of the scientific community would be as deafening as the pseudo-science BS and hold back progress just as much if not more except for one key thing: the scientific method.
Luckily, we have a process by which crackpots get differentiated from geniuses. So let's not leave $20 on the ground assuming others would have picked it up, especially when that $20 represents collective progress for the entire species.
> Heck, many of the Nobel prize winners were ridiculed by their colleagues as borderline wack-jobs at the time they were working on their research. Even after winning the prize, some still were with their later work (Crick's search for consciousness comes to mind, and why it would be so worthless a search does not).
Do you have any good examples of being considered wack-jobs before their winning?
Semmelweis. He never received the Nobel prize, but I think he counts towards the point.
> Dr. Ignaz Semmelweis discovered in 1847 that hand-washing with a solution of chlorinated lime reduced the incidence of fatal childbed fever tenfold in maternity institutions. However, the reaction of his contemporaries was not positive; his subsequent mental disintegration led to him being confined to an insane asylum, where he died in 1865.
Not a Nobel prize winner (somewhat shockingly, given the contribution), and he wasn't (to my understanding) so much ridiculed, but when The Great Debate occurred, which was a debate about whether or not galaxies besides our own existed,
> if Andromeda were not part of the Milky Way, then its distance must have been on the order of 108 light years—a span most contemporary astronomers would not accept.
the size and distance of these objects (galaxies) seemed far too absurdly large to one side of the debate to be accurate; it would mean the size of the universe would be absolutely enormous. Of course,
> it is now known that the Milky Way is only one of as many as an estimated 200 billion (2×1011)[1] to 2 trillion (2×1012) or more galaxies[2][3] proving Curtis the more accurate party in the debate.
It isn't too hard to find examples of scientists who were ridiculed for their ideas and eventually win the Noble prize:
>...Stanley B. Prusiner, a maverick American scientist who endured derision from his peers for two decades as he tried to prove that bizarre infectious proteins could cause brain diseases like “mad cow disease” in people and animals, has been awarded the ultimate in scientific vindication: the Nobel Prize in medicine or physiology.
>...Prusiner said the only time he was hurt by the decades of skepticism “was when it became personal.” After publication of an especially ridiculing article in Discover magazine 10 years ago, for example - which Prusiner Monday called the “crown jewel” of all the derogatory articles ever written about him - he stopped talking to the press. The self-imposed media exile became increasingly frustrating to science journalists over the past decade as his theories gained scientific credibility.
>....The recent 2011 Nobel Prize in Chemistry, Daniel Schechtman, experienced a situation even more vexing. When in 1982, thirty years ago, he made his discovery of quasicrystals, the research institution that hosted him fired him because he « threw discredit on the University with his false science ».
>...He was the subject of fierce resistance from one of the greatest scientists of the 20th century, Linius Pauling, Nobel Laureate in Chemistry and Peace Nobel Laureate. In 1985, he wrote: Daniel Schechtman tells non-sence. There are no quasi-crystals, there are only quasi-scientists!
An example that is pretty well known is Barry Marshall
>...In 1984, 33-year-old Barry Marshall, frustrated by responses to his work, ingested Helicobacter pylori, and soon developed stomach pain, nausea, and vomiting -- all signs of the gastritis he had intended to induce.
>...Marshall wrote in his Nobel Prize autobiography, "I was met with constant criticism that my conclusions were premature and not well supported. When the work was presented, my results were disputed and disbelieved, not on the basis of science but because they simply could not be true."
It was Max Plank who said "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." - so this isn't a new issue and things are probably better now than they were in the past.
You're seriously underestimating the number of things there are to study and overestimating how many people there are to do the study.
This is nothing like tech startups, where you have tons of people sharing a relatively small problem space (creating tech tools companies want).
Consider for a moment: there are well over 350,000 different species of beetles. Theres just too much to study and too few people doing the work to expect there to always be a plethora of external research to draw upon.
This is actually a very interesting question that might take one down the rabbit hole.
Acquiring knowledge (I should say 'beliefs about valid knowledge') and brainstorming (and certainly collaboration and getting an advisor to adopt you) appear to be social activities, as much as purely logical and analytic activities.
Social activities like this, for social or herd creatures, are subject to flock or swarming patterns.
Maybe all the brilliant people are swarming around a locus of interest? It's certainly a good way to have the population explore the ins and outs of a, well, locus of interest. It's also a good way to have a loner get shunned by wandering off and poking at an uninteresting pile of dung.
I guess my point is: why not both? (Mathematically, statistically, egotistically, I know the idea that I am the foolish one is almost certainly more likely to be the case)
It seems like PhD candidates work on peripheral elements of their sponsor/tutor/professor's work, of that professor at some point is going to make a significant step then one of those PhDs will be along for the ride; not necessarily the genius one.
At the time, chytrids were about as obscure as a topic in science can be. Though fungi compose an entire organismal kingdom, on a level with plants or animals, mycology was and largely still is an esoteric field. Plant biologists are practically primetime television stars compared to mycologists. Only a handful of people had even heard of chytrids, and fewer still studied them. There was no inkling back then of the great significance they would later hold.
Longcore happened to know about chytrids because her mentor at the University of Michigan, the great mycologist Fred Sparrow, had studied them. Much yet remained to be learned—just in the course of her doctoral studies, Longcore identified three new species and a new genus—and to someone with a voracious interest in nature, chytrids were appealing. Their evolutionary origins date back 600 million years; though predominantly aquatic, they can be found in just about every moisture-rich environment; their spores propel themselves through water with flagella closely resembling the tails of sperm. Never mind that studying chytrids was, to use Joyce’s own word, “useless,” at least by the usual standards of utility. Chytrids were interesting.
The university gave Joyce an office and a microscope. She went to work: collecting chytrids from ponds and bogs and soils, teaching herself to grow them in cultures, describing them in painstaking detail, mapping their evolutionary trees. She published regularly in mycological journals, adding crumbs to the vast storehouse of human knowledge.
And so it might have continued but for a strange happening at the National Zoo in Washington, D.C., where poison blue dart frogs started dying for no evident reason. The zoo’s pathologists, Don Nichol and Allan Pessier, were baffled. They also happened to notice something odd growing on the dead frogs. A fungus, they suspected, probably aquatic in origin, though not one they recognized. An internet search turned up Longcore as someone who might have some ideas. They sent her a sample which she promptly cultured and characterized as a new genus and species of chytrid: Batrachochytrium dendrobatidis, she named it, or Bd for short.
That particular chytrid would prove to cause a disease more devastating than, as best as scientists can tell, any other in the story of life on Earth. After Longcore’s initial characterization, she and Nichol and Pessier proceeded to show that frogs exposed to Bd died. Other scientists soon linked Bd and its disease, dubbed chytridiomycosis, to massive, inexplicable die-offs of amphibians in Costa Rica, Australia, and the western United States. No disease had ever been known to cause a species to go extinct; as of this writing, chytridiomycosis has driven dozens to extinction, threatens hundreds more, and has been found in more than 500 species.
Almost overnight Longcore went from obscurity to the scientific center of an amphibian apocalypse. “Had I not been studying the ‘useless’ chytrids,” she says, “we wouldn’t have known how to deal with them.” Her research has been crucial—not only the initial characterization, but also her understanding of the systematics and classification of chytrids, which helped provide a conceptual scaffold for questions about Bd: Where did it come from? What made it so strange and so terrible? Why does it affect some species differently than others?
>In which case you have to ask yourself, are you so brilliant that you’ve found an important topic that no one has considered yet, or have all the brilliant people already figured out that topic isn’t worthy of study?
Imagine how many inventions we would have missed if all inventors had shared your mindset.
What? These are perfectly valid questions to be asking and do not inherently stop you from researching what you're working on.
I know some scientists that define their research direction by asking these questions first before pursuing an idea. Many great inventions like optogenetics or expansion microscopy came from this investigative strategy. It can help keep your resources and energy in check.
The purpose of PHDs are to move human knowledge forward. You have to do an analysis of something that, in all likelihood, nobody has done before (or not enough to be considered settled).
It’s the same with the startup world. If you’re the only one doing a thing, are you brilliant or foolish?