The real reason prestigious journals (still) exist Now that we're in the 21st century, a talented scientist should be able to post their important research online so that people who will benefit from the work will have immediate access to it, and the scientist can move on to their next project without spending 1+ years battling with editors/reviewers. However, how do we know that the research is important? Taking a step back, how do we even know the scientist is talented and we should read their work? Herein lies a core problem of science today. There are so many scientists and so many papers published in such a wide range of subjects, that even a scientist with a diverse background will have difficulty picking up a random paper and deciding if the work was well done and advances the field. Why does this matter? When universities perform job searches they have to sort through hundreds of candidates, each of whom has published multiple papers. Even if the university had the man power to look through each publication, they would be unable to ascertain the quality of the work. So what do they do? They limit their search to candidates with the most publications and importantly, most "high profile" publications1. As an aside, you can make a strong argument that this is exactly how universities should be performing candidate searches. If a researcher has a history of publishing in high profile journals, perhaps he/she will continue to publish in high profile journals and be more likely to obtain grant money for the university, attract talented post-docs and other scientists, resulting in more money and prestige for the university, which is the ultimate goal after all. However, does it really make sense to judge a scientist by how many Nature, Science, Cell papers they have? In general, what determines if a paper is "high impact" is whether or not the paper will be of interest to the majority of readers. This can mean the paper makes a huge advance in a field, or it can mean the paper makes a very minor, if any, advance in a very popular area (how many papers do we really need that show CRISPR has off targets, wouldn't that effort be better spent elsewhere?). Therefore, it is the sexiness of the project, not its quality/replicability/reproducibility, that determines if it will get into a high profile journal. And because projects are determined largely by the lab, not the researcher performing them, whether a researcher publishes "well" is often mainly determined by the lab they worked in, and is an inaccurate measure of how talented they are. But what other option is there to determine the quality of the scientist? In addition to publications, an applicant will undoubtedly have multiple letters of recommendation from previous mentors. But the former mentors have an incentive to write a good letter of recommendation. Labs want to have a track record of placing former post-docs in professor-level positions in order to attract more post-docs, so they would be shooting themselves in the foot if they didn't write amazing letters. Well what about their post-grad course grades? Surely talented scientists would separate themselves from their peers in their courses, or perhaps have amazing test scores. Nope, after undergraduate studies researchers take very few courses, most of which are a joke, and no longer have to take any standardized tests. Finally we are reaching the root of the problem. In a given department/lab, it is very obvious who the best researchers are. They give the clearest presentations, ask insightful questions/give advice during lab meetings and seminars, schedule their day to squeeze in as many experiments as possible, send papers to other lab members that might be important for their work, are sought out for advice on how to trouble shoot experiments, and are frequently requested to collaborate on projects. However, it is possible that none of this will be reflected in their publication record or even letters of recommendation. But what if we just assumed that scientists trained at prestigious universities are good scientists? Because surely it was very competitive just to get into those schools. Funny you should mention that. In fact, it is extremely easy to get into graduate school and I personally don't know anyone who hasn't been accepted into a prestigious school, and have seen very mediocre students accepted into the most prestigious universities. The truth is grad schools don't care about undergrad GPA or GRE scores. They only want to know if you are likely to be a reliable source of cheap labor for around 6 years, and judge that solely by how much previous research experience you have and your letters of recommendation. So basically you're telling me we are accepting practically anyone into graduate school programs, only requiring them to take a few underwater-gum-chewing courses, and giving them a PhD as long as they eventually write a thesis that no one reads? Even the medical system, where many classes are pass/not pass and graduates near 100% of students, at least has licensing exams to determine if a student should be able to practice medicine. Yes, that's correct! We are accepting students into graduate school who have no chance of being successful scientists, and are hiding this fact by having no requirements for graduation. These students may end up doing a post-doc, but will never obtain a faculty position, and will end up working in biotech, as a journal editor, or a field completely unrelated to science. But why would universities do this? Graduate students represent a very cheap source of a labor, and are very important for new labs that have not yet recruited any post-docs. Not only are graduate students paid near minimal wage, but the NIH gives universities training grants that support students for two years. And in my experience, there are more training grants than students, so every student is guaranteed to be free for two years, and if the university is lucky, the student will get a fellowship and be free for five years. Ah, things are starting to make sense; I was wondering why universities offer such little training in the form of courses and workshops. It's because the university is only pretending to train students. Kind of like how Byron Scott is pretending to try and win games...actually it's really hard to tell, he's either the worst coach ever or the best actor ever. So what's the solution? The solution to having a system where only 10% of students eventually get the job they are supposedly being trained for? Well, I would start by accepting far fewer students. Medical schools limit their number of students based on how many students they are able to train, so why can't graduate schools do the same? This would have numerous benefits. Fewer students means less training grants are needed, which means the NIH will have more money. Fewer students also means labs will be smaller and have less active projects, thus requiring less R01s, which again leads to the NIH having more money. Less graduate students per lab will also allow more mentoring per student, leading to less improperly conducted experiments, which means savings for the lab. I would also have NIH grants to universities, whether they are training grants or R01s, be highly influenced by the eventual success of former students at the university. This will have multiple effects. First of all, universities will be highly incentivized to accept only the number of students they feel they can successfully train, which should result in them accepting fewer, and overall higher quality, students. Secondly, universities will actually have a monetary reason to train their students, which will result in higher quality courses, etc. Universities may also finally start having strict graduation requirements, because they wouldn't want an official alum to negatively impact their record. With the prospect of not getting a guaranteed degree, undergraduates will have less incentive to apply to graduate school, thus naturally decreasing the number of graduate students and limiting the pool to students truly interested in science. With the NIH having more money, and labs in general being smaller but more numerous, there will be more jobs for scientists, and there will be less scientists applying for these jobs. Where a scientist trained will start to actually mean something, and how well the scientist can train students will be just as important as where he/she will publish when it comes to obtaining grants. All of this will result in less emphasis on publishing in high profile journals...or maybe Nature, Science, Cell are never going anywhere. Thoughts? 1. In my experience, they then invite the researchers to give talks, and give offers to the candidates whose work they were able to understand.