Hacker News new | past | comments | ask | show | jobs | submit login
Academia Is Eating Its Young (pressbuttongoboink.com)
151 points by irollboozers on March 23, 2013 | hide | past | favorite | 107 comments



For a while now, I've pondered the creation of a new scientific society, something which has the same 'vibe' as the Royal Society did in the late 1800 early 1900's. Sort of a 'gentlepersons group' of critical thinkers. The promise is freeing science to once again flourish, the danger is something like TEDx.

A society to promote the pursuit of science by the common man or woman, with rigorous debate and discussion. A society where everyone agreed on the ground rules about what constituted 'science' and what constituted quackery.

We are almost to the point where we have enough multi-billion individuals that such a society could be endowed to create a place for science and scientists. Research without the requirement to teach undergraduates, publication with the requirement of a Ph.D. I could imagine that people who could learn the discipline to do the research could be supported in that research by some facilities and a modest stipend. Not the crazy big investment stuff like Fusion Reactors, but smaller problems like characterizing digestive flora in developed and undeveloped countries.

Its probably a pipe dream. But I wish such a group existed.


One existing institution that is fairly close to you're imagining is the HHMI.

However, I think you are making two assumptions about research that are incorrect.

First, it isn't at all clear that separating teaching from research entirely would be a good thing; researchers need to be trained, and trainees bring diverse skills and perspectives to the labs in which they work.

Second, many of the "smaller problems" have been getting bigger, in that they depend on large expenditures for equipment and professional staff to keep things running. To execute your example of characterizing digestive flora from different countries, you'll need a facility with the latest sequencing equipment and people to run it, computational infrastructure, and boots on the ground to collect the data in all those developing countries.. it'll quickly turn into a multi-million dollar collaborative project.


Yes, I think the funding is the key. There are already funders who see it as their mission to support scientists and allow them to tackle interesting questions. But in the end, they have a limited pot of money to dish out, and they have to somehow make a decision about what's worth funding. That brings us back to spending a lot of time writing grant applications and trying to justify the work.

Ultimately, 'gentleman scientists' were probably able to follow their interests because they were rich enough that they didn't need to worry about earning a daily wage. That's still possible, but we'd much rather let science be done by anyone who's interested and good at it.

The OP mentions Microryza, their crowdfunding platform for science. That's a very interesting approach, but I think it only works if you already have some kind of paid job in science: the projects I've looked at are all asking for much less than a year's salary for one person.


There are places a bit like this, but they are not generally liked by the entire scientific community because they can sort of swallow highly talented individuals and take the pressure to produce off them, which does reduce their output. At least that is my (very limited) understanding.

One example of such a facility is Janelia farm, part of the HHMI, which was endowed by Howard Hughes ('The Aviator'). Another one is Woods Hole, endowed by the Rockefellers.


Gentlemen scientists tended to be free of the responsibilities of the common man. The vibe of the Royal Society was a bunch of stuck up nobility debating worthless issues. There is a vain of thought that says science was promoted during the enlightenment precisely because it distracted the nobility from aspiring for social change. Voltaire in Candide mocked this with a debate, if I recall on sheep color.


I'm sorry but your characterisation of the issues as being worthless seems to be entirely missing the point of intellectual thought. It is very difficult to predict what is fruitful and what is not when we are unsure of the destination.


> The vibe of the Royal Society was a bunch of stuck up nobility debating worthless issues.

To many, that wouldn't sound wholly unlike academia.


Research without the requirement to teach undergraduates, publication with the requirement of a Ph.D.

A lot of institutions like this exist and if you compare them with traditional universities, the results are at best inconclusive. Compare IAS to Princeton, for example. Is the quality of research at the IAS better than at Princeton? I really doubt it.


There is also evidence that your graduate students do better research if they teach. See http://www.sciencemag.org/content/333/6045/1037.abstract


Perhaps not, but the quality of research at IAS is surely better than that at all but a handful of universities.

Also, the IAS does things like have "special years" devoted to a particular topic, and then a ton of professors in that area take their sabbatical that year to go to IAS for a year. It would be difficult for a university to do that.


Perhaps not, but the quality of research at IAS is surely better than that at all but a handful of universities.

Fair enough, but they also have much "better" researchers than most other universities. My hypothesis is that if you tried to scale the IAS model to researchers outside the top 1%, it would be less successful.

I suspect the reason for this is that cross-pollination of ideas that you get from working with other researchers and graduate students, combined with the clarity of expression and understanding you're forced to acquire when teaching undergraduates is the reason why research universities have been so successful in advancing human knowledge. This isn't to say there is no place for other types of research institutions. However, I am skeptical of unsupported claims that there are "better" means of advancing human knowledge that are structured very differently from a research university.



It's not entirely clear that you know what the Royal Society actually did. It was hardly a model.

The Royal Institution is more along the lines of what you describe. Or here in the US, the Wood's Hole oceanographic institute. It was pretty much propped up by the Rockefellers.


A better historical parallel in America would be the American Philisophical Society [1], founded by Benjamin Frankin in 18th-century Philadelphia.

[1] http://en.wikipedia.org/wiki/American_Philosophical_Society


A fine institution; it served as kind of a national library, museum, even a pseudo-patent office until there were actual federal institutions funded for those purposes. Pretty significant no doubt, but it never conducted original research apart from "paying" for the publication of the Proceedings.

Wasn't until around 1940s that they even started having grants and postdocs, but they aren't significant enough stipends to conduct original research.


Wood's Hole and Janelia Farm were the 2 places that came to my mind, too.


A good proportion of the people who do biomedical research do it with little or no course teaching responsibilities. That is pretty much the model at most medical schools as far as I know, especially in basic science departments there.


Let's just do it.


The Nobel Prize in Chemistry winner Kary Mullis (the genius inventor of PCR) talks about this at length in an excellent essay in his book "Dancing Naked in the Mind Field". The book, overall, is ok - some of the stories/opinions he holds are "alternative" and a refreshing perspective.

"Because of science - not religion or politics - even people like you and me can have possessions that only a hundred years ago kings would have gone to war to own. Scientific method should not be take lightly.

The walls of the ivory tower of science collapsed when bureaucrats realized that there were jobs to be had and money to be made in the administration and promotion of science. Governments began making big investments just prior to World War II...

Science was going to determine the balance of power in the postwar world. Governments went into the science business big time.

Scientists became administrators of programs that had a mission. Probably the most important scientific development of the twentieth century is that economics replaced curiosity as the driving force behind research...

James Buchanan noted thirty years ago - and he is still correct - that as a rule, there is no vested interest in seeing a fair evaluation of a public scientific issue.

Very little experimental verification has been done to support important societal issues in the closing years of this century...People believe these things...because they have faith."


The article is rather vague on what he thinks are the problem with academia. All I can see is this:

Instead, it’s the politics, the inefficiency, and the in-bred hostility towards change has driven these incredible people out of academia.

Good luck getting politics out of human systems involving more than a few tens of people. It's just inevitable.

I don't know what he specifically means by inefficiency and in-bred hostility. One form of inefficiency I can think of is the constant grant writing professors have to do. But then this is just a product of penny pinching politicians and our anti-intellectual culture. I am not sure how academia can fix this.

I have no clue what he means by "in-bred hostility". I'm not the most sociable person in the world, but the number of positive interactions I've had with other researchers is vastly outweighed by the few negative interactions I've had with academic jerks.

The title of "academia eating its young" suggested me to the lament of many in the biomedical fields who seem go from postdoc to postdoc for years on end because permanent faculty positions are few and far between. Again, this is a direct result of the shrinking levels of funding going into research and higher education. This isn't academia eating its young, it's our society eating academia.


I think academia is eating the young if it lets people do that in exchange for producing papers that the PI gets the credit for. Precisely because it isn't the case in all of academia, in the parts where it is, I think it really is eating the young.


Some scientists are too old to change, and too proud to believe the results of their younger counterparts. It's stupid and it sucks, but it's crazy to think Microryza will fix this in any substantial way. Crowdfunding science encourages projects that are "sexy" but don't build toward real advances, and so are of questionable scientific value (two examples off the top of my head: http://www.nature.com/srep/2012/121115/srep00834/full/srep00... and https://www.nytimes.com/2013/03/01/science/new-research-sugg...). Moreover, projects that the average person can't understand are at a great disadvantage, and your success rate is tied to the ability to sell yourself, rather than the merit of your research or your abilities as a scientist, which the average person not in your field will be unable to assess. For all of the problems with the present grant funding system, I find it very difficult to believe that crowdfunding could allocate resources more efficiently.


This is part of a broader problem in society with baby boomers not retiring or stepping aside for the next generation as has always happened before and it has broken many systems.

Is this because boomers consistently voted for tax via over safety net and then squandered that money so cannot afford to retire? Maybe. Is it because they're healthier than prior generations at the same age and the size of their demographic bulge means there are just more folks who don't feel ready to retire? Possibly.

In the US tenure systems are killers for younger talent. In my subject they led to one dominant branch of physics dominating tenure tracks - string theorists - with no real opportunities for other ideas ...now 20 years on and string theory appears to have been a terrible squandering of a lot of talent with little predictive science emerging from all that work.

It's somewhat depressing.

I once had a letter published in new scientist when I was an undergrad pointing out that there were no career paths for scientists as financially attractive as the most basic entry level job available to non-scientists. But that was nearly 20 years ago. Things have just got worse since.


> This is part of a broader problem in society with baby boomers not retiring or stepping aside for the next generation as has always happened before and it has broken many systems.

-1. If a 60-year-old is doing good, productive work, enjoys it, and wishes to continue it, why should he or she retire?

Your argument, at least in the broader sense, seems to rely on the fallacy that there are a fixed number of jobs, and an older worker who chooses to continue working is "taking the place" of someone younger.

In more restricted contexts, such as academia, this is more or less true, and the tenure system does allow for older professors to stop pulling their weight if they choose. (Most of them don't do so.)

But are you prepared to demand of older, productive workers that they quit simply because you would like to take their place? I find such a sentiment to be profoundly selfish.


I didn't say they should. You're raising a straw man. I merely said this is the traditional structure on which our society has been working and it is now broken and having all sorts of repercussions.

To your other point there absolutely are only a small number of academic jobs in relation to population size. You can't have society made up of academics. Or any jobs which require a pyramid of supporting jobs to find them and make them viable.


> You're raising a straw man.

Perhaps I have misunderstood your point. If so, then I apologize -- and I also invite you to explain further. What exactly is broken?


For generations there was a steady transition of senior roles to you get generations at a pretty regular pace. For a number of reasons this hasn't happened with the transition from the baby boomers. This has led to a bunch of problems. A backlog if talent with no career progression opportunities. Because younger generations haven't had the positions of seniority open to the same degree they haven't learned progressively how to manage that transition either. This means that when the transitions do happen people will be moving into roles both older and less well prepared. It's a big problem in all aspects of the public sector in particular.


The one person I feel most strongly about this is Elizabeth Iorns.

Elizabeth was named one of the 10 most important people by Nature last year, for her work in the Reproducibility Initiative and with Science Exchange (http://www.nature.com/news/366-days-nature-s-10-1.11997). She is one of the most genuinely passionate researchers I know, and she cares deeply about doing good science.

If she is not teaching the next generation of scientists by the time I die, all of my work will have been for naught.

I could go on and on about the number of brilliant scientists who are struggling in today's system. There is a huge bottleneck of innovation and it's entirely self-imposed. Someone or something is going to blow that bottleneck to shreds, and the world will start to see incredible things.


The entire college/university ecosystem is poised for collapse over the next couple of decades. After centuries of monopoly in the collection and dissemination of knowledge, they are being challenged by higher value free market alternatives on the internet, and they have absolutely no cultural or instinctive capability to react to it.

I'm not sure where Research will end up, but I don't think academic institutions as we know them today will be around a whole lot longer.


There is no higher value free market alternative to academic research. This does not exist. Publication is a different issue altogether.

What do you have in mind - huge companies like Microsoft trying things and occasionally letting out crumbs? People blogging their speculations about the cause of autism? The problem with depending 100% on non-academic research is that either real information is not being produced in a rigorous way, or that details like methods are not published so it's reproducible, or that the key stuff is withheld for business purposes.


Perhaps you have not heard of Microsoft Research. [1]

[1] -http://en.wikipedia.org/wiki/Microsoft_Research


I took a random sample of 25 papers from:

http://research.microsoft.com/apps/catalog/default.aspx?p=1&...

68% of the authors hold academic positions.


Many of those academic positions are just for show, so that another university can't lay claim on them.

University of Washington has many 'sponsored' positions, but basically what it amounts to is Google, Yahoo, and Microsoft giving money in exchange for that researcher's time. Latest example would be Babak Parviz, EE prof who spends all time at Google (http://news.cs.washington.edu/2013/01/26/babak-parviz-on-goo...)


I just checked. Most had verifiable course loads (i.e. personal websites with course information).


And perhaps he has, but this seems to be an outlier, and Microsoft is collaborating with researchers in numerous academic institutions. If those no longer existed, and thus Microsoft Research was no longer collaborating with academic institutions, would Microsoft Research continue to share reproducible results of their work with the public? We can only speculate at this point, but I'd find it hard to believe that the free market would share as much.


If there were many more MSRs, in many more areas of research, I could see them being a viable replacement. But there aren't, and I'm not sure how we'll get there. What would it take to get a few dozen more companies to fund an MSR-sized research lab, and in other fields besides CS? Better yet, something broader, like the old Bell Labs? I just don't see it happening.


Traditional academic institutions serve two purposes:

1) Credentialing 2) Capital-intensive research

There is nothing on the internet that is a threat to these two core functions.


Her website says she's an Assistant Professor at the University of Miami. Doesn't that count as teaching the next generation of scientists?


The reward structure we live by is oriented towards extrinsic motivation even when work produced by intrinsic motivation is known to be of higher quality.

There goes a story that Edison was taught this lesson when he tried to sell his very first voting machine. He was told, "we don't want it if we can't mess with it". That's when he is suppose to have decided to only work on projects that people wanted.

Then there is Tesla - worked on projects without much regard to their social or economic relevancy.

Do not make this into a debate about Edison vs. Tesla, this is just to show how fucked up is this world we live in. I think all of us, hackers, hustlers, and designers, have made a similar choice at one time or another when we've decided to do it for the x or be true to yourself.


The problem with intrinsic motivation is that people use it as an excuse to exploit other people.


The problem with intrinsic motivation is that it often makes itself unaccountable to getting results.

And I don't mean economic results, I mean results of any kind. This is actually one of the reasons I'm in CS grad-school right now rather than anything else: I could spend my life hacking away at Cool Stuff as open-source projects, joining open-source projects, coding whatever for a day job. I could. But I've already done hobby stuff in realms that intersected partially with research, and found that it's just too easy to completely bullshit myself if I don't have real research training.

So I'm here getting the real research training, in hope of writing things that are less prone to self-delusion and bullshit.

No word yet on my final career plans, because the grandparent was definitely right about one thing: I do my best stuff when intrinsically motivated and I find money-motivated careerism more of a burden than anything else.


I always wonder when reading these just how much the author is rationalizing their choice to leave. If( and I'm assuming here) they were in grad school for CS and left , most of those problems really don't apply.

There isn't the glut of adjuncts and postdocs one sees in e.g Micro biology because industry absorbs most of them. Tech transfer isn't nearly as big a problem (though may still be a problem). And reproducibility and data fraud are not nearly as pervasive for the simple reason that a lot of CS work is not experimental in that sense.

This reads like someone's grab back generic issues with academia that don't all apply to any given field.


For all but the superstars with a desire to pursue academia, it is irrational to stay. Especially outside CS, a lot of quality people are heading towards oblivion (middle age with no job, no prospects) by doing a PhD. I know some of them, and it makes me sad to think about it. There just aren't enough jobs and funding dollars to go around. It's not to humanities adjunct-hell level yet, but it stopped being a sound economic decision a long time ago.

When people criticize academia I listen because I think everybody's got something to say, even if part of their message is self-justification. In non-CS sciences with fewer pathways to industry, it takes guts to speak out about the truth of the situation because the culture selects so strongly against it.


What we desperately need is far less people _wanting_ to do a PhD, and its not because it would reduce competition - it's because then the entire postgrad student environment would be forced to seriously consider how it attracts students.

Unfortunately its very hard to make a rational sort of choice at the junction point where you choose, because by then you've already decided you want to do a PhD and so any offer seems like a good one.

But you really shouldn't do this: you need to turn a critical eye to the facilities if you're looking at physical sciences, get a feel for the environment (basically, if people are joking something is always broken, that's a red flag because it won't be funny 18 months in), and try and figure out where your supervisor is at regarding publications - anyone who's established but say, lacks a Nature paper, or has one, needs to be treated suspiciously because their incentive to help you get papers published is low compared to wanting you to pursue high-risk/high-reward type activities. You also really need to pay attention if the project you're being asked to do matches up with what the lab/group you'll be in can do.

A PhD is taking effectively a massive paycut for nebulous future earnings potential, not to mention is asking a lot of your enthusiasm and creativity for a long duration.


>"it makes me sad to think about it."

I think this is the best way to sum up my original message. I wrote this blog post out of frustration.


Not sure how to answer your question, but one thing's for sure: CS is no longer just CS, isolated by itself.

CS is increasingly interdisciplinary, with cross pollination of students and teachers in other fields like bioinformatics, physics, math, economics, ecology, public health, etc. At least in these fields, it's pretty dinosaur like to just think of CS as its own thing instead of applied CS.

And per your assumption, I actually studied biochemistry and economics in college. I taught myself how to code as a way to get out of academia, but I'm pretty useless in anything related to CS research.


I think you hit the nail on the head with teaching yourself to code. Its a way out. Precisely because of that, most people in pure computer science PhD programs can leave and go to industry. This means there isn't a glut of postdocs and adjuncts. This doesn't necessarily mean you have any of a better shot actually getting the Phd or getting a tenure track position, but it means your not dead if you don't.


Yeah but you have to realize 'industry' is still a dirty word. Industry is not the solution that the adjuncts and post-docs are looking for, it's inclusion.


There really isn't anything substantive in this post. Almost all of the problems that are linked to are due to funding cuts.

"Science and academia are entirely broken today, but we can longer afford to wait for the dinosaurs to die." Please - what exactly is the author trying to say? What exactly is broken, who are the dinosaurs, and what do they have to do with it? Science works very well, as it long has, and in fact in works much much better than it ever has, due to increasing transparency and fairness in all aspects of the academic pipeline (admission, graduation, funding, peer review, publication, etc.)

Certainly, funding is tight, which might drive a lot of very creative people away - but a lot of very creative people remain, and not everybody wants to be in academia anyway.

Also, it's true that academia can be very inefficient, but that's not globally true, and neither is it an academic problem alone. Most importantly, it's something that can be fixed.


The comments for this article are loading on top of the article itself. It's completely unreadable. Anyone else seeing this?


Yes, but Safari Reader saved the day for me.


Just stop loading the page immediately after the article loads, and before the comments appear.


Sorry, someone else mentioned this. It's working on mine, but I'll get a fix in later.


It's still doing it.


can't read on chrome 20, ubuntu 12.04


All I saw in that post is basically:

- Science is broken and needs to be fixed

- A 22-year-old bioengineer taught himself how to code

- The author started Microryza

and now this is on the HN front page. Did I miss an important sentence somewhere, or is that the actual summary?


It fits the meme of "entrepreneur saves the world with a simple hack that revolutionizes a stagnant market", so it gets upvoted by people who aren't as familiar with academia (i.e. most people here) and thus can's see past their biases and through the platitudes.


Science is notoriously bad at fixing its own problems. The meme plays well on HN because most people here believe in building solutions for themselves.

We were scientists. We didn't want to deal with stupid shit. So we built something for ourselves. This is my cofounder crowdfunding her own project: https://www.microryza.com/projects/repurposing-potential-ant...


I can't wait to write a grant application targeting the average Internet user rather than experts in my field. That will surely be a productive use of my time, and will undoubtedly result in a more fair distribution of money.

Honestly, I'm not trying to be an asshole and I commend you for your spirit, but I just can't see how you and others think crowdsourcing science funding is a good idea. Science has problems, but Microryza solves none of them and introduces myriad new issues (well, it would introduce new issues if it was successful, but I find that to be a dubious proposition).


We openly tell researchers that crowdfunding isn't for everyone, at least not yet.

If it isn't valuable for you, that doesn't mean it can't be valuable for anyone else. Just don't do it and wait around for others to validate it.

I don't believe you're being an asshole. You're just picking up the mantle that being a scientist automagically entitles you to know what's best for other scientists.


I'd have to share the parent's skeptism. I'm in biostatistics and neuroscience, and for the most part, could evaluate a stats or neuro proposal. But, if I were to look at, say, an astrophysics or physical chemistry research proposal, I'd probably have little to no idea on the proposal's merits. So I'm curious how Microryza will help non-domain experts evaluate proposals.


Whilst academia has its problems, they tend to be no different to any other walk of life. Yes there are politics, yes there are cliques and annoyances, and yes inefficiencies and crazy decisions made. But I don't think there are many institutions that don't suffer these problems because generally they are staffed with human being, and humans, being what they are, tend to be flawed, political, mistaken, and sometimes brilliant.

Academic institutions are unique for enabling people to work on research problems in an environment that is extremely rich and productive, its not for everyone, its certainly not going to make most people rich in cash, and its structures can infuriate as well as enlighten. Thankfully there are other options in life, and other opportunities, but before you rush to reduce the universities to ruins and homogenise research activities, lets just stop for a second and ask what it is the universities uniquely bring to the table.

Maybe, just maybe, there is something worth saving in the idea of a university. Something that may not fit certain people, that may pass over some brilliant people, that may even struggle to satisfy the instant always-on consumerist education that is so fashionable today. Indeed, the university on a social level is a public good. And that public good, transcends the inefficiencies, the politics, and the other complaints pointed toward the universities. When universities are working at their best, they are for the public good, contribute to the public good, and help make society a better place for everyone.


I think you're touching on a really interesting subject, which namank elaborated on in his comment. It's too early to tell, but it seems education is heading in a more 'intrinsic' direction, with less structure and more exploration. On the issues of publishing and funding scientific work, these are kind of the same problems facing all creators now. The rate of creation has gone up tremendously quickly, and we are still developing the tools to improve efficiency.


The article implies that Microryza funds independent scientists, but their website says they only fund scientists working at established academic institutions, and I verified in the last discussion of the subject [1] that that was indeed the case, at least for now.

[1] https://news.ycombinator.com/item?id=5280236


We're already experimenting with independent researchers, see: Elizabeth Iorns' BRCA project which will be outsourced through Science Exchange.


This article is wholly devoid of motivation and characterization.


It was intended for the HN crowd who believes in build first, ask questions later. Hoping to get more people to build solutions that fix science.


Besides all the reasons here why the system is broken, academia is also frequently a source of free/cheap labor from students trying to publish something or just get some experience. So, at least where I came from, in the constant presence of fund cuts, the PI just replaces whoever lost their grant with a volunteer student, or perhaps keeps that person working for free with the perspective of finish and publishing the work or with the vain hope that he/she'll get paid when the next round of funding comes. No matter what, the PI gets his name on every paper and also his salary (in most cases, that is). Seems not surprising that people start leaving this "profzi" scheme (http://www.phdcomics.com/comics/archive.php?comicid=1144).


Wow, this is an interesting article and I agree with much of it. I've had my PhD for about 2 years now (neurobiology) and I agree that academia is eating..itself. The inefficiency, politics, hostility are some of the things, but much like any other governmental thing, it's just run poorly...and that trickles down to the PIs, students, departments etc. [edit] this got really long, I'm going to create a blog post on this instead and host it at another time [/edit]

Here is an example:I applied to a training grant (t32) that will pay my salary and help me through my PhD. It was not funded initially because my PI's funding was running out in a year. <angry bold>I was applying for funds to continue my education so my PI wouldn't have to pay me (so we could use that money for my research), but they wouldn't fund me because my PI was running out of funds...to fund me. </>

I resubmitted the exact same grant, with zero changes, but I included a letter staying that I'd be covered if my PI ran out of money. Then I got funded. Total time? 1 Year! It took a year for a paragraph to propagate through the granting agency, as I had to resubmit the grant again and wait 6 months to get a vague answer (priority score).

This is one of many examples I have that are consistent with the OPs article, the system is broken. And this is a wide spread problem that is going to require all of science to really make a change happen. I'm intending on doing so with my startup (just applied to YC) to bring transparency, efficiency, and opportunity to scientists. My goal is to bring the semantic web to academic science to start linking data sets between fields. Positive, negative and pilot data all available in one spot so I can run meta analysis on these data and create preliminary data for a grant or future studies. (there is more to it, but this isn't the place to plug my site)

What I'm most concerned about isn't the technology, it's the people using the technology. I recently got this comment, "Wow, that is a brilliant idea....but good luck getting scientists to use it." The dinosaurs are scared of change, and their fear(ignorance?) is by far the hardest thing I'm going up against. It's also one of the biggest problems we are going to have in science.

Still gotta try though:)


Thanks for the response. You included real world examples from academia, and your supportive attitude means a lot to us.

I'm really interested in what you're working on. If you need help with your YC app, send me an email and I'd be happy to look through it. (denny_at_microryza.com) I'm always happy to hear what other scientists are working on.

Keep going!


If (American) Academia isn't eating its young, it is chasing us away. This article explains exactly why I couldn't bear to do a PhD in the US - instead, I can do it with better funding in half the time without any coursework or comprehensive exams in the UK. It may not be as "valuable" in the US, but it keeps a number of options open for the future, while not boxing me into academia, which could end up being a dead end anyways if the bubble bursts.


It seems to me that Microryza itself is an example of one of the key problems facing Academia.

You have very successful sites in this space, and Microryza doesn't seem to offer much beyond these in exchange for their smaller user base and less validated templates and formats (please correct me if I am wrong). It seems to me that everyone on Microryza would have been better served by using Kickstarter or one of the other more established options in this space.

So why don't they? Why does Microryza exist?

To me the answer is the standard academic elitism. A persuasive thread in academia is that everything outside academia is inferior to what is done by the elite in the ivory tower. So accept the long hours and long pay, because you are a member of the elite. Research this arcane irrelevant problem, because you are a member of the elite. Submit to only these few journals and conferences, because you are a member of the elite. Etc. Etc. Mycroryza seems to be embracing this elitism.

Now, this isn't to bash on Microryza; if they can pull academia into this space, that would be fantastic. But I think the fact that they need to exist to make it happen is a shame.


(This will be hopeless to find in this massive thread, but...)

The OPs service-have laymen fund science-solves exactly the wrong problem (though he's on the right track.) The problem is not too little money, it's too much research because of the massive incentives to crank out crap, and the inability of the expert funding committees to sort through it all. Kickstarting this will just lead to funding the flashiest pseudoscience. We need to allow researchers more freedom to produce rarely, not encourage hype even more.


1. Couldn't read the article, but get the gist. 2. Years ago I got a B.A. in business. 3. I can honestly say 90% of the courses were, pretty much a waste if time. I literally had one instructor tell his students, "I really don't want to be here, so I don't exect you to even show up". Being young and naive, I went to a few of his classes; He literally read, verbatim, from a Accounting Book.

4. The Internet can teach a person much more than most colleges, with the exception of a professional degree--medicine, or engineering. Even then, the degree will just open the door. I've met engineers that couldn't build a house.

5. If you have the money, go for a degree. The women are plentiful--that's about all I remember. Oh, yea--learn what the placebo effect is, and don't let others take advantage of you. Be careful with the student loans-- they are not bankruptable--at least for now. Hopefully, that might change.

6. The most successful tech guys I have know dropped out of school, and learned to program on their own.

7. I've met too any people who feel guilty not completing the degree. I truely, feel they are better off--really. I have seen too many Ivy League guys "skate" on projects, while the high school drop out works 2x as hard, and in the end; most managers do take notice of the "Contributers" and eventually realize the guy with Stanford sweatshirt doesn't do much, besides look cool in the company pictures.


> 6. The most successful tech guys I have know dropped out of school, and learned to program on their own

To a large extent, given the values of US universities, that's mostly inevitable.

What are such values? In a university, the profs are expected to be doing 'research', not just teaching what's already on the shelves of the libraries or in commercial products. In computer science, the 'research' is supposed to be finding the 'fundamentals' of computing, e.g., the question P versus NP.

So, there can be some first courses in computer science that concentrate on teaching a programming language, maybe Java, and then some later courses in algorithms, data structures, compilers, database, etc., but, still, turning out people ready to 'hit the ground running' in a serious programming team is not really the goal.

Now, there are community colleges, but there is a 'quality' problem: First, the better students tend to go to universities. Second, where is a community college going to get someone who is a right up to date software team leader to teach and, also, do a really good job teaching with preparing the course materials and giving individual attention to the students?

Net, at least in the US, it's long been the case in computing that mostly or even entirely have to be self-taught. Heck, I've taught computing to undergraduates at Georgetown and to graduate students at Ohio State but I essentially never took a course in computing and, instead, was self-taught and before such teaching had a good career going in industry.

Self-taught's largely where it's at. Sorry 'bout that!

Solution? If you can think of a topic that needs some good teaching, then get smart on the topic, write a book, sell it in some form, maybe Kindle, develop some lectures and put them on YouTube, have a blog, etc.


The good stuff is on the shelves of the research libraries, and it's not going anywhere.

If someone does a lot of good research early in their career and publishes it, then they can get funded to continue. In the US, there are billions a year for research from NSF, NIH, DoE, etc.

For teaching, did I mention that the good stuff is still on the shelves of the research libraries and is not going anywhere?


I read everything he said and then read it again substituting "law school" and "law" where needed and it still worked.


Look at the URL in the picture at the top. He did a search before doing the autocomplete.

This changes the results.

Try it. If you don't do the search for what you want to autocomplete, you will not get the same autocomplete options, in this particular case.

So he's basically posting a picture of doctored autocomplete options. If he just went to google.com and typed that in the search box, he would not get that dropdown.


I've been doing a lot of analysis of organizational decline over the past month, and what characterizes R&D (including academia) is extreme convexity: http://michaelochurch.wordpress.com/2013/03/14/gervais-macle...

Convexity pertains to the shape of the curve for output (profits, value added, impact) per input (investment, skill, effort, and just plain luck). It pertains to desirable risk behavior (convex => seek risk; concave => limit it). Most of the "fun" work (arts, sciences) is convex. It's creative and difficult and you usually don't get directly paid, but when you have a hit, it's Big. The problem with convex work is that everyday people can't handle that kind of income variability. Institutions can, and if they're working properly, they buy that risk.

Some areas of work are so convex that only altruistic financing (public or academic funding without repayment expectations, long-term implicit autonomy) are tenable. Science and most of what academia does will return value to society, with interest, but the convexity puts such a time gap between the creation and capture of value that an institution required to capture value generated (e.g. a for-profit corporation) wouldn't survive at it. You need implicit trust and autonomy for that.

I like the idea of academia and like the idea of saving it, but it'll be hard to do. The system is now generationally broken and it will take heroic efforts to heal it.

When you start studying institutional decline as I have, you learn a few things. First, institutions are all about moving risk-- finance, in other words, although sometimes of a more abstract kind than what happens on Wall Street. There's nothing wrong with that. Risk transfers are great. The professor gets stable financial mediocrity (which most people would accept, even me; I've met the $2M/year Wall Street crowd and they're just as unhappy as anyone else) while doing exciting work, and doesn't have to worry about capturing the value. However, the second thing you learn is that the MacLeod cartoon (Losers, Clueless, Sociopaths) is the truth (a Loser/Sociopath risk trade) and The Bad Guys really do exist. They turn what were once fair risk transfers into "heads, I win; tails, you lose" affairs. That's what academia is, these days. Professors no longer get the autonomy (freedom from market risk) they were promised until their most productive years (due to the unsustainable nature of what it takes to get tenure, and midlife burnout) are behind them. Instead, post-1980 they have a great deal more career risk than they should have, given the obvious convex value of what they (as a group) achieve. They work really hard for many years for someone else's benefit, and most get tossed aside at the end of it. Heads, I win. Tails, you lose.

Obviously, the "trickle-down economics" 1980s were horrible and started this looting and tearing down. The demolition of the academic job market started then and hasn't stopped. However, looking into it objectively, there is one thing that professors did that ruined their game (for most of them). They began, as a culture, to devalue teaching.

It's not the fault of the people who are 28 now and trying to get professorships. They weren't even alive when it started. But the Baby Boomers created an academic culture in which research (often esoteric or inaccessible to outsiders) was "the real work" and teaching was just commodity grunt work, to be tossed aside to $15/hour TAs and otherwise scaled back (200+ student classes). This "teaching is a commodity" attitude led to a greater society (unable to see the value of research) hitting back with, "then why the fuck are we paying so many of you?" Now academia is dying. When you have uneducated, right-wing idiot state senators who didn't get feel like their professors gave a shit, they repay the favor in 20 years by cutting funding for 20 years.

The moral lesson (and it applies to us as programmers, too) is that teaching (for us, documentation and outreach) isn't commodity grunt work. It's vital. It's often where most of the value is added. If you blow teaching off, the world will lose interest in you and pull investment. I would say "you deserve it" but, in the academic sphere, it's a different (younger) set of people getting whacked for it.

Sadly, that karma was slow to act on academia. It was Baby Boomer careerist narcissists who copped that "fuck teaching" attitude, and Gen-X/Millennials who got the shaft... like so much else in society.


This is silly. It's purely a historical fluke that the role of professor (scientist/teacher) even exists.

There is no reason whatsoever why the same person should both create new knowledge and also teach calculus/cs 101/etc. I was good at research, but I sucked at teaching [1]. In contrast, many people are really terrible researchers, but actually pretty good teachers. (As you note, there is a glut of such people, so their wages are low.)

Separating these roles is a good thing.

[1] Due to a lack of standardized tests at the college level, I have no way to actually know this. But I do know my students didn't like me, complaining that I was too hard and unsympathetic.


My ideal world: Basic courses should be taught by professional teachers. Intermediate courses should be taught by practitioners who also have good teaching skills. So an junior year software engineering course might be taught by a Google programmer who was good at teaching and was taking a one year sabbatical from Google to teach. Or the course might be taught by a programming researcher who was good at teaching. Master level courses should basically be the student doing an apprenticeship under an existing master practitioner. So someone wanting to be a computer science researcher would apprentice with an existing researcher. Someone wanting to be an engineer at Google would do an apprenticeship at Google.


It's hardly a fluke that the creation and dispersion of knowledge are tied together. The reason research universities exist is because undergrad were willing to pay money to learn from people at the forefront of knowledge. The problem isn't having (most) of those researchers teaching, it's that our society now wants to send 50% of people to college rather than the historical norm of order 1%. The researchers are spread too thin, and the mixed bag of lecturers must necessarily pick up the slack.

If you correct for the fact that the best researchers have zero incentive to teach (i.e. by just selecting those researchers and lecturers who are internally motivated to teach regardless of incentives) the researchers are way better on average.

There's a place for great researchers who suck at teaching or are so great at research that teaching is a waste, e.g. the institute for advanced study. (And frankly, most of the researchers at the institute should probably be teaching the best grad students.) But, as someone who was a grad student after going to a fancy undergrad school, I think it's clear that having most researchers teach is a fantastic Good Thing.


I don't think it is a fluke, it was just a tactic to extract a guaranteed baseline real value from intellectuals where they could jump from research project to research project for years without a return.

Which is perhaps the greatest bane of capitalism - that which has poor returns in the immediate time frame is always passed over for instant results. It hurts our long run dearly to be so inefficient.


"There is no reason whatsoever why the same person should both create new knowledge and also teach calculus/cs 101/etc"

How about that the person using the tool every day in the course of their work is probably going to be the person best placed to teach others how to use it? Obviously teaching is a skill and craft that has to be taught, but given two people of equal teaching ability, the person using the tool every day should be the better choice, surely?


Hmm, I think the problem is slightly different.

The job of a tenured professor in 1980 was pretty sweet. Upper middle class salary, great job security, a soap box from which to influence impressionable minds, freedom to pursue intellectual interests, etc. Naturally, as with any career track that looks pretty sweet, people rush to enter the field. Thus the great numbers of grad students.

There are basically three ways a guild profession can react to an influx of new demand: a) lower the salary and benefits of the master guildsmen to make the profession less desirable and to allow more overall hiring, until supply and demand balance out. b) make the entrance requirements more and more onerous in order to restrict entry c) use a lottery system.

Option a) is out of the question because it hurts the powerful senior professors who run the existing system.

Option b) is the path the universities took. Earning tenture requires more years in grad school, more years as a junior professor, and more risk of never reaching the goal. And the actual research requires mastering extremely esoteric readings and/or mathematical techniques that have virtually no use for actually doing productive research. But requiring mastery of these techniques simply became the default filter. When you have too many grad students, the default is to make the tests harder until you have sufficiently reduced the number who can pass. They then become the new professors, and thus have an inordinate love and respect for the esoteric test material that allowed them to get their current job. The result is the death spiral you describe - burnt out professors who are out of touch with reality.

Option c) a lottery, is the route academia probably should have taken. Set some sort of bar for admittance into grad school - great GRE's, great grades, and great recommendations, and then simply lottery off the spots. Keep the spots few enough so that there is a clear path to a tenured professorship.

But the real underlying problem is that academia is not accountable. Academia is suffering from the problems that plague any self-selected priesthood - over time its practices grow more esoteric and out of touch with reality. There are a few good departments out there, but most departments neither do good teaching nor good research.

Unfortunately, the body that has the lawful right to regulate academy (Congress and State legislatures) are too dumb and too fickle to do the job well. What "reform" happens takes the form of temper tantrums that do little to fix the system.

In the end, I think fixing academia is thus a "regime change complete" problem - ie. not possible in the American political system as it currently exists.


I think that, in a sense, a) would have been the right solution here.

While I'm not sure this is the case in all fields, at least in mine, when you apply for grants, people too much about how famous you already are and what you have published. People don't care about how many grants you already have, or how much money you burned through to produce your previous publications.

This is bad.

First, it creates the illusion that there are more scientists out there than can be sustained by grant funding, when in reality the problem isn't lack of funding, but the fact that the funding gets concentrated in the hands of the highly successful instead of being evenly distributed to the highly and moderately successful. Thus, pursuing a career in science becomes risky because you might not be able to get a job (as GP stated), and so people decide they don't want to do science.

Second, it's inefficient. A professor at my institution employs 50 people and produces about 1 paper per year. 15 moderately successful professors with 2 people working under each of them could do much better than that. But he has a Nobel prize and they don't.

Finally, it's horrible for the people in these labs. This is where accountability comes in. A professor with a small lab is accountable for the success of each of his students/postdocs. With larger labs, professors can allow students to fail, because their success is no longer necessary for success of the labs. Moreover, professors run out of ideas and waste their trainees' time. (I have heard rumors that a different Nobel prize winning scientist hired multiple people to work on the same project, not as collaborators, but as competitors. The person who won got a high-impact publication. The person who lost got nothing.)

There's no obvious solution here, since science can't be funded completely algorithmically, but to start I think that funding agencies should be requiring professors to disclose both the number of personnel they employ and their grant history when they apply for further grants, and take this into account during the review process.


A quick comment about point 2: A lot of academia is about quantity over quality these days. One high-impact, well-designed paper with strong conclusions could very well be worth more than 50 weaker papers. Unfortunately these days, many people, especially those outside of the field, may perceive the 50 weaker-paper lab(s) to be more "efficient" than the 1 strong-paper lab.


> One high-impact, well-designed paper with strong conclusions could very well be worth more than 50 weaker papers.

This is true. Assessing impact is difficult, and journal impact factor is not a great proxy. However, at least in my field, most Nature and Science papers aren't worth 50 second- or even third-tier papers.


In my field (chemistry) plenty of people utterly loathe high-impact factor journal papers. They do something that looks impressive, but are a useless source of detail on _how_ it was actually accomplished, since more often then not they omit - not intentionally really - something which turns out to be very important to the process.

Several simpler papers not trying to be as dramatic are going to be far more insightful since you get the descriptions of what was found _not_ to work about the method they were trying to replicate. Weaker papers - to me - are gold, since they're what you publish when you were trying to copy something and it just doesn't work as well as is being claimed.


Yes, the risk-taking strategy of working towards a potentially high-impact publication is not worth the reward (as measured in citation rate comparing low-impact and high-impact articles )- see Foster et al Tradition and Innovation in Scientists Research Strategies http://arxiv.org/pdf/1302.6906.pdf


Who exactly decides how to allocate these grants? Professional federal government bureaucrats? Fellow professors in the field? Do you have any insight on why they choose to allocate the grants in such an inefficient way? Is the classic "no one ever gets blamed for buying the name brand" dynamic (where the famous professor is the "name" brand)? Or is it more sinister insider dealing and back scratching?


I've never participated in the grant process, so I don't have the full information here, but I do know that grants are scored by other professors in the field.

Part of the problem is that the grants are numerically scored, and this scoring is primarily based on the merit of the proposal and the publication record of the lab. The lab budget isn't taken into account, and no one actually checks to make sure the project hasn't already been funded (see http://www.nytimes.com/2013/02/05/science/study-suggests-dup...). There is additional funding for young investigators, but otherwise, the funding agencies treat a process like they are funding a project, when in reality they are also funding a lab. The assessment is with regard to scientific impact, not scientific impact per dollar.

The "no one ever gets blamed for buying the name brand" dynamic is probably a factor, but I doubt it's a very big one. If you win a Nobel, it seems that you're guaranteed grants for life, but otherwise, your research still gets defunded if you don't produce anything for a while, even if you were once at the forefront of your field. People are probably more likely to vote to give grants to their friends, but I don't think there's systematic manipulation.


Interesting, thank you for sharing.

I think it's a major problem that there is no external accountability. The grants are scored by peer professors, based on publications (which again requires approval by peers). This creates a bunch of problems 1) it's very easy for entire field or subfield to go down the wrong track, and have no way of getting out of that track because the professors controlling grants and publications are the ones doing the bad research. 2) there is a bias towards publishing research that shows some new finding, rather than research that shows that they could not re-produce some previous science. The entire idea of basing science off of peer review is erroneous IMO, science should advance via replication.

In my ideal world, research would be funded in three, independent streams: 1) Prize based. The government would allocate a yearly budget of some very large amount of money - say $30 billion. It would create competitions and prizes in all sorts of fields. Example contests might include: self-driving car race, robot soccer olympics, longest battery life for $x of total materials used for making the battery, most yield of a plant strain per acre, etc. Teams would receive money for their research in return for placing or winning the contests. Research teams would also get money for achieving objective milestones:

2) Commission based. The government needs some new technology, maybe an upgrade of stealth tech for their fighter planes, so it pays a research institute to develop that tech. Unlink grant based funding, the commission is granted by the outsiders who need the tech, not by professors in the field giving to it to fellow professors.

3) Open Ended Institutional Funding. A certain number of reputable institutions should simply be given permanent annual funding. No grant process, no paperwork, no bureaucracy. The researchers would be overseen by an executive director, and the executive director would be overseen by a board of directors. The board of directors would consist of outside experts with a proven track record. For example, the board of a computer science research university would have people like Google's Urs Hölzle or Guido or Trevor Blackwell. The researchers would not have to apply for grants. They would just have to convince the board of directors that they were generally working on projects that had potential benefit to humanity, or else they would get fired. Every institution would be actually independent, governed by the personal judgement of that institutions board of directors.


Addressing #1: Most academic research the government funds is in basic science in expert domains, e.g. intercellular communication, neural firing patterns, etc, and much of this basic science is exploratory. For most projects, it would be difficult for everyone to agree on the goal and metrics of a competition. Sure, this might work for high-profile projects, e.g. establish a map of some organism's genome or connectome, or applied projects like your examples, but prize-based competitions would be difficult to implement on a large-scale. On another note, who would judge these competitions? For most domain-specific research, it would be peers in those domains -- so not much would be changed.

Addressing #2: This already exists. The government does contract out many tech projects, e.g. to Lockheed Martin or Boeing. Life science labs I've been associated with, have also received contracts from the NIH (as opposed to grants). While these aren't commission-based, these contracts are probably what you have in mind.

Address #3: This also already happens to a degree, typically with public research universities. Especially when a new center opens, the state + federal government will give some minimum funds for the center or institute to keep their lights on. But no funding is ever "permanent", especially when you have politicians slashing research budgets. Finally, it's interestingly that you mention that these institutions would be more or less under the complete control by an executive director or a BOD. The article mentions that having individuals of such power over a field (as these executives would have) stifles change, and I agree. It would be too optimistic to believe that these executives would not pick and choose projects of their liking, which is actually pretty much what currently happens. Plus, not having to submit grants virtually guarantees less accountability.


#1 I agree that prize-based research only works in certain areas. That is why I would make it only one of the three ways to fund research. I think that any competition should be judged by outsiders, if the competition cannot be accurately judged by outsiders then it is not a good competition. For instance, a self driving car race can be judged by outsiders.

For #3 the accountability comes from the board of directors. Right now the highest ranking professors are basically accountable to no one. If a field starts to stagnate, get to esoteric, or close off to new ideas, there is no one with the power to get it unstuck. The top professors do not have an incentive to get it unstuck, because they are the most successful in the status quo, their status is directly linked to the current direction of the field. An independent board of directors has no such perverse incentive. Also, each institution would be actually independent, unlike with the grant system in which grants are centrally allocated by a network of the existing professors. So if one institution goes the wrong direction, another institution might try and make its name by trying out a novel area of research.


> Option b) is the path the universities took

You're sort of right, but we aren't deliberately weeding out people.

This year, I was on the hiring committee for a tenure-track mathematics hire in my academic department. The job market is extraordinarily competitive this year, and sucks rocks for candidates. It follows that we were able to hire somebody truly outstanding.


My question is what did you pay him? If the job market is really such a buyers market, instead of hiring one guy at $100k/yr, why not hire two at $50k/yr?


I'm not sure, probably 75-80k, for the most part pay is very inelastic.

Whether fairly or not, I anticipate that any university attempting to pay below 50% the going rate would quickly get a very bad reputation. I would probably take a 50% pay cut to work at my dream institution, but I'm definitely in the minority (and also I don't have kids).


You might assume the total cost of those 2 options is equal, but when you look at the overhead/additional compensation (insurance, retirement, etc.) for each position, it might actually be cheaper to hire 1 at 100K rather than 2 at 50K.


> it might actually be cheaper to hire 1 at 100K rather than 2 at 50K

You're totally missing the point. The GP was just giving an example. Adjust the numbers however you want so that the latter option is cheaper.


There is a bigger issue at work here. You can pay the one professor 100K, and have a slightly more relaxed professor that is likely to be more productive, or hire two at 50K that end up spending a large amount of their time looking for better opportunities, worrying about money, losing self esteem when comparing themselves to other professors that are making that 100K, etc. There is a threshold below which, you are just throwing away money.


So what you're saying is that this is an overfitting problem?


(due to the unsustainable nature of what it takes to get tenure, and midlife burnout)

I'm in academia right now, and this is indeed a problem. I generally only meet two kinds of academics: the perpetually tired (like myself) and the perpetually energetic. This may have something to do with the ways our bodies metabolize the utterly ubiquitous, universal stimulants (ie: loads and loads of caffeine) we take to cope with our relatively long working hours (it's not that unusual, for myself as a grad student or for many professors I've met, to start the day at 9:30am and finally leave for home at 7:30pm, then still be answering emails and doing work at home until it's time to sleep).

I'm perpetually tired because I keep developing a tolerance for the stuff every time I find a stronger coffee, and the habit of Israeli institutions is to start early in the morning, so I keep having to override my natural sleep rhythm with coffee -- thus requiring me to find ever-stronger caffeine sources even though what I really need is to just wake up somewhere from 8:30-9am and have one cup of black tea.

The moral lesson (and it applies to us as programmers, too) is that teaching (for us, documentation and outreach) isn't commodity grunt work. It's vital. It's often where most of the value is added. If you blow teaching off, the world will lose interest in you and pull investment. I would say "you deserve it" but, in the academic sphere, it's a different (younger) set of people getting whacked for it.

Hmmm.... I semi-object. It's not that teaching is unimportant, but that the whole dichotomy is really a spectrum. There's undergraduate-level teaching, graduate-level teaching, colleague-level teaching, documented research (ie: publishing), and your own research notes. The institutional problem is that you need a full spectrum of different methods and expectations for each role.

It's easy to say that we need to emphasize research less and undergrad teaching more, but what's really happened is that an originally unified role (the Full Professor of teaching and research) has grown apart, as the research frontier, and even the grad-level teaching frontiers taught to MSc and PhD students, pull steadily further and further away from the capacity of the undergraduate curriculum.

IMNSHO, we need to have a lot more respect for teachers, but we also just need to eliminate middle school. Ok, now that I have your attention: the skill level required to do Real Work, in both industry and research, has gone up, so we need to find ways to stop wasting our youth's time in redundant so-called education they neither want nor need and get them properly specialized and skilled once they show some talent and direction instead of just expecting to pump out Yet Another 1000 Engineering Grads.


By "middle school" do you mean the US 6th-8th grade (more or less)? Because in "middle school" I wanted to become a cartoonist, not the computer programmer than I ended up being. I personally think that specializing in middle school is way too early.


The point wouldn't necessarily be to start specializing that early, but that those years are simply unnecessary bullshit. Nobody I know remembers them being anything but a repetition of elementary school and a first introduction to the material of high school.

So it's probably easiest to just eliminate it, move kids directly on to high school or a vocational program, and graduate them three years earlier into higher ed, apprenticeships, or first jobs.


Taking your arguments though, is it clear that the lesson is teaching cannot be commoditized. Isn't it the lack of understanding and valuing of the research component that makes them see professors as adding little value? Programmers in most tech businesses do provide clear value through their work. So if anything the lesson seems to be it's not enough to do something valuable. You have to make sure the stakeholders understand and value it as well.

And in this regard I do agree we have a huge disconnect between modern research and people. To some degree this is because of the increasing complexity of the frontiers of research, but I also feel we no longer have the public figures in the research community who act as bridges and translators between academia and the public.





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: