Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think in many ways, in terms of exploring scientific and engineering realms, we're more conservative now. Personally, I'm doing a PhD in computer science, and I find it incredibly frustrating. Publishing papers is all about minute incremental improvements, and you absolutely must have numerical results to prove that you beat the competition. You can't publish negative results either, your work is considered worthless.

I'm in compilers and programming languages. I really wanted to create my own language as part of my PhD thesis, but I was basically told that this would be unpublishable. I mean, how can you hope to numerically show that your language is better than everything else around? Plus it's all been done already, nothing else new can possibly be invented in that realm.

Things weren't always like this. In the 1970s, we created things like Smalltalk, ML and LISP, which had a tremendous impact on the programming world. People also had bold ideas about artificial intelligence and nuclear-powered spaceships. In the 70s, people were allowed to just explore ideas, in the hopes that these ideas would lead to something good (and some did). Now, it's much harder, you bring up an idea and people immediately try to shoot it down, ask you for proof that it definitely will work, and bring up the most asinine suggestions as to why your idea will definitely fail.

Today, the exploration has been scaled down. It's not because the exploration failed, we invented many great things as a result of it, it's largely IMO because we live in different economic times. The USA is no longer in an economic boom, things are no longer in expansion. There are cuts to scientific funding, cuts to education. People are being told not to be "wasteful". We live in a much more nearsighted world, in a sense. Being a dreamer isn't considered a virtue.



I've mentioned this a bunch already on HN, but I recommend reading "The Structure of Scientific Revolutions" by Thomas Kuhn. The book was published before most people had even heard the phrase "computer science," but I think that computer science still follows some of the patterns he talks about.

In Kuhn's view, "normal science" is mostly iterative and incremental. It is so because most people in the discipline agree on most of the big issues, and people are mostly refining those understandings. The periods where people don't agree on the big issues in a discipline is around the time of scientific revolutions: the solutions to such big issues are so different from previous approaches that accepting them requires a complete re-think of what the discipline is.

A lot of areas of CS are in the "normal science" part of that cycle, and I think compilers and languages are in there. (The biggest argument against that is concurrency and parallelism.) In the 60s and 70s, programming languages were new, and they changed computer science forever. We were exploring what these things could be.

I also recommend Cristina Videira Lopes's blog post, "The Evolution of CS Papers": http://tagide.com/blog/2014/02/the-evolution-of-cs-papers/


People are still creating new languages in academic contexts -- and ones which go on to have influence outside of the academy -- now. Likewise, I'm sure even in the 1970s, someone's thesis advisor told them it had all been done.

You're trying to compare different times, but its getting confused because you're actually comparing one incident now to an aggregate impression of the "1970s" (though, from the 3 specific examples cited, the "1970s" you are talking about are really something like 1958-1973.)


To be fair, were any of Smalltalk, ML or LISP created as part of someone's thesis? I thought they were all academic (or academic/commercial) projects, but developed by academics at a more senior stage in their careers.

I'm pretty sure I've seen some recent theses that did describe the creation of languages, too. Though they had less of a theoretical computer science bent, and more of a focus on saying "this is a language for X problem domain".


Why do you want to create your own language? People create languages for many different purposes. What would be original and interesting about this language?

I am inclined to agree with your larger point that we have gotten too conservative, but I also think we are awash in programming languages. Before I would encourage someone to create a new language, I would want to see a good argument that none of the existing ones would suffice for some interesting purpose.


Implementing a new language was never such an exciting deal as many think. New languages happen more because of the need than because of properties of the language itself. For example, C was created because a language with those features were necessary for the UNIX project. Similarly for Smalltalk and the environment at PARC. Lisp is a little different because it started as theory that was eventually implemented.


The modern language that comes to mind, is of course, Haskell. And "in between" we have Standard ML. And I think one could argue that the Dylan revival project and Rust are various extensions on the idea of invent(academically)-implement(pragmatically). And the Racket ball is still rolling, of course.

So, I don't think it's right to say that things are all different now. Might be that the maturation of computer science into a "science" field feels a bit like it's taking the joy out of things. But I think if you look at stuff that was published earlier, there's a divide between rather conservative work founded in logic and discrete mathematics, and more exploratory work in what would now be considered "computer science". I'm not convinced all of that would really be considered "published research" though (as in qualifying for a ph.d etc).

It's not like you'd be able to publish a study in medicine on the benefit of washing your hands before you deliver a baby, after you've done an autopsy -- we've already figured out a lot of the elegantly simple stuff.


I do think there is a big difference in academic computer science now and then. Rob Pike has written about it from an operating systems perspective [1], and Richard Gabriel from a programming language perspective [2]. It seems that until the early 1990s, there were lots of projects focused on building "systems", i.e. big pieces of software which are in themselves practically useful; then there is a sharp shift and academic research focus on "theory" (in PL research, e.g. a type safety proof for small core calculi, or a particular algorithm for program analysis).

People have always pursued "small" ideas. But I think the fact that we stopped writing "big" systems is a real change. (I guess the reason is that off-the-shelf operating systems and programming languages gradually got better, until it became impossible to "compete" with them. C.f. the Lua langauge, which got written basically because "western" langauges were not easily available at the time). My impression is that there is a lot less diversity of ideas now than there used to be, because everyone is incrementally improving the same set of OSs/langauges.

[1] http://herpolhode.com/rob/utah2000.pdf [2] https://www.dreamsongs.com/Files/Incommensurability.pdf (see the section starting on page 10)


I'm not sure it's quite so clear cut, don't forget about http://vpri.org/ for example. Or the OLTP (one laptop per child) with assorted projects. Or the work on various unikernels on top of xen (like mirageos). Or Minix3. Or livelykernel (http://www.lively-kernel.org/).

Or perhaps even Dragonfly BSD.

I'm not necessarily disagreeing with you, some of the links above might even support your point -- I'm just not sure we've "stopped" with big, complete systems -- but the field as a whole has gotten much bigger -- and there's only so much hype to go around...


>all about minute incremental improvements

...how much of that is due to the fact that CS is now much more of a mature field compared to 44 years ago. It seems like the fields of synthetic biology and biological and chemical computers and self replicating nanobots (hey Von Neumann again) would be more open to grand scale ideas, and there are lots of exciting opportunities that haven't been explored.

http://en.wikipedia.org/wiki/Synthetic_biology

http://en.wikipedia.org/wiki/Biocomputer

http://phys.org/news/2014-01-slime-molds.html

http://en.wikipedia.org/wiki/Chemical_computer

http://en.wikipedia.org/wiki/Self-replicating_machine#von_Ne...


Lisp was defined in the 50s and implemented in the 60s.

https://en.wikipedia.org/wiki/Lisp_(programming_language)


True, however many other lisps were developed in the 1970s (i.e.: Maclisp), including Scheme which came in 1975.


Not disagreeing with you necessarily, but the 70s were hardly an economic boom time in the USA either.


And von Neumann lived mostly though war and 30's financial crisis. I honestly think it's more a question of (lack of) will/confidence from part of academia. I bet if had really good ideas about new languages and really wanted to dig deep on that he would be able to do so without starving.

Maryam Mirzakhani (this year's female fields Medalist) commented that she's quite a slow thinker, and left a clear impression that it's more a matter of asking important questions and actually trying to solve them with perseverence. It's just far more secure to chase the low-hanging fruits.

http://www.simonsfoundation.org/quanta/20140812-a-tenacious-...

" Another notable and enviable trait of von Neumann's was his mathematical courage. If, in the middle of a search for a counterexample, an infinite series came up, with a lot of exponentials that had quadratic exponents, many mathematicians would start with a clean sheet of paper and look for another counterexample. Not Johnny! When that happened to him, he cheerfully said: "Oh, yes, a thetafunction...", and plowed ahead with the mountainous computations. He wasn't afraid of anything."

That's an inspiration!


My understanding is that during the war, the american government was throwing money around liberally. Trying to do everything it could to outwit and outdo the enemy. This is part of what caused the post-war economic boom, money being distributed and more people getting a chance to prosper.


My understanding is that after the war, the rest of the world needed to buy stuff, but the U.S. was the only major country with an intact manufacturing base. So that's why the U.S. made 30.5% the world's manufactured goods exports in 1948, compared to 15.3% in 1938. See table XXIII on page 52 of:

http://unstats.un.org/unsd/trade/imts/Historical%20data%2019...


Makes one wonder what the NSA have been doing since the 80s. They're supposed to be the biggest employer of mathematicians, and since the 90s they've had their very own "war budget". Maybe in 2050 we'll say -- oh wow, look at all the great stuff that came out of the islamophobic-fuelled war on the middle east that the US spearheaded. Shame about the future ruined for half a billion people, but boy did they come up with some crazy stuff at the NSA!


> Makes one wonder what the NSA have been doing since the 80s. They're supposed to be the biggest employer of mathematicians, and since the 90s they've had their very own "war budget".

I wonder. Does anyone have recent sources indicating the NSA really might be the biggest employer of mathematicians? I'd also point to the Snowden and other leaks and attacks like Stuxnet - so far, everything revealed has been fairly humdrum in the sense that they are more or less what you'd expect if you threw a few billion dollars at known vulnerabilities in the Internet and current OSes. Portmapping entire countries' computers may be impressive in some respects, but not in the sense of beyond-cutting-edge cryptography/mathematics.


I'd say the recent revelation/allegation that NSA accidentally took down Syria's Internet connection in the recent Wired interview with Snowden and the general "vibe" of mismanagement that I get from Snowden's and Binney's accounts -- doesn't have to imply that NSA doesn't also do interesting work other than their practical attacks on global infrastructure. But it does seem clear that if they actually do interesting stuff (say, working on using multiple satellites to photograph the same area in order to extrapolate to higher resolution images than what is possible from a single lens due to atmospheric effects -- or perhaps quantum or DNA/RNA computing etc). It is clear that whatever they might be doing, it does seem to be ridiculously highly classified, just like the sibling comment's example with RSA and GCHQ.


While the NSA has a huge budget, it's not infinite, and what gets one promoted and what analysts are boasting about in their slides gives you a definite idea for what the organization values. Do any of the leaks sound like there might be huge contingents of mathematicians off in side-corridor making breakthroughs? Not really.


We're more likely to say "what a huge burden was NSA!". Major crypto breakthroughs created there (in either breaking something/creating something) would be kept secret for years, so it just bogs down innovation. GCHQ infamously invented public key crypto but no one was allowed to show outsiders for comment so it just died there until it was reinvented a few years later.


OTOH, the time when Lisp (1958), Smalltalk (1969), and ML (1973 is what Wikipedia has for "first appeared in", but unlike Lisp and and Smalltalk I can't find information on definition vs. release) were first defined all (except the last, which is on the cusp) precede the 1973 recession, and the preceding period actually was an economic boom time.


Just move to open source. There you are judged by your peers rather than by numerical results.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: