You're preaching to the choir. Looks like NEC funded this.
NVidia seems to be the preferred hardware for institutions/big companies. I'm not sure if this is because NVidia's architecture is better for supercomputers or if they're simply better at marketing to those types of customers
NVidia funds a lot of academics in my space, and I've found academia to be very anti-open source for those reasons, which amuses me greatly.
Case in point, Matlab. Why is this taught in a world with Python/Numpy/MatplotLib?
Matlab seems like inertia/culture to me: it's the longtime de-facto standard in engineering. Since it's what everyone uses, it's got packages for everything, and papers will often come with prototype Matlab implementations. Roughly like the cultural position R holds in statistics. Matlab's hold on engineering is also bolstered by its widespread use in industry: students want to learn it, because it's what their future employers use, and professors / research scientists like to use it because it's what their industrial collaborators use.
In my area of CS (artificial intelligence) it seems considerably less popular. I don't really remember how to use it, since the last time I used it seriously was in some engineering (but not CS) courses in undergrad.
In addition, the MathWorks have so far managed not too screw up too badly and are keeping Matlab up to date. (They are definitely quite nice as an employer.)
Could be; we don't use Matlab much in my own research area, so recent change could've happened under my radar. When I've occasionally had contact with engineers in industry, though, Matlab still seemed to be everywhere. The most recent two examples were someone doing DSP, and someone doing mechanical engineering, and both had all their stuff built on top of Matlab+Simulink.
Matlab knows how to control numerical precision and many algorithms produce the best results when running on that platform.
In the world of electrical engineering Matlab can do things that other packages can't.
From personal experience, I have spent many hours looking at the results of a atan2 function in C++ and Matlab and trying to get them to agree. After a day of work I was able to get them to agree by precisely controlling the rounding modes and using my own atan2 function. This was not fun, and I would rather give somebody 1k to take care of it for me.
I'm saying there is a systematic advantage to using proprietary technologies in academic research (companies have money, so you can write a grant and they will pay you $). Case in point, look at apps coming out of academia and you'll see a lot of WindowsPhone. It is because Microsoft gives away a ton of free phones (I have one on my desk at this moment) and Azure time.
Ok, that tracks better than it being something nVidia in particular did.
I don't see a serious problem with the scenario you describe, though. You're not really describing a hostile scenario, just an affinity for commercial software.
There is a bit of a problem of course; I find a lot of papers that describe how to do things with commercial technology that isn't in the budget. That hasn't been insurmountable for me in any way, but maybe others have had more serious problems with it.
The free alternatives are not so good for beginners. One of MATLAB's main strengths is the embedded editor + repl, whereas with Python/Numpy/Matplotlib stack, there are just too many moving parts. The environment of MATLAB can be emulated with ipython workbook or emacs, but I don't believe it is easy enough for beginners.
Are you aware of Sage [1]? It's a Python-based, batteries-included, integrated maths system. It is actually more popular than Matlab around the lab here. Incidentally, we don't get much funding from corporations (but yes, we have licenses for Matlab, Maple and Mathematica for everyone, just somehow Sage is more popular).
NVidia seems to be the preferred hardware for institutions/big companies. I'm not sure if this is because NVidia's architecture is better for supercomputers or if they're simply better at marketing to those types of customers
NVidia funds a lot of academics in my space, and I've found academia to be very anti-open source for those reasons, which amuses me greatly.
Case in point, Matlab. Why is this taught in a world with Python/Numpy/MatplotLib?