Hacker Newsnew | past | comments | ask | show | jobs | submit | more dvse's commentslogin

I think now it is fairly generally appreciated that interior point methods constitute perhaps the most powerful class of tractable algorithms there is, yet remarkably they are hardly ever mentioned in any of the undergrad CS algorithms classes. Usually there is a passing mention of linear programming to establish an auxiliary result like max flow / min cut duality but that's it.


Yeah but an undergrad class in algorithms is supposed to in most cases just give you a deeper view of traditional C.S algorithms. There are quite a few topics which are barely skimmed ( approximation algorithms, randomized algorithms). I believe that is OK because as long as most of the students are trained to think algorithmically, that should be fine. After all, most advanced stuff becomes much easier once you have the basics fundamentals clear.


All true, but the idea that a good fraction of combinatorial algorithms that undergrads see can be replaced (albeit less efficiently) by linear programming is important. Not clear how classical CS curriculum helps to appreciate this either. And after all, LP used to be considered an almost elementary topic before it mysteriously dropped out the curriculum (perhaps something to do with linear algebra no longer being compulsory).


OR/CS is not the ultimate combination. If you want to do interesting work in this space you ideally need a background in all of optimization (ILP, convex NLP, stochastic optimization),control theory, certain areas of economics (general equilibrium, game theory, mechanism design, concepts behind applied finance), statistics/econometrics (out of sample performance, hypothesis testing, causality, dealing with non random samples) and probability (mainly stochastic processes).

OR itself contains a large number of applications that combine many of the above, e.g. network revenue management, but someone who has taken grad courses from the OR department alone would genuinely struggle to do anything significantly new or interesting.

People from computer science departments have also been gradually moving into these areas, witness growth in machine learning, algorithmic game theory etc.


I actually agree with you there. Since OR is so multi-disciplinary, it is paramount to have a broad knowledge of all the fields it touches. Although I specialised in OR in my masters, my bachelors was actually CS&Economics/Business, which has been of great value to me.

In terms of CS moving into machine learning and artificial intelligence, the focus tends to be on applications in the consumer sphere - e.g. analysing big data to understand and recommend to consumers ala Amazon/Netflix, or image recognition to self-driving cars.

But these are mere sub-domains of OR. In business, what I think has the highest value and remains yet unexploited is the optimization branch/sub-topic of OR. The likes of production optimization, supply chain optmization, inventory optimization, facility location optimization, and my favorite, vehicle routing optimization.


Just for fun - "stochastic optimization" as in here: www2.isye.gatech.edu/~ashapiro/download.php?Down=book

is in fact identical to "empirical risk minimization" in Vapnik's statistical learning theory. As soon as you are not 100% sure about any of the numbers in your routing model, you find yourself in the same setting as that studied in "machine learning"!


Thanks for the link.


Do you know a reference that presents state estimation from this point of view? I have painstakingly arrived to this myself, following somewhat sketchy remarks in Gilbert Strang's "Computational Science and Engineering".

The classical exposition is a perfect example of confounding the model and the algorithm but it just refuses to die.


All good advice - operator theory in various guises is ubiquitous in applications but certainly not something many discern even through grad school in CS.

Probably not much point trying to convince people on HN though - having perhaps suffered through an abysmal 'calculus' sequence they are not very receptive to the message that we have barely tapped practical consequences 50 year old mathematics.

You should perhaps do a stand alone blog - even on google plus!


Also, to save much future grief, the undergraduate degree should be in maths.


If you are interested in learning this topic, do not go directly to Koller/Friedman book - it sure contains a lot of material but its presentation is not well integrated (just look at the topic dependency graph at the beginning).

A much more cohesive introduction would be Michael Jordan's book draft that has been floating around for nearly a decade now. You can find some of the older versions online, e.g. http://www.cs.cmu.edu/~lebanon/pub/book

Go directly to chapter 5 to see how the language of PGMs can help to clarify a lot of standard material in stats and ML.


And for a nice overview of how factor graphs, when considered generally, really can capture arbitrary dependencies I would check out "Extending factor graphs so as to unify directed and undirected graphical models" http://uai.sis.pitt.edu/papers/03/p257-frey.pdf.


Was this (Jordan's) book published? If yes, what is it called? He seems to have a couple of other books on Graphical models, but the contents of those book don't line up with the ones at the link above.


This book does not exist as a published volume, at least not in the way it is presented in this draft. This is why it tends to circulate as a draft, rather than as (say) an Amazon link.


Thank You. I spent some time looking for it. And if anyone who has an up to date draft wants to send me a copy, my email is in my profile, Thanks in Advance :).


As usual, an excellent summary. Economic models based on low level data (essentially better "instrumentation", capturing bank transactions, some contracts, individual spending etc.) might be quite useful at least for short term prediction. Perhaps old ideas of "optimal control" can be to some extent realised.


Indeed, best courses on SEE by quite a wide margin.


I took those via HCP (which is like SEE but for 3x the usual grad student rate) and he's a great teacher. He's not one to let his smarts get in the way of teaching, which is not true of all the professors at Stanford.


Certainly nothing beats a good book, but often a good lecturer can give some intuition or motivations that are not customarily written down. I think a great example of this would be material on convex optimization released by Stephen Boyd (http://www.stanford.edu/class/ee364a/videos.html), also from Stanford - certainly his style is not as precise as e.g. Bertsekas, but probably much better at helping students to get a sense of how to actively work with those ideas.


"Certainly nothing beats a good book, but often a good lecturer can give some intuition or motivations that are not customarily written down."

You are correct on both points. Early in my learning, I found both points to be important. And, for a beginning student, both points can be crucial: Without the second point, it is far too easy for a student to get into some not very good material or into some good material but, still, lost.

Eventually I got away from your second point.

Still, there is a version of your second point that lasts: For research, seminars and conferences are good, fast ways to keep up, see the forest for the trees, pick new research directions, etc.

When my company is successful and I retire, I will return to mathematical physics and, maybe, attend research seminars in, say, Boston.


Is Boyd still talking about convexity?

Convexity is an important concept, and the world is just awash in rock solid treatments.

So, for some positive integer n, consider n-dimensional Euclidean space. Suppose C is a closed, convex subset and point x in the space is not in C. Then there exists a closed half space that covers C and omits x. The boundary of this closed half space can share a point with the boundary of the closed half space so that the plane of the half space is 'supporting' for C. So, this is a 'separation' result. Yes, it generalizes past finite dimensional Euclidean spaces! Yes, it's a darned useful theorem! E.g., can ask for the point in C closest to x -- the point has to exist since C is closed. Then the plane through this point and perpendicular to the line to x is supporting for C. So, can get a nice projection result and use it for optimization and best approximation.

Okay, my point: This result is one of the most important about convexity but is standard in 'analysis'. I first saw the result in the 'advanced calculus' book, Fleming, 'Functions of Several Variables'.

Bigger point: Convexity pops up frequently in analysis. E.g., there is Jensen's inequality. With it can easily knock off a nice list of otherwise difficult to prove inequalities. E.g., in the L^p spaces, we can use Jensen's inequality to get Holder's and Minkowski's inequalities.

When we start with optimization, we should start with convex sets and also convex functions. E.g., there is a nice list of theorems of the alternative that are separation results for cones and polyhedra that can be used to establish some otherwise tricky results about duality in linear programming and are also key to the Kuhn-Tucker conditions in non-linear programming. Of course, the feasible region in linear programming is a convex set.

In non-linear programming maximization, there is a non-linear dual that is to minimize a convex function, and that can be the core of the powerful technique of Lagrangian relaxation.

When we have a norm on a vector space, the locus of all points distance 1 from the origin is convex.

In facility location we can be minimizing a convex function and doing so by constructing a sequence of planes supporting the hypergraph of the convex function.

Alas, I never heard a lecture from Boyd!

There are plenty of rock solid, highly polished books where the role of complexity is made clear!


Absolutely it is possible to get a good high level picture of a field without taking any particular class. Moreover things like "convex optimization" can be done in the footnotes of a serious maths course. But on the other hand until there are again 300+ people functional analysis classes like in the old soviet union, this is perhaps the best there is.


Yup, Kolmogorov and Fomin -- NICE book. I never learned from it, but I believe that eventually I got a copy, looked through it, and liked it.

If Boyd has some nice engineering applications, terrific!


These materials have actually been released in late 2008 - the three new courses will follow a completely different arrangement.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: