The biggest problem with math is the language. If it were a programming language, we would call it crufty and obscurantist.
The language and notation really needs to be rebooted and cleaned up. Math with a sane notation would be significantly easier to learn.
The other -- and closely related -- problem is with how math is taught. It is taught procedure-first, not language-first and concept-first. It is impossible to understand math without being able to read the notation and translate it into relevant concepts. Doing the mechanics is secondary (and often done by computers these days).
You know, I don't usually do this, but in this case I'll make an exception.
> The biggest problem with math is the language.
Citation needed.
> Math with a sane notation would be significantly
> easier to learn.
Citation needed.
> Doing the mechanics is secondary
Citation needed.
OK, so I speak from a different perspective from yours, because I have a PhD in math. I also do a lot of outreach, enrichment, and enhancement, dealing with reasonably bright (but very rarely genius level) kids, and in my direct personal experience I have found that
(a) the notation is not a problem, and that
(b) doing the mechanics opens the door to internalising the structures and enables the understanding of the ideas.
In my experience, trying to grasp the ideas and concepts without working on what seem to be tedious, mechanical and apparently unnecessary processes results in a flailing about, an inability to anchor those concepts in experience or intuition. It's the mechanics and the processes that let you internalise the patterns and start to build your own structures, into which the ideas and concepts can then be fitted, making a coherent structure.
So forgive me if I regard your pronouncements with a degree of suspicion and scepticism.
Spivak and Sussman have made excellent cases as to the problem of notation in mathematics. I have long had much the same opinions as the grand parent so was delighted to see such luminaries in agreement. Now, mathematics is compact but that is not the problem. The problem is ambiguity. You can with time learn to pack the and unpack the extremely dense notation and the upfront costs are worth it but the ambiguity (not to mention differing conventions across branches) is an inexcusable mess. Calculus for example is replete with the abuse of variable binding which is just cruel to the beginner. Quoting: http://mitpress.mit.edu/sicm/book-Z-H-5.html#footnote_Temp_4
"
1 In his book on mathematical pedagogy [17], Hans Freudenthal argues that the reliance on ambiguous, unstated notational conventions in such expressions as f(x) and df(x)/dx makes mathematics, and especially introductory calculus, extremely confusing for beginning students; and he enjoins mathematics educators to use more formal modern notation.
2 In his beautiful book Calculus on Manifolds [40], Michael Spivak uses functional notation. On p. 44 he discusses some of the problems with classical notation. We excerpt a particularly juicy passage:
The mere statement of [the chain rule] in classical notation requires the introduction of irrelevant letters. The usual evaluation for D1(fo(g,h)) runs as follows:... This equation is often written simply...
Note that f means something different on the two sides of the equation!
3 This is presented here without explanation, to give the flavor of the notation. The text gives a full explanation.
4 ``It is necessary to use the apparatus of partial derivatives, in which even the notation is ambiguous.'' V.I. Arnold, Mathematical Methods of Classical Mechanics [5], Section 47, p. 258. See also the footnote on that page.
"
In my experience I think notation would not make the life of a student much easier if you're learning things at the level Terry Tao writes in his blog, he's not writing college level math for engineers (ie. Calculus, LinAlg, Basic Fourier Analysis and some other topics) but Graduate level math for mathematicians and people interested in pure and/or applied mathematics. Notation is not the problem when you're having a hard time studying Functional Analysis, Algebraic Topology, Advanced Probability (using Lebesgue integral) or trying to understand the proof of the Prime Number theorem.
The problem actually is that notation preference is a matter of personal preference, the physicists love the bra and ket notation I think it is very confusing, maybe because I'm not a physicist. I think any time someone comes with new notation for settled things they just turns the matter worse.
Partial Differential Equations is a perfect example how these things works, there's generally different notations being used by mathematicians, physicists and engineers, at least in my experience, and because every couple of years someone comes with a new notation to "simplify" everything to his field of study, but a introductory note of a page or two in any book is enough to explain the differences between the notations.
The grandparent argument is valid for some math below graduate level and some confusing bits in advanced mathematics but it's generally not a problem for anything above. I actually think notation is a problem for students that are not that much interested in mathematics, high schoolers and some engineering students that I saw in my life generally are confused why some things are written the way they are without any justification. This is a problem with learning methods and if you just give a new arbitrary notation to these students I believe the problem will persist.
I also think that trying to reboot the entire mathematical notation to fit areas such as aerospace control theory, algorithm complexity theory, abstract algebra, biostatistics and thermodynamics, among other fields would probably result in failure, like creating a universal language like esperanto that would never be used.
Right, that's why the focus is on introductory classes. As I noted, yes a sufficiently motivated individual will get used to the notation in time but that doesn't mean things are okay.
These things that seem minor to the expert actually make a big difference before chunking is achieved and can hinder all but the most motivated. If you are taxing short term memory by using unhygenicly bounded variables then no, it is not just a matter of who is interested. If you are not pointing out the difference between higher order functions and regular functions nor separating the notion of function from application then you are causing unnecessary representational couplings that create a lot of friction. These things have real cognitive and physiological costs. The design should streamline thought for expert and novice alike, it should not be arbitrary. And in the absence of anything better we cannot say that the issue of notation is not a problem at high levels. Sure learning is no longer the problem but what of adroit mental manipulations? I tend to agree with Alfred Whitehead who said
"By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race.".
We can't lament the lack of scientists and engineers on one hand and not try to do reduce uptake friction on the other. There's a real problem with math education if the experts are not trying to relate to the ones who are struggling.
btw braket is a wonderful notation in my book and I'm not a physicist, it is an elegant way of writing sparse vectors.
This is a big aside, sorry about that, but I've been wondering about this. People on HN often say "citation needed" when they're talking about a thesis or opinion rather than a source that should be quoted.
Are you sure it's a citation that is needed, or just more elaboration/justification? I was under the impression that a citation had more to do with referencing a source (especially facts, but sometimes theories).
For example, If I said "30% of elbonese immigrants are drug carriers", and there appears to be no evidence of this, I'd say a citation is needed.
If I say "elbonese immigration is a bad thing", it would be reasonable to ask for a justification - that's a bold statement and you can't just make it and move on. But I'm not really claiming a fact here, this is clearly an opinion or thesis. I could cite a source, but really what you're asking for is a better justification of a controversial statement.
I'd agree that a statement like "the biggest problem with math is the language" probably warrants more discussion or defense. But I dunno, "citation needed" seems odd to me. Maybe I have this wrong.
Otherwise yes, strictly speaking you would only ask for a citation as evidence of some claim. But even then, people would still use it informally as a stand-in for "I disagree," "prove it," "says who?," and so on.
It was never my intention to start a meme like this. I started [citation needed] because specifically Wikipedia was (and I guess still is) replete with assertions that had no evidence. I got increasingly annoyed at half-baked statements that could not be easily removed, either because they sounded plausible or because someone would object.
So I decided to create the {{fact}} tag, which would highlight the phrase that had the questionable content. That way, the article could keep the material, but it would be easier to a. signal to the reader that the statement's veracity was in doubt until some sort of citation was provided, or b. someone (hopefully the original author!) would provide a citation for the statement.
It was, to put it mildly, wildly successful. It appears to have been my greatest contribution to Wikipedia, and, it appears, to wider society.
Not sure how this makes me feel. I was hoping for something more significant, but I suppose that if I have contributed something worthwhile then this might as well be it!
Knuth actually had some similar things to say about mathematical notation. Coming from programming and making my way into math, I can vouch for the fact that mathematical notation can be at times annoyingly inconsistent and vague. That being said, I don't think it would really help anybody learn fundamental concepts any better.
I agree. This isn't a [citation needed] moment, anyway, because it's mostly our subjective opinions on the structure. While the core symbolic system is effective in many ways, I hardly imagine it's ideal. It evolved over the years out of the mathematical discourse, and is very organic and piecemeal. There are ambiguities and inconsistencies, unnecessary redundancies and linguistic inefficiencies. I hardly imagine that anyone commonly steeped in symbolic math thinks that the current system is perfectly ideal, even if they find it generally effective.
I find that linear algebra is particularly bad with having too many ways to represent objects and operations, along with many that overlap with algebraic symbols and lead to confusion. It also doesn't help that there are forty billion different ways to represent the derivative of a function, many of which overlap ambiguously with other operations and symbols. I remember having a single semester where multivariate calculus used f' to mean a derivative, modern physics used f' to mean the position of f after a time interval, and linear algebra used f' to mean a matrix transformation of f. There's also no easy way to represent the antiderivative of f other than with F, which looks like f is a vector/scalar and F is a matrix. Add to that that many professors have their own notational preferences, and it's a royal mess.
Have you ever worked with software projects going into several millions of lines? How would it be if all you are allowed is a Notepad to handle such a project?
Will it be possible for you keep a cache of all the keywords, API's, documentation, their interplay and their heuristic all in your brain and still continue to modify and extend the codebase?
If the answer to this question is yes. Then I presume you must either be a super genius with enormous brain power and memory to deal with so many symbols, their purpose and interplay.
I find math to be similar. Going fluently with math forced to have many formulas learned and kept in brain. Those fundamentally acted as abstractions to shorten equations and represent them in a more concise way. Understanding others work forced me to go many steps below to see what and how they try to use their symbols to represent concepts.
In fact many friends I know hated math because they needed to keep many things in their brain. Formulas, theorems, numbers, multiplication tables. Where is the space to put concepts?
I think notepad is the wrong metaphor. Try BASIC.with programming languages we are all working on (to some approximation) Turing machines so a single language would be easy -- but w have many languages for different problem domains and target audiences.
I'm sorry to reply with merely an anecdote but I hate the notation used in maths. I was trying to read a book on algorithms some years back and it took me forever to work out that a diamondy looking dot can also mean multiplication.
Go back and look at Newton's and Leibniz's notation. Math notation, proof structure, and simplicity are continually getting cleaned up. The thing is, math is HARD.
I read that Leibniz had studied Chinese and that it influenced some of the math notation that he introduced. Like using df/dx to express the derivative of f with respect to x as a kind of ratio. Whereas Newton just put f' to denote the same thing. I think Leibniz also came up with the elongated sigma for representing integrals as a type of summation.
And Newton's dot notation could only represent differentiation with respect to time. Obviously what he was focused on, sure, but the more general notation was obviously also needed.
Frankly, what annoys me is how mathematicians grab natural language words to represent their concepts, and often pick the worst possible of such words to use. For example: "Imaginary" numbers are no more imaginary than the "real" numbers are. They're just as well-defined, just as practically useful, and at least as interesting. But that idiotic name hangs on them, a relic from a more ignorant era, and scares away who knows how many students.
The term imaginary was not chosen to be descriptive:
> At the time, such numbers were poorly understood and regarded by some as fictitious or useless, much as zero and the negative numbers once were. Many other mathematicians were slow to adopt the use of imaginary numbers, including René Descartes, who wrote about them in his La Géométrie, where the term imaginary was used and meant to be derogatory. -- http://en.wikipedia.org/wiki/Imaginary_number
df/dx is actually completely obvious. It tells you exactly what's going on, since df is the change of f and dx is the change of x, for very small changes. It's a lot more precise than simply writing f'. Newton wrote a dot over f and a lot of physicists still do that.
I agree that math formula are hard to memorize because the language is terrible. It's from an age when things were handwritten and movements of the hand were to be conserved as much as possible.
I used to have a bit of trouble with math but I found out , somewhat counter-intuitively, that the key, at least for me, is rote memorization of formula. When I was having trouble I always tried to understand how to derive rules from proof but this was impractical when trying to do higher mathematics as the level of abstraction got to be too much. I found it just easier to simply memorize algebraic formula, derivatives, integrals, properties of infinite series, etc with flash cards or repetitive exercises and then apply them as if they were simply given. Maybe I'll never be a great mathematician, but all I really want is to be able to read CS papers and understand analogue electronics.
"movements of the hand were to be conserved as much as possible."
Citation needed. I seriously doubt writing speed was the limiting factor for any mathematician (and that by at least an order of magnitude, but that is guessing). For some, paper might have been expensive, but even then, I doubt it forced them to write succinctly.
Your method probably is a good one for being doing passive maths and probably will be of help for doing real math, too. Even the memorization of such things as multiplication tables helps there. For example, it may help you spot that all numbers in some sequence are prime, divisible by 17, or whatever, and that, in turn, can lead to a proof (I remember reading about a famous mathematician numerically approximating some integral, immediately seeing "hey, but that looks like X" (with X being something like log(27) or the sine of the square root of three or so), and from there, getting the inspiration to algebraically solve a class of problems, but cannot find a link for that)
The language and notation really needs to be rebooted and cleaned up. Math with a sane notation would be significantly easier to learn.
The other -- and closely related -- problem is with how math is taught. It is taught procedure-first, not language-first and concept-first. It is impossible to understand math without being able to read the notation and translate it into relevant concepts. Doing the mechanics is secondary (and often done by computers these days).