Hacker News new | past | comments | ask | show | jobs | submit login
Does one have to be a genius to do maths? (terrytao.wordpress.com)
113 points by caustic on Aug 11, 2012 | hide | past | favorite | 56 comments



This is a fantastic blog post and it needed to come from someone as accomplished as Tao.

The cult of genius can be very caustic to young minds, especially in mathematics. I used to do some rather stupid things out of a combined sense of pressure from family, teachers, and peers. I would compare myself unfairly to historical luminaries as a yardstick of what I should be accomplishing at what age. I worked incredibly hard, but on intractable problems and not on reasonable pieces of research for even a precocious mathematician. My grades suffered because I thought I was going to solve some open conjecture instead of learn the tools bit by bit like virtually every other successful mathematician had done before me.

Depression can set in when you discover that your 20th birthday has passed and you are not Evariste Galois. I know it sounds stupid when it is phrased like that but human psychology is full of improbable behavior designed around avoiding cognitive dissonance. We're funny meatbots.


I think to me it might've been more reassuring if it came from someone other than Tao. I tend to put Tao more in the Galois-like depressing category, given how much he had already accomplished by the time he was 20 (he was promoted to full professor at age 24). There's no amount of hard work that can replicate his trajectory unless you go back and start it at age 8, and even then it's unlikely.


_delirium: I highly recommend you read this short biography of Scottish scientist James Croll, who developed the modern theory of Ice Ages -- with little formal education, he decided to become a scientist while working as a janitor at the age of 38: http://www.guildtownandwolfhill.org.uk/assets/files/pdf/Jame...

Edit: submitted the link as http://news.ycombinator.com/item?id=4370924 because Croll's story can be very inspirational for the many entrepreneurs on HN who're coping with the challenges of building a business from scratch.


Why is an exception like Croll, who is literally one out of thousands (how many scientists make their breakthrough in their late 30s/40s, while working at menial labor? now, how many cranks do that...) of any interest?

Exceptions are exceptional, hence the name.


gwern: I find his story of dogged persistence an inspiration.


This rings painfully true.


Every word of advice in this blog post by Terry Tao applies verbatim to many other fields -- including entrepreneurship. Here are two key paragraphs from his post, with just a few words searched & replaced so the text refers to "entrepreneurs" instead of "mathematicians:"

Even if one dismisses the notion of genius, it is still the case that at any given point in time, some entrepreneurs are faster, more experienced, more knowledgeable, more efficient, more careful, or more creative than others. This does not imply, though, that only the “best” entrepreneurs should start companies; this is the common error of mistaking absolute advantage for comparative advantage. The number of interesting business opportunities and problems to work on is vast – far more than can be covered in detail just by the “best” entrepreneurs, and sometimes the set of tools or ideas that you have will find something that other good entrepreneurs have overlooked, especially given that even the greatest entrepreneurs still have weaknesses in some aspects of business. As long as you have education, interest, and a reasonable amount of talent, there will be some market opportunity where you can make a solid and useful contribution. It might not be the most glamorous idea, but actually this tends to be a healthy thing; in many cases the mundane nuts-and-bolts ideas turn out to actually be more important than fancy ones. Also, it is necessary to “cut one’s teeth” on the non-glamorous parts of a field before one really has any chance at all to tackle hard problems; take a look at the early efforts of any of today’s great entrepreneurs to see what I mean by this.

In some cases, an abundance of raw talent may end up (somewhat perversely) to actually be harmful for one’s long-term professional development; if success comes too easily, for instance, one may not put as much energy into working hard, asking dumb questions, or increasing one’s range, and thus may eventually cause one’s skills to stagnate. Also, if one is accustomed to easy success, one may not develop the patience necessary to deal with truly difficult challenges. Talent is important, of course; but how one develops and nurtures it is even more so.


The biggest problem with math is the language. If it were a programming language, we would call it crufty and obscurantist.

The language and notation really needs to be rebooted and cleaned up. Math with a sane notation would be significantly easier to learn.

The other -- and closely related -- problem is with how math is taught. It is taught procedure-first, not language-first and concept-first. It is impossible to understand math without being able to read the notation and translate it into relevant concepts. Doing the mechanics is secondary (and often done by computers these days).


You know, I don't usually do this, but in this case I'll make an exception.

  > The biggest problem with math is the language.
Citation needed.

  > Math with a sane notation would be significantly
  > easier to learn.
Citation needed.

  > Doing the mechanics is secondary
Citation needed.

OK, so I speak from a different perspective from yours, because I have a PhD in math. I also do a lot of outreach, enrichment, and enhancement, dealing with reasonably bright (but very rarely genius level) kids, and in my direct personal experience I have found that

(a) the notation is not a problem, and that

(b) doing the mechanics opens the door to internalising the structures and enables the understanding of the ideas.

In my experience, trying to grasp the ideas and concepts without working on what seem to be tedious, mechanical and apparently unnecessary processes results in a flailing about, an inability to anchor those concepts in experience or intuition. It's the mechanics and the processes that let you internalise the patterns and start to build your own structures, into which the ideas and concepts can then be fitted, making a coherent structure.

So forgive me if I regard your pronouncements with a degree of suspicion and scepticism.


Spivak and Sussman have made excellent cases as to the problem of notation in mathematics. I have long had much the same opinions as the grand parent so was delighted to see such luminaries in agreement. Now, mathematics is compact but that is not the problem. The problem is ambiguity. You can with time learn to pack the and unpack the extremely dense notation and the upfront costs are worth it but the ambiguity (not to mention differing conventions across branches) is an inexcusable mess. Calculus for example is replete with the abuse of variable binding which is just cruel to the beginner. Quoting: http://mitpress.mit.edu/sicm/book-Z-H-5.html#footnote_Temp_4

" 1 In his book on mathematical pedagogy [17], Hans Freudenthal argues that the reliance on ambiguous, unstated notational conventions in such expressions as f(x) and df(x)/dx makes mathematics, and especially introductory calculus, extremely confusing for beginning students; and he enjoins mathematics educators to use more formal modern notation.

2 In his beautiful book Calculus on Manifolds [40], Michael Spivak uses functional notation. On p. 44 he discusses some of the problems with classical notation. We excerpt a particularly juicy passage:

The mere statement of [the chain rule] in classical notation requires the introduction of irrelevant letters. The usual evaluation for D1(fo(g,h)) runs as follows:... This equation is often written simply...

Note that f means something different on the two sides of the equation!

3 This is presented here without explanation, to give the flavor of the notation. The text gives a full explanation.

4 ``It is necessary to use the apparatus of partial derivatives, in which even the notation is ambiguous.'' V.I. Arnold, Mathematical Methods of Classical Mechanics [5], Section 47, p. 258. See also the footnote on that page. "


In my experience I think notation would not make the life of a student much easier if you're learning things at the level Terry Tao writes in his blog, he's not writing college level math for engineers (ie. Calculus, LinAlg, Basic Fourier Analysis and some other topics) but Graduate level math for mathematicians and people interested in pure and/or applied mathematics. Notation is not the problem when you're having a hard time studying Functional Analysis, Algebraic Topology, Advanced Probability (using Lebesgue integral) or trying to understand the proof of the Prime Number theorem.

The problem actually is that notation preference is a matter of personal preference, the physicists love the bra and ket notation I think it is very confusing, maybe because I'm not a physicist. I think any time someone comes with new notation for settled things they just turns the matter worse.

Partial Differential Equations is a perfect example how these things works, there's generally different notations being used by mathematicians, physicists and engineers, at least in my experience, and because every couple of years someone comes with a new notation to "simplify" everything to his field of study, but a introductory note of a page or two in any book is enough to explain the differences between the notations.

The grandparent argument is valid for some math below graduate level and some confusing bits in advanced mathematics but it's generally not a problem for anything above. I actually think notation is a problem for students that are not that much interested in mathematics, high schoolers and some engineering students that I saw in my life generally are confused why some things are written the way they are without any justification. This is a problem with learning methods and if you just give a new arbitrary notation to these students I believe the problem will persist.

I also think that trying to reboot the entire mathematical notation to fit areas such as aerospace control theory, algorithm complexity theory, abstract algebra, biostatistics and thermodynamics, among other fields would probably result in failure, like creating a universal language like esperanto that would never be used.


Right, that's why the focus is on introductory classes. As I noted, yes a sufficiently motivated individual will get used to the notation in time but that doesn't mean things are okay.

These things that seem minor to the expert actually make a big difference before chunking is achieved and can hinder all but the most motivated. If you are taxing short term memory by using unhygenicly bounded variables then no, it is not just a matter of who is interested. If you are not pointing out the difference between higher order functions and regular functions nor separating the notion of function from application then you are causing unnecessary representational couplings that create a lot of friction. These things have real cognitive and physiological costs. The design should streamline thought for expert and novice alike, it should not be arbitrary. And in the absence of anything better we cannot say that the issue of notation is not a problem at high levels. Sure learning is no longer the problem but what of adroit mental manipulations? I tend to agree with Alfred Whitehead who said

"By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race.".

We can't lament the lack of scientists and engineers on one hand and not try to do reduce uptake friction on the other. There's a real problem with math education if the experts are not trying to relate to the ones who are struggling.

btw braket is a wonderful notation in my book and I'm not a physicist, it is an elegant way of writing sparse vectors.


This is a big aside, sorry about that, but I've been wondering about this. People on HN often say "citation needed" when they're talking about a thesis or opinion rather than a source that should be quoted.

Are you sure it's a citation that is needed, or just more elaboration/justification? I was under the impression that a citation had more to do with referencing a source (especially facts, but sometimes theories).

For example, If I said "30% of elbonese immigrants are drug carriers", and there appears to be no evidence of this, I'd say a citation is needed.

If I say "elbonese immigration is a bad thing", it would be reasonable to ask for a justification - that's a bold statement and you can't just make it and move on. But I'm not really claiming a fact here, this is clearly an opinion or thesis. I could cite a source, but really what you're asking for is a better justification of a controversial statement.

I'd agree that a statement like "the biggest problem with math is the language" probably warrants more discussion or defense. But I dunno, "citation needed" seems odd to me. Maybe I have this wrong.


"Citation needed" is a meme that derives from Wikipedia:

http://knowyourmeme.com/memes/citation-needed

Otherwise yes, strictly speaking you would only ask for a citation as evidence of some claim. But even then, people would still use it informally as a stand-in for "I disagree," "prove it," "says who?," and so on.


It was never my intention to start a meme like this. I started [citation needed] because specifically Wikipedia was (and I guess still is) replete with assertions that had no evidence. I got increasingly annoyed at half-baked statements that could not be easily removed, either because they sounded plausible or because someone would object.

So I decided to create the {{fact}} tag, which would highlight the phrase that had the questionable content. That way, the article could keep the material, but it would be easier to a. signal to the reader that the statement's veracity was in doubt until some sort of citation was provided, or b. someone (hopefully the original author!) would provide a citation for the statement.

It was, to put it mildly, wildly successful. It appears to have been my greatest contribution to Wikipedia, and, it appears, to wider society.

Not sure how this makes me feel. I was hoping for something more significant, but I suppose that if I have contributed something worthwhile then this might as well be it!


Knuth actually had some similar things to say about mathematical notation. Coming from programming and making my way into math, I can vouch for the fact that mathematical notation can be at times annoyingly inconsistent and vague. That being said, I don't think it would really help anybody learn fundamental concepts any better.


I agree. This isn't a [citation needed] moment, anyway, because it's mostly our subjective opinions on the structure. While the core symbolic system is effective in many ways, I hardly imagine it's ideal. It evolved over the years out of the mathematical discourse, and is very organic and piecemeal. There are ambiguities and inconsistencies, unnecessary redundancies and linguistic inefficiencies. I hardly imagine that anyone commonly steeped in symbolic math thinks that the current system is perfectly ideal, even if they find it generally effective.

I find that linear algebra is particularly bad with having too many ways to represent objects and operations, along with many that overlap with algebraic symbols and lead to confusion. It also doesn't help that there are forty billion different ways to represent the derivative of a function, many of which overlap ambiguously with other operations and symbols. I remember having a single semester where multivariate calculus used f' to mean a derivative, modern physics used f' to mean the position of f after a time interval, and linear algebra used f' to mean a matrix transformation of f. There's also no easy way to represent the antiderivative of f other than with F, which looks like f is a vector/scalar and F is a matrix. Add to that that many professors have their own notational preferences, and it's a royal mess.


Have you ever worked with software projects going into several millions of lines? How would it be if all you are allowed is a Notepad to handle such a project?

Will it be possible for you keep a cache of all the keywords, API's, documentation, their interplay and their heuristic all in your brain and still continue to modify and extend the codebase?

If the answer to this question is yes. Then I presume you must either be a super genius with enormous brain power and memory to deal with so many symbols, their purpose and interplay.

I find math to be similar. Going fluently with math forced to have many formulas learned and kept in brain. Those fundamentally acted as abstractions to shorten equations and represent them in a more concise way. Understanding others work forced me to go many steps below to see what and how they try to use their symbols to represent concepts.

In fact many friends I know hated math because they needed to keep many things in their brain. Formulas, theorems, numbers, multiplication tables. Where is the space to put concepts?

We are in serious need of IDE's for math.


I think notepad is the wrong metaphor. Try BASIC.with programming languages we are all working on (to some approximation) Turing machines so a single language would be easy -- but w have many languages for different problem domains and target audiences.


I'm sorry to reply with merely an anecdote but I hate the notation used in maths. I was trying to read a book on algorithms some years back and it took me forever to work out that a diamondy looking dot can also mean multiplication.

How does one even search this stuff on Google?


Go back and look at Newton's and Leibniz's notation. Math notation, proof structure, and simplicity are continually getting cleaned up. The thing is, math is HARD.


I read that Leibniz had studied Chinese and that it influenced some of the math notation that he introduced. Like using df/dx to express the derivative of f with respect to x as a kind of ratio. Whereas Newton just put f' to denote the same thing. I think Leibniz also came up with the elongated sigma for representing integrals as a type of summation.


No, Newton put a dot over y to represent a derivative. The f' notation came from Lagrange.


And Newton's dot notation could only represent differentiation with respect to time. Obviously what he was focused on, sure, but the more general notation was obviously also needed.

Frankly, what annoys me is how mathematicians grab natural language words to represent their concepts, and often pick the worst possible of such words to use. For example: "Imaginary" numbers are no more imaginary than the "real" numbers are. They're just as well-defined, just as practically useful, and at least as interesting. But that idiotic name hangs on them, a relic from a more ignorant era, and scares away who knows how many students.


The term imaginary was not chosen to be descriptive:

> At the time, such numbers were poorly understood and regarded by some as fictitious or useless, much as zero and the negative numbers once were. Many other mathematicians were slow to adopt the use of imaginary numbers, including René Descartes, who wrote about them in his La Géométrie, where the term imaginary was used and meant to be derogatory. -- http://en.wikipedia.org/wiki/Imaginary_number


df/dx is actually completely obvious. It tells you exactly what's going on, since df is the change of f and dx is the change of x, for very small changes. It's a lot more precise than simply writing f'. Newton wrote a dot over f and a lot of physicists still do that.


I agree that math formula are hard to memorize because the language is terrible. It's from an age when things were handwritten and movements of the hand were to be conserved as much as possible.

I used to have a bit of trouble with math but I found out , somewhat counter-intuitively, that the key, at least for me, is rote memorization of formula. When I was having trouble I always tried to understand how to derive rules from proof but this was impractical when trying to do higher mathematics as the level of abstraction got to be too much. I found it just easier to simply memorize algebraic formula, derivatives, integrals, properties of infinite series, etc with flash cards or repetitive exercises and then apply them as if they were simply given. Maybe I'll never be a great mathematician, but all I really want is to be able to read CS papers and understand analogue electronics.


"movements of the hand were to be conserved as much as possible."

Citation needed. I seriously doubt writing speed was the limiting factor for any mathematician (and that by at least an order of magnitude, but that is guessing). For some, paper might have been expensive, but even then, I doubt it forced them to write succinctly.

Your method probably is a good one for being doing passive maths and probably will be of help for doing real math, too. Even the memorization of such things as multiplication tables helps there. For example, it may help you spot that all numbers in some sequence are prime, divisible by 17, or whatever, and that, in turn, can lead to a proof (I remember reading about a famous mathematician numerically approximating some integral, immediately seeing "hey, but that looks like X" (with X being something like log(27) or the sine of the square root of three or so), and from there, getting the inspiration to algebraically solve a class of problems, but cannot find a link for that)


I agree 100% with this. Math will be a lot easier to learn with better tools.


I think the dragonbox app [1] is a pretty solid empirical test of that thesis.

[1] http://dragonboxapp.com/



Personally, I find the language to be brilliant.


I started taking an interest in machine learning and AI about a year and a half ago. I don't consider myself any kind of genius (although I'm reasonably intelligent), and I was terrible at math in school--to the point where I'd come to the conclusion that I simply "wasn't good at math".

After a good deal of reading, trial and error, and banging my head against the wall, I've managed to get myself to pretty much the cutting edge of ML research as it applies to neural networks. There's quite a bit of math involved, and it would have been easy for me to write it off as "too hard" in the beginning. However, I'm glad I stuck with it because I'm actually using it for some pretty neat applications.

My point being, if you have an interest in something that seems like you have to be a genius to be good at it, don't let that stop you because it probably isn't true.


I myself am quite interested in ML research. Do you have any resources that you found useful in learning not only ML, but also the associated math?


Yeah, definitely.

If you're just getting started, I highly recommend Andrew Ng's online ML class. I had started reading up before this was available, but it really tied a lot of basics together that I was confused about.

From there, read papers. For me, my primary interest is in neural networks. Geoffrey Hinton and Yoshua Bengio have two very good groups that both have contributed a great deal to research in this area, and their websites provide lots of good stuff.

After spending a bit of time on a survey of the field, try to come up with a practical goal as quickly as you can: I want to use ML to do X. Then, try to do that. When you get stuck, get back to reading until you find the answer. Rinse and repeat. The math naturally falls into this--you'll get stuck on things that you can't fix without a decent understanding of the math. So figure out how to formulate the question you're really asking, hit google, and read. Then try again. Rinse and repeat.

If you're interested in neural networks, I can also recommend the deep learning tutorials associated with Theano, a library for compiling python code down to CUDA code to get speed increases on certain operations that will let you train your models about 10-40x faster on GPU than if you tried it on CPU.

For me, putting things into practice has helped me make the biggest leaps in understanding, but of course I wouldn't have been able to do that at all without getting a basic grasp of the mechanisms involved. So it's a bit of push and pull between practice and learning, like anything worth doing.


Im not sure if you can find many places where they have both in great detal as most machine learning material assume some knowledge in the specific math but you could always write down the terms that you dont understand from any ML material and look it up on khan academy. For all the material you need for Andrew Ng's classes (on coursera) for example you can pretty much find all the required terms (matricies, probability etc) on khan academy


My dad is a math professor at a big state school with a strong engineering program. I was hanging out with him and some of his college classmates, many of whom are also math/science/engineering professors. One of them told me, "If everyone worked as hard as your dad, anyone could be a math professor."

That stuck with me, not as a statement of fact, but a testament to the power of hard work to shape outcomes, even in a field considered to be dominated by genius.


Reminds me of this anecdote from Richard Hamming's "You and Your Research":

Now for the matter of drive. You observe that most great scientists have tremendous drive. I worked for ten years with John Tukey at Bell Labs. He had tremendous drive. One day about three or four years after I joined, I discovered that John Tukey was slightly younger than I was. John was a genius and I clearly was not. Well I went storming into Bode's office and said, ``How can anybody my age know as much as John Tukey does?'' He leaned back in his chair, put his hands behind his head, grinned slightly, and said, ``You would be surprised Hamming, how much you would know if you worked as hard as he did that many years.'' I simply slunk out of the office!


I love this essay but would find it more believable if it did not come from one of the people who best exemplifies genius in mathematics.

Sure, hard work may be the way that he experiences himself. But read http://www.davidsongifted.org/db/Articles_id_10116.aspx for an account of his childhood, written when he was 10. Teaching yourself to read and do math before most children can use complete sentences requires something more than pure effort. My son is well above average, but there is absolutely no way that it would be possible to get him to work hard enough to compare with average high school seniors on the SATs before he was 10, let alone scoring near the top. (For those who took the SATs in the last 20 years, the scale used in the 1980s was much tougher than it is now. 700+ would have been easily in the top 1% on the test.)

All of that said, he would not have his current success without constantly working hard. And it is possible to succeed without being at Terry's level of genius. But he's the worst possible example to use for saying that what appears to be genius is just hard work. Because sometimes what appears to be genius really is genius if you dig in.


Math is particularly tough on the ego. I remember one of my advisors said: "I try to decide if something is true. I work extremely hard to do so. Then when I am done, it seems clear that it was true all along, and the only problem was that I didn't know it."


Most mathematical genius in my life (in others and me) has been the direct result of DOING. Doing maths at that stage was the result of joy. Joy was often the result of a feeling of newness. A feeling of newness can be the result of a great number of things. Another reason for joy in mathematics can be emulation of ones parents.

In an interview in BBC Music Magazine British violinist Nicola Benedetti says quite bluntly "If you sound like rubbish at age 13, you quit."

It's lazy to say that this initial "sounding good" is "genius", and leave it at that. To come back to classical music: where is the Mathematics equivalent of El Sistema[0]?

[0]: http://en.wikipedia.org/wiki/El_Sistema


At the risk of going backwards, quoting an artist to a mathematical community about math, I found what Martha Graham said about dance applies to any creative endeavor:

"Nobody cares if you can’t dance well. Just get up and dance. Great dancers are not great because of their technique, they are great because of their passion."

To anyone who doesn't know, Martha Graham was to dance what Picasso was to visual art.

http://en.wikipedia.org/wiki/Martha_Graham http://joshuaspodek.com/master-speaks-creative-expression


Written by a guy who was attending university level mathematics courses at the age of nine. Perhaps the sentiment is better expressed as being a genius is not sufficient. Hard work is still required (but so is being a genius).


Certainly hard work is required, but equally certainly, being a genius is not necessary. I'm a mathematician, and I'm no genius.


You are not? How does one tell?


I've spent time with Tim Gowers, John Conway, Ron Graham, and others who are, without question, geniuses.

I'm not in their league.


Right. So you don't need to be a one in a million mind like Growers and Tao. You just need to be one in a thousand.

Still not "anyone can bang it out".


The minute or so starting at 5m40s in this video is apropos: http://video.google.com/videoplay?docid=5935911405946587342

(You may be interested in the rest of the video as well. It's a fascinating show.)


I have to agree. Being a genius is very important if you want to be a mathematician. Tao is an excellent teacher, but I still don't know that someone as smart as him can really appreciate this. His perspective is nicely idealist, but is highly unrealistic for normal people. For example, he writes, "Professional mathematics is not a sport ... mathematics needs all the good people it can get." But if you want to be a professional mathematician, then you do need to play the sport, because you aren't going to get a job anywhere, let alone at UCLA, otherwise. It is a very competitive field.


    I have to agree. Being a genius is very important
    if you want to be a mathematician.
How do you know? What makes you say this? I know many research mathematicians who are not geniuses (and a few who are) so I'm keen to know exactly how you have come to this conclusion.


I imagine he would say being a genius (which he clearly is) is neither necessary nor sufficient for working in math.


This interesting article submitted here on HN is one I have often recommended to other readers, so I'm glad to see it on HN's main page. It's particularly interesting to read the comments here, mostly largely agreeing with Tao, as I am currently at Epsilon Camp,

http://epsiloncamp.org/

the most advanced mathematics summer program for YOUNG learners in North America, and the parents of the campers here are all pondering the issue of their children's mathematical development. Plainly, at any given age, some young people are more advanced in their mathematical development than many of their age mates, but it is still to be seen how steadily and consistently the mathematical development of the most advanced young learners can be developed if optimal practices are applied to their education.

Based on this published writing by Tao and various writings by other mathematicians, the Epsilon Camp program provides FAQ pages for parents,

http://epsiloncamp.org/FAQ.php

who in many cases are not themselves mathematicians, to serve as food for thought as precocious young mathematics learners are growing up. One of the issues for many of the parents from various parts of the United States is simply finding a flexible local school. Another issue, which the meetings of the parents at the camp has helped to handle, is sustaining friendship relationships among those most advanced young mathematics-learners as they disperse around the country at the end of the summer program. On the whole, the parents participating in the program have great buy-in to Tao's idea that whatever initial dose of "talent" or "native ability" a child starts with, careful and intentional guidance of the child's whole-child development is still very important for the child to have the best enjoyment of advanced study of mathematics and the best success in making a new contribution to human knowledge as an adult, in whatever domain the child chooses.

By the way, everyone on Hacker News might enjoy looking at Tao's comments on a blog post that reached the Hacker News page yesterday,

http://rjlipton.wordpress.com/2012/08/09/a-new-way-to-solve-...

in which we can see Tao thinking out loud in blog comments about what the blog post really means. This kind of careful, step-by-step thinking is something that every mathematician needs to develop sooner or later.


Mathematics when studied alone can get a bit difficult. In my opinion mathematical concepts are better understood through applications of it such as Physics, specially Mechanics.


This is definitely a per-person sort of thing. I was a maths major who took several physics courses and I tended to find that they muddied my mathematical understanding as often as they helped whereas I found algebra and number theory (my foci, admittedly) to be easy enough to understand on their own that the classes were just on this side of trivial.


Physicists often take shortcuts based on physical intuition, e.g. symmetry arguments.

I remember having trouble understanding equilibrium charge distributions when studying E&M -- laws such as "at electrostatic equilibrium, all charge is on the surface of the conductor" -- the questions of whether such a static equilibrium exists, is unique, or is a place you'll always end up from an arbitrary initial configuration weren't really addressed. (The best I could come up with is that any movement of charge will eventually die out due to friction, but this was more of a vague intuition than a satisfying explanation.)

Anyway, I guess if you have the physics gene, you just have a strong intuition that tells you the answers to questions like these. I didn't have it.

Physics is better for people who like to trust their intuition. Math is more programming-like in that the people who do well tend to be hard-nosed about details and corner cases.


there's nothing wrong with being a math genius. there's nothing wrong with not being a math genius.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: