Browsing through Calculus Made Easy, I found it curious that the text begins with “On Different Degrees of Smallness” and the limit is mentioned qualitatively but not used formally.
It reminds me of the preface to Elementary Calculus [1] [2] an excellent text that I discovered years after learning the subject using the limit:
> The calculus was originally developed using the intuitive concept of an infinitesimal, or an infinitely small number. But for the past one hundred years infinitesimals have been banished from the calculus course for reasons of mathematical rigor. Students have had to learn the subject without the original intuition. This calculus book is based on the work of Abraham Robinson, who in 1960 found a way to make infinitesimals rigorous. While the traditional course begins with the difficult limit concept, this course begins with the more easily understood infinitesimals.
>Browsing through Calculus Made Easy, I found it curious that the text begins with “On Different Degrees of Smallness” and the limit is mentioned qualitatively but not used formally.
You might want to read the chapter of Calculus Made Easy entitled Epilogue and Apologue ;)
Excerpt:
>Thirdly, among the dreadful things they will say about “So Easy” is this: that there is an utter failure on the part of the author to demonstrate with rigid and satisfactory completeness the validity of sundry methods which he has presented in simple fashion, and has even dared to use in solving problems! But why should he not? You don’t forbid the use of a watch to every person who does not know how to make one? You don’t object to the musician playing on a violin that he has not himself constructed. You don’t teach the rules of syntax to children until they have already become fluent in the use of speech. It would be equally absurd to require general rigid demonstrations to be expounded to beginners in the calculus.
You're not wrong, of course, but neither is Silvanus P. Thompson.
You might be interested in the book A Primer of Infinitesimal Analysis by John L. Bell, which is another approach to infinitesimals that uses intuitionistic logic.
When I took my first calculus class, in the 1980s, we spent the first few weeks of the course just computing limits. No reason was given, and no context, just techniques.
I remember having to compute the limit of 2x+1 as x goes to 3 by writing it as the limit of 2x + the limit of 1, and then saying the first limit is 2 times the limit of x, and so on. Utterly boring, especially when the teacher couldn't even answer us when we asked "if we already know we'll get 7 by setting x to 3, why do we have to write all this?"
The whole thing almost put me off math.
There's a time and place to introduce properties of limits, but the first month of calculus 1 isn't it.
I've been relearning some math that I forgot so that I can learn general relativity (just for fun), and this was what I worked through to brush off my calculus. Really great book, and far superior to anything else I've seen. There seem to be two approaches to calculus: dumbed down and mechanical (pretty much any modern non-math major text) or abstract and proof based (something like Spivak). This first group of books takes pages and pages to get to the point and provides no understanding of why calculus works. The second book is just too hard for a normal person who doesn't already know the subject pretty well and who is trying to teach themselves. Calculus Made Easy fits right in the sweet spot. While it isn't entirely rigorous, it does justify everything clearly enough that you can see that the manipulations you're learning are valid. At the same time, it always remains focused on calculations, so you don't get lost in abstractions.
His videos are good for giving intuition (I also liked the linear algebra ones), but it's really easy to fool yourself into thinking you understanding things better than you do, so you really need to work a lot of problems too.
This is a good book, but the first time I read it, I was puzzled. There's all this stuff about "infinitely small pieces" like every other calculus resource. It wasn't until I had practice using the derivative rules and the chain rule that calculus clicked for me.
I think calculus should be taught like "here's a set of tools to help you rewrite your equations to make them output the slope of their graph".
Then, afterwards, you can get into what makes it all tick. Sort of like teaching a student to write "hello world" before diving into the low level mechanics of a compiler.
> I think calculus should be taught like "here's a set of tools to help you rewrite your equations to make them output the slope of their graph". Then, afterwards, you can get into what makes it all tick.
I feel the exact opposite way. The method you're describing is how I was taught calculus in high school, that is, with a strong focus on practicing mechanical manipulation of formulas rather than building deep understanding. And I just disconnected from that, I got no mental reward from procedurally finding the derivative or the integral of a function for the 100th time. Lack of motivation meant lack of focus, which in turn made my performance drop, and at the time I assumed I probably just wasn't as good at mathematics as I thought I was. That lasted until I got mathematics classes in college, where many of my mathematics courses used a more first-principles-based approach, and I performed well again.
> Sort of like teaching a student to write "hello world" before diving into the low level mechanics of a compiler.
I'm not writing this comment to say that you're wrong; the method you describe may very well work well for you, and many others with you. Rather, I'm using the opportunity to spread an opinion that I feel quite strongly about: the best way to learn a topic is completely student-dependent and there is no such thing as the one true way in which a topic should be taught. The most powerful motivator is always intrinsic motivation, and different minds are motivated by different challenges. Unfortunately it is hard to bring into practice in a batch-processing based education system.
An intermediate approach I've been taking with my own son is to use calculus and initially not bother with either the manipulation OR the formalism. You have something moving, have the position and want the speed. Or vice-versa. You use it in a bunch of places to build out vocabulary and fluency, and build up both the mechanics and the formalism later.
And we derive a few formulas along the way. integrals of x and derivatives of x2 (x^2, depending on your programming language) pop up all over the place.
I don’t know how old your son is but for a high level and amusingly written book was How to Ace Calculus, The Streetwise Guide. I know it sounds a bit gimmicky but even after years of calculus classes I found it helpful for the higher level view. I always found from engineering school there being a stark difference between the classes taught by engineers and those taught by mathematicians. It always made more sense to me to have a concrete example of why I might use something before diving in than to have all this abstraction and be expected to abstract it to other problems.
I appreciate your perspective, especially around motivation, and I think you're right that students learn differently.
I do wonder though -- for heavily skill-based endeavors (say swimming, sushi-cheffing, curling, etc.), the best way to learn is to do. First principles/mental model approaches tend not work so well in these domains, at least not isolation.
The best approach is probably a blend: learn by doing and understanding first principles.
A difference, though, is those domains are almost entirely about doing: you don’t have to understand buoyancy or the biology of the tongue to swim or cook. Math is almost the opposite: this day, the mechanics of math are nearly entirely automated: there’s no real reason to manually work out answers except insofar as it helps you understand the principles. The main reason to study math now is so that you understand what your computer is doing, so you can instruct it more efficiently. So, first principles in math are significantly more important: someone who can do the mechanics of calculus or long division but doesn’t understand what their doing is just doing slowly and unreliably what a computer can do instantaneously.
You don’t necessarily have to start from the bottom, though; you can drill down and get to first principles instead of starting without context at low level abstractions. Both of the angles you two are talking about (providing context and working through the mechanical applications vs explaining fundamental principles that makes things work) seem addressable in the same course if you start with understandable problems and drill down until you get to the principles that allow you to solve those problems.
I think it would be quite difficult to do in a reasonable amount of time, but using history as a course outline would be a cool way to explain the motivations/context behind the principles being discussed. That might better illustrate how math research works by explaining the gaps/interesting questions that existed prior to certain ideas and how those ideas evolved and lead to other gaps/questions.
The type of math courses I’ve gotten a lot out of already kind of did that, they just usually used more of a condensed, logical narrative than a historically based one.
I see what you're saying, but I would say there's a strong "doing" component in math.
It's still really important to work through problems mechanically (often with only partial understanding) if only to become familiar with patterns. Whether this exploration is done via a CAS or by hand, acquiring this mechanical familiarity is important if one is to go deep. Math is often not merely read but done.
Take something like automatic differentiation (AD) for instance. It's just the chain-rule in principle right?
One could try to study the principles of AD in textbooks/journal pubs and then try to implement it in code. Chances are one is unlikely get it right the first few times because anyone who's ever tried to implement anything from numerical computation papers knows that published papers regularly omit important implementation details (unless they also publish code). There's so much heuristic/unintuitive behavior [1] in the world of numerical computation that first principles alone is rarely enough to get you a decent implementation in code.
On the other hand, if one mechanically works through AD derivations for a bunch of functions and observing the patterns that emerge (without necessarily grasping all the principles at first), one begins to notice things, like difficulties posed by corner cases like non-smooth/discontinuous functions, non-n'th differentiable functions, etc. A CAS could be used for this exploration (manually doesn't necessarily literally by hand) but it's so important to actually try stuff out and really grasp how mathematical objects work in practice. Once this contextual understanding is acquired, going back to AD journal pubs and reading about the principles is likely going to make way more sense because one has seen the behavior "in the field".
I do think the pure mechanical repetition is often underrated as a autodidactic mechanism. (again I don't mean literally by hand, but rather "doing" instead of just "reading and thinking").
[1] non numerical comp folks are often surprised to hear this because they think everything's super deterministic -- it's not. Tiny random perturbations in matrix can change things wildly. Take naive Gaussian elimination -- the ordering of the rows of equivalent linear systems can vastly affect solution efficiency and numerical stability, but one can't appreciate that unless one understand how GE works mechanically and has worked through examples.
Yes, I think doing both in parallel, or alternating between periods of theoretical study and periods of applied learning (projects for instance) could work very well. I do, by the way, fully agree that just first principles based learning doesn't cut it either. Practice is necessary, I just didn't feel I learned from practicing things which I didn't understand at all.
The low level "what makes it all tick" of calculus is known as analysis, set theory and the topology of the real line, and is the subject of upper division college math classes. The stuff about "infinitely small pieces" is a mnemonic made to help you remember and intuit the tools of calculus.
You know, I never understood this. Limits (how a function behaves as dx approaches zero) are much easier (for me) to understand than infinitesimals (an idea which obviously can't even exist on the real line), and it's much harder to end up in contradiction.
Yet, the idea of infinitesimals is very popular in applications. For example, in undergrad physics courses, when deriving formulas (essentially, doing math!) they keep talking about a "small area" or a "small volume" (where a given function is assumed to stay constant).
When you're actually computing things, such as numerically approximating an integral, you do so by breaking it into small pieces and adding them together. Infinitesimals are the limit of that process as the smallness goes to zero. I think the key problem here is that students use the symbolic rules but never actually do numerical computations of this kind.
I was first exposed to calculus by a couple of visual analogies for the meaning of differentiation and integration. Keeping those firmly in mind, the more formal parts (algebraic manipulations, limits, etc.) come more easily.
I love how people discover old, hidden gems like this. Despite what we know about how people learn and modern tools like video resources, every once in a while, someone unearths some decades-old resource that explains things far better than what's available now.
It reminds me of when I was learning antenna theory and propagation. I was looking for something that started at step one. The best videos I found on YouTube were a series created in the 1950's by the Royal Canadian Air Force.
I was a voracious reader as a kid and came across this book when I was 12, and the prologue sounded particularly amusing to me:
--
"Considering how many fools can calculate, it is surprising that it should be thought either a difficult or a tedious task for any other fool to learn how to master the same tricks.
Some calculus-tricks are quite easy. Some are enormously difficult. The fools who write the textbooks of advanced mathematics — and they are mostly clever fools — seldom take the trouble to show you how easy the easy calculations are. On the contrary, they seem to desire to impress you with their tremendous cleverness by going about it in the most difficult way.
Being myself a remarkably stupid fellow, I have had to unteach myself the difficulties, and now beg to present to my fellow fools the parts that are not hard. Master these thoroughly, and the rest will follow. What one fool can do, another can."
--
It made me believe I could really learn calculus by myself at age 12. After reading the first few chapters, nope. Turns out calculus is built on a foundation of algebra and geometry which I did not have until I was 16. Once I did have the foundation though, calculus became easy and mostly mechanical. But to a 12 year old, this book overpromised and underdelivered.
I also found the exposition a little wordy and tedious, like it was written to explain calculus to an English major. It's the sort of book that one appreciates in retrospect after knowing the subject, but not while learning it. I discovered the most effective way for learning math is actually not by reading but to mechanically work through problems to cultivate intuitions and to develop a pattern matching schema. In doing so, one gains confidence that one can actually solve problems. Once this confidence is achieved, going back and delving into the underlying principles becomes so much more contextualized and rewarding.
Learning math by reading books like this is like learning to ride a bike by reading about it. You try to understand all the principles, but when you need to deploy them, you find yourself unable to execute. Much better to do it the other way around.
EDIT: but I want to soften that by saying that I appreciate not everyone learns this way. It's just I've seen too many struggle with math even though they've read the textbook countless times... when a simple change in stategy would yield much better results.
This book is great. Although anytime I see it, my mind jumps to the meme about "If the people who wrote computer programming books wrote math textbooks."
I remember entering high school and picking up this book from the library in 9th grade. It was honestly not my favorite. It being called “Calculus Made Easy” had me feeling frustrated. I went to put it back and there was a book next to it called Calculus and Pizza. Within the introduction I got the intuitive feel for differentials and limits. I was able to play with the difference formula and take a much more enthusiastic high level trip through both differential and integral calculus. A big bonus was Calculus and Pizza allowed me to check my algebra in the back of the book and forward those questions to my algebra teacher. That was a few years before I was able to officially take calculus, so I kept re-reading chapters because they were fun.
A book with an easy approach really needs to make the chapters super easy to restart and spiral through concepts. Stories and characters didn’t hurt to have too though.
Linear Algebra Done Right is, without a doubt, one of my favorites in this category. It tends to be taught as an intermediate or graduate level approach, but things clicked for me with Axler's explanations way better than about anything else I've used.
Sadly the last edition turned their beautiful LaTeX typesetting into something really distracting.
Aside from Axler, I wish someone picked up Halmos text and rendered it in a more modern way. It's a really good one, a bit more advanced, but the text is so cramped it's unpleasant to follow.
Funnily enough, I'm working through this book right now. The writing amuses me to no end, from the anachronism of calling it "the calculus" to the chain rule being referred to as a "dodge".
It goes well with 3blue1brown's calculus series on YouTube, which gives more of a visual intuition for the concepts.
Me too: it took until 6 months after I started differential calculus (AS-level Pure 1 (UK)) before I grokked what the "d" in dy/dx meant (just like this person: https://physics.stackexchange.com/questions/65724/difference... ) - before then I was treating the differentiation operator as a magic, irreducible, symbol and only getting so far in the course because I was remembering and applying the mechanics of differentiation (i.e. by-rote) instead of really understanding what I was doing.
For anyone interested in the history of calculus, there is an excellent paper:
Grabiner, J. (1983). Who Gave You the Epsilon? Cauchy and the Origins of Rigorous Calculus. The American Mathematical Monthly, 90(3), 185-194. doi:10.2307/2975545
If this book appeals to you then you should see 3 brown 1 blue's videos on YouTube. The way they show the squares with the 'little bits' added in Calculus Made Easy is essentially repeated in 3 brown 1 blue, with graphics.
Beautiful book, and incredibly lucid. Gave me a useful intuition on infinitesimals that never quite came out in the various standard calc texts. Second to this imo is Serge Lang’s A Short Introduction to Calculus.
Honest question: did calculus change enough in a hundred years to make this outdated? In other words, can I use this to refresh my calc knowledge in 2020?
I read the first few chapters when this book was last mentioned here and the only thing that stood out to me as "outdated" was the use of pre-decimalization British currency in examples - I think that most modern students will not naturally know how many pence are in a shilling and so forth.
Do you mean using SI instead of Imperial units? What difference would it make? For instance Exercises IX (Chapter XI) says:
(4) A piece of string 30 inches long has its two ends joined together and is stretched by 3 pegs so as to form a triangle. What is the largest triangular area that can be enclosed by the string?
Just replace inches with your linear measure of choice, mm, rods, poles, chains, stadia, light years, or anything else; it makes no difference to the problem. Thompson could have said 'units of length' instead of 'inches'; but the whole purpose of the book is to build on easy notions that Thompson believed everyone could manage and perhaps even be familiar with already, so he used imperial units that were familiar to every handyman of the time instead of metric which would be familiar to far fewer and hence distracting.
Also don't forget that there were several distinct 'metric' systems in the past before most of the world settled on SI (MKS, CGS, etc.)
It really is. I was shown this text in high school and it's the thing that got Calculus to click for me. I've recommended it dozens of times over the years and I've heard the same sentiment echoed nearly every time.
I love the exercises in Godement's Algebra (1968), e.g. p111:
15. With the development of purely peaceful space research, the President-Director-General of the Societe Anonyme pour l'Exploitation Financiere de la Physique Purement Theorique, Commander of the Legion of Honneur, had been able to conclude some profitable contracts and has bought a country estate of 200 hectares in Basse-Normandie at 8,000 francs per hectare. Inspired by this example, a plumber employed by the company, earning 800 francs a month, decided to invest a tenth of his wages in savings bonds at 4% interest per annum. How many years will he have to work before he is in a position to buy an estate of 200 hectares in Basse-Normandie and live out the rest of his life in peace and comfort? (You should take account of compound
interest, but ignore the effects of possible devaluations of the currency.)
16. According to Le Monde of 21st July 1954, the annual expenditure on the war in Indo-China is given by the following table (the unit is 1,000 million old francs):
...
Use this example to verify the fundamental properties of addition (associativity and commutativity).
17. In November 1954, there were in Algeria 1,230,000 Europeans and 8,300,000 native-born Algerians. At the same date, the University of Algiers had 4,548 European and 557
Algerian students. Calculate the ratio between the chances of a European and those of an Algerian of receiving higher education.
Nice socially-conscious exercise. For possible comparison, I was interested to listen to an interview on KPFA progressive radio with the alleged "math is racist" woman, and unfortunately she did disappoint. One of her first examples was essentially: the one-drop rule involved counting, so counting can be racist. (Every culture counts to one.) On the plus side, in a YouTube video to math teachers she was more about pepping-up teachers, I thought.
It reminds me of the preface to Elementary Calculus [1] [2] an excellent text that I discovered years after learning the subject using the limit:
> The calculus was originally developed using the intuitive concept of an infinitesimal, or an infinitely small number. But for the past one hundred years infinitesimals have been banished from the calculus course for reasons of mathematical rigor. Students have had to learn the subject without the original intuition. This calculus book is based on the work of Abraham Robinson, who in 1960 found a way to make infinitesimals rigorous. While the traditional course begins with the difficult limit concept, this course begins with the more easily understood infinitesimals.
[1]: https://www.math.wisc.edu/~keisler/calc.html
[2]: https://en.wikipedia.org/wiki/Elementary_Calculus:_An_Infini...