I write software that runs on various jet engines. There is a lot of software that goes onto a modern commercial engine, but the general theme of the software I focus on is the modeling of the underlying engine dynamics using sensor data. It's very math- and physics-based. A thorough knowledge of linear algebra, signal processing, regressions, and clustering/neural network algorithms is essential, among other things.
It's not the kind of environment where typical startup aphorisms apply ("fail fast, fail often", "don't be afraid to pivot", etc). So every month or so when another "programmers don't need to know math" article comes out, usually written by a web programmer, I have an impulse to represent the other side of the divide, but I usually find so many misconceptions and poor assumptions in the original article that I conclude it's too much work. So I am glad that the author has done the job for me here.
What I have observed, again anecdotally, is that web programmers are generally more vocal about programming than systems programmers and scientific programmers. It makes sense in the historical context of the internet, but I don't think that this bias is properly accounted for when people on Twitter/HN/etc discuss "programming" and what it requires.
I don't think it should be controversial that programming is founded upon the study formal languages, mathematics in particular. So even if you don't need math to do your programming work on a day to day basis, it's because a lot of very smart people have solved some very difficult math and language problems over the decades so that you have the luxury of ignoring the mathematics your code relies on. This is all ok. But the implicit hostility towards mathematics that a lot of these articles demonstrate really makes me concerned about the influence it will have on the next generation of programmers. Moore's Law has allowed a certain level of indifference to mathematics in the past few decades since you could always throw a newer processor and more memory at a problem, rather than solving it via a better algorithm. But that situation won't last forever.
Personally I have the impression that many web programmers, especially those who has never worked with anything other than the web, are ignorant of the existence of other branches of programming.
In many aspects programming for the web is simpler. For example the quality requirements for a microwave oven firmware are much higher. You can't easily deploy a hotfix to production after shipping 100k units. I have a feeling that most of complexity in web programming is incidental. It's a result of how poorly the web is designed as a platform. Of course it's very successful commercially but from a technical standpoint it isn't good.
john_b, could you give an advice how to get into aerospace industry? It always seemed very challenging and I'd love to have a chance to work there.
I've made numerous rebuttals (many right here on HN) to claims that software development is not engineering, citing aerospace examples to the contrary. Often the actual task of programming is as simple (or even simpler) than web programming, but there is a wealth of process that surrounds it, to, as you suggested, make sure that what goes out the door is as correct as possible the first time.
Having also done some web and mobile development on the side, I see software development across a spectrum: ranging from throwaway hacks to cheesy mobile entertainment software to quality mobile entertainment software ... etc ... to embedded medical software and flight control software and such.
While all of these activities are software development, it would be erroneous to classify them all as the same thing. Best practices in one region of the spectrum may well be ludicrous if tried in another region.
How to get into aerospace? Apply for a job? Anyone with a computer science degree or equivalent experience [though the degree may be "required", depending on the company] would qualify for an entry-level position in most aerospace fields that I am familiar with. Another approach, if you're more web savvy, might be to get a job in an aerospace IT department doing internal web development and such, and then transfer out into some other project later.
> I write software that runs on various jet engines. There is a lot of software that goes onto a modern commercial engine, but the general theme of the software I focus on is the modeling of the underlying engine dynamics using sensor data. It's very math- and physics-based. A thorough knowledge of linear algebra, signal processing, regressions, and clustering/neural network algorithms is essential, among other things.
> I've made numerous rebuttals (many right here on HN) to claims that software development is not engineering, citing aerospace examples to the contrary.
I don't get this pattern of argument. If the question is "is programming math" or "is programming engineering," it is not an answer to point out that some programming tasks require engineering or math skills. Doesn't this simply establish that certain problem domains require those skills while others don't? The same pattern of argument could be used, I would think, to "demonstrate" that writing is math or engineering: after all, if you're writing a textbook on engineering you sure need to know something math and engineering. But surely nobody would accept that writing, therefore, IS either of those things. Or pick another social science: knowledge of math sure does help if you want to perform a quantitative sociological study, but I don't think that this makes sociology math.
One might make a somewhat stronger claim about programming: math and engineering skills will make anyone a better programmer. I think this is probably true, to some extent, and this is what is at issue in the articles' debate over the usefulness of mathematical measures of complexity. But this also does not establish that programming "is" either a branch of math or engineering.
Then again, I really have no idea what turns on any of these distinctions or, therefore, why we would be debating them for any reason beyond philosophical curiosity. Isn't it enough to say "no, programming isn't math [or engineering], but like many other disciplines, programming bears some conceptual similarities to math, and a grasp of mathematics [or engineering] will make you a better programmer"?
Well, somewhat notoriously, there has been a rolling debate for better than a century now over whether "higher" sciences simply are complex subdiciplines of "lower" sciences. E.g., are all biologists ultimately just physicists? How about psychologists? Economists? The proper meaning of "are" itself in that sentence also, rightly, is the subject of significant dispute. But the analogy isn't the best since I think that philosophers enjoy that question a lot more than actual scientists.
A more interesting example may be law and economics. There has been a concerted movement by certain economists/legal theorists to show either that law is simply applied (or misapplied) economics, or that it should be. (Here's a taste: http://www.law.berkeley.edu/faculty/cooterr/PDFpapers/stratc...) Again, though, the analogy is imperfect. I don't think that many would claim that law "is" economics, but rather that legal rules are or should be explicable in strictly economic terms. Though maybe this would satisfy some definitions of "is."
Pharmacutical research and clinical trials is another field in which software development is more like proper engineering. I worked in a niche that was not very math heavy, but intensely process heavy (out of necessity). First you design a system that can never fail. Then you assume it will fail and design another system to make up for it. Repeat over and over until you reach the desired safety.
I think the dividing line between whether an industry practices programming or engineering is the question "Is it possible people might die as a direct consequence system failure?" Eventually, someone will die and folks will search for answers. When the investigators/plaintiffs come around, you must be able to produce documentation explaining why you thought your system was robust enough to rely on for a safety critical task.
Edit: Perhaps a good rule of thumb to distinguish between programming and engineering is this: are you productively working full-time and producing more than 20-50 lines of high-level language code per day? If yes, you are programming. If no, you are engineering.
I think the dividing line between whether an industry practices programming or engineering is the question "Is it possible people might die as a direct consequence system failure?"
Commercial aviation products are developed according to the requirements of DO-178B (or increasingly, DO-178C) which defines several levels of criticality at which a software component could exist. The most severe indicates that software failure could result in loss of life. The least severe indicates that a software failure wouldn't interrupt anything notable (often used for in-flight entertainment, for example). As you go up the chain, the amount and stringency of process increases.
> Edit: Perhaps a good rule of thumb to distinguish between programming and engineering is this: are you productively working full-time and producing more than 20-50 lines of high-level language code per day? If yes, you are programming. If no, you are engineering.
I like this. As a web developer I am often expected to write closer to an order of magnitude more lines per day than that. But maybe our high level languages aren't high level enough and I'm mostly writing bloat.
I can't believe that in 2014, web development is still getting this mindless, indiscriminate bashing. The kind of web development you might have seen, or done, may have been simpler, but please don't generalize it to everything. Every field has its low ends and its high ends.
I'm generally speaking a web developer, I work in bioinformatics and I can assure you that what I do is not "simpler" than most other programming, please get over yourself.
Your microwave example actually undermines the point you're trying to make. If microwaves are indeed so far more technically advanced, why then is it so hard to upgrade them? I know it's a stupid question, but so is the point that brought it up.
I've spent 15 years having to justify the worth and quality of my work to non-web developers, a lot of whom I wouldn't even trust with the most basic programming tasks. That included some of my teachers at University (!), who not only undermined me all the way through my dissertation, but actually proceeded to steal my work and use it as a teaching base for years after. This was after they admitted that they could not, in fact, fully understand my work, mostly because they disregarded it for the longest time.
I'm pretty damn proud of being a web developer, and even though I do or have done other kinds of programming, I'll stick to web development, if only to prove to people like you that we, too, are programmers.
> To me it sounds like you have a bioinfo library that you've integrated with a website.
??? This is the hostility the above poster commented about; why did you assume that they were using unsafe / uninventive practices? From what? How did you read their development of a complex piece of software as "you made glue code", without external information, or, perhaps, internal bias?
> If your bioinfo code is actually mixed in with your web code then this is exactly what real programmers talk about when we trash web programmers.
Why would you frame a hypothetical situation, unrelated to their comment, as evidence of your beliefs? How does this make sense?
It's pretty clear people use "web programmer" as a synonym for "poor programmer" or "lesser programmer".
Oh, you work on hard problems and complex software? I wouldn't call that web programming!
I promise there are enough hard problems and complexities in the "web" for any programmer in existence to choke on. HTTP/CSS is just a small part of the "web"; it doesn't speak to the complexities of the system behind the curtain.
Well, he has a point. From your description, it seems tenuous to classify what you do as 'web programming'. So why don't you describe how your bioinfo work relates to the web part, because all I can think of is that you have software for which you simply have a web frontend.
From his current job's description (Clinical Genomics Database Web Developer) [1,2]:
"The Clinical Genomics Database/Web Developer performs a variety of duties in the development of interactive websites, databases and interactive tools for the interpretation of genetic assays performed in a clinical setting. The Clinical Genomics Database/Web Developer also participates in the development of interactive tools for interpretation of genetic and genomic assays in support of developing high-throughput, clinically accredited pipelines to aid in identifying known variants in hereditary cancers, and in assisting in research towards the development of personalized treatments for other cancers.
Duties/Accountablilties:
1. Development of interactive websites to allow for sample submission, clinical report retrieval, and analysis interpretation.
2. Development of databases for tracking of patient samples, and preparation of reports for Clinical staff and physicians.
3. Development of interactive tools to allow for interpretation of results from genetic and genomic assays.
4. Quality-assurance testing for websites and databases.
5. Development and assessment of bioinformatic tools for detection of genetic variants implicated in oncogenesis."
Quite intense. Far from a mere "presentation layer", it seems in this case the work involves developing actual clinical research tools --which just happen to be web-based.
> Personally I have the impression that many web programmers, especially those who has never worked with anything other than the web, are ignorant of the existence of other branches of programming.
I see this again and again in the articles that show up here. Occasionally, mobile apps are acknowledged (although they usually mentioned when the author is trying to make a point about trivial social media apps).
> I have a feeling that most of complexity in web programming is incidental. It's a result of how poorly the web is designed as a platform. Of course it's very successful commercially but from a technical standpoint it isn't good.
It seems that most of the technical challenges that people run into in this field can be divided into two categories:
* Needing to accomplish something difficult by developing, implementing, or otherwise employing novel techniques, algorithms, data structures, automation, etc.
* Needing to hack around poorly designed or obsolete software, such as CSS hacks to get older versions of IE to render a page element correctly.
First, a disclaimer: the aviation industry is very large and surprisingly diverse. Different practices exist in different parts of it (engines, airframers, rotar winged aircraft, etc), so it's hard to give general advice. Everything below is based on my personal experience with only the engine subset of the industry.
At the moment it seems like a pretty good time to get into the industry. Hiring is very cyclical, but the companies I follow are generally hiring now. If you know somebody at the company, or even a contractor with the company, that would help of course. With the larger companies, contractors often do a lot of the actual technical work while the primary company manages various projects and contractors, with some of the more ambitious research-y projects conducted in-house (often by M.S. or PhD holders).
Aerospace companies did not begin as software companies, so even though many of their competitive advantages today depend on the quality of their software, a lot of people at these companies--especially older ones who pursued a management path--will not be able to assess your specific technical capabilities (nor will they try). If you have any code on Github, you can mention it as a plus, but some of the people you talk to won't know what a Github is. Some people will understand how desperately their company needs competent software developers, while others will be more interested in how you solve problems in a general sense, communicate & interact with groups, and function in an environment with complex processes and high standards.
You will generally be expected to have a B.S. degree in some form of engineering, though physics and mathematics majors are also hired in smaller numbers, depending on the company. A graduate degree doesn't hurt either. At my company and the companies I work with, there is currently a bias towards more traditional engineering degrees: mechanical, electrical, obviously aerospace; but CS majors are hired as well. The latter are needed most urgently in my opinion, as the former group often ends up writing software that doesn't really require their skills or background, but which could be better written by someone with a CS background who had a strong interest in the application area.
You make a good point, and it worries me somewhat. It manifests itself not just in discussion about math, but also in discussions about fast runtimes, discussions about algorithms, discussions about data structures. There is a lot of vocal self reinforcement of the ideas that speed matters little, algorithms matters little, data structures matter little.
The situation reminds me of an aphorism. I am sure there would equivalents of this story in other cultures. The story behind the aphorism goes like this:
A worldly wise frog visits his friend, a frog in a well. The frog of the well wants to know how big the world is, and proceeds to ask questions by jumping over increasing fractions of the diameter of the well, and asking "is it this big ?". About the point where the frog jumps from the sides to the center of the well, the frog proclaims now if you say the world is bigger than this you are just bullshitting me.
I see this a lot in CRUD programmers, just because they havent found a way to exploit some algorithms they think these are universally useless. Not only that, influenced by these oft repeated lines, they do not even try to find an use that would make their code more efficient. Some are more radical and propose that these topics should not be taught in CS courses. Redis is proof that CRUD can benefit from all these. Similarly, I have also seen many compute oriented programmers think CRUD is trivial. In which case I would just slide the keyboard towards them and ask them to write something that scales, is performant in terms of speed and latency, is robust and frugal with hardware resources.
> web programmers are generally more vocal about programming than systems programmers and scientific programmers
I'm not sure you would find systems programmers doing much more math than web programmers. Sure, they'll focus on optimization and a few algorithms, but much of their job involves plumbing bits from point A to point B.
There are programming jobs that involve math, but programming itself is not math. It is more like plumbing and architecture. You are right in that much of the math is already done for you, in well encapsulated components that you might want to know about but often can forget. But saying "programming is math" is like saying "biology is physics"; it might be true at some level, but it is not often very useful.
> "But saying "programming is math" is like saying "biology is physics"; it might be true at some level, but it is not often very useful."
I agree and actually thought about making this analogy. I think some people like to debate these things for philosophical reasons, and this kind of hierarchy of fields is of interest to such people. Others seem to approach it from a more pragmatic and anecdotal perspective and often take issue with the classification of their field as, e.g. "math-based" or "language-based", as though it reflects on their talents (required or actual) as an individual.
The reaction of the latter group reminds me of pg's essay:
As programmers, we use lots of techniques and where many hats. We sometimes do engineering, design, math, sometimes we are detectives (especially when debugging or coming up with requirements), sometimes we are writers and communicators.
That is why I react to "programming is math" so negatively; it belittles the other tasks that I have to do much more often than math.
The issue to me is more that our education system is so shitty these authors don't even know what math is or that they're doing it every work day. We combine abstract symbols and attempt to logically reason about how that will affect what our programs do. That's math. It's not separable from programming. The only other option is these authors are proud of the fact they engage in some weird cargo cult attempt at programming.
That is a definition of 'math' that is so broad as to be utterly useless. Other things that are 'math' by this standard include baking cookies from a recipe and writing poetry.
I'm unsure of why so many people have such an urgent need for programming to be math, but generations of people have written software, some of it quite substantial, who have no mathematical education at all. Redefining what math is to get these works under the canopy reeks of No True Scottsman and is pointless.
There's plenty of math in CSCI, and I will state unequivocally that any particular writer of software would only be improved by understanding more math than she currently does. In fact, I believe that about anyone, anywhere, in any field. But it doesn't mean that you're doing math while writing your Django app.
The fact that you do it by writing software is incidental. Obviously mathematical skill is necessary to write software that models and controls complex physical/mechanical systems. No one is saying it isn't. That's a strawman.
Many problem domains, however, are not mathematical in nature. I don't need to know anything about calculus, have accurate hand-computation skills, remember probability formulas, etc. to write a database application, for example.
But then obviously I would need these things to write and validate a financial modeling/risk assessment system.
I (and many others) would argue that when math becomes relevant to programming, it is because math is relevant to the problem domain in which you are working. Programming, by itself, has little to do with math (or, more precisely, skill in computation-based math classes and exams is not a proxy for programming skill.)
There is actually more to it. There are aspects of programming that are no different from procedures in mathematics even if the application area of the code is not 'mathematical'. As has been said in the comments [0], the process of debugging code is really not that different from solving/proving those Euclidean geometry exercises/riders we had in school. Now, the person debugging the code may not be explicitly conscious of this, but if the person is any good, he/she is using the same procedure: assume axioms; make hypothesis; prove or disprove using observations, deduction/induction; repeat till you have reached your goal.
>he/she is using the same procedure: assume axioms; make hypothesis; prove or disprove using observations, deduction/induction; repeat till you have reached your goal
Absolutely! This problem-solving process is very similar to what we were asked to do in physics and chemistry on a nightly/weekly basis. I've found chemistry problem sets to be very similar to programming.
However, what you describe is not math, not as it is taught anywhere between first grade and Calc III. Being "good at math" is exactly equal to being good at manipulating symbols (first numbers, then variables) according to a memorized set of rules and procedures. The higher you go, the weirder the rules get. Symbol manipulation and the memorization of probability and geometry formulas are also the entirety of the "skills" represented by standardized tests of mathematics.
Programming is about coming up with the rules and procedures by which the computer should manipulate symbols. It's really completely different.
Excellent mathematical skills would make me a good compiler. Not a writer of compilers, but a compiler. Of course, that might be useful if I wanted to work on one (I'd be able to check its work by hand), but in practice, I'm that sort of work is done billions of times faster and with much greater accuracy by computers than humans anyway.
> not as it is taught anywhere between first grade and Calc III
This is the problem when most programmers talking about math. Everyone seems to think that math after Calculus is just more abstract versions of calculus, with more complicated symbols and more complicated rules for manipulating those symbols.
I once talked to an engineering student who, after founding out I'm a mathematician, boasted to me that he took both differential equations classes, so he was going to tell me some cool things about math. I was sitting with my friend who is getting her PhD in dynamical systems; the irony was rich and totally lost on him.
Anyone who says "being good at math is exactly equal to being good at manipulating symbols" is completely ignorant. I suggest you update your beliefs about mathematics, because right now you're the butt of the irony.
If, by arguing that math skills are important to programming, you mean "skills in post-Calculus math" than say so. If you say "math," people will continue to believe that you mean math in the way that they experienced it, i.e. math ending at or shortly after calculus.
Lots of mathematicians freely admit that they're terrible at computation. Which is precisely my point - the two represent different skills. Being good at symbol manipulation (i.e. "math" as represented by your high school transcript, SAT, ACT, and college transcript if not a math major) is not a determinant of your aptitude as a programmer.
Is it useful to study math beyond calculus? Of course! I know (abstractly) that things like the typed lambda calculus and set theory exist and are the foundation for much of what we do. However, it is ignorant to talk about these things like they're related to your aptitude for pre-Analysis math.
But what it isn't is rational to to use performance in lower math classes to screen (computer science undergrad applicants|interns|programmers), or to encourage people to choose or not choose programming as a career path based on their performance in mathematics classes.
Your comment made it very unclear whether you believed what you were saying about mathematics or not. Rereading, it still seems like you think conjecture and proof is more like physics than math. And you claim that math is irrelevant to compiler design. It's not "abstractly" relevant, it's directly relevant. Parsing is formal language recognition. Register allocation is literally graph coloring. These things are not third-generation applications of mathematics. You seem to think that mathematics makes you good at computing, when in truth most mathematicians are bad at computing things by hand.
The root misunderstanding might be that most people wouldn't know math if it hit them in the head, but changing my definition of math to fit their misconceptions is certainly not going to fix the problem. I would even prefer it if people thought, "math? yeah I have no idea what goes on in that subject" to what you describe. Because distinguishing between "math" and "post-Calculus math" (the latter of which is almost all of math) won't help anyone.
You are basically claiming that everything is math, which is true, but useless. Programming is more like chemistry or biology in the purity access...not "like them" but at a similar level.
Writing parsers, which I do a lot, requires very little parsing theory...ya, they make for good academic papers, but nothing beats good ole simple recursive descent in flexibility for error handling. And for register allocation, you might use - gasp - an algorithm, but that doesn't dominate your work - writing a compiler involves some math (among other tasks), but is not mathematical activity.
Mathematicians can be bad or good at programming, just like musicians can be...there is no strong correlation to their aptitude based on their previous training.
> The root misunderstanding might be that most people wouldn't know math if it hit them in the head, but changing my definition of math to fit their misconceptions is certainly not going to fix the problem.
Math can be defined so broadly as to basically be a useless word. Is sociology math? Is chemistry math? Is physics math? Is engineering math? Is implementing an algorithm math? Meh, if so, then whatever, we haven't made any progress.
> Writing parsers, which I do a lot, requires very little parsing theory...
I am currently writing an Earley parser. When I'm done, you will indeed need little math to use it. However, I had to grasp several non-trivial mathematical insights to write this damn tool (most notably graph search). And I'm not even done: currently, my parser only handle grammars like this:
A -> B C D
A -> B 'x' E
B ->
etc.
I want to handle the full Backus Naur Form, however, so I will need to compile BNF grammars down to a set of production rules. This will involve things very close to lambda lifting, which is rooted in lambda calculus.
Math is the main activity in writing this parser. The code is merely a formalization of such math.
I believe the result will be worthwhile: the error handling should be just as good as manual recursive descent parsing (RDP). Parsing code using my framework should be about 10 times as short as RDP. It will not be limited to LL grammars, unlike RDP. And you will still need very little math to write a parser —just like RDP. But that's because I will have abstracted it under the rug.
---
I don't know what kind of compilers you write, but I'm surprised to hear you say math is not the main activity. We're talking about semantic-preserving transformations here, how could it not be math?
Or, you already know all the math, and applying it doesn't feel like "math" any more.
I do Bret Victor style interactive programming environments. I've developed a set of tricks over the years and they are all quite simple. It really is programming in the classic sense and not so much Greek symbols on the whiteboard.
also, what I do is not very well understood, so there is no good theory for it yet and a lot of open questions to investigate. So its more experiment and measure vs. find a proof that will tell you for sure the right thing to do.
> I've developed a set of tricks over the years and they are all quite simple.
Actually, so is Earley parsing. The more I study this little piece of CS, the more I see how simple this really is. This is why it feels so much like math to me: hard at the beginning, then something "clicks" and everything becomes simpler.
Your "set of tricks" are probably similar. Knowing nothing about them, I'd bet their simplicity is rooted in some deep, abstract, yet simple math, just waiting to be formalized:
> what I do is not very well understood, so there is no good theory for it yet and a lot of open questions to investigate.
And how do you plan to further your understanding, or finding good theories? It can't be just psychology and cognitive science. I'm sure there will be some math involved, including proofs.
My set of tricks is more like: trace dependencies for work as it is done, put work on dirty list when dependency changes, redo work, + some tricks to optimize previous steps. It is really hard to interpret that as math, especially if the word "math" is to remain meaningful and useful. It is all math at some level, but so is everything.
Thank you. This is what I was trying to say, but better articulated.
I would say that math done by mathematicians would be more indicative of skill, but most people who are/are trying to be programmers haven't really tried that, so it's impossible to assess them based on it.
If you still write parsers as recursive descent parser by hand, then this is because NOT ENOUGH MATH IS APPLIED IN PRACTICE (by programmers who think they don't need it).
No, its because incremental performance and error recovery are more important than raw performance and expressiveness. If you think Dr. Odersky doesn't know enough math...not to mention most production compilers out there written by the best professionals in the field. Reality is a harsh mistress.
My parsers, by the way, do things only your parsers could dream of:
Yep. Well, see my managed time paper; basically we use no fancy algorithmic tricks and it all works out fine. There are 2 ways to do incremental parsing: the hard way based on some fancy delta algorithm, and an easy way based on simple memoization and conservative recomputation.
Academics especially often over think these problems when the simple solution often works fine, and performance wise you'd have to mess up pretty bad before parsing becomes a noticeable bottleneck.
> I would even prefer it if people thought, "math? yeah I have no idea what goes on in that subject" to what you describe. Because distinguishing between "math" and "post-Calculus math" (the latter of which is almost all of math) won't help anyone.
Why not?
Most educated Americans did "math" for a minimum of 13 years of their lives. It would be immensely useful if people who control gates like employment and admission understood that the math relevant to computer science and they math they did/can measure about candidates are different things.
The fact that most mathematicians are bad at computing things by hand is the key takeaway. Hand computation skill doesn't mean you should be a mathematician or programmer. Lack of hand computation skill doesn't mean you shouldn't.
> However, what you describe is not math, [not as it is taught anywhere between first grade and Calc III]
I think this is where we disagree then. That is pretty much what Euclidean geometry is, and I would indeed file that in the 'math' cabinet. Euclidean geometry has very few axioms and all of the theorems are just consequences of those axioms and choosing and applying truth preserving operations on them (in other words, reasoning, or as you called it 'problem solving').
Say you are tasked with deciding whether the angular bisectors of all triangles all meet at a point inside the triangle (In programing you may be asked to find out whether this piece of code will ever crash). You seek out theorems that you think would be useful and then try to prove or disprove them. It might turn out that the theorem wasnt useful after all, then you try to prove another theorem that you now think will be more useful. In debugging that is pretty much exactly what you do, same with ensuring properties of code: the code will not crash, the pointer will not be null etc.
I was not schooled in the US but I would guess it is not that different here.
> Programming is about coming up with the rules and procedures by which the computer should manipulate symbols.
How do you think mathematicians come up with a set of axioms and how to operate on them ? Think about how mathematicians came to use imaginary numbers, it is quintessentially the same process, i.e. "coming up with the rules and procedures ...[to]... manipulate symbols". Is it really as fundamentally different as you make it out to be. I am hesitant to say that programming and math are one and the same, but would claim that their methods are the same, that processes in programming, no matter what application you are programming, is indeed at least a subset of the fundamental processes of mathematics.
Furthermore what you are talking about is one aspect of programming, the synthesis part of it. The other is debugging or the deductive aspect of it. It is no less of a part of programming, and again the methods are indeed the same. Whether you call them theorems or not, whether you write them with symbols on paper or not, when you are debugging you are indeed manipulating symbolic objects, and proving or disproving theorems based on rules, exactly like in math, example: "if my assumption about initial conditions and the function foo() is correct then 'a' ought to be 42. If it is not, either my reasoning is wrong or the function implementation or the initial condition is wrong. Ok so it was not 42..." and you keep going like this manipulating your conjectures and observations using rules of mathematics, more precisely logic with '&', '|', 'for all', 'there exists'.
Consider writing tests and choosing what tests to write, its the same process. Consider drawing conclusion form the tests, consider using the type-system to encode properties you desire in the code, again its fundamentally the same deal. One may be aware of it, one may be doing it implicitly without being aware of it, but regardless, its still the same process.
I would argue the match with math is better than the match with sciences like Chemistry or Physics, because there the rules have been set by nature. In programming you choose the rules and try to do something interesting with those, much like in mathematics. For actual computers you do have to co-opt nature into executing those rules.
=============
EDIT: @superuser2 replying here as I dislike heavily nested threads with cramped widths.
> I have never been asked to do anything this in a math class.
I upvoted your comment and now I understand more of where you are coming from, and it seems that there is a difference in the way math is taught in schools where you are from [I am guessing US]. We would typically do this stuff in grade 7 and it is taught in school, so anyone with a school education would be aware of this (quality of instruction varies of course, in fact varies wildly).
> Geometry is an exception, as you state. However, geometry is one year of many, and was extremely easy for me. I'm talking about Algebra, Algebra II/Trig, and Calculus.
I think this sheds more light. For me at least the most difficult homeworks and tests were in geometry, also the most gratifying. Most of my schoolmates would agree. I think we had geometry for 3 years (cannot remember) very lightweight compared to Russian schools. I would however encourage you to think about solving simultaneous linear equations, it is again deduction at work, the only difference is that its scope is so narrow and we know the procedure so well that we can do it by rote if we want to. We also had coordinate geometry, which was about proving the same things but with algebra, but we had this much later in grade 11.
BTW post high-school 'analysis' is different though, it is not devoid of such logical reasoning but its focus is different.
..and thanks to you I now know a little bit more about the methods of Chemistry.
> and proving or disproving theorems based on rules, exactly like in math, example: "if my assumption about initial conditions and the function implementation is correct then 'a' ought to be 42. If it is not, either my reasoning is false or the function implementation or the initial condition is wrong. Ok so it was not 42..."
Chemistry is like this. "Cadmium would be consistent with the valence electrons required for the bond to work. If the mystery element is Cadmium, then the mass would be xxx, but it's yyy, so we can rule that out..." Of course I've forgotten the specifics, but the method is just like it is in programming.
I have never been asked to do anything like this in a math class. Equations are sometimes called theorems, but that's the extent to with this sort of analytical thinking ever factored into a math class up through Calculus. (There was a little of this in geometry, and a really fun two weeks in propositional logic, but geometry was still mostly formula-driven - numbers of vertices and such.) I gather that's the sort of thing Analysis is about, but I'm not talking about math for math majors, I'm talking about math as most people experience it.
You memorize some formulas. You're given some formulas. You recognize the information you're given and apply the procedure you were taught to convert it to the information you're asked for. That's it. (Geometry is an exception, as you state. However, geometry is one year of many, and was extremely easy for me. I'm talking about Algebra, Algebra II/Trig, and Calculus.)
Memorizing and drilling steps is not how you become good at debugging - thinking is. Holding the program in your head, tracing it, asking yourself how you would design it and then figuring out where the problems in the logic are. This is nothing close to recognizing the situation as belonging to a particular category and then blindly cranking the computation for that category of exercise to reach the answer.
> I have never been asked to do anything this in a math class.
This is a tragic comment. It is also true. American children waste their time and their brains in American math classes for about 8 years each. They learn to multiply and maybe solve a quadratic equation. "Equations are sometimes called theorems," and that's it. And it's not getting better; as a culture, we don't understand or respect math so we don't support it.
Modern mathematicians don't memorize or drill steps. We don't just find a bigger number each day by adding one. We hold our problems, trace them, ask ourselves how the proof should be designed and figure out where the problems in the logic are. If you've got 2n points in a line, how many ways can you draw a curve from one point to another such that you match them all up and none of the curve intersect? What about if they're in a circle -- is the answer different? How do you characterize a straight line if you're in a non-Euclidean surface? What's the right definition of straight -- should straight mean "direction of derivative is in direction of line" or "shortest distance" or what? And if we're talking about points on a grid, how should distance be measured anyway? Do we have the right definitions to solve our problem? Distance from streetcorner to streetcorner by cab in Manhattan is different than distance as the crow flies is different than "distance" through time and space; how are the concepts consistent, then?
Formulas are just shorthand for relationships. We don't teach students the relationships or the thinking in the US. That's why my Romanian advisor would ask our graduate algebra class, "You should have learned this in high school, no? No... Hmph." American school math is like prison, a holding cell until release that merely increases the likelihood you'll do poorly in the future.
In general I agree. But I can't help but feel that while math is essential to your job, it might not be essential to programming in general. It's similar to if someone said that having medical knowledge is essential for programming because he was working in medical imaging software.
So I agree with your conclusion (math==good) but not necessarily with your evidence.
I didn't (and wouldn't) claim that programming "is" math in any literal sense. The foundations of software engineering are heavily derived from math, but a degree in software engineering isn't necessary for all forms of programming.
I only claimed that programmers rely on the math skills of other programmers, even if they are not aware of it. Somebody wrote the compression, search, and encryption tools that even ordinary computer users use every day. Game developers often rely on 3rd party game engines to do the heavy vector mathematics that they'd rather not reinvent.
So some programming is heavily based on math, and it is often exposed only as a library or service of some sort. As long as it sits quietly in the background doing its tasks, it won't be noticed much. Declaring that programmers don't need to know anything about math is, to me, equivalent to saying that abstractions can be counted on not to leak.
I don't think this is limited to programming. I've noticed that most engineers forget their college math and engineering theory within a few years of graduating. Many are adamant that math isn't necessary for an engineering career, and many have heard this message from older engineers.
Maybe there's a similar divide, where some engineering activities use more math & theory than others. In moments of cynicism or frustration, I label the two groups "engineers" and "designers," and complain to colleagues that a remarkable amount of design work is done by trial and error.
Is it the right way or the wrong way? I don't know. I'm biased by my preference and enjoyment of math & theory, and I volunteer for tasks that use those skills.
I think we should just do away with the term 'programmer'. It doesn't really mean much anymore. A 'programmer' could be anything from someone deep into the algorithmic side of thing to someone who spends most of their time writing event hooks in a UI. I don't mean to imply any level of 'who's better' in that statement, just that they are very different concerns, often with little in common other than 'writing code'. Making any generalization of programmers is bound to be wrong.
A "programmer" is already somewhat the same as a "builder" is. A programmer programs, just like a builder builds.
A programmer can have a lot of different hats. A hacker (white or black), computer scientist, developer, etc, all do different jobs. You wouldn't put a hacker into a job of medical or financial responsibility, where a minor bug can be fatal or extremely expensive.
A builder can have loads of different hats as well. A carpenter doesn't do the same thing as an iron worker, a plumber or an electrician.
The carpenter probably could do a lot of the same things, with about the same error frequency as a hacker in a financial job would have. That error frequency would also decline as he got better at wiring like an electrician.
That's my view of what the "programmer" word means.
I'm curious your experience with TDD and other hip programming approaches? Because I've been long wondering if "web programmers are generally more vocal" has a lot to do with their popularity.
(Disclaimer: I love automated tests, particularly integration tests, and use them wherever it seems practical. But I cannot imagine developing the software I develop with a "unit tests first" strategy...)
My experience, while nowhere close to being representative of the aviation industry at large, has been that large industrial companies are somewhat schizophrenic in this regard.
I often find myself appreciating, for example, the care that somebody put into their automated build tools so that my unexpected corner case (which I originally expected to break things) actually worked the first time without a hiccup. Yet later in the same day I will be cursing the awful, terribly broken two decade old version control system that the same company is paying enormous sums of money to license.
TDD is one of the approaches that I advocate for often, though not the extreme versions that some people advocate for. Some people get it, while others think it's more efficient to wait until the last few days of a waterfall(ish) process before running comprehensive tests (because "testing costs money"). The panic that ensues when they discover a major problem days before a big deadline is never very pretty.
I'd kill for a TDD framework in the tools I use. While you label many things as "hip programming" my industry is several years behind the fast paced methods of more popular platforms like the web.
For the record, TDD was taught in my university education course as a pretty core testing methodology. The reasoning behind it is sound as it really helps you program to contract and forces the programmer/engineer/manager/whatever to make critical thinking decisions ahead of time. Working "On the fly" isn't always practical.
Hard to get over just how behind automation is. The big scada platform we use is 'disruptive' (ie cheaper and better than the entrenched players) yet they think we should be happy that the brand new version finally supports half-assed, internal version control.
I worked at a company with both a big web application and a bunch of hardware and firmware development. We showed the hardware and firmware developers our integration tests using cucumber, and after about a year they were test driving big swaths of their process using cucumber. (And they were better at it than we were.) My point being: with some work it can be possible to integrate the "hip" tools into your process, and sometimes it even makes sense and works ok!
It's not the kind of environment where typical startup aphorisms apply ("fail fast, fail often", "don't be afraid to pivot", etc). So every month or so when another "programmers don't need to know math" article comes out, usually written by a web programmer, I have an impulse to represent the other side of the divide, but I usually find so many misconceptions and poor assumptions in the original article that I conclude it's too much work. So I am glad that the author has done the job for me here.
What I have observed, again anecdotally, is that web programmers are generally more vocal about programming than systems programmers and scientific programmers. It makes sense in the historical context of the internet, but I don't think that this bias is properly accounted for when people on Twitter/HN/etc discuss "programming" and what it requires.
I don't think it should be controversial that programming is founded upon the study formal languages, mathematics in particular. So even if you don't need math to do your programming work on a day to day basis, it's because a lot of very smart people have solved some very difficult math and language problems over the decades so that you have the luxury of ignoring the mathematics your code relies on. This is all ok. But the implicit hostility towards mathematics that a lot of these articles demonstrate really makes me concerned about the influence it will have on the next generation of programmers. Moore's Law has allowed a certain level of indifference to mathematics in the past few decades since you could always throw a newer processor and more memory at a problem, rather than solving it via a better algorithm. But that situation won't last forever.