I am a guy who works with complex DSP algorithms all the time. I know how to get to things thinking in arrays with actual numbers. I can't say how often I decyphered some obscure mathematical hyroglyphs only to then be: "Oh that is what that means? I have been doing this for a while now, they could have described that with a sentence or two."
I get why mathematical notion looks like it does, but it sucks at doing a lot of things. That would be as if I tried to teach people a non-trivial piece of code by showing them the zeros and ones that make up the code and naming all important variables with ancient banylonian letters. Now sure, after a while the tiny fraction of them who wouldn't get scared away by this might also get fluid in a mad system like that — but if my goal is to make people understand the code why not reduce the tripping stones instead of putting some there for once?
I think mathematicians — similar to the people who didn't let themselves be scared away by zeroes, ones and babylonian letters — have become fluid at a system that they now defend, because they are the initiated magicians who understand it.
As a programmer I am aware that there is a difference between what a piece of code expresses and the syntax/representation it uses. To me mathematicians are like programmers who learned only brainfuck and refuse to consider trying out more appropriate notations for teaching purposes, because that's what they had to learn historically.
> I think mathematicians — similar to the people who didn't let themselves be scared away by zeroes, ones and babylonian letters — have become fluid at a system that they now defend, because they are the initiated magicians who understand it.
No, there is no elitism behind. Mathematical notation exists because it is the most compact form that also contains no ambiguity.
There is plenty of plain english explanation of all of mathematics, that is used to understand the concepts, but to represent them, you use math notation.
> Mathematical notation exists because it is the most compact form that also contains no ambiguity.
Citation needed. My experience thus far trying to read pretty much anything written by a mathematician involves googling a LOT of symbols, all of which have many many many many meanings depending on context.
If any of that were actually true, postfix/revers polish notation would be more popular. Math has it's own languages with all the same problems every language has.
I get why mathematical notion looks like it does, but it sucks at doing a lot of things. That would be as if I tried to teach people a non-trivial piece of code by showing them the zeros and ones that make up the code and naming all important variables with ancient banylonian letters. Now sure, after a while the tiny fraction of them who wouldn't get scared away by this might also get fluid in a mad system like that — but if my goal is to make people understand the code why not reduce the tripping stones instead of putting some there for once?
I think mathematicians — similar to the people who didn't let themselves be scared away by zeroes, ones and babylonian letters — have become fluid at a system that they now defend, because they are the initiated magicians who understand it.
As a programmer I am aware that there is a difference between what a piece of code expresses and the syntax/representation it uses. To me mathematicians are like programmers who learned only brainfuck and refuse to consider trying out more appropriate notations for teaching purposes, because that's what they had to learn historically.