Once you get to the advanced levels of any field, terminology being "accessible" doesn't really matter, but being precise does.
Areas like philosophy and law actually suffer in my opinion when they overload common words with uncommon meanings, or descend into weird disambiguations that depend on suffixes.
For example, in philosophy there's "contractarianism" and "contractualism", and trying to remember which is the general term and which refers to a specific theory drives me nuts. (If "contractualism" were just known as "Scanlon's theory" it would be a lot easier.)
Naming things after their creator is actually super-helpful because it's really easy to disambiguate, helps situate things historically, and once you're at that level there often isn't a single unique word or phrase that can easily encapsulate the idea anyways and isn't easily confused with something else.
I Disagree with you, in Computer Science we have things like: "Quick Sort", "Merge Sort", "Map", "Hashtable", "LRU", etc... etc...
They are much more descriptive and easy to remember, even though the Algorithms can be complex themselves. Event the name "Boolean", could be changed to "Conditional"... and be even more readable. Also, Dijkstra algorithm can be generalized to "Shortest Path Algorithm" (there can be more than one).
Math, and physics to some degree, have become self-referential to the point that start becoming more esoteric magic black books to beginners...
While CS was born out of Math folks, and unfortunately has adopted some of the same esoteric symbolics, I hope Computer Science doesn't follow that path on the long term, otherwise it will become divorced from day to day real life applications.
Let me give you a clear example:
Now, imagine if we called Double Linked Lists as "Darombolo lists", or whoever invented it. (I made up that name), Double Linked List is very easy to visualize and remember. "Giacomo Darombolo List", is not, and just adds to 'must cram/memorize' things to make things work.....
I personally don't like "cramming" useless trivia in order to work in my field. I hope Computer Science divorces from Math, and takes its own path to more logical naming of things and less useless symbols used in it.
It is like the whole field suffers because the authors' Narcissism, that they must name things after them.
Tim Sort, Hamming Codes, Huffman Coding, RSA keys, LZW encoding, Duff's Device, Bloom Filter, Carmack's Reverse, awk, linux, git. We have a lot of things named after those that discovered, invented or popularized a structure or technique. Certainly nowhere near as commonly as does mathematics, I will agree.
In CS, no doubt, we often end up on the other end, where a single term means different things in different contexts and beginners may get confused at our reuse of terminology. Often the reuse gives some metaphorical understanding to the newcomer, even if it largely leads them astray in the details.
Bloom filters prove OP’s point though. The first few times I heard the term, I wondered how a Photoshop filter to blur things could possibly apply to the problem. Maybe if it was called an exclusion filter it would be less jargony, I don’t know; naming things is hard.
At least having a word like "filter" in it narrows down the choices even if it doesn't make it unique. If instead it was "Bloom's construction" or "Bloom's algorithm", or "Tim's procedure", we'll be at a total loss to even guess what it was about, which is what happens with a lot of math starting from "Pythagorus theorem", anyone instantly recall "Apollonius' theorem", "Ackerman function", "Euler's function"?. If "Fermat's last theorem" or "Goldbach conjectures" weren't crazy famous I wouldn't have a clue.. The request to at least give us a "Fourier transform", if not "frequency spectrum" is not unreasonable.
I've lamented this for a long time, but on the other side, I doubt if mathematicians would ever get sufficient recognition if their names weren't immortalized thus, since they can't get patents on their works. They totally deserve recognition. Would you even remember Leonard Euler if his work was named factually? Most of us I guess have no idea who came up with sin/cos/exp/log etc. I'm glad for the names of these functions, but lament the loss of knowledge about the one (or many) who discovered them.
Longer names are a candidate .. along the lines of "Einstein's theory of general relativity". "Euler's relative prime counting function" .. but they too will likely, depending on familiarity, collapse over time.
Do they get recognition? I’m an atmospheric scientist, and though I’ve used the equations many, many times, I have no idea who Navier-Stokes was. Or maybe they were two people? Whatever. Presumably they invented the equations and were scientists or mathematicians or something. If real recognition only comes from inside the field, everyone else has pedagogically unuseful name to deal with.
But could you propose a better name for the math terms you mentioned? Fermat last theorem for example is famous because of its history and not significance and I don't think any other name would be better. Pythagorus theorem - how to call it with a short and significant name? The only option I can think of is "a squared plus b squared equals c squared" which is hardly a good name :)
The alternative "Euclidean distance" is already half way there and is better since we at least know it's about "distance". At this point, offering any alternative will feel alien and unfamiliar, but "Linear distance" works for me if I feel the need to push Euclid out as well.
edit: If I want to talk about distance in a curved space, we already have a well named "Geodesic distance".
That sounds like a different theorem. While it conincides with sums of squares of distances for the Euclidean setting, for the case of a sphere or other manifold it is decidedly about triangles, not so much distances.
I was telling a friend about Bloom filters and he'd never looked into them, because he'd assumed from the name that they were some kind of screen space shader.
I'm a non-native English speaker, and I struggle to remember that Bloom filters are named after a person. I have at times, when coming across the term, ended up wondering how they're supposed to relate to whatever "bloom" the name refers to.
I don't think it really "proves the point", though, except for the point that sometimes names are confusing. So maybe it would be nicer if someone had happened to consider whether a specific naming might be confusing, but that's not the same as "names that aren't directly descriptive are automatically bad".
The point being made is that photoshop manipulates matrices and filters are circles made of translucent material you put on your camera; and in fact many of those photographic filters don't primarily filter light but rather distorts it in some desired fashion. "Transformations" would be much more accurate.
Page rank is a funny one, it's named after one of its creators, but also describes what the algorithm does. Kinda like Baker's chocolate (named after Walter Baker, but is popular for baking). I wonder if there are other examples.
Page rank is probably one of the more clever names out there. I didn't realize at first it was named after the founder, as it's a good description to how it works.
rank (adj):
1. (of vegetation) growing too thickly and coarsely.
2. having a foul or offensive smell. very unpleasant.
3. (especially of something bad or deficient) complete
and utter (used for emphasis).
>In software development, Linus's law is the assertion that "given enough eyeballs, all bugs are shallow".
>The law was formulated by Eric S. Raymond in his essay and book The Cathedral and the Bazaar (1999), and was named in honor of Linus Torvalds. [...]
>Validity
>In Facts and Fallacies about Software Engineering, Robert Glass refers to the law as a "mantra" of the open source movement, but calls it a fallacy due to the lack of supporting evidence and because research has indicated that the rate at which additional bugs are uncovered does not scale linearly with the number of reviewers; rather, there is a small maximum number of useful reviewers, between two and four, and additional reviewers above this number uncover bugs at a much lower rate.[4] While closed-source practitioners also promote stringent, independent code analysis during a software project's development, they focus on in-depth review by a few and not primarily the number of "eyeballs".[5]
>The persistence of the Heartbleed security bug in a critical piece of code for two years has been considered as a refutation of Raymond's dictum.[6][7][8][9] Larry Seltzer suspects that the availability of source code may cause some developers and researchers to perform less extensive tests than they would with closed source software, making it easier for bugs to remain.[9] In 2015, the Linux Foundation's executive director Jim Zemlin argued that the complexity of modern software has increased to such levels that specific resource allocation is desirable to improve its security. Regarding some of 2014's largest global open source software vulnerabilities, he says, "In these cases, the eyeballs weren't really looking".[8] Large scale experiments or peer-reviewed surveys to test how well the mantra holds in practice have not been performed.
>Empirical support to the validity of Linus’s law [10] was obtained by comparing popular and unpopular projects of the same organisation. Organizations like Google and Facebook are known for their quality standards. Popular projects are projects with in the top 5% number of stars (7,481 stars or more). The bug identification was measured using the corrective commit probability, ratio of commits detected to fixing bug. The analysis showed that the popular projects had more bug fixing ratio (e.g., Google’s popular projects had 27% higher bug fix rate than Google’s less popular projects). Since it is unlikely that Google lowered its quality standard in it most popular projects, this is an indication of increased bug detection efficiency in popular projects.
> Organizations like Google and Facebook are known for their quality standards.
Citation needed, especially for Facebook. The parts I saw are full of vile hacks, committed by an endless stream of new clueless devs --- young developers are of course better by the dictum of Mark Zuckerberg. I'm curious how his code looks or if he ever wrote anything substantial.
Common enough (though there's a diversity of pronunciation and spelling of the name), though for any practical purpose probably "edit distance" works better. (And Hamming distance would probably be better called XOR distance or something.)
Funny coincidence, there was a comment on HN the other day when someone called Euclidean distance "bird distance", and everyone agreed it was a great coinage. I think "Manhattan distance" is similarly evocative, and no less precise than any alternative.
Very notably things like the Shannon-Hartley Theorem.
Gaussian noise.
The list in physics could go on for quite a long time. Terms like "gaussian noise" are used as short versions to describe something that would need an entire paragraph if described in a verbose way, as if the reader did not understand the fundamentals of the concept.
It would be harder to search for, and to a lay person it still doesn’t explain what it is. Even worse, given the ambiguity of the word ‘normal’ it could easily cause misapprehensions. Ask the person on the street what normal noise is and they’re probably going to answer in decibels.
What does gaussian noise have to do with a vector that points orthogonal to a surface? (Not entirely rhetorical; it'd be interesting if there was a relation there.)
To some extent that reflects that in programming we encounter a mixture of naming origins between computer science (where naming follows an academic tradition that shares its heritage with Mathematics) and software engineering (where naming follows a tradition closer to that of other engineering tooling, where naming is more like branding - think 'Duck tape' or 'Allen wrench').
Often the closest thing we do to the whole 'Thurston maps are Thurston-equivalent to polynomials, unless they have Thurston obstructions' thing is that we qualify our statements about programming entities by the programming language or platform to which the statement applies. So you might reasonably have a statement like 'A JSON value consists of an array of JSON values, a map of string keys to JSON values, or a primitive value', which sounds just as self-referential as the Thurston example, but because it's called a JSON value, not a Crockford value, it sounds less conceited.
I guess there's no commerce in mathematical theorems. But the use of his name as a generic (not just for products sold by the company which bought the trademark) seems similar in that it's a community decision -- we could have all decided to say "hex wrench" instead.
The absolute worst is some thing I can't remember the real name of right now - I thought it was generic programming but it doesn't seem to be it. It doesn't connotate the meaning in the slightest, and even connotated something other than what it was. It was just storing a matrix or array of past return values of a function to reuse. It feels like I'm having a Mandella moment.
Are you thinking of dynamic programming? It's the most poorly named thing in CS, in my opinion. The name tells you absolutely nothing about what it is, and IIRC the name "dynamic" was chosen because it's difficult to use as a pejorative.
Memoization? Or dynamic programming? The latter is a funny story:
> ... Thus, I thought dynamic programming was a good name. It was something not even a Congressman could object to. So I used it as an umbrella for my activities.
These are good examples but would all be better and more accessible with descriptive names. I doubt anyone would allow a pull request in which a developer named the variables after team-members, so why do we condone it when it comes to R&D?
Just because it's a common practice does not mean it's good.
in Computer Science we have things like: "Quick Sort", "Merge Sort", "Map", "Hashtable", "LRU", etc...
Those are trivial concepts though. The article brings up "perfectoid spaces", as if to suggest their superiority to "Scholze spaces", yet neither name gives any clue as to what they are:
In mathematics, perfectoid spaces are adic spaces of special kind, which occur in the study of problems of "mixed characteristic", such as local fields of characteristic zero which have residue fields of characteristic prime p.
A perfectoid field is a complete topological field K whose topology is induced by a nondiscrete valuation of rank 1, such that the Frobenius endomorphism Φ is surjective on K°/p where K° denotes the ring of power-bounded elements. [1]
Oh, okay. See, the problem is that modern mathematical structures are built on top of centuries of prior work. Computer science, on the other hand, is still in its infancy as a field.
I hope Computer Science divorces from Math, and takes its own path to more logical naming of things and less useless symbols used in it.
That's silly. Computer science is a subfield of math. All computer scientists working in research are mathematicians by training. You won't get anywhere at all if you try to enter the field without a mathematical background.
Well, CS is more like an extension of a very tiny corner of maths. And practically speaking, most computer scientists and 99% of programmers are terrible at math, to the point that they don’t even genuinely understand basic undergraduate concepts from linear algebra.
Working computer scientists (academics) are definitely not terrible at math. There are a great deal of complicated proofs involved in areas such as complexity theory, algorithms, and data structures. Their jobs are not altogether different from those of mathematicians.
The tight performance bounds for Union-Find data structures are ridiculously complicated to prove given how simple the algorithm is, even in a mathematical sense.
There isn't a lot of theory-building in algorithms, compared to the more traditional fields of maths, but the combinatorics are formidable.
Computer science is not a subfield of math. Theoretical computer science is, but there is a lot more to cs than just tcs, for example: operating systems, programming languages, human computer interaction, etc.
I agree on your first point though, I think the concepts named after people are usually so abstract that the hard part is to really understand the concepts, not to remember the name.
On more serious note... Most people confuse software engineering with computer science.
Computer science is a branch of mathematics, software engineering might not be.
Oh and by the way, there's a bunch of mathematics and logic (again, a branch or mathematics) into language recognition and compilers...
Edit: think that many prominent computer scientists were/are mathematicians: think of Donald Knuth or Claude Shannon, for example. They laid the ground for other stuff to happen.
Dijkstra wrote extensively on how mathematically illiteracy among working programmers is why software is generally of such poor quality.
He grew up in a world where he could only beg an hour a week of compute time off the Americans so his community put a remarkably high emphasis on software quality and clear semantics.
And that’s really what advanced mathematics is all about: clearly communicating otherwise intractably complex ideas by letting the symbols do the work.
> And that’s really what advanced mathematics is all about: clearly communicating otherwise intractably complex ideas by letting the symbols do the work.
I agree, and I think you nailed the point perfectly.
I think there's truth to it though, even at as high a level of situations where you have logic that leaves you thinking 'ok, this works... but it doesn't actually make sense' where something's hard to understand or debug because it's got tangled up into a functional but illegible mess.
All of the important ideas in all of the fields that you mentioned have an absolute foundation in math. Just because we are able to work without using math everyday does not reduce the relevance of math to programming at all.
Programming is math. I’m not sure why this bothers people so much. When you cross a bridge and it’s able to keep standing, you don’t feel gratitude to math? When you write any program, you should feel the same gratitude.
Is baseball also a subfield of math? The problem of swinging the bat can be reduced to physiology, which can be reduced to physics, and then to math. (Sorry for the snarkiness -- reductio ad absurdum just seems like the easiest way to argue this point.)
Baseball's development had nothing to do with math.
Engineering is not a subfield of math either, it merely uses math as a tool. As a field, engineering evolved in parallel with math, only borrowing mathematical methods when their suitable applications were discovered.
Computer Science is a subfield of math because it was developed by mathematicians as a direct descendent of algebra and the study of algorithms, which date back to the ancient Babylonian and Greek methods for division, computing the GCD of two numbers, finding square roots, etc.
It seems kind of strange to me to draw the lines based on anthropology. If there was alternate universe in which a philospher with no mathematical training invented Turing machines and so forth, would you consider CS a subfield of math in one universe but not the other?
Your classification seems as reasonable as any, but the lines seem fairly arbitrary to me.
for example: operating systems, programming languages, human computer interaction, etc.
Those topics might seem to have nothing to do with math but all of their components have mathematical underpinnings. Algorithms, data structures, complexity theory, and even the physics of end-to-end latency, colour perception, etc.
There probably are some people out there, working in these fields with only a high school math background, but I'd imagine they're exceedingly rare. Anyone who's completed a CS degree has done their fair share of math.
Everyone who completed a physics degree has done at least as much math as a CS degree holder, is physics a subfield of math?
(I think there is a discussion to be had about making the distinction "CS" vs "theoretical CS" as the GP comment does, or if it should be "CS" vs some other term ("computing"? feels a bit general))
Physics is not a subfield of math. It has entirely non-mathematical origins (see Aristotelian physics [1]).
Computer science was developed by mathematicians as a study of algorithms, procedures for computing, and methods of abstraction. In the words of Hal Abelson, computer science is not a science and it's not really about computers in the same sense that geometry is not about surveying instruments [2].
There is the Curry Howard Lambek Correspondence (or should I say: the Types, Logic, Cartesian Closed Category Correspondence). Curry Howard in particular, says the act of providing a term for a type is the same as providing the proof to a theorem (modulo a few details). Note that this isn't the same as saying writing a concrete computer program and proving a theorem in a type theory are the same type of activity.
Numerical methods and algorithms are fields of math as old as geometry, especially if we focus on the Babylonian or Chinese styles.
Hermann Grassmann sought to formalize arithmetic, not wishing to assume them as granted. In doing this, he also connects recursion, induction and the natural numbers (he would have known of recursion from its early application in the theory of combinatorics). Peano, Dedekind, Frege, Zermelo and many others would also work on the foundations and axiomatization of mathematics and deduction. Computing began as a side-effect of attempts to formalize just how far such an approach could be taken. The Turing Machine arose to tackle Hilbert's Entscheidungsproblem. The lambda calculus as an approach to the foundation of mathematics. Functional programming languages were originally part of tools meant to study formal mathematical objects while Logic programming sought to apply ideas from formal logic and the axiomatization of mathematics to automatically search for programs.
Dedekind said: "In speaking of arithmetic (algebra, analysis) as a part of logic I mean to imply that I consider the number-concept entirely independent of notions or intuitions of space and time, that I consider it an immediate result from the laws of thought."
What we find is computing reaches right to the foundations of mathematics. Whenever we try to systemize thought, we end up with ideas which seeming inevitably also lead to the foundation of computation.
It is like saying Chemistry is a subfield of math, because at some point they use numbers to describe things (mass and such)....
It is not. Applied Computer Science has as much common with math as Chemistry does with math.
I view theoretical computer science as mostly self-masturbatory, to the point that is very very divorced from real life applications and is benefiting very little to us.
Also the market has spoken as well. Someone with CS degree, and 5 years of experience can command a higher salary than someone that took 5 years to get his/her phd in CS. A phd degree is not seen as valuable, mostly because it is not seen as beneficial and it is very divorced to reality of applied computer engineering.
I wouldn’t use money as a measure of how useful something is. Useful for getting a job sure, but to society not so much.
The computer science that helps big companies get more control is the most useful by this metric.
Oh look big data and AI are popular. Programming language theory to help create less buggy programs is less so.
You wouldn't have a compiler without someone having developed formal language theory. Or, at least, probably not one built on a solid theoretical foundation that actually happens to be helpful.
You wouldn't have complexity analysis of algorithms, with which most of us don't need to directly involve ourselves, but you do apply its results when choosing an algorithm based on the knowledge that was originally obtained through that analysis. Or if you're not choosing your algorithms, at the very least someone who chose them for your platform did.
You probably wouldn't have lossless data compression (and an understanding of it) at its present level without someone having done mathematically-based work on things like arithmetic coding [1] and range encoding [2]. Again, you probably don't write that code yourself (I haven't), but it's there.
The list goes on.
A PhD degree isn't really a good investment in terms of just salary in almost any field that I can think of. That just means work that's closer to (and directly applicable for) direct revenue streams tends to pay better than work that's further away from them. That doesn't directly mean work that's further away from revenue streams is less valuable down the road; it just means there's less certainty about its ability to help generate revenue, and that there are more steps, more interim work and a greater financial risk involved. While most businesses don't, and shouldn't, bother, that doesn't mean they might not benefit at some point if someone else does it. "The market has spoken" is a shortsighted way of looking at these kinds of things.
Sure, there are areas of theoretical computer science that are more similar to pure maths in terms of abstraction and applicability, and which are pretty much a pure intellectual pursuit. They are very far from engineering or applications. But theoretical work in CS is broader than that, and some of it underlies much of what we have in the practical world.
It's also true that most of software development and engineering work don't really require involvement with much of the theory, partially because someone else is already doing that work within the platform, and partially because most business software is actually theoretically more or less trivial.
Still doesn't mean the theoretical side is useless, because not all software is trivial.
Now this is most certainly wrong. Example (search "history research"):
"Research in history involves developing an understanding of the past through the examination and interpretation of evidence. Evidence may exist in the form of texts, physical remains of historic sites, recorded data, pictures, maps, artifacts, and so on."
Taking a very obvious example, you can't do much in the way of historical research without at least trying to establish which things happened before which other things. It's a serious problem in ancient history.
I'm pretty sure modern historians use a lot of mathematical tools: statistics, information science, digital archives, computer imaging, etc. It's very hard to search for this, though, because all of the results concern the history of mathematics. You have to examine the tools and methods used.
Pretty much every field in the social sciences and humanities requires their undergrads to take at least one course in statistics. Sure, these students may complain about it but they need to be trained to not make common statistical errors in their publications. Unfortunately, they still do, which highlights the importance of mathematical education even in these fields.
> in Computer Science we have things like: "Quick Sort", "Merge Sort", "Map", "Hashtable", "LRU", etc... etc...
And in quicksort, we have Hoare’s and Lomuto’s partition schemes. Not that “quicksort” is actually particularly descriptive.
We also have Timsort.
> Even the name "Boolean", could be changed to "Conditional"... and be even more readable
Booleans aren't conditionals, conditionals in crisp binary (or, as it is commonly known, “Boolean”) logic operate on booleans (conditionals in fuzzy or nonbinary crisp logics do not.)
CS needs to move much farther toward math and not away from it. Naming things well is always a worthy goal, but just like DDD tells us, it’s a fool’s errand in a global namespace. The distinction between certain concepts are so subtle that it’s arrogant to think that just choosing a better word is all it takes to make the distinction more clear.
Unfortunately, we rely on idioms and made-up terms for lots of complicated concepts, but I don’t believe that narcissism is to blame. I believe the magnitude of the number of concepts we need to know about overall is gargantuan, so much so that words would get overloaded if we tried to describe everything accurately. Which would be more confusing than it is now.
Also, you’re assuming a minimum context of knowledge when you say something like “Shortest Path” is a better name for Djikstra’s algorithm. What if you don’t know what a graph is, or know what a graph is but don’t know what a path is? How is Shortest Path any less opaque? There’s no lowest common denominator of knowledge, so having agreed upon terms in a given domain is the only way to remain precise.
terms like "quick sort" and "map" are actively harmful because what they mean is VERY ambiguous, and in some cases becomes wrong over time. "quick sort" is no longer the quickest sort algorithm by any standard, it happens to just be quicker than some of the algorithms that came before it. "map" tells you nothing about the properties of the data structure other than that it 'maps' keys to values, but even the nuances of that could vary (can you have duplicate keys? duplicate values? do the keys have to be non-null?)
"Boolean" and "conditional" are semantically very different, the difference is significant.
The "there can be more than one" is specifically why it's important to call the algorithm Dijkstra. It's possible to look it up in Google or in a textbook and immediately find the algorithm in question. Generic terms don't have that property.
Calling this narcissism does everyone a significant disservice.
When other people want to discuss that work, they need to call it something, and some natural solutions are “an algorithm proposed by Dijkstra” and, subsequently, “Dijkstra’s Algorithm”, as you can see in this contemporary paper. https://www.ams.org/journals/qam/1970-27-04/S0033-569X-1970-...)
As that second paper shows, you can’t really call it “the Shortest Path Algorithm” because others were developing other approaches for similar problems. “Shortest Path, Assuming All Edge Weights are Non-Negative and You Can Afford to Search Blindly Without a Heuristic, Algorithm” doesn’t really roll off the tongue either.
This is true, concepts aren’t usually named by the person inventing them. But it’s mind boggling that someone could consider it unfair to give the giants that created mathematics and laid the groundwork for physics and modern technology their due.
a) It'd need to be called that by Dijkstra himself to be even arguably narcissistic and
b) The qualifier "Dijsktra's" is important because there are other algorithms for finding a shortest path (Bellman-Ford, Floyd-Warshall, A*), with different trade-offs (Bellman-Ford is slower, but can handle negative weights; Floyd-Warshall gets you all pairs and may be better when the graph is dense). Accordingly, I think the grandparent's suggestion of purely descriptive names isn't feasible.
at this point in time it is probably a better idea to simply list both dijkstra and bellman-ford in a wikipedia page for optimal path finding algorithm
I’m not sure there’s a huge difference between fields: CS also has AVL trees and the RSA algorithm (both named after inventors), red-black trees (mnemonic, maybe, but not descriptive), and B-Trees (Boeing? Balance, Bayer? No one knows, not even the creators —- see 16:10 here https://vimeo.com/73481096)
The mathematical objects mentioned in the article are way more abstract than something like a sorting algorithm. There aren't any words in our vocabulary which would concisely help to describe these concepts.
The OPs point did point out that accessibility becomes less important in advanced topics. The things you've mentioned are largely elementary computer science concepts so they're both more amenable to descriptive naming and more important to have that.
Consider coming up with a descriptive name for LLL basis reduction (where LLL is the initials of the 3 authors) or some other advanced algorithm.
Someone more educated in CS might be able to give better examples.
Very few concepts or techniques, whether in mathematics, CS or elsewhere, are named after people by themselves.
It's rather that others start calling the concepts by the person's name when discussing the concept or algorithm, after it has been introduced by that person in an article or elsewhere.
It would be good not to accuse others of narcissism when there's actually no narcissism involved.
A digression, but I don't think this is actually the case.
I readily acknowledge that literally is often used when a sentence is figurative; that's not the same thing. For literally to be used to mean figuratively the utterer would be worried that, but for the presence of "literally", the sentence might be understood to be literal.
I contend that the role "literally" usually plays is that of an intensifier. I believe it plays that role through ordinary application of hyperbole: the utterance "X is literally Y" is usually meant as "X is very Y; X is so Y it is almost as if it were literally Y; but of course you understand that it was not, in fact, literally Y - we're all reasonable people here."
In much the same way, when someone says "You left me waiting for days" and it's been a handful of minutes, we don't say "'days' sometimes means 'minutes'" - we say that people exaggerate.
I recognize that I'm disagreeing with at least one dictionary; I believe they got it wrong.
And I won't claim that there is literally no single person who in fact uses "literally" to mean "figuratively" - but I have never encountered such an example and I believe it to be rare enough that we can consider it an error, even in a descriptivist treatment.
I think that’s because when science started to get serious, Latin happened to be the lingua-Franca of those in Europe who could afford to be part of it. Equivalent deal for why we use Arabic numbers.
I wouldn’t be surprised if I turn out to be wrong, but I was taught the causal chain was:
1. The Romans spoke Latin
2. Catholic church based in Rome, did everything in Latin
3. Between tithes and indulgences, church got rich and powerful
4. The rich and powerful keep learning Latin to keep up to date with news from the other rich and powerful
latin was used because it was the language you used to write important stuff, it was with the creation of france and Germany that other language were upgraded in social status
I think people have a misconception that mathematicians get together with pomp and ceremony and someone pounds a gavel and declares, "By order of the secret council of mathematicians, such-and-such theorem is hereby dubbed 'Davis's Theorem'", or something.
Rather, what really happens is that mathematicians are a community, and they refer to things in whatever way is convenient. Davis's colleague refers to such-and-such theorem as "Davis's Theorem" not because of some committee on naming, but rather because they were there at the conference where Davis announced the theorem, and everyone at said conference excitedly talked about "Davis's Theorem" for the whole rest of the conference because it was so exciting.
>These two similarly-named Hamming and Hanning (more properly referred to as Hann) window functions both have a sinusoidal shape. The difference between them is that the Hanning window touches zero at both ends, removing any discontinuity. The Hamming window stops just shy of zero, meaning that the signal will still have a slight discontinuity.
The Hamming Window is named after Richard Hamming.
But the Hanning Window is named after Julius von Hann, and lots of people just throw in an extra "ing" to make them sound alike, but its excruciatingly correct name is Hann Window.
But it seems fitting that they're almost but not quite alike, and so is their spelling. Maybe for symmetry there should be a Halling Window that stops just below zero.
I love how they named the inverse spectrum the cepstrum, which uses quefrency, saphe, alanysis, and liftering, instead of frequency, phase, analysis and filtering. It should not be confused with the earlier concept of the kepstrum, of course! ;)
>References to the Bogert paper, in a bibliography, are often edited incorrectly. The terms "quefrency", "alanysis", "cepstrum" and "saphe" were invented by the authors by rearranging some letters in frequency, analysis, spectrum and phase. The new invented terms are defined by analogies to the older terms.
>Thus: The name cepstrum was derived by reversing the first four letters of "spectrum". Operations on cepstra are labelled quefrency analysis (aka quefrency alanysis[1]), liftering, or cepstral analysis. It may be pronounced in the two ways given, the second having the advantage of avoiding confusion with "kepstrum", which also exists (see below). [...]
>The kepstrum, which stands for "Kolmogorov-equation power-series time response", is similar to the cepstrum and has the same relation to it as expected value has to statistical average, i.e. cepstrum is the empirically measured quantity, while kepstrum is the theoretical quantity. It was in use before the cepstrum.[12][13]
Completely agree. And when using common terms, people also make all kind of assumptions from it, and that comes with a bias, before even getting a grasp of the concept it describes in a given field.
Going to pile on to the agreement here -- for example, much confusion can be had when discussing things like "intentionality" with laypeople. Philosophy is riddled with regular words that take non-regular meanings. It might make philosophers feel smart, but it's a detriment to the field imo.
Much, though not all, of this is due to philosophers discussing ideas from the past three thousand years originally in a multitude of languages. How should one discuss a "Platonic Idea" without confusion of a general idea?
People's names can be confused too. A few weeks ago I referred to the Legendre symbol as "Lagrange symbol" by mistake when talking to a colleague. Such things are inevitable however you name things; some words happen to have small Levenshulme-distances...
Particularly if the names are hard to pronounce or remember because of the cultural and language difference. Few French-speaking people would confuse Legendre ('the son-in-law') and Lagrange ('the barn').
Lagrange vs. Legendre is kinda really easy to mix up, since you have Legendre polynomials and Lagrange polynomials and both are important in numerical analysis...
To a certain extent both sides are right. What it really comes come to is communication. The real question is who should the language be optimized for?
One side feels the intended audience is of the same field and sufficiently sophisticated enough to understand the somewhat obscure naming. Others don’t understand it because it simply isn’t for them.
The other side may come from other tangential domains with their own unique language. They don’t understand why those specialists use such obscure language meanwhile they do the same in their own field.
You see the same in any large organization. Seemingly random acronyms get created as lazy shorthand that conflict with other orgs understanding. It’s of course no consequence to those in the “in crowd” but it hampers communication between groups. Considering communication is one of the ever present hurdles between groups it seems reasonable to me to optimize communication between groups rather than within groups.
>Areas like philosophy and law actually suffer in my opinion when they overload common words
It's more that people take material from our fields and misuse them in casual contexts.
>If "contractualism" were just known as "Scanlon's theory" it would be a lot easier.)
It would also be wrong as it isn't his theory, he's just a philosopher with a recent in-vogue formulation of it. The source of the theory in modern western philosophy is Rousseau. There's no issue with discussing "Scanlon's theories"; but that term refers to his theories, not contractualism at large.
Social contract theory is known as contractarianism. [1] (And the source was first Plato, but is best known through Hobbes. Rousseau was then the next best-known iteration after Hobbes.)
But "contractualism" is generally used to refer to T.M. Scanlon's theory specifically. [2]
The preambles of the articles that you're citing do not agree with your position, neither does a number of other definitions found easily online, neither does the academic publication record (you'll easily find 500+ articles on Contractualism penned before Scanlon).
See: "There is no necessity for a contractarian about political theory to be a contractarian about moral theory, although most contemporary contractarians are both. It has been more recently recognized that there are two distinct strains of social contract thought, which now typically go by the names contractarianism and contractualism."
[...]
"Contractarianism argues that we each are motivated to accept morality “first because we are vulnerable to the depredations of others, and second because we can all benefit from cooperation with others” (Narveson 1988, 148). Contractualism, which stems from the Kantian line of social contract thought, holds that rationality requires that we respect persons, which in turn requires that moral principles be such that they can be justified to each person. Thus, individuals are not taken to be motivated by self-interest but rather by a commitment to publicly justify the standards of morality to which each will be held. Where Gauthier, Narveson, or economist James Buchanan are the paradigm Hobbesian contractarians, Rawls or Thomas Scanlon would be the paradigm Kantian contractualists. The rest of this entry will specifically pertain to the contractarian strain wherever the two diverge."
In other words, this isn't about Scanlon the person, it's about two different schools of thought regarding people and their relationship with society. To dumb them down significantly, one's about the selfish, desperate flight from the state of nature, the other is about the crafting of a persuasive encompassing rationality of co-operation.
----
Also, just as an aside, while this difference might seem like it's a small nitpick, it's actually one of the fundamental theoretical divides between continental European and American/British legal systems. So yes, there's a reason why the terms are distinct.
Could not possible disagree more. "Contractualism" is far easier to remember for most people than remembering the name "Scanlon". Proper names do NOT easily translate to concepts for majority of people. If you can't properly define your concept without the use of a proper name, you aren't trying hard enough.
I think this trend needs to die, and luckily it seems to be falling out of favor.
Similar to naming companies after the founder. Tesla Motors could easily have been "Musk Motors" if it was started in the era of Ford and Lockheed.
using suffixes and syntactic variations to distinguish concept also does not translate, in the literal sense of translating to other languages, especially those that have different rules for suffixes. Scanlon or whatever other name is much easier to translate to any other language than random unusual words
> Areas like philosophy and law actually suffer in my opinion when they overload common words with uncommon meanings,
This is definitely true. It doesn't matter as much as a technical person learning it but at a certain point it becomes absolutely impossible to communicate with regular people not steeped in the terminology about it (e.g. object, property, event, part, substance, sort, kind, type for analytic ontology).
Absolutely. When defining new areas the most important things are not bringing along baggage of irrelevant context and being distinct from other things--anything that's unique and visibly distinct (e.g. not like Q123512). Using common words would be the worst as we have ideas about them but the analogies break down fast and get in the way.
> terminology being "accessible" doesn't really matter, but being precise does.
There is also an argument that for experts names are even more accessible than the alternative - if a descriptive name is used but is wrong, then it becomes more confusing. Math already has a problem with overloading common words with special meanings.
> Many of these entities have been given simple and ambiguous names such as Euler's function, Euler's equation, and Euler's formula.
> In an effort to avoid naming everything after Euler, some discoveries and theorems are attributed to the first person to have proved them after Euler.
Absolutely agree. I'd go so far as to say this is one of the real strengths of mathematical naming conventions.
You are absolutely on point with the contrast to Law; I have degrees in both subjects, and this has also been my experience.
Jargon exists for a reason. The deeper the field, the better the jargon needs to be to allow proper communication. The fact that we can communicate complex mathematical ideas in language at all is a minor miracle.
Simply being able to look up the luminarie's name, years of life, and location helps alot with defining an ideas part our knowledge evolution. I hear a theory made by someone in england in th 17-1800's I already know this theory has probably sprang out of the enlightenment and should looking at contemporary works for context of the thinking at that time.
Because I don't give a shit about who discovered something allegedly first. I do care about what it means/does. Maybe it's a language problem, because some languages make it more easy to add words into a new word describing that new thing.
edit: Otherwise it's just protocol overhead, line noise, gibberish to me.
Any name given to object is necessarily going to be an incomplete description of said object. It's a tradeoff between the length of the description and the precision of the description. If you want to know what the object does you can look up its definition.
I ran into something like this with Perrow's 'Normal Accidents' - he defines common words with specific meanings, when he could have just as easily used two common words with no ambiguity.
If you use three words, you're in real danger of forming acronyms, but I think two words are a sweet spot.
> Once you get to the advanced levels of any field, terminology being "accessible" doesn't really matter, but being precise does.
If you really want to build knowledge barriers, then yes.
This might have been true decades ago when advanced academic concepts really might have been relevant for a small group of experts - but today, advanced math is the foundation of huge industries - and not everyone working with it can be assumed to have a formal education in the field: Often, you want to make use of an algorithm and simply need to know the concepts necessary to understand that algorithm.
Also, being precise doesn't have to mean being opaque - a name should give an uninformed reader at least enough information to roughly categorize the concept: Even "Timsort" is better than "Tim's algorithm", because this at least gives me a hint that I deal with a sorting algorithm.
I strongly agree that precision matters, but in no way does naming a concept after a person help with precision. Almost entirely orthogonal. What does help with precision is typed languages. You cannot use a term until you've precisely defined it in terms previously defined, which can be back-traced automatically by a computer to ones and zeroes. It must be a language both parseable by a human and a machine. At that point you have precision. That doesn't preclude using names as identifiers, but makes that largely irrelevant.
I strongly disagree that accessibility doesn't really matter. It always matters. Maybe not during draft time, but in the long run. There are strong economic incentives to make things less accessible for non-experts, but in no way does that help advance the field, or force experts to reduce elements to their most basics, it is just rent seeking.
I'm with you on the "Hard disagree", but I'd say something like: one of the great rewards to being a Mathematician is that things are named after you, and you get that legacy. God knows the pay ain't great.
> Once you get to the advanced levels of any field, terminology being "accessible" doesn't really matter, but being precise does.
And in what multiverse is "Calabi-Yau manifold" a precise terminology? Literally all that can be gleaned from that is that it's a manifold and it's the invention of some mathematicians (or maybe one mathematician with a hyphenated surname).
> If "contractualism" were just known as "Scanlon's theory" it would be a lot easier.
My disagreement would make diamonds look soft in comparison.
"Contractualism" at least conveys it might have something to do with contracts. "Scanlon's theory" tells me absolutely nothing about what it might be. That is worse by every objective measure.
But why do you have to be able to glean meaning from the phrase alone? And how often does that really happen in other situations, when last names aren’t involved?
The term “polymorphism” is a good example. It isn’t named after someone, but could anyone without a background in computer science have any clue what that term actually refers to? Sure, they could examine the root words and try to figure it out, but would they be any closer to the actual meaning of the word?
I don’t think it makes sense to make field-specific jargon accessible to the masses. Instead I think it makes more sense to make it easily researchable and distinct from more commonplace words.
> But why do you have to be able to glean meaning from the phrase alone?
Because it helps me wrap my head around what it is, what it does, and why I should care about it.
And by "me", I more importantly mean a rhetorical "me", i.e. a random layperson who happens to be a politician or someone else with disproportionate power over things like scientific endeavors. More on that in a sec...
> Sure, they could examine the root words [of "polymorphism"] and try to figure it out, but would they be any closer to the actual meaning of the word?
I mean, a little bit, yes. "Poly" = "many", "morph" = "form", "ism" = some kind of state of being, and from that someone could figure out that if something exhibits "polymorphism" it means it has many forms (and this does indeed provide at least some intuitive understanding of e.g. how a function can have many different implementations under the same name, and that implementation being decided by the form of its arguments).
> I don’t think it makes sense to make field-specific jargon accessible to the masses. Instead I think it makes more sense to make it easily researchable and distinct from more commonplace words.
And this is why the masses write off science and "them nerds telling us how the world works" as useless, and in turn why our planet is dying and humanity's decline into stupidity is accelerating. It's precisely why so few people trust science: because they don't understand it, because every effort seems to have been made to make it entirely opaque to anyone without decades of academic background that the vast majority of people cannot afford (schoolwork doesn't put food on the table).
Maybe - just maybe - we could instead try to remove barriers to entry into making STEM some elitist Kool Kids Klub that deems laypeople as unworthy because they don't have the time or energy to memorize the names of a bunch of dead white men, and maybe then we can live in the world we all want: one where science and scientists are taken seriously, and where we don't wait until it's already too late before we even start thinking about addressing a self-induced extinction event.
I've always thought there was a nice breakdown here in math; name things after what they are if they can be stated concisely, name them after discoverers as a short hand for here-there-be-dragons.
Is that the breakdown? Euclid's Algorithm, Pythagoras' Theorem, Fermat's Little Theorem are all elementary, but the example upthread of a "perfectoid field" which is a "complete topological field K whose topology is induced by a nondiscrete valuation of rank 1..." is much more abstruse.
In addition names have another huge accessibility advantage: they are easy to translate. using pedantic suffixes for precise meanings sometimes means that some languages run out of variants
The thing is, there is a trade-off between making math (at any level) accessible to outsiders, and making things easier for the heavy practitioners.
For naming things after their inventors, you are very much losing accessibility to outsides (that includes mathematicians working outside the field). I could see the trade-off for clarity (for most people) not being big enough to off-set the loss in accessibility. The easier a field is to get into, the quicker it will grow. Moreover, it also makes the field a lot more fruitful to work in.
Yet another example: International relations theory overloads "Liberalism", "Realism" and "Idealism". It's too easy to project meaning for already overloaded words like these.
Long articles and opinion pieces are written by people who saw something with these words and it never occurred for them that they should check out what these words mean in the context.
Once you get to the advanced levels of any field, terminology being "accessible" doesn't really matter, but being precise does.
Areas like philosophy and law actually suffer in my opinion when they overload common words with uncommon meanings, or descend into weird disambiguations that depend on suffixes.
For example, in philosophy there's "contractarianism" and "contractualism", and trying to remember which is the general term and which refers to a specific theory drives me nuts. (If "contractualism" were just known as "Scanlon's theory" it would be a lot easier.)
Naming things after their creator is actually super-helpful because it's really easy to disambiguate, helps situate things historically, and once you're at that level there often isn't a single unique word or phrase that can easily encapsulate the idea anyways and isn't easily confused with something else.