Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When one uses an existing word, it carries a lot of epistemic and ontological baggage, thereby causing trouble to those who are new to the domain. That’s why scientists go for a linguistic reform: invent a new word during a conceptual or scientific revolution. That’s how the word “Oxygen” replaced the old “Phlogiston”.


When defining statistical entropy in his Cambridge lecture course on Coding and Cryptography, Thomas Korner adds the following footnote:

"It is unwise for the beginner and may or may not be fruitless for the expert to seek a link with entropy in physics."

https://www.dpmms.cam.ac.uk/~twk/Shan.pdf


Does Landauer's principle not already create a very strong link between the two? https://en.wikipedia.org/wiki/Landauer%27s_principle

It even predicts a direct equivalence between the unit of bit and Joules per Kelvin with the entropy produced in the erasure of a bit being equal to 9.6e-24 J/K

Unless I'm making some mistake, you can even use computer science principles combined with Landauer's limit to solve Maxwell's Demon.


The link was established quite early and it wasn’t fruitless:

https://bayes.wustl.edu/etj/articles/theory.1.pdf


My least favorite example of this is statistical "significance" which gives the impression of importance. I can't think of a word that has caused more trouble in science than that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: