Hacker Newsnew | past | comments | ask | show | jobs | submit | todd8's commentslogin

When I went to college, practically every student arrived with a slide rule. I bought mine when I was 14 years old. (I still have it, a Picket model N4-ES https://www.sliderule.ca/pickett.htm) We all learned to use them in High School and were expected to use them in our science and engineering classes.

There were mechanical adding machines (see for example https://www.burroughsinfo.com/portable-adders.html) but these were only practical for adding and subtracting and were heavy, bulky machines.

In engineering and the sciences, multiplication, division, trigonometric, and exponential functions were necessary. In the late 1960s, there were four alternatives for computing these operations: books containing printed tables of values (good for 5 or 6 decimal positions of precision at best), desktop scientific calculators like the Wang 360--expensive and not very common. I remember using one only once at MIT, "real" computers (running FORTRAN programs on punched cards or perhaps APL), or the lowly slide rule.

Slide rules were everywhere that scientists and engineers roamed in the late 1960's. They had only three parts: a pair of fixed rulers, a sliding ruler that slid between the two fixed rulers, and a cursor, which is a thin precise line in a window that could be used to line up the positions on the rulers. The rulers were inscribed not with evenly spaced marks (as measuring rulers are) but with marks starting at 1 (not zero) and going up to 10 spaced logarithmically.

Just as two yard sticks can be lined up to measure 5 feet, a rulers of a slide rule can be lined up to calculate the sum of two logarithms and adding logarithms can be used to perform multiplication.

A typical slide rule had dozens of scales, with spacing corresponding to the trig functions, exponentials, hyperbolic trig functions, logs of logs, etc. A slide rule could perform almost any function needed for basic science and engineering. There were two limitations; slide rules couldn't calculate sums or differences, and they were only accurate for perhaps 3 digits of precision.

The limitations on precision was due to the difficulty in reading the scale accurately; the scales were only around a foot long. To get another digit of precision you'd need a slide rule with fine marking ten times as long as our portable slide rules. The MIT museum had examples of just such devices. Typically, the scale would be marked on the outside of a cylinder in a helical fashion. Such devices could then get more than four digits of precision. See https://en.wikipedia.org/wiki/Fuller_calculator for an example of a cylindrical slide rule.

In 1971, some of my wealthier fellow students bought the Bomar Brain. It was the first portable electronic calculator that I ever saw and cost $240. In today's dollars that would be roughly $1900. All the device could do was add, subtract, multiply and divide. One would still need a slide rule for trig, square roots, logs, etc. A few years later, HP and others came out with hand held scientific calculators and I retired my slide rule, which now adorns my home office in its original leather case (with the belt loop to carry the slide rule at your fingertips).

Coincidently, I'm vacationing right now, and in the lobby there is a case containing an original Thatcher Calculator on display (https://commons.wikimedia.org/wiki/File:The_Thacher_Calculat...), another variation of the cylindrical slide rule and invented around 120 years ago.


Time for hexadecimal.


Rhino horns are not made of ivory.


Yes... sorry I am thinking of some related topics in this area, not only just about rhinos. Elephant trade has a similar issue, and this is why my mind was in that space.


Humans too can be committed to beliefs that are not true. I have friend that believes in and regularly consults her "clairvoyant". I wonder if our AI assistants in the future will be vulnerable to suspicions or popular fantasies about the world, people, or even other AIs they interact with.


Isn’t commitment to beliefs that aren’t true part of the value of intelligence? Like right now multiple billion dollar companies are being built on different theories of the future of AI. They can’t all be true.


There have been decades of work on file name completion put into Emacs, nothing I’ve seen comes close. Currently I’m using add on packages vertigo, orderless, marginalia, consult, embark, and corfu to handle completion. These packages all work together to produce a crazy good setup, but in the past I’ve used helm, ivy, icicle, and vanilla Emacs. Every one of these open source completion frameworks works great.

Of course, I’ll never get back the time spent fiddling with my 1600 line Emacs configuration file.


Oh man, I've tried using Ivy, but it's terrible. I tried typing '~/.c/r/r' to quickly get to my RetroArch config file, and I immediately got stuck in '~/.cache'. With Vertico it Just Works (R). Icomplete (whose vertical mode is part of Emacs now) can also do it, but I prefer Vertico's UX as the way Return and 'C-j' work in Icomplete is opposite to what I naturally expect and rebinding them doesn't work that well.


This isn’t obvious to me. It doesn’t have to simulate itself in real time. It may not be able to simulate itself simulating itself. There may be proof of this using a diagonalization proof.

Consider programs that are quines, programs are able to output the exact source code of the program. See [1].

And there are abstract computers that can produce any computable output. These are called Universal Turing Machines, see [2]. UTMs can be specified with a remarkably small number of internal states, see [3].

I’m not saying you are wrong, but computability is full of unintuitive results, and the answer may be more subtle than what is revealed by “just thinking about it”.

[1] https://en.m.wikipedia.org/wiki/Quine_(computing)

[2] https://en.m.wikipedia.org/wiki/Universal_Turing_machine

[3] https://en.m.wikipedia.org/wiki/Wolfram%27s_2-state_3-symbol...


It needs to be able to store a complete representation of different versions of it's own state, in it's own state.


Is there a reason for me to pick up Fleet if I already use JetBrains other IDEs?


My regular IDE is IDEA with a bunch of language plugins, so I think I have a reasonable idea where you're coming from.

The idea is to have a tool that can be either a lightweight text editor when you want that or a full on IDE when you want that, and potentially use a remote (could just be another machine on your desk or could've a container on a server) machine for the heavy lifting parts.

I like the premise, but the public beta (or whatever they're calling it) is a little rough to use daily, for me.


I've taught hundreds of people programming. For one year, at my first real job, I was tasked with teaching the programmers within a big company Pascal. All of them were already professional programmers using Fortran, COBOL, or assembly language. The company had a commitment to modernize their software development practices and one facet of this was to introduce a more modern language--Pascal at that time (circa 1976).

I enjoyed that year and, for the most part, my "students" were easy to teach, and Pascal held up well. They were all programmers already, so I didn't have to go slow through the basics and introducing ideas like recursion (absent in Fortran, COBOL, and assembly language of the period) kept the classes new and interesting to the students.

Unfortunately, standard Pascal isn't really a good choice for a first programming language today, even though modern Pascals have removed many of early Pascal's limitations (packages, batteries included libraries, etc.)

So, how can we decide if a language is a viable first language? Employing the wisdom of crowds, here are the top 30 languages in the 2023 IEEE Spectrum rankings:

1-10: Python, Java, C++, C, Javascript, C#, SQL, Go, TypeScript, HTML.

11-20: R, Shell, PHP, Ruby, SAS, Swift, Dart, Rust, Kotlin, Matlab.

21-30: Scala, Assembly, Perl, Visual Basic, Objective-C, Lua, Fortran, Verilog, Groovy, Julia.

I contend that we should limit our choice of first-languages to some language in the top 30. This rules out many languages that I consider important, but we should stick with a language that is likely to benefit the student more than others as a first-language. That means not teaching Lisp or Racket or Ada or Ocaml or Prolog or even D for that matter as a first language.

To narrow our top 30 down to a handful of alternatives, I'm going to use my own subjective difficulty rating and remove C++ and Rust and Scala from the list because they will be too hard for beginners.

Furthermore, let's remove languages that aren't general purpose; this knocks out SQL, HTML, R, Shell, PHP, SAS, Matlab, Assembly, Visual Basic, and Verilog.

Some languages require more complex build or execution environments; I would remove Java, Dart, Kotlin, and Groovy from our list for this reason.

Finally, there are some languages that are not as easy to use outside of specific hardware or operating systems; I consider these languages C#, Swift and Objective-C to be ruled out.

For good measure, I'm removing Perl from the list (a bit too irregular) and Fortran (not widely used outside of some important areas).

Our condensed list now look like this:

Possible first languages: Python, JavaScript, Go, TypeScript, Ruby, and Julia.

To narrow it down even a bit further we can observe that one should learn JavaScript before learning Typescript. (Is this opinion shared by HN?)

Ruby and Python seem like close cousins with Ruby being a bit prettier and Python being much much more popular.

Julia is a bit specialized and a bit harder than the others so now we are left with just three: Python, JavaScript, and Go.


I think first programming languages need to fit with what the person's motivation is. If they want to do statistics, data analysis and the like then surely Python, R and the like will be a good starting point instead of Bash Shell. Also, the teacher having a very good command of said language as well as all the tools for the environment they'll be using (*nix container, Windows, Mac, etc.) makes a big difference. Things like setting up VS Code to debug something is a daunting task for a beginner with no idea on how to set it up (if they devote themselves to setting it up). Language choice isn't that big of a deal but getting a teacher who knows how to guide the student and teach them everything in said language is (the article shows how the author having a good command of D and knowing said tools taught effectively while he struggled with JS and Lua).


Limiting to top 30 programming languages based on a particular ranking can be biased and misleading. For January 2024 D TIOBE ranking is 21, just outside Top 20 while your top 6 choice of Julia is outside top 30 at rank 33 [1]. D also support scripting and REPL feature as Julia, but D is much faster at them in which faster feedback is very important for new learners.

For proper pedagogical approach we should refer to the experts and educators. There is one paper from Monash University that have come up with major sins of introductory programming languages that are not suitable for new learners including Pascal that you used during your teaching time. I have provided the link to the paper in my other comment.

[1] TIOBE Index for January 2024:

https://www.tiobe.com/tiobe-index/


I would say of your four contenders, you eliminated Julia too quickly, it's perfect.

For one reason: it's a dynamic language with types. It's good to have types in a first language, they're too important to neglect. But with Julia, you just introduce them later.

Also, 1 based indexing is easier on anyone who hasn't already gotten used to 0 based indexing, and it's the only language in your last paragraph where zero isn't false.

But it depends more on why they want to learn than anything else. If they want to make web apps, for some reason, obviously JavaScript, and if they like games, it should be Lua. Data and numerics stuff, AI? Julia obviously ;)


There are various reasons to rule out JS. It's a scripting language for browser engines, it's about as general-purpose as GLSL or ActionScript. Node grabbed it and forced it into a general-purpose suit it wasn't made for, but that's not a good reason to learn it as a first language. It's also simply a poorly designed language.

On the other hand, the reason given for excluding Java is seriously reaching. The JVM isn't "complex" from the student's point of view, and hello world (and other exercises) can be compiled with a simple javac invocation.


I think you tossed VB for the wrong reason. It's definitely a general purpose language but it's mostly limited to one platform so that's handled a few lines lower and it would still have led it to be tossed out. I agree with JS over TS for first learning because of the complete lack of a toolchain but later on you probably want to move to TS. Python agreed on as well, Go requires a lot of other knowledge and is a far more advanced eco system already so I'd have ruled out that one as well.


What happened to C? It's the #5 top language and you never ruled it out but it is missing from the final list? It seems like a reasonable first language with simple operations and not much magic happening behind the scenes.


On point but C# is now widely available on any OS under .NET (core) iso .NET Framework. It should be included in the shortlist in my opinion.


It's available but not in common use, and where it is used it is used mostly as a portability option (though it is very well possible that is no longer the dominant use, I just haven't seen it used any other way outside of the Microsoft eco-system).


Main deployment platform for back-end workloads using .NET is Linux. There are GUI frameworks which support it (Avalonia and Uno), file and sockets I/O support is first-class. CLI tooling is platform-agnostic, paid and free tools for developing in C# are available under Linux and macOS. With some effort it can be also run on FreeBSD (not mentioning interesting but niche projects like BFlat which can target UEFI directly).

For example, getting the SDK on Fedora is just

    sudo dnf install dotnet-sdk-8.0
It is as cross-platform as it gets, far more easier to manage than installing JVM implementation and then dealing with a switcher or just the fact that Java needs Gradle to build projects over trivial dotnet new console; dotnet run/build/publish


In my experience, the only places pushing for C#/.NET in Linux are places that were already C#/.NET before the core started to support it.

It's still fairly popular but that popularity is waning as languages with more modern design principles are gaining momentum without relying on a heavy framework like .NET. This could make it a poorer choice for somebody just learning as there may be fewer opportunities for junior devs by the time they graduate.

Even Microsoft has started transitioning to Rust in some cases. I'd hesitate to recommend a language to somebody getting started if that language is under publicized risk of being replaced by its maintainer.

I would add that a purely OOP language also probably isn't ideal for a first language, though. Being able to start teaching with just functions is quicker to introduce than having to describe objects first.

An OOP centric language is definitely appropriate for a second language. But if we are talking "intro", the faster somebody can type hello world while still having more programming concepts in the file than just the print statement, the better. For that last statement, I'd also exclude Python.

I always recommend talking about goals first then recommending a path.

If they know web dev is their future, JS.

If they know IOT is their future, C.

If they know games, probably still C as C++ is still likely the path for a while.


What? None of this makes any sense, even if we go back a few years.


We are talking first language. Not people who are looking for a job right now. What about that doesn't make sense? These are people who are likely months to years away from getting a job.


> Furthermore, let's remove languages that aren't general purpose; this knocks out SQL, HTML, R, Shell, PHP, SAS, Matlab, Assembly, Visual Basic, and Verilog.

Just a nitpick, but modern PHP is definitely a general-purpose language, and quite a decent one. Assembly is also inherently general-purpose, and might have a much higher learning curve, but its foundational nature also makes it an important step in a layered approach to learning (a la Nand2Tetris).


I'd throw Python out too, as it's just far too slow*, which would set wrong expectations for learners. While itmcaan be sped up by writing everything in C and calling it from Python you've already thrown C out, so that's not an option. The languages remaining are very unfortunate though: JS is infamously the language designed in 14 days, and Go is a Google project, so I wouldn't be surprised if programming in it awakens Deep Ones to feed on your sanity.

* <https://github.com/attractivechaos/plb2>


Why would execution speed be particularly relevant? Python is slow but that hasn't stopped many useful applications from being written it, plus proper use of libraries written in C (e.g., numpy) again mitigates the issue.

From a learning perspective, a slower language could actually be beneficial in the sense that it's impractical to brute force a solution to a problem in a way that you can get away with in C or similar.


Too slow for... what?

For beginners, whose main concern is implementing basic data structures/algorithms, Python is fine.

I still use Python for daily scripting (sorry, don't get along with Bash) and it's runs pretty fast.


> I'd throw Python out too, as it's just far too slow*, which would set wrong expectations for learners. While itmcaan be sped up by writing everything in C and calling it from Python you've already thrown C out, so that's not an option.

I'm not sure I follow you. Are you saying using e.g. numpy is not okay because it's implemented in C? How does that make any sense? The cpython interpreter itself is implemented in C so everything interpreted by cpython is just as much "in C" as something else like e.g. numpy.


I missed why you tossed Lua from your list.

I can see why people think the tool chain may be hard for adults, but if the focus is on children learners, you can get Lua with the batteries included.


As a first language, I would teach python or javascript with java, C#, C, C++ as strong contenders depending on what they study, and possibly R or MATLAB is there is a very specific and good reason to do so


These must be the masses of the Mars rovers, the weights would be measured in newtons (or pounds in the USA) and would differ between mars and earth.


In grad school, many years ago, I would plug the drain and let the tub fill while I showered. I would pull the plug and let the tub drain once the water had cooled. This way my bathroom would warm up a bit. I suppose this might have saved a few cents.


These days you can get passive ‘waste water heat recovery’ systems that sit beneath a bath or shower tray - warming your inbound cold water with your outflow, so you use less new hot water (or have to heat it less).


I'd be curious to know how these systems manage with dissolved soap, partially dissolved skin cells, hair, etc.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: