There is a FOMO especially around younger developers who wants to use latest, newest and upgrade all frameworks the moment they pop out on Microsoft website.
Older developers (like me) have opposite problem and I have been fighting for months to not upgrade .NET4.8 to .NET8 due to compatibility with our current deployment chains etc. In the end I had to admit that using .NET8 for everything is going to work too and is going to give us access to better tools and new tech either through some teething problems.
There is no such community push but like anywhere else, you'll see folks get excited about new toys and then try to force them in to try them. It's not any worse for C# than anything else.
> Like engineers in that community look down on not using the latest features?
Yes. Engineers use the latest features heavily to demonstrate that their skills are current.
One of the worst such features is "var". Some tools even flag your code for not using var when you could. Inappropriate use of "var" makes code harder to read (and even MS documentation says this) and makes code slightly less reliable too.
You are right, SomeMethod().Fire() has the problem too. But typically you write
Employee e = SomeMethod();
e.Fire();
This does NOT have the same problem as var. When you use var you reduce the opportunities for the compiler to catch your bug. So the best practice is to explicitly state the type when possible. It makes the code more readable too.
Microsoft recommendation is reasonable. I don't think var should be used as much Rider recommends. But var is perfectly fine in itself. The problem you illustrated occurs independently from var. You assert that one "typically" writes code a certain way but I've seen plenty of both. Further, sometimes you need var. Sometimes the target type can't be spelled like with anonymous types.
Sorry for being unclear. Var is perfectly fine in many cases. Sometimes there are better tools. Screwdrivers are fine tools but they are not appropriate for driving nails.
from your sibling comment:
> "var" also makes your code less reliable as seen in this example
I disagree with this too, I think your example is a classic case of preprocessor directives making it difficult to know what your code is doing because the build system is going to conditionally change your code. Var or not, you can't even know what's being compiled just by looking at the file, and that's a larger problem than the use of var
Use of preprocessor in the example is incidental. The problem with var is that you're not completely stating your intent, leaving it to the compiler to infer your intent. This can lead to trouble in some cases.
I have seen code that sorts numbers as strings, because nowhere in the code was the intent stated that the array is supposed to hold numbers. As a result, if the programmer forgot to convert strings to integers at input time, the bug is never caught.
Funny. I don’t need to work hard for that at all. It comes naturally to me. It obviously has its pros and cons, but to me, a shiny new tech has to prove itself first in order to deserve my attention.
There are, occasionally, the “wow, I got to have that” moments and those are great, but rare.
I stopped using C# 10+ years ago, but I remember wanting to update the framework and use the latest thing (linq to objects) because it made code a lot more concise.
I can see someone defending Ruby as concise (having the ability to chain different transformations) and simple to write in (low verbosity). Lisp? Define a class and do some map/filter/reduce. Lisp is verbose. Sometimes even worse than Java21 these days.
Just put the equivalent code side by side and look. You would be surprised at how much verbosity you choose to ignore.
Lisp done idiomatically is terse, incredibly so. It's been a while since I've worked in Lisp, but I never used a class, nor did anyone I worked with.
I purposely type everything when coding[0], which makes me painfully aware of all the verbosity and boilerplate. I last wrote Java about 12 years ago (first wrote Java 25 years ago), and it was among the worst at every point in time; I have very little desire to see what it evolved into, but if you can link to a terse, modern Java codebase I will take a look.
chaining a-la Ruby does not make code terse; good selection of primitives, and good idioms do. APL family including J & K are at the extreme, and nothing else comes remotely close.
[0] need to get on the AI assistant bandwagon .. haven't yet
So good to see this. I have seen so many developers use the latest C# features heavily, to avoid looking weak.