Hacker Newsnew | past | comments | ask | show | jobs | submit | destructionator's commentslogin

> D2 re-write

No such thing happened. D has always been built on the same codebase, and the labels "D1" and "D2" are just arbitrary points on a mostly linear evolution (in fact, the tags D 1.0 and D 2.0 came only 6 months apart; 1.0 was just meant to be a long term support branch, not a different language. It was the addition of `const` that broke most code around release 2.6 but if you update those, old and new compilers generally work.

I'd say where D failed was its insistence on chasing every half-baked trend that someone comments on Hacker News. Seriously, look at this very thread, Walter is replying to thing after thing saying "D has this too!!" nevermind if it actually is valuable irl or not.


I've actually tried to remove some features from D, but there's always someone who built a store on it.


that actually wasn't its intended use; that's a side effect. The original intended use came from Effective C++ by Scott Meyers: "Prefer non-member non-friend functions to member functions.". It was meant to make that as syntactically appealing as the members.


The D parts of the compiler were released under the GPL from almost the beginning, since 2002. By 2004, a full open source compiler - what we now call gdc, officially part of gcc - was released using this GPL code. D was pretty popular in these years.


Just a personal anecdote, Walter Bright's Digital Mars C++ compiler also had the contracts (D started life almost literally as recycled code from Mr. Bright's other compilers - he wrote a native Java compiler, a Javascript 1.3 stdlib, and a C++ compiler with a bunch of extensions.... smash those together and you have the early D releases!).

Anyway, I used the DM C++ compiler originally because it was the only one I could download to the high school computers without filling out a form, and pimply-face youth me saw "DESIGN BY CONTRACT" at the top of the website and got kinda excited thinking it was a way to make some easy money coding online.

Imagine my disappointment when I saw it was just in/out/invariant/assert features. (I'm pretty sure D had just come out when I saw that, but I saw `import` instead of `#include` and dismissed it as a weenie language. Came back a couple years later and cursed my younger self for being a fool! lol)


The in/out features come into their own when inheritance is in play, i.e. for member functions of classes and interfaces. See https://dlang.org/spec/function.html#in_out_inheritance

`import` is so cool we extended it to be able to import .c files! The D compiler internally translates them to D so they can be used. When this was initially proposed, the reaction was "what's that good for?" It turned out to be incredibly useful and a huge time saver.

The concept is sort of like C++ being a superset of C and so being able to incorporate C code, except unlike C++, the C syntax can be left behind. After all, don't we get tired of:

    struct Tag { ... } Tag;

?


> struct Tag { ... } Tag;

What's the thing with the syntax? If you don't intend to use the type elsewhere don't give it a tag, if you want, you have to give it a name. (Assuming you are annoyed by the duplicate Tag)


Which would you prefer:

    struct Tag { ... }
or:

    typedef struct Tag { ... } Tag;
? It's just simpler and easier to write code in D than in C/C++. For another example, in C/C++:

    int foo();
    int bar() { return foo(); }
    int foo() { return 3; }
The D equivalent:

    int bar() { return foo(); }
    int foo() { return 3; }


If the user of Tag is supposed to know how the internal details:

    struct Tag { ... };
if it needs to rely on the internals, but the user shouldn't care:

    typedef struct { ... } Tag;
if it can be opaque (what I would default to):

    typedef struct {} Tag;


I also think that is a good feature to separate specification from implementation, I like being forced to declare first what I want to implement. Funnily in your special case you wouldn't need the declaration. (But of course it's a bad idea to rely on this "feature")


My C++ compiler also implemented contracts back in the 90s: https://www.digitalmars.com/ctg/contract.html

Modern C++ is slowly adopting D features, many of which came from extensions I added to my C++ compiler.


lol "11%" i guess sounds a lot bigger than "two cents".


I work from home with a young child. I think if I put a sign up, it'd just prompt more questions from her lol


Use the ascii bell character "\a" and turn off the "visual bell" or whatever options in the terminal (I hate those things) so you can actually hear it beep and find joy.


What I find offensive about this post is using "neo"vim and neomutt. what, was original AgentSmithMutt and TrinityVim not good enough? ...Is it because they're old?

Well anyway, I mostly agree with this post. Another similar thing I hate to see is things like "Really? In 2024?" and im just like that's the laziest critique of something ever.


neo means new


so does modern. so the same author who hates the term modern uses neo - synonomous with modern for all practical purposes - vim and mutt. why is that? might those reasons also apply to other "modern" things?


If you're forking a software to introduce change… it is factually newer.


The recent push to Wayland in 2024 is an interesting choice, given how productive and usable X11 is.


I'm very glad people made the push! My configuration has been much nicer being based on Wayland for the last 4 years or so than it was on X. Screentearing and limited refresh rates on mixed Hz setups are now a thing of the past :)


By that measure we'd still be using DOS these days (which was also productive and usable... and indeed the initial backlash against this nrefangled "Windows 95" thing kept going for a while, not very dissimilar to the X11 vs Wayland debates)


I mean you're free to fork and continue developing X11; right now there is nobody with both the capability and the desire to do so.

I'd wager that once I get hardware made in 2024, Wayland may work well for me (though in its defense it does work fine on my one machine with an Intel integraded GPU), but for now none of my (very old) discrete GPUs work reliably with Wayland, with 2 GPUs and 3 drivers (nvidia vs nouveau for my old GeForce and "radeon" (not amdgpu) for my old AMD card) causing 3 symptoms:

1. Crashes immediately on login

2. Black Screen

3. Kind-of sort-of works, but sometimes the screen just freezes for no reason and sometimes switching VTs fixes it sometimes not.


> I mean you're free to fork and continue developing X11; right now there is nobody with both the capability and the desire to do so.

OpenBSD Xenocara


Last I heard, Xenocara was downstream of X11 rather than being a hard fork?


The era of a single machine is over. We need remote rendering for services on datacenter fleets without GPUs, so X11 is more often replaced by Javascript for a browser (with support for a user's local GPU) than by Wayland.


Have fun yelling at that cloud for the rest of time.


x11 is depreciated. It has no active maintainers and barely even qualifies for "maintenance mode" status; the push to remove Xorg can be justified by enumerating the security issues and nothing else.

Strictly speaking Linux is "productive and usable" with nothing but a terminal multiplexer and a shell to work with. With expectations as high as they are in 2024, I don't think former Windows or Mac users will feel at-home with an x11 session. Switching away from bazaar-style software development is a prerequisite for the Year of the Linux Desktop.


I really do like Gnome and Wayland. I use them every day. That being said,

Bazaar-style software development is the sole advantage free desktop has over macOS and Windows.


Cathedral-style development doesn't necessarily mean closed-source, but instead reflects the less-modular nature of Wayland in relation to x11. There aren't multiple desktops that are all using the same display server; instead each desktop implements it themselves around a common spec. Plug-and-play software has fewer and more restrictive interfaces to rely on. Modern desktop Linux is decidedly pared-back, which is a good thing when you consider how scarily open Linux is in the right hands.

"sole advantage" isn't correct either - there's a plethora of reasons to use Linux. In the enterprise, people pay companies money to keep their Linux away from bazaar-level patches and randomly packaged repos. More casually, a lot of people don't use desktop Linux for a particularly advanced purpose and just treat it like a Mac/Windows/Chrome machine with fewer advertisements. Some people do very much get a lot of value out of the bazaar-side of Linux, but the comparison between the two styles wouldn't exist at all if Linux didn't entertain both philosophies.


> It only takes a little electricity to power this process, which can raise the refrigerant’s temperature by many degrees Celsius.

And the same electricity can raise the temperature by even more degrees Fahrenheit!


Heat in F chill in C, et voilà! free energy.


Unit arbitrage. I love it.


I'd buy that for a dollar!


Using a temperature system built around water to measure air temperature. I mean I can use it but the range of fahrenheit is more useful.

What we really need is a combination of the two. Something that measures air temperature and water content because 68F at 5% humidity is a lot different than the same temp at 40%>


> Several accounts of how he originally defined his scale exist, but the original paper suggests the lower defining point, 0 °F, was established as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (a salt). The other limit established was his best estimate of the average human body temperature, originally set at 90 °F, then 96 °F (about 2.6 °F less than the modern value due to a later redefinition of the scale).

Nothing beats scientific accuracy and thoroughness, right? So it then actually ended up being tied to water as well:

> For much of the 20th century, the Fahrenheit scale was defined by two fixed points with a 180 °F separation: the temperature at which pure water freezes was defined as 32 °F and the boiling point of water was defined to be 212 °F

(from https://en.wikipedia.org/wiki/Fahrenheit)


So that's why one unit Celcius is roughly 2 units Fahrenheit!

For some reason I never noticed there are exactly 180 degrees between freezing and boiling points on the Fahrenheit scale. 100°C is a nice "round" number, and 180°F divides evenly into a lot of smaller numbers.


> What we really need is a combination of the two. Something that measures air temperature and water content because 68F at 5% humidity is a lot different than the same temp at 40%

The "feels like" apparent temperature accounts for things like humidity and windchill[1].

Many weather apps provide the "feels like" temp, including my app: https://uw.leftium.com

I was going to drop the "feels like" reading in my new weather app (I just didn't notice a major difference), but maybe I'll keep it...

[1]: https://www.wikiwand.com/en/Apparent_temperature


Wet bulb temperatures account for evaporation, but I don't know if weather stations report wet bulb or dry bulb.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: