> Windows is for grandmas, like Macs used to be in the 90s. So not only does the desktop no longer matter, no one who cares about computers uses Microsoft's anyway.
This sounds like a very ignorant thing to state. Wherever I worked, the best programmers used Windows and Linux desktops. Why? Because they too busy coding to get caught up in the Apple hype is my best guess. You know just like he talks about being ignorant to what was going on in Microsoft world, many coders I worked with did not know or care about what was going on in Apple world. There was one group of people who I did notice flock to Apple OS's, wannabe developers. The type that talk a lot about new tech but never wrote code that got deployed to production.
As a Windows/Linux developer my reason is that both Windows (at least historically) and Linux offer far more opportunities for customisation than macOS, as well as being far less about hiding things and dumbing-down for the computer-illiterates, or locking the user out in the name of security.
I agree with your observation about those who prefer Apple products: it seems to be more of a fashion statement.
MacOS has been a buggy mess for years. From 2008 to 2014, I used to have three screens in front of me: Fedora (local dev server), Windows (Office and business apps), and OS X (iOS dev).
I had by far the most problems with OS X. Windows has been the “just works” OS for me since Windows 7. I’ve had zero bugs and zero crashes. I know that’s not universal, but I’ve spent thousands of hours with various Linux distros, MacOS, and Windows, so it’s also not exactly anecdotal.
The opinion part of that matters a lot less than the silly elitism of “good developers choose ____” OS. I like to be able to customize every aspect of my OS, so I preferred Linux for a long time and now typically use Windows (because it’s customizable enough and easier to use).
Well, I've just left a company where I have been using Mac OS for 6 years. IT recommended that I don't use Linux because the drivers for the Mac hardware are supposedly buggy and slow... so I had to go on with Mac OS.
I've never had so many problems with a computer since Windows 95, and it was consistent across two different model Macbooks. Being unable to hold on to the correct resolution on external display, problems with peripherals (e.g. not detecting a mouse until after a reboot or taking out/putting in Apple mouse batteries), every upgrade of the OS breaking something or requiring re-installs (XCode, etc.), grey screens of death, Terminal crashing(!), mic stopping and needing one to kill the audio driver to work again (that hit many people, hilarious in meeting), insane memory usage (on a 4Gb laptop it's simply impossible to do work because it's swapping pretty much half the time... compared to Windows 7 that is slow but serviceable on 2Gb with similar workload; 16Gb laptop was better), VPN mysteriously dying (works after reboot), GPU driver hanging and causing a reboot ("relaunching WindowServer"), slow boot time compared to a Windows laptop from like 2014, and many more issues. Some issues with memory and scheduler that I looked for in-depth explanations of, I was thinking "is it really possible to f-up Unix so badly? You need some sort of a special talent."
And almost every time I'd go online and there are tons of people having the same problem for years with no resolution, just like Windows 95.
When I started using it, I'd say "lol just like Windows" on every glitch, nowadays when something wrong/slow happens on Windows I say "lol like a Mac"
Oh, also - needing 3rd party software for simple things like setting high-enough mouse sensitivity, window maximization that is not idiotic on multiple monitors, window snapping, or system-wide equalizer.
Really I found the only good thing about Mac OS is a terminal (that is until Terminal app started crashing after some update, taking all my tabs with it - a known bug months ago, still not fixed). But then in Windows I set up ConEMU + MinGW64 for only slightly inferior experience, in like an hour including research.
The fact that you're talking about Terminal and not iTerm makes this whole story very suspect, or at least puts you into a category of "people who never really tried to get the most out of their MacOS experience".
I don't really care much for bells and whistles around the command line, as long as the command line is good (which it is, because it's Unix - I just wish they'd stop making changes so it won't crash).
Yeah, I never really tried to get much out of my Mac OS "experience" because I just want the OS to allow me to find stuff and launch stuff. I found most UI (well, not UI itself but various UI features) outside of actual applications (e.g. the IDE) plain annoying, just like the same crap in Windows 10. Spotlight is on par with Windows 10 search. Finder is worse but more or less the same.
However, I do want my monitor to stop blinking from 2560 to 1920 for no reason, and my terminal from crashing, and to stop saying "hey I can't hear you, kill coreaudiod" in meetings. And actually come think of it, it would be nice to open files from non-open dialogs (I don't know why the context menu in dialogs cannot just be a normal context menu ), and also can I not have the thing where the files created on the date that is after the date the app was opened, show up in "No Date" section at the bottom of its open/save/etc dialogs, when grouped by date? That's really annoying when you want to attach a latest file.
IMHO iTerm isn't all that great. I have problems with common keyboard shortcuts that work in every other *nix terminal emulator (eg forward over word). The response seems to be "just redefine these keyboard shortcuts yourself!"
What exactly does iTerm do that makes you regard it better than Terminal?
Input broadcast, for one. Insane customizability for another -- I have scripts that, in 1 click, can SSH me to all hosts for a service (dozens). That's super valuable.
The fact that you're talking about SSHing into dozen hosts and not e.g. pssh makes this whole story very suspect, or at least puts you into a category of "people who never really tried to get the most out of their terminal experience" :P
Definitely true, but I'm not on here defending the merits of the terminal!
I'll check out pssh. The cool thing about iTerm input broadcasting is you can turn it on and off rapidly, so you can start out broadcasting, stop for a sec to adjust one host, then resume. pssh looks like you'd have to establish a separate connection.
Not being alone and being correct are entirely unrelated.
Also, an OS having bugs and an OS being "terrible" are also entirely unrelated.
Should I just link the bug trackers for each of the desktop environments available for Linux? Should I link the Linux kernel bug tracker (not that either such thing is actually fair)?
You're flamebaiting more OS wars, and honestly I'm pretty sure that's explicitly disallowed on HN, per the rules.
Like, really - do you not realize how biased you sound? Or rather - how _few_ people will agree with you, making the "statement of fact" pure flamebait?
BTW, please note that on Windows I can use various types of Linux variants right from the Windows App store. So the excuse of having a Nix shell on Mac OS is no longer a point for Mac OS. In fact, because we use Ubuntu in prod, and I run Ubuntu on Windows, that point might now be for Windows for many users.
I still prefer using a real nix instead of the windows subsystem for Linux. Mostly because the file system integration with WSL is still bad and Windows still doesn’t come with a decent terminal emulator.
Linux, the kernel, is one of mankind's greatest achievements, I do believe this. The Linux desktop experience, however, is a living nightmare. Your computer itself becomes your second hobby, in addition to whatever you do on your computer. For a professional who's at work to produce novel software, that creates an additional time-sink they don't have the luxury of indulging.
This isn't even really a debate for developers, and pretending like it is just baits the idiotic "Great OS Wars" debates that have zero rational actors. "What's the best OS (for x)" has been, for probably 25 years, a flaimbait level question.
Besides, I use "all" 3 OSes (MacOS, Linux, Windows) for various things, why does this have to be a, "YOU CAN ONLY PICK ONE!" conversation? I could happily develop on Linux or Windows, I am (along with most in the industry, based on my conversations with people over the years) just happiest on MacOS.
"just happiest on MacOS" is far from what you wrote earlier, "MacOS is, by far the best generic developer OS". This points to a problem I notice, hyperbole and exaggeration when discussing software development tools and technology.
Because it is the best generic developer OS (it's *nix + good desktop environment, no Linux distro can even come close to the latter), I am also just happiest with it.
This sounds like a very ignorant thing to state. Wherever I worked, the best programmers used Windows and Linux desktops. Why? Because they too busy coding to get caught up in the Apple hype is my best guess. You know just like he talks about being ignorant to what was going on in Microsoft world, many coders I worked with did not know or care about what was going on in Apple world. There was one group of people who I did notice flock to Apple OS's, wannabe developers. The type that talk a lot about new tech but never wrote code that got deployed to production.