The key point in this Twitter rant is that the SWEs who understood the Mac (and NEXTSTEP, for that matter) have aged out. Most everyone has retired or moved onto other opportunities. There's very few of them left at Apple. And they're often overruled by younger managers.
Apple is hiring tons of fresh new SWEs who have no understanding, nor desire to understand "The Mac Way".
I'm not saying this is right or wrong, the world needs to move on, but it's a punch to my nostalgic gut.
If you're hanging onto the Mac out of some belief that Apple will re-find their way WRT macOS, don't. That train left the station years ago. It's past time to move on.
I am the "analysis software coordinator" for a nuclear physics experiment (MUSE). The software framework we use was originally developed by me, on a macbook pro, mainly, targeting both MacOS and Linux. We have to onboard new students regularly, and it's quite a software stack to compile (Geant4, root, helper libs, our framework). 5-10 years ago, I was happy if a new student was on Mac. Slap homebrew on it, install the dependencies, install the software. Pretty straight forward. But now, it get more and more brittle. New xcode? Better start downloading new versions and recompile everything, and pray that it works. Now I am actually happy if somebody brings a windows laptop. Slap on WSL, install ubuntu, just works. The number of students we had who used linux already on their laptop I can count on one hand with no fingers :(, but our new postdoc directly ran into an opengl problem on AMD which makes the gpu restart.... So from that perspective, Windows+WSL, with all its warts, comes closest to usable.
On a tangent -- You might want to look into Guix (at least just the package manager, if not the whole system/distribution); it seems quite popular for similar use cases. See, eg: https://hpc.guix.info/about/
Also, with regards to linux, what stops you from making that the default that students onboard to? I would imagine that something like this becomes quite easy if you have a culture where group members help onboard the new ones into the ways of doing things.
They typically have one laptop, and a) they are not happy if I tell them that they need to switch to linux (and no word, less games...) b) it's often cheaper laptops with questionable linux support.
It's not like we can buy them a work laptop. At least not for each undergrad.
Docker? After attemping to install dependencies and build GIS tools originating in the 90´s, I am so thankful for people who have published docker images of obscure software.
The students have to develop software using the framework, not just use the final program. I always found docker as a dev environment rather painful. Maybe I have to revisit.
Used Ubuntu for almost a decade now. Wouldn't call myself a tinkerer with respect to the OS, the defaults are totally fine. Drivers are rarely an issue ime. Maybe things have become better since last you tried?
I installed Ubuntu on my Acer laptop a few years ago. Fans didn't work after waking from sleep, so it'd overheat. Wifi was finicky. I didn't even try Bluetooth. Couldn't run Adobe Flash and thus couldn't watch the World Cup on the only legit website to watch in the US (Fox Sports).
Yeah, "not the OS's fault" etc. But Windows did all of that fine.
Windows did all of that fine because the manufacturer made the machine with the intent of running windows and shipped it with drivers for its particular hardware. A System76 machine would be the same with linux, apart from the flash thing, so you aren't really saying anything meaningful about any operating system.
I'm saying it's painful for the user, who doesn't care about these details.
But speaking of laptops made for Linux, a lot of modern Lenovo Thinkpads have unpatched issues with Bluetooth audio, and my coworkers are constantly fighting with that. So even us techies have an easier time with Macs.
Ubuntu / Linux definitely has come a long way. I used to compile kernels, write X11 config files, now it just works. Timed perfectly with my lack of time / willingness to tinker for hours to get stuff working.
I somehow found bluetooth and sounds in general to be the achilles heel of Ubuntu. Both installations I have/had in my family either sound stopped overnight or bluetooth didn't work anymore after an update.
Linux has had an arguably FAR superior UX to Windows since Vista.
Mac and Windows both have some great "power user" features - but I think for the average person, Ubuntu is far superior for usability.
I remember 10 years ago when my friends would use my Linux laptop and expect it to be impossible and couldn't believe how easy it was to use and figure out.
Maybe for a few rare laptops that have perfect driver support, it's on par. But this isn't a common case.
The main problem with Linux is for laptops - the power management is simply not in the same league as MacOS.
> Linux has had an arguably FAR superior UX to Windows since Vista.
Has had like 20 different maybe-superior UXes, often 2-3 for a single distro... which in a way is worse than one bad one that you get to know well. Like, I logged into my Linux machine at work one day, and the entire UI was suddenly different, ???
I finally dumped Mac completely for Ubuntu LTS on a Lenovo X1 Carbon about 12 months ago. My experience is that I'm now tinkering with things less on Linux than I used to on macOS/OS X.
I have an external Thundebolt dock that actually works. I'm not on the Xcode treadmill forcing me to update an OS because Apple dropped support. Updates are on my schedule instead of Apple's. I can use an actual standard 3D API (Vulkan) rather than the undocumented pile of crap that is Metal. My Bluetooth speakers are more reliable than they were on macOS. I can set my monitor resolutions without buying a dumbass Mac App. My printing system never gets confused forcing me to reinstall all my printers. I can go on and on.
Is it flawless? No. About once a week, my cursor response goes to absolute shit for about 90 seconds for no obvious reason. Occasionally, I hit one of the Wayland corner cases. Lots of software still doesn't run on Linux--so I have to keep a Windows box floating around, but I had to do that even with macOS. I've had 2 hangs over the last 12 months.
But, overall, while I can list grievances about my Macs endlessly, I have to actually think hard about what has genuinely pissed me off about Ubuntu.
I'm not going to deny anyone's experiences because when that's done to me I just get pissed off. In addition, I made a point to buy somewhat more expensive hardware that was a decent match to Ubuntu (better quality laptop, i5 for better thermal performance, max RAM and SSD). I'm just providing a counterpoint that for some of us Linux is a better environment than Windows or macOS.
> Desktop Linux GUI is still painful and really only suitable for those who love to tinker.
For what it’s worth, this is only true if you buy a device with poor Linux support which is kind of like buying a hackintosh and complaining about how much tinkering OSX requires. If you buy a good Linux box and put Fedora or PopOS on there, I think you’ll be productive instantly.
I’m actually trying to convince my current employer to let us use Linux laptops instead of windows ones (with Linux vms for all development).
My comment was in reply to someone saying Linux just works without needing to tinker, and as much as I’d prefer to be using a Linux based laptop, those are 3 things I have spent countless hours tinkering with to try and get working without success.
I assume that there is reasonable PC laptops out there which also have problems with “power management, screen sharing […] and certain Bluetooth headsets” …on Windows. If you were to encounter such a laptop, would you also consider Windows to “not work” with these three things?
I don’t encounter such laptops except when they run Linux.
Other OSes have their own litany of issues, however the topic under discussion was how well Linux just worked without tinkering, and these things are still pain points for Linux that I don’t encounter on other platforms.
This is not to imply that Linux is bad out that you shouldn’t use it, just that it is still not tinker free.
> Other OSes have their own litany of issues, however the topic under discussion was how well Linux just worked without tinkering, and these things are still pain points for Linux that I don’t encounter on other platforms.
My point is that yes you do. You just immediately attribute the problem to the other parts, not to Windows. Myriads of Windows users everywhere have all sorts of problems with all sorts of hardware, and everyone just accepts it as some sort of problem that they’re having. But a problem where Linux is involved somewhere? Then it’s suddenly Linux which is prone to problems.
I get battery life similar to Windows on my X1. I regularly share my screen on work-related meeting and it works for me without any issues on Gnome with Wayland.
I no longer use a Linux laptop for work but as recently as a year ago, plugging an external monitor/keyboard in to my X14 wouldn’t wake it up (required manually opening and closing the lid), I could use the headphones on AirPods but not the mic, and screen sharing on teams (and a bunch of other video conferencing software) would only work on X11 and not Wayland.
This was running the latest Ubuntu at the time (20, and then 21).
> I could use the headphones on AirPods but not the mic
This is still the case. I was just trying to get this to work. You can't use the mic on a Bluetooth headset. There's supposedly a workaround that requires you to install and run a smartphone software stack on your laptop, but I could never get it to work.
> I no longer use a Linux laptop for work but as recently as a year ago, plugging an external monitor/keyboard in to my X14 wouldn’t wake it up (required manually opening and closing the lid)
Is this supposed to work on Mac? Does it need configuring? Is it something I need to enable? I always had to remove and reinsert the power cord to wake a sleeping docked Macbook.
My WH-1000XM4 has Mac os sometimes forget it has an audio device when the Mac wakes from sleep, resulting in a need to manually disconnect and reconnect from the Bluetooth menu.
My Samson G-Track Pro USB microphone does not get recognised on M1 Macs. It worked fine on a previous Intel Mac.
Not the commenter but macOS regularly spams people with:
- Ads to try Apple Music free for a month.
- Ads to try News+
- Ads to try Fitness+
- Try Safari (if it’s not your default browser)
- Try/log into iCloud if you’re not logged into iCloud.
- Try the latest macOS release if you aren’t on it.
As an aside, you do not want to even log into the Mac App Store with a personal account on a work computer.
Logging into just the Mac App Store effectively logs you into your iCloud on that machine and sensitive personal data is then strewn across various files under ~/Library. The iCloud sysprefs will claim you are not logged into iCloud but ~/Library will show you the truth.
I see them on iOS any time I go to settings or take a photo or open Apple Music. You might not consider them to be ads, though. “Your iCloud storage is full…” and so on.
Not sure if OSX is like that, though, as I haven’t used it much in the last few years.
Which popups are these? The only one I ever received was when I hit the limit on my iCloud account. Certainly not anywhere as bad as all the adverts in teh start menu on my Windows work machine.
The point is not that paying $4 a month rent isn’t useful to some people. It’s that another group of people want to own a computer that doesn’t beg for rent at all.
The old macOS provided that the new Mac, not so much.
The old macOS didn’t provide any cloud storage option, and the new one does, for a fee. That seems like progress, not begging for rent or a reason to be nostalgic.
I agree with you in principle. I just take exception to the claim that this is somehow worse than Windows where you have pervasive 3rd party ads (including telemetry) and no way to opt out.
Frankly, I am rather pessimistic with the industry as a whole. Problems never seem to stay solved. There is constant churn, constant disruption of workflow and UI patterns. Constant change for its own sake and no reflection on what is really important.
I haven't seen an ad in in Windows system so far.
Onedrive is MS cloud and backs up your files for free.
Mac has full on telemetry where it logs every program you use every time you use it amd may stop you from using it.
Apple is building the biggest add network in the world and it needs captive audience for it, Mac will be small fish there but it will be locked inand down just like the iPhone or ipad.
It seems like after a certain level of complexity product design is a very difficult thing to do.
My guess is that few people have the skill to really get product refinement just right over the long term. So you end up with either stagnation, where people are afraid to improve things or random unnecessary changes that make stuff worse.
I've rarely found myself getting all that tied up in solving driver issues in linux especially after the first week of ownership of a new machine. Fighting annoying in OS ads, again, really a first week of ownership sort of thing.
Linux I find what's painful about it is that some things will always be kind of broken, for me it was bluetooth, I used bluetooth all the time, it did work, but I had to fight it into submission fucking daily. Windows? My goto example of what a shitshow windows is by far is the incoherent settings menus, and generally whereas Android and iOS and MacOS and Linux and all the OSes people actually use follow a lot of similar conventions Microsoft is SUPER SPECIAL and of course must do everything in their own unique way.
Jobs was against dogma. It’s a changing world and the perspective and input of the young is valuable.
What exactly are you moving on to? Ubuntu? Windows?
Behind macOS is the most valuable company in the world adding fixes, features and security patches. There isn’t any other computing solution today that can deliver what Apple does.
I moved on to Windows 10 and FreeBSD after many years on the Mac (I refused to upgrade my Macs beyond Mojave), but this is an interim solution for me since neither Windows 10 nor existing FOSS desktops match even Mac OS X Tiger when it comes to the user experience.
My long-term solution is building my own desktop environment that is inspired by the classic Macintosh Human Interface Guidelines, but also takes ideas from OpenDoc, Smalltalk environments, the Symbolics Genera Lisp machine OS, and Plan 9 (itself influenced by Project Oberon when it comes to the GUI) to form an environment that encourages flexible, programmable tools built from smaller components but can be arranged to create complete software solutions, kind of like how a painter arranges colors and tools. Unfortunately I’m still early in the design phase and I have no code to share, and this is also a side project since I work full-time as a researcher and thus I can’t devote my full time to this, but I hope that by 2024 I’ll have an early version working and released.
I am inspired by the vision of personal computing pioneers like Alan Kay, and I hate to see personal computing being transformed from user empowerment to platforms for selling services and serving ads to a captive audience.
I only spent a few years in Mac-land when I was working on an iPhone game, but having grown up on the Amiga and being blown away by Symbolics Genera, I seek out and suppport such visions of personal computing.
It's like we're going from 'bicycles for the mind' to 'mass transit for the mind', where more people than ever are going places, but the bus doesn't go exactly where you want at the time you want to go there, the seats aren't adjustable, the other passengers are sometimes annoying or even dangerous, and nothing is under your control or at your convenience.
I'm also beginning to see that those of us who were lucky enough to experience the individual empowerment granted by the original vision of personal computing must work to keep it going, else we end up with the kind of world that personal computing was reacting against.
That sounds amazing! Can you build it on top of Linux so I can run it on Asahi Linux on a MacBook? I refuse to use a laptop that isn’t running an Apple M-series chip now. Intel/AMD have a long, long way to go to catch up and they don’t seem to have any interest in doing so.
I've also moved on... But unless someone out there gets their hands on a CPU comparable with M2, what would be the point of not upgrading back to a Mac? It's not like Windows is a significantly better experience (gee, the menu item search is such an obvious and necessary thing, is that covered by a valid patent?!)
I can't see Linux (or other GNU-desktop-nixlike) becoming a full-time macOS replacement for me because no DE works the way I'd like it to. You've got KDE (Windowslike), GNOME (what if iPadOS were adapted into a desktop OS), XFCE (middle ground between Windows and Classic Mac OS), and Cinnamon (Classic Windowslike), and Pantheon (what if iPadOS looked like Mac OS X Mavericks). None that are an obvious choice for someone who prefers a Mac-like setup. RavynOS/airyx/etc has been posted here a few times but is a long way away from being usable.
I've considered just building a true maclike DE myself but the process of getting a garden variety traditional floating window manager for Wayland set up makes my head spin, even with things like wl-roots existing. It doesn't feel the least bit accessible to someone used to building consumer-facing native desktop/mobile apps.
How come macOS magically just happened to work the way you liked it to? Answer: It didn’t, you just adjusted yourself to use macOS, but now you’ve ossified and think everything not what you grew up with is unnatural.
Last time I used Linux, Gnome was nearly perfect apart from the huge toolbars they insisted on to satisfy touch users who don’t exist. Hopefully someone see the lights in the next decade.
My concern with a classic macos interface (or even a classic windows interface), is how do newer features get mapped to those sorts of d.e.’s. We’re used to so many features (and little apps like windows snipping tool) and features that we might just normally use a given os’ search bar, (this feature seems to have come from os x and moved elsewhere, I’m even forgetting what apple called it…) would there even be an equivalent to be able to make an equivalent? Maybe I’m showing my age (maybe lack thereof, I only know about the “bicycles of the mind quote” maybe from watching a nightline segment featuring Steve Jobs on youtube, or I saw it somwwhere while reading about him/and or the early pc era)…
Maybe someone should make a Linux distro with a DE preconfigured to work "the Mac way", kind of like Linux Mint is preconfigured to be "the Windows way".
That’s the problem with the current state of their OS, it attempts to do too many things without regard to simplicity. I could name a few things MacOS could do without.
Remember 2000-2005 Microsoft that tried to lock you into Internet Explorer and was deeply into this thing called "trusted computing"?
That's what you're missing.
You're required to sign in for updates so they can track you. No software patches without monitoring, I guess.
No way to remove iTunes or stop Safari from trying to take over again and again. Even if you don't want music to open that way.
Steve Balmer would be proud.
Remember 2005-8 when you tried Ubuntu for the first time and were wowed with Compiz Fusion compared to the drab Windows XP/Vista/etc window management?
That's the feeling you get every time you step away from MacOs finder and the atrocious Mac window manager.
You shouldn't need a 3rd party plugin to manage window placement. It shouldn't be difficult to navigate to a directory via a path or run a program or have the file browser automatically resize its icon view. But that's Mac.
> Also you're required to sign in for updates so they can track you. No software patches without monitoring.
This hasn't been my experience. macOS still incessantly bothers you about OS updates even if you decline signing into an Apple account during first-boot, and you can download major release updaters independently from the Mac Store (e.g. using the script from OSX-KVM).
> You shouldn't need a 3rd party plugin to manage window placement.
If it’s a feature you want, and there are multiple ways of handling it, why not go with a third party app with a windows placement scheme that works for you? I do.
>It shouldn't be difficult to navigate to a directory via a path or run a program
It’s not. I bet I can run any program on my Mac in 6 keystrokes or less. I easily navigate via paths.
>have the file browser automatically resize its icon view.
Personally this sounds like an extreme edge case preference to me. I would hate this behavior
nothing. you're probably saving a lot of money, something objectively true.
fwiw safari is the no 1 worst browser i have ever used, as a developer. it's hot garbage. not the renderer, the browser itself and its dev tools.
plus you need an apple account and a 10 trillion gb xcode install just to port your chrome extension to safari, what a joke.
and every couple of years apple decides to drop backwards compatibility for X number of products/services you've been happily using because they need to sell new machines or something.
Seriously, how hard is it to implement look-behind regexes? If you have even one in your code, the page will crash in Safari.
https://caniuse.com/js-regexp-lookbehind
e-v-e-r-y-t-h-i-n-g. but hey I’m jealous because you have so many pleasurable experiences to come. I suggest jumping in and buying the new iPhone 14. Go all in get the Pro Max. You’ll get a massive return on the investment.
I’m a tech junkie, I’ve had it all. Doesn’t get better than Apple
my answer was everything - you miss the entire computing experience if you use windows or linux on desktop today or android on mobile. In 2022, if you aren’t using Apple you’re in the red. You might as well not have bothered. Windows is worse than nothing. Android is worse than nothing. Linux is awesome for servers, but the desktop experience isn’t quite there.
I’ve used all the platforms. Windows and android are incompetent, poorly designed, serve the user LAST, poorly updated, insecure, spying on you, and annoying AF to use. Worse than nothing. Worse than no computer or phone at all.
Just trying to help really. Using the wrong products has done a great deal of damage to me. Using windows instead of a nix back in the day blocked my progress. Android shares a number of windows like traits. Being a commercial imitation of an Apple product for instance. Or serving the user last. Or just being crap software full of bugs.
If you can't listen, your punishment is to keep using imitation product. Harsh. But it is what it is.
For the record, I'm very familiar with most of the OSs you mentioned: Windows, Linux, MacOS, and Android (on a variety of devices). And I think the only fair statement is that they each have unique strengths and weaknesses. You can worship MacOS to high heavens, but it is also lacking in many ways and owning a MacBook comes with numerous downsides (as well as upsides), compared to a Windows laptop.
The only OS I don't have much experience with is iOS. I've stayed with Android the whole time - largely with Samsung. A high end Samsung phone is an absolute powerhouse, while also looking pretty sleek, if I may say so. I personally think that iOS and iPhones are quite limiting in certain ways. But I would never knock on someone for using iOS. The OS and the phone look very nice and trendy, the whole experience is very well optimized and, at least in the U.S., you'll get certain social perks by having one. Even for older folks, it could be a great choice, because their grandkids will be able to fix most problems that could arise in their phones.
Let me revisit a prior point about iPhones - they're not very customizable. On my Samsung, I have a number of customizations: a custom "launcher" ("Nova"), which enables me to tweak things like icon sizes, density, and many many other things (my favorite feature is to use a custom icon set). I could easily keep naming countless other aspects that Android allows me to customize, even if I only customize a few of them. The default experience of a Samsung phone is quite solid, other than maybe having a few too many Samsung-specific apps pre-installed (though at least I never buy through a carrier, so I don't have to worry about any of that bloat). But it's very trivial to nerf them by disabling them or even simply uninstalling them using "pm-uninstall" through ADB. For me, it takes maybe 20 minutes total to apply various tweaks I like to a brand new phone. I have them all written down neatly and they haven't changed much for a while, so applying those customizations is not bother for me at all. But for others? It would be an unimaginable punishment. Many people prefer to buy an iPhone not only for all the benefits I listed out earlier, but also because it is more restrictive. Because of this, you don't have to deal with any of the 'pain' of trying to decide what should be customized and whether you should do so in the first place. But would I judge anyone for taking either route: 1) a Samsung that you customize to a reasonable extent - without rooting or unlocking the bootloader - vs 2) an iPhone that is effectively at stock settings, other than the apps you've downloaded? No, I absolutely would not. Which brings me to my last point..
I think we live in amazing times in terms of technology (aside from the obvious issues related to privacy and social media, which are discussed at great lengths here and elsewhere). We have many choices in terms of hardware and software and our devices can do things we could hardly fathom a few decades ago. I rather arbitrarily chose to focus on smart phones in my rant, but I could have easily discussed laptops, desktops, servers, or some other type of device to make the same point. So I say, "Live and let live."
I agree with everything you say. I cling to macOS while trying to migrate back to Linux. To Pop!_OS or some other suitable Linux, but for the past couple years it’s been Pop.
Your assertion about “The Mac Way” is simply false. The VP in charge of macOS is an ex-NeXT person; a lot of ex-NeXT and Apple people still work on it.
macOS user experience continues to be driven from the UI design group and executives, as it always has been.
Feel free to criticize the design direction, but understand that it’s fully top-down intentional, not something that just happens because individual software engineering positions are suddenly filled with non-Mac people. That’s not how it works.
I think Apple has already sailed the seas of no return. Another company that puts users first will eventually arise and take its place. Life is a cycle of highs and lows. Thank you for coming to my Ted Talk.
Microsoft and Apple, the duopoly who control nearly 100% of the consumer desktop OS market, were both founded in the 1970s.
In comparison, Google was a relative "newcomer" in the late 1990s. Google and Apple control nearly 100% of the consumer mobile OS market.
There are no challengers. 3 companies own all of personal computing.
Oh yeah, these companies also collectively control the web browsers: Chrome, Safari, Edge. And they control the search engines: Google, Bing. (Google leveraged its search monopoly to achieve a mobile monopoly, which is how it crowded into Microsoft and Apple's world.)
> Another company that puts users first will eventually arise and take its place.
I doubt it, new companies spring up like mushrooms when fields gets disruptively innovated, but big companies tend to stay otherwise.
Can see it here, lots of shifts in the biggest companies the first half, while the second half is dominated by Apple, Microsoft, Google, Alibaba, Facebook and Amazon.
Or ask yourself, when was the last time a new American car brand was created? Tesla thanks to the electric car disruption, but other than Tesla it was a very long time ago. Same thing will happen in software once disruptive events get less common.
I was taking apart my Dyson today to clean it out. I was thinking that it was elegant and useful. I sort of like the company because it doesn’t attempt to get in the way and just provides a nice experience for a tad higher price.
Then I thought vacuum cleaners have existed for a while now. There’s the Whirlwind, Bissel, Kirby, Electrolux, Hoover, Oreck, Dyson, and most recently iRobot. Each generation had their brand.
We’re living through cycles and computers with operating systems are no different. With all the Chromebooks floating around in schools today, I bet you’ll have a generation of kids who will do everything on a web browser in the cloud and won’t complain about not having an exposed filesystem.
This is funny because I just went down a vacuum research rabbit hole and Dysons are despised by many techs and aficionados. Bagless vacuums in general are seen as a downgrade, and Dyson was the biggest hand in pushing their popularity.
My favorite is one vacuum tech who said Dyson’s biggest innovation was having a clear chamber that showed you the moving dust. Not much utility, but great product marketing.
But because this is HN, I look forward to a grizzled vacuum industry veteran coming by to put me in my place.
Microwave ovens are simple to design and that makes them not comparable, almost all the cost comes from running the factory and buying inputs.
A consumer level OS on the other hand is extremely expensive to develop and basically free to copy, so the advantage of scale is extreme, you wont see that level of competition there.
80% the people working in Apple now were hired after the Steve Jobs's death. Which means that most of the old Mac, NEXTSTEP folks are likely all retired.
Those that first put out Apple HIG, along with other software UX expert left as well. After Tim Cook fired Scott Forstall and named Jony as the CDO.
Most of the NeXT folks I worked with left many years ago. Perhaps generous stock options contributed to the exodus. Or perhaps once OS X was clicking along with a new Window Server and graphic sub-system, replete with an Objective-C-based Cocoa API, they felt like the work was complete.
There are nonetheless many of the old guard still there — presumably navigating the Swift-UI transition.
And Craig Federighi is, in my opinion, a very capable captain to be steering the OS ship for Apple.
I have been hearing this for almost 2 decades now, as long as I’ve been following the ‘community’. Meanwhile, in my real life, my parents, wife, kids, and friends, are all happier than they ever ever been with their latest MacBooks and macOS.
Things like iCloud photo syncing, universal control, and airdrop feel boring to us, but they make the computing experience feel pretty much like living in the future for them.
Your parents, kids et cetera aren't power users though. Macs used to be for professionals, but Apple started seriously chasing the consumer demographic with the original iMac.
Macs used to be for professionals, but Apple started seriously chasing the consumer demographic with the original iMac.
The original iMac was released in 1998. What's happening today has nothing to do with 1998 other than it was the iMac that is credited with saving Apple.
Professionals certainly do use Macs in all kinds of industries [1]:
You incorrectly assume that all power users wish to modify their operating system and hardware, and in doing so exclude professionals who prefer to use the tool as it was sold to them. There are many reasons why professionals with the capability to modify software and hardware behaviors might choose not to; conflating “power users” and “professionals” disregards that distinction, and by doing so ignores why Macs are successful among those professionals who are not, as you label it, “power users”.
I said nothing about modifying anything, if anybody is assuming things, it is you.
Case in point: my 70 year old aunt and 8 year old neighbour can change the RAM and storage on their computers, that doesn't make them power users.
An Academy-winning Hollywood director I know bought his Windows-based computer pre-made, doesn't modify anything on it, just uses Da Vinci Resolve, but requires the absolute pinnacle of performance. He's a power user.
You’re absolutely right: the term “power user” isn’t well-defined enough for us to use in this discussion without considering what it means. What definition are you using in your comment above? I don’t have any strong preference, and will accede to yours.
The typical one of having advanced skills and advanced needs. The thing that Macs did used to cater for.
Software-wise, now both Logic and Final Cut are a bad joke, especially by modern standards. That's just one of dozens upon dozens of reasons why Apple aren't the go-to anymore.
As a kid I was definitely a power user of everything I cared about. Lego, reading, playing and inventing (physical) games etc. I was really busy and I couldn't understand all those kids that went to sleep after lunch. A waste of time.
I don't know how this maps to using a computer but I think I wouldn't have appreciated a GUI that slowed me down.
There was one point where it seemed like everybody in Hollywood, music, and the visual arts were using Macs. Logic and Final Cut used to be serious programs.
I've always been curious why. Was Final Cut the Excel of the arts? Perhaps it was integrated platform color control? I don't recall PPC being dramatically faster, or supporting much more memory.
Trying to compete with Adobe in this space seems pretty futile. Pro users are deeply entrenched in their workflows. The R&D costs of innovation are off the charts and the market is extremely tiny. The same problem occurred with Autodesk.
That opinion is very outdated. Da Vinci Resolve is the #1 post-production tool in Hollywood, and gives real-time collab, via LAN and WAN. It even works on Linux, Windows and macOS. Linux is huge in Hollywood these days. BlackMagic cut ahead with their Cinema Camera range of hardware.
Cubase/Nuendo and Pro Tools likewise dominates as a combo for composers and sound designers.
The intel transition really made it the platform for everyone, with all the cool intuitive apps like the apple office suite, to Office and Adobe availability and the unix shell and its underpinnings hidden away.
Now it seems that the power user stuff takes a much further backseat to the toy stuff.
"Better" is highly dubious. Less user-serviceable, forced to endure lead times at Apple Stores to procure parts. Death of macOS Server. Shall I continue?
Unix you layer a major role in the success of the Mac in two key markets. First max was the cool box for young developers, since it provided a Unix dev environment and this such nice things as Python, ruby, reasonable c++, git, etc. when these often worked poorly on windows.
2) while emgineering software moved from Unix to windows in the early 2000s, scientific software often did not. It was however available on macs. The result is that macs took over university science departments.
Now both of these are irrelevant because of virtualization, wsl, and better windows support. But it was once a really big deal. Mac was a Unix environment that also had word and PowerPoint.
Only when one equates "developers === UNIX", which is why so many still don't get Apple developer's culture, even with two decades of OS X.
C++ on the Mac was already a great development experience with Toolbox and Metrowerks PowerPlant.
Copland was going to be fully based on C++.
If you look close enough to NeXT's history and how Steve Jobs thought about UNIX beards, NeXT only supported UNIX as means to compete against Sun on the workstation market.
His reluctant appearance at USENIX is well documented.
Everything was supposed to be coded with Objective-C frameworks, down to the device drivers.
If anything those "developers" are now finding out that they should have been giving money to Linux OEMs instead.
While Microsoft discovered that that group doesn't really care about Linux per se, only having POSIX toys around, and given that Linux compatibility nowadays matters more than POSIX (even other UNIXes have Linux compatibility layers), they decided to reboot SUA with WSL.
I’m sure mac was a wonderful dev environment for developing Mac GUI apps.
But was it a wonderful dev environment for developing high performance numerical code? Or web servers? Or robotics? Or...
Most developers don’t write GUI apps.
That matches my experience with recipe websites lately. 4 cores for the ads, 3 for the author’s life story, 6 cores for the autoplay video showing an unrelated dish being cooked, 2 for the partner grocery delivery links, 3 for performance analytics, and of course 1% of a core to display the actual recipe. I have actually seen iPad Safari crash on trying to read a recipe.
It does indeed make you wonder considering the top composers are on Windows running Cubase/Nuendo, almost everybody in game dev is on Windows or Linux, and most of Hollywood's VFX teams are on Linux or Windows integrating with Da Vinci Resolve.
Most C-suite execs don't even seem to use a Mac in 2022. All seem to be about the iPad Pro, Surface Pro, or ThinkPad.
I also don't think a $2000 laptop is making anybody look "sharp" in 2022 when financing is readily available for anything Apple. What car you drive or what watch you have says more, but there's so many counterfeit watches out there that it's also fairly meaningless. Even cars, easily financed or leased these days.
Yet apple released a $6000 mac pro (that when fully specced, came out to be ~$60000, iirc), a $5000 monitor that competed with monitors almost used exclusively by Hollywood production houses and an optional $1000 stand for said monitor.
You’re absolutely right. It just seems that apples market in Hollywood could begin to dwindle because all of the alternatives. I guess they can sell a apple silicon mac pro on speed, but what about the ease of customization that linux offers these companies. I think I’ve read that a lot of these production companies develop their own Linux distro, and probably software, and extensions (for the likes of davinci resolve). Is the raw speed advantage of their chips going to outweigh all the investment these companies have made in their previous solutions? I feel like a variation of this question floated along in the time after the announcement of the 2019 macpro as well.
Anyways, I’m not really in disagreement with you, I think, just writing some thoughts out.
Seems like you're joking but I know far too many people who have bought a garbage $300 windows laptop, absolutely hated it, then they bought a $2000 MacBook Pro.
They always mention how much nicer the Mac is than their old Windows laptop but never mention that they bought a bargain-bin Windows laptop and a modern flagship Mac.
Even if I try to advise them to to just get a Windows laptop in the $800 range or so, they pretty much always go with a Mac because it feels like the safe purchase.
That's a lot like people saying android is slow and laggy when they bought the cheapest one they could find at target and compared it to a thousand dollar iPhone.
My current work laptop is twice the price of my personal M1 Air. There is no comparison, the Apple wins on all but one front: I cant install uBlock into Safari.
For a lot of people, having a macbook pro is a fashion statement. Sort of like owning a luxury car to be stuck in traffic most of the time.
After all, is the base model of the 13in MBP actually a 'pro' device? $1300 gets you a paltry 256GB of storage and 8GB of RAM - now they've upgraded from more than 2 ports though.
If you can't replace the basics like battery, display, keyboard, RAM, CPU, SSD, it isn't "professional", IMO. ThinkPad really set the standard for that. Used to be able to do that on MacBooks.
Likewise, if you need to take it to an Apple Store and wait 1-2 weeks for parts to come in, again, it isn't fit for purpose versus repairing it in-house, or RMAing it fast.
Professionals are not only big businesses. Most Mac-Users I know are small creators, freelancers or independent. By which I mean writers, designers, artists and the like who don't need to connect with the infrastructure of their team/company, and can just deliver/release the finished product.
Nothing against you and your family, but I am disappointed when for example, the Netflix app shows 2 rows of 6 titles when probably 100 could fit on the 3024-by-1964 native resolution display. Heck even the top 10 list scrolls to the right and only shows 1 through 4.
Annoyingly, nearly every video streaming service uses this horizontal-scrolling skeuomorphism, presumably modeled on video tape rental shelves. Give me vertical scrolling, or give me death!
My opinion on this is that it's purposefully slower to browse through titles so you can't ever create a full mental map of all available titles. That way it feels like you could still find something new if you just tried harder.
> They also put the same content in different categories with sometimes different thumbnails.
And I thought I was going crazy! And seems like it does work, at least on me. From the frontpage, the Netflix catalog always feels like twice its actual size — and it only occurred to me now how they do it.
Totally agree! I've been using Macs since the 80s, every version of Windows since 3.1. I use the Linux desktop all day at work.
I've been reading these 'Apple has lost its mojo' opinion pieces for decades but I'm really not seeing it. Sure there are specific issues but compared to the competition it's still night and day.
This guy has no idea what he’s talking about. One language, one gui framework, and one API paradigm - unified across all contextual devices - is exactly what Apple should do.
SwiftUI is not it by any means, but to say that the idea of SwiftUI tries to achieve is a bad aim is completely wrong.
The aim of UIKit on all Apple devices is 100% correct; and allows developers to go deep into the API forest to create mystical interactions for their customers, friends and users - in every device context. Merging AppKit and UIKit (MacCatalyst) should have been the team to win out the battle, just like in the days of the competing iPhones. The idea Apple is letting both live and fostering the wrong team is its only black eye in this story -
So no, the Mac is flourishing, you’re just not following the way.
It's not the mystical interactions or feel of widgets, layout, and function that are the problem, it's bad UX paradigms, declining consistency between apps and across versions (a Mac hallmark), buggy software on an aggressive annual schedule, holes and rot in developer documentation, constant treadmill of Xcode and device updates, and general lack of attention to detail in Mac apps.
I've been a Mac user and dev for twenty years. Music.app is probably the most egregious example and gets a lot of flak for good reason. It's so bad, when I'm working all day on my Mac, I listen to music on my iPhone instead. It's utter trash, designed with the foremost goal to push Apple Music sales and subscriptions, with everything else propping that up.
There are still so many magical features — Safari Cloud Tabs, using your iPad as a second display with automatic keyboard & mouse switching, attention to Focus and Notification management, a very solid iCloud sync service for so many core data apps like Contacts, Notes, and Messages, and so much more.
But the whole platform just feels like a rot, and the generational gap alluded to in the post is spot on. It's not necessarily an age thing, but it feels like those who "get it" and know what the Mac was and could be, and those who are making decisions and setting deadlines who are largely driven by company-wide synergistic goals. It could be both. But instead, the latter is winning.
Messages sync has been giving me problems for months. I'm receiving SMS messages on my MBP that are not being sent to my phone for some reason. Then when they show up in MacOS, they're labeled as iMessages even though they're green bubbles from Android users. This is definitely degradation of the Apple experience for me. I just don't get it.
Sure. I don't keep a running list of this (and I feel like I've seen some, but can't find them now) so it's a little off the cuff...
- Scrollbars not appearing until you attempt scrolling, so no indication that there is more content to scroll to. If you mouse-wheel scroll, then try to move over to click and drag, sometimes the scrollbar disappears before you can grab it. Yes, you can change this default, but... it's the default. Good examples: Dock & Menu Bar System Preferences sidebar, Spotlight results, Finder windows in list view.
- Having to hover over the more inner portion of a notification to get its close button to appear.
- Unintuitive Finder toolbar buttons for changing types of view, grouping, and 'More...'. Years on I still click the wrong ones routinely.
- Try searching your own music instead of Apple Music in the Music app. And then do it again. You are fighting the system to access your own content.
Here's an example of something that to date still is adhered to pretty well: having dialogs with OK/Cancel/possibly other interactions be ordered such that OK is to the right and the default, non-destructive action is the one that is both visually highlighted and resounds to the Return key. And Cancel responds to the Esc key. If you use another system where you literally have to pause for even a split-second and figure out which button is tied to which key action and whether you are going to lose some work — or read the informative text to parse it out — it drives you batty. Have a browse over https://developer.apple.com/design/human-interface-guideline... to see the sorts of things I'm talking about.
There are countless little interactions now that, if you've only used other computing systems, just feel like the things you need to do to accomplish tasks. Often they are "intuitive" to pro computer users because those users have gotten used to thinking how the computer wants them to think. But, if you've ever used a Mac thing done well, each use is a delight. Sometimes you can almost reflexively use a new interface and be delighted that the right thing just happened. If you then experience the sorts of degradations that have been happening over recent years, it's a double punch: one, a tedious annoyance and two, a reminder that it used to be — and can be — different.
I don't get it. If I want to see whether/how much I can scroll ... I try it. I appreciate only having to see the information when I actually need to see it instead of it distracting me otherwise.
> Having to hover over the more inner portion of a notification to get its close button to appear
This reminds me of a much worse thing regarding notifications. When you install an app, often you get a notification telling you something like "This app will send you notifications". For some reason these apps don't go away automatically, so okay, you swipe it away manually. But for some reason they keep re-appearing.
After over a year of using macOS I found out a hidden button saying "allow".
A hidden button. Asking you whether you agree to something that wasn't a question.
Part of the scroll bar thing is an admittedly subtle old interaction cue, which is that the scroll bars are sized proportionally to how many screens of data there are. So in just glancing at a list of ten files, you can see that there are approximately 20-25 files total, with no interactions.
Sure. I may have worded it a big ambiguously, but I mean that too.
I don't want to be confronted with information I don't need at a moment, so I prefer this information that I need maybe 1% of the time to be behind an explicit interaction.
I don't think he was arguing against unification; he was just arguing that in practice, the unification Apple settled upon was one that made macOS worse and thus represents their continual disinvestment in the platform (because, as you note, "SwiftUI is not it by any means"). This comes after years of neglect for the Mac platform; OP says Apple was "distracted" by iOS, but IMO that is unfair; instead I would say they were focused on iOS, probably correctly, but with a downside cost for Mac users. Macs are no longer the primary breadwinners for Apple, so they get a smaller percentage of focus and resourcing.
It sure seems like they could've done a better job at the unification attempts though, at least from the videos of the new Settings app floating around the internet. How can you break toggles that badly? It seems like it should be very hard to make a toggle button that occupies the regular amount of screen real estate, but has a click target that is 1px. That shouldn't happen by accident!
Oh yes, that is so badly done on iOS! I thought I was the only person bothered by copy and paste on the iPhone.
Even Apple’s own Messages app won’t let you copy single words. I end up having to paste the entire message into the Notes app (having fought its UI to get a blank page first).
How about text entry on iOS in general? It seems almost designed to cause typos and autocorrect mistakes. Whole websites are devoted to laughing at users who make them.
I hardly think Jobs would have thought that was a cool glass of water but who really knows.
No, it's absolutely unacceptable to merge touchscreens and keyboards/mice. These should be separate frameworks where you couldn't reuse any GUI code whatsoever, not a single line. You must design everything from scratch for phones and for computers because of how different the interaction paradigms are.
Putting myself in Apple's shoes: I just don't see the financial incentive to invest a whole lot in the native GUI story for the Mac in 2022
Regular people mostly use iOS. Many don't have traditional computers at all. That was originally the Mac's core market.
It still has a strong professional market- but many of the serious professional tools (Adobe, Blender, IDEA, etc) use their own custom in-house GUI frameworks, or maybe something like QT
And then most of the rest of the space is covered by web apps and/or Electron
Personally I can think of... three third-party native Mac GUIs that I use on my work computer, and two are background apps whose only GUI is a settings window you can open
The Mac is just... not really a GUI platform the way it used to be. Desktop OSes in general aren't. It's a bummer, but it feels like the new reality
People just are not building, crafting and using desktop apps specifically for Mac anymore.
That ship has sailed. They either use their own frameworks and build on all 3 platforms, or use Electron. On a lot of daily things people use iOS.
I am thinking what apps I actually use on my Mac the most.
It’s Chrome, VSCode, Terminal.
I still use Preview a lot though… I hope the Swift pushers won’t touch that. If they screw up Preview, I don’t know what I’ll do. Preview is amazing honestly.
I had a contract a while back that meant I had to work on desktop Linux for the first time in years.
The experience on my XPS13 is not as good as macOS[1]. It’s not far off though …and FOSS is always improving.
They have the edge right now but if Apple keeps neglecting macOS, degrade it to be more like iOS[2] or keep adding rent seeking nagware, Linux will be “good enough” for me.
1. Seriously, why can’t I have subtitles off by default as a preference in Gnome Video again?
2. Why is there an unused dictation toggle on my iOS keyboard but no undo key? Why can’t I copy and paste in every iOS app? Why is the spell checker so hard to trigger? etc, etc
Can’t answer the rest of it, but copy and paste works anywhere there’s a standard text field unless the developer has gone to the trouble of disabling it. There are ways to make it work in non-standard text fields as well. You should send bug reports to the developers of the apps where it’s not working, same as you would if it didn’t work in macOS apps.
Preview is amazing. And now I'm worried the Swift pushers will screw it up. More likely they'll release a new "better" Preview thats half as fast, buggy, and missing most features.
Yeah... I could totally forgive them for sleeping on the third-party GUI frameworks, but I can't forgive them for neglecting or botching first-party apps/core parts of the OS. And those don't even make business sense to de-prioritize. I don't really know what's happening there but it's not good; the Apple Music app for Mac is one of the buggiest, slowest Mac apps I've ever used. It's distressing that this is the treatment their flagship services are getting.
It's small compared to iOS, sure, but I think it's still an important part of Apple's overall portfolio. It's currently what all of the iOS developers need to use. Its what Apple's own people use for their work.
I guess the difference of opinion may turn on the phrase "invest a whole lot". I have no idea what kind of investment Apple is making in Mac native GUI frameworks. It's obviously non-zero and is likely what most would deem non-trivial, but I'd argue that it's worth a decent size investment because it is still a potential differentiator for a $40B part of their business.
I will also note that there are some excellent professional native Mac apps, even beyond Apple's own. Things like Pixelmator and the Affinity suite, for example. I'm definitely running a combination of native Mac and Electron apps right now, but I still appreciate the native apps when they're available for the job at hand.
This. Ableton still works great on Mac, so do all the plugins I use. Logic Pro is still excellent, and I assume final cut is too. I often find these rants a bit surreal.
There is really no reason to keep on adding stuff to OSX/MacOS. It's done. It does everything it needs to do. It has been done for something like a decade. It should be in maintenance mode, with bugs being fixed as they show up and nothing really changing.
And yet every year there's a new version. Mostly it's a bunch of new features on the bundled programs that I don't ever touch. And a new skin on the GUI. Sometimes it breaks something that was working just fine; I personally quit staying up to date when I balked at a new OS whose main feature was "broke all 32-bit apps forever". I finally upgraded to the latest version in the past few months so I can deal with getting stuff working on the M2 Air I'll be getting soon. Preparing for that has made me replace two perfectly-functional drawing tablets because Wacom's newest drivers won't support them and the ones that do predate the M1.
Honestly I doubt I'm gonna be upgrading that M1 from whatever OS it comes with until the latest version of Illustrator stops working on it, or Apple stops doing bugfix releases. Or if I have to replace it because it gets broken. I just do not need this endless cycle of change for its own sake, done by people whose idea of what a computer "should" be is increasingly shaped by growing up with an iPhone in their hand instead of an 8-bit computer on their desk.
> I doubt I'm gonna be upgrading that M1 from whatever OS it comes with until the latest version of Illustrator stops working on it, or Apple stops doing bugfix releases
Unfortunately security updates are more important than ever and patching old OS versions is not a high priority for any OS vendor. Even Ubuntu LTS only gets critical updates for something like 5 years.
The tragic reality is too many thousands of people's jobs at Apple depend on delivering new features for it ever to go into maintenance mode.
Yeah, I completely agree -- the MacOS is finished and should be in long term maintenance mode for the next 25 years. We should turn our attention to other problems.
I got my dad a Mac about 10 years ago, and never got into the experience. I mean I got it for him, used a couple of days to tell him where was everything, delivered and then came back to my trusted thinkpad with i3.
About a year and a half I got a macbook from my employer even as a replacement from a T480 which was awful because of thermal throttling. After a couple of months, I got the m1 macbook air as my personal laptop. I guess the experience with the software I commonly use got better, but I feel like the biggest reason is that the experience with other laptops just became worse.
Yeah… I mean I love the “don’t have to think about anything” experience of all Apple products. That’s the reason why I use their stuff.
But at least in the last couple of years before the M1, the number of things that were just plain annoying… it really made me look for an alternative. And really… there isn’t one. Windows is so annoying to use and even the Linux fanatics I work with manage to run into trouble like once a month. And yeah it’s fixable trouble. But I really don’t want to deal with that shit. I want a useful trackpad, a useable browser and a usable mail program. I want my laptop to reliably suspend and wake up. I want, if there has to be a kernel crash, that I get all my windows back in the state I left it. That really doesn’t sound like too much. And yet it is.
And the worst part: Apple knows it. There is now quite a number of subpar experiences (looking at you Apple Music App) that they can just get away with.
This is my perspective as well. Apple products are basically out of the box usable for me (short of a couple of helper apps on MacOS) and frankly just do not crash or cause me any downtime.
But…I am admittedly well past my “OS tinkering” days. I finally realized how much unproductive time I spent trying to make whatever OS I was using “productive” and visually “correct” and realized it wasn’t net positive. So I just decided the default options was good enough, short of my couple of must have helper apps (Rectangle and Hiddenbar)
I just started developing in Swift/SwiftUI with a few app ideas in mind as a 37 year old with quite a bit of XP developing web services and *nix based utilities.
I _really_ like the overall experience of Swift and the mac ecosystem a lot. Maybe there is a lot of truth in the headline, but other than XCode not being as excellent as PyCharm or VSCode, I'm having a lot of fun and getting some users along the way.
> other than XCode not being as excellent as PyCharm or VSCode
I can't help pointing out the irony that PyCharm uses custom native rendering and VSCode is an Electron app, while Xcode is the only native Mac app of the bunch
Obviously there's a lot more that goes into a piece of software, but
I don't have a ton of Xcode experience, but from what I've touched, while it does feel good to use in terms of a "native" experience, as an IDE it feels lacking. Even basic things like the text editing experience feels limited compared to other IDEs (e.g. why is there still no auto-formatting built in? why can't I duplicate a line without having to use pseudo-Emacs movements?)
Interesting generalization about a lot of mature tech businesses, borrowed from TFA (edits mine):
What the future of <product> looks like as <company>'s employee base ages out and that care & attention is lost forever? Fresh new hires, who’ve never known a reason to care about the <product>, trying to rewrite key portions with unforgiving yearly timelines and shaky foundations...
I'm facing a bitter conflict with Mac App Review. I feel so poorly about this that I can't sleep, and my mind is in turmoil. After 37 years writing Mac software, I'm wondering if there is anything left of the Mac and of the Apple spirit that made such a difference in my life. I hate to give up my current work (5 years and 13,000 lines of code), but in a sense it would be liberating to be free of Apple.
I've had very similar feelings with a different ecosystem. The reality is that you don't have to be a Mac developer. You can jump ship, try something else and even come back later if it didn't work out. I tied my identity as a developer to a technology. When the technology got worse, it made me feel worse.
Trying something else felt like a break-up, and I haven't attained the same reputation I had in my the ecosystem that I had spent 15 years pm; but this was 6 years ago and I'm glad I moved on and not looking back. It _will_ be liberating; it's just a soulless corporation at this point.
Reach out if you want to chat. This really resonated with me!
I have been a Mac and iOS developer for 7 years now and am moving away. I feel like Apple treats developers really poorly from App Review process to forced updates of your hardware/OS/Xcode.
cc101 I was in that exact situation a few years ago! I feel you! I'm still writing Mac software, but now spending more and more time learning about different technologies with an eye on getting out one day.
The Mac experience peaked some years ago in many ways. Now when the hardware is back on track we can at least enjoy that plus some good MacOS features, and ignore or tweak the not so great stuff.
They have certainly lost a lot of quality in the OS and 1st party apps, but the summed experience is still more positive than negative imo. My mbp14 is the most (or maybe 2nd to most) enjoyable of all my 8-9 machines since the G4 days.
And that's coming from a software conservative :) I hate when an update changes stuff without obvious improvement.
I feel like I may be the only developer that prefers Windows to MacOs. I really, really just do not get the "Mac Experience". It feels so clunky and counter intuitive.
- Why does the alt tab equivalent cycle through applications first, and I need another shortcut to cycle through the windows of that application? When I have an IDE with 5x instances, hosting different projects, it's a pain switching back to the project I want from another application
- I attach a non Mac external monitor to a new MacBook Pro from work and the text is super blurry. Much googling later and I need to use something called Better Display to create a virtual display, and project that to my monitor. It's so confusing and sometimes break when I wake up from sleep and need to configure it again.
- Why is there 2 modes of "full-size" windows? Regular style "fill the window", then another wierd one that sort of takes over the whole screen
- Why does finder sometimes open in a sub folder as the "top level", then it's a total pain to move up to where you want.
- Why are 3rd party applications distributed as Disk Images? Then some as package installers? It's weird and nonsensical.
Mac hardware feels great, but I just cannot understand why it's the preferred developer platform.
> I attach a non Mac external monitor to a new MacBook Pro from work and the text is super blurry. Much googling later and I need to use something called Better Display to create a virtual display, and project that to my monitor
That’s not the correct solution. The correct solution is for the monitor manufacturer to stop reporting an EDID that identifies their monitor as a TV (and thereby forces the OS to default to YUV rather than SRGB.)
macOS here is just following the EDID spec to the letter. (Linux does too, as it happens.) Windows is the one “doing it wrong” (i.e. not to spec), by falling back to SRGB whenever it’s unsure, rather than falling back to YUV. This results in a lot of TVs being unhappy (no signal) when plugged into Windows PCs, but working fine under macOS / Linux.
The personal solution is to not buy monitors from manufacturers that fail to put EDID chips in their HDMI signal path. The set of manufacturers that do this is well-documented.
I walked into an Apple Store and asked, “What monitor/TV should I buy such that my AppleTV remote can control the volume and be the only remote I need?” The geniuses shrugged. I asked, “What brand are these displays in the store that are connected to the demonstration AppleTVs?” They shrugged. We turned them around so I could see the backs of the monitors and I wrote down the model numbers.
There's probably a list somewhere on the internet but I've never seen one. When I went monitor shopping six months ago, I used the Macrumors forum heavily. There are long threads on the forum for monitors which didn't work well because of edid or other factors. That's how I ended up with an Asus ProArt 4k instead of the Dell I initially wanted. Everyone complained about how MacOS handled the Dell but no one had a bad word for the Asus.
Though it was a little frustrating to me that I had to monitor shop in the first place. I bought an MBA to start some iOS projects. I wanted to use my still great 1080 IPS Dell screen that I had used for years with Windows and Linux. But MacOS looked absolutely horrid on it. I know MacOS prefers higher resolution screens, but I was surprised it couldn't manage to handle a ridiculously common resolution.
All the Dell monitors I've ever tried have worked just fine. Including the lower resolution models. On those, you don't get the Retina crisp display, but it still works.
Of course, I avoid HDMI like the plague. I only use DisplayPort unless I have no other option, and then I don't hold out much hope that the display maker did anything right.
This seems like bad advice. A little bit of Googling for solutions to this found this thread[1] which identified (old) Dell monitors as being one of the problems.
That's basically the current state of computers. Some vendors care more than others. I'm sure there are ASUS monitors out there that don't work correctly with ASUS laptops, but if you buy an Apple monitor it probably works with an Apple laptop from the same vintage. That's just how things are. The market seems to prioritize "new" over "good". Doesn't mean you can't do some research to get "new" and "good" though.
All usability comes down to what you're used to. I can't use Mac OS because I don't use Mac OS. Sit me down in front of a Mac and it is practically useless to me, nothing works right, same thing with iPhones. They don't "just work" for me.
This is not to say I couldn't learn, if forced to. That's what happened when I started using windows, Linux, and Android after all.
For general usability you need agreement on one way of doing something. Cars are this, you can drive pretty much any car because of the standard way the interface is put together.
My main issue with people responding to any criticism of Apple is that Apple fans (especially hardcore) can't accept that the Mac OS/iPhone interface isn't perfect and doesn't work for everyone.
Mostly agree. IME it really depends on the person. And Microsoft has valued backward compatibly and pragmatic solutions almost to a fault at times. I'd day that's been changing since Windows 8. Still as a DOS-then-Windows native I appreciate their approach.
Job's insistence on a one button mouse for decades, then a one button phone bordered on the absurd as devices grew in capability. Yet it did serve a portion of the market well enough. Even if it kept advanced features virtually impossible to discover with being taught. Doubtful my boomer parents will ever learn swipe gestures, yet my 4yo is doing them now.
No—Windows takes a strictly worse approach. macOS [and Linux] give you “some [not to spec] monitors have very slightly blurry text”; but Windows gives you “some TVs just plain don’t work at all.”
Note how I didn’t say that the TVs are not-to-spec; that’s because the EDID spec allows TVs to just not report an EDID (TVs are low-margin, and that’s one fewer chip on the signal path), where the video source then must assume some standard default modes (e.g. 1080p@60 YUV 4:2:2.) But monitors, meanwhile, must report an EDID. Why?
Because all this only applies over HDMI; and HDMI is a protocol for talking to TVs first-and-foremost, which is why the EDID spec assumes YUV as the fallback.
Monitors generally have DisplayPort inputs; and if you plug Windows or macOS into a monitor over DisplayPort, it’ll always go SRGB, because YUV isn’t a video mode the DisplayPort protocol can even carry.
So when a monitor has an HDMI input, it’s essentially doing “TV compatibility” over that input—and it needs to report its capabilities with an EDID, lest the video source think it’s a TV and send base TV spec signals to it.
Which all makes the situation even more ridiculous: there are TVs that Windows can’t talk to over HDMI, despite the HDMI connection implying. and the EDID spec explicitly stating, that Windows should make the fallback/safety assumption that it’s talking to a TV.
Also keep in mind that this is a prisoner’s dilemma: the only reason that monitor manufacturers are able to “get away with” being nonconformant to the EDID spec is because of Windows’ leniency. Windows is the “defector” here. If Windows stopped allowing it, then nobody would be allowing it, and so the monitor manufacturers would have to shape up.
> That’s not the correct solution. The correct solution is for the monitor manufacturer to stop reporting an EDID that identifies their monitor as a TV (and thereby forces the OS to default to YUV rather than SRGB.)
Is this an EDID issue? OP didn't specify what the "non Mac monitor" is, but I'd expect it to be a lower DPI one. I get the blurry text on an Apple Thunderbolt Display, too.
I think this is related to the removal of sub-pixel antialiasing in Mac OS a few versions back.
I’m using a windows machine for work and I hate it. Explorer and outlook change application icon depending on what you are doing. I can never find them when I alt-tab.
I can never find anything in word or excel or any other application that use windows style menus. On Mac you at least get a real menu.
I never got used to macOS’ alt-tab when I switched more than a decade ago, but now that I’m back on windows I find that their behavior works even worse. So maybe that’s a “don’t ever try to learn another alt-tab behavior than what you already do now”
> Why are 3rd party applications distributed as Disk Images? Then some as package installers? It's weird and nonsensical.
As if the installation experience on windows is consistent. Some things come as a zipped exe, some as an msi, others as a zip file that extract to an exe which runs an install program.
> Mac hardware feels great, but I just cannot understand why it's the preferred developer platform.
Isn’t that because it comes with a Unix shell? I’m using wsl and only like 1 or 2 gui programs can talk with my Linux files in a reasonable fashion.
> I can never find anything in word or excel or any other application that use windows style menus. On Mac you at least get a real menu.
I’ll never understand why it’s so common for Windows and Linux GTK apps to go so far out of their way to avoid using a tried and true menubar. Hamburger menus are probably the most common substitute, and they’re strictly worse in every way, being basically junk drawers in software form.
That’s one of the things I love about the global menubar on macOS: because it’s owned by the system, third party devs can’t try to delete it out of some misguided pursuit of minimalism. Several cross platform apps that lack menubars on Windows and Linux have proper menus on macOS as a result.
And... it's always in the same place. And... all the standard common things are always in the exact same place with the exact same shortcuts.
I've been using Linux (KDE Plasma) now as my primary system for 5 years now, and it's practically impossible to build up a system-wide muscle memory. On Linux and Windows menus are all over the place. Things in the menus are all over the place. Shortcuts to them are all over the place. It's crazy making.
However, I've resisted going back to a Mac because it's getting perpetually worse year after year too. The "golden era" to me that I wish had been refined and iterated incrementally on is probably somewhere around Mac OS 10.6
I personally prefer Windows style menu bars where the menu is at the top of each window, instead of at the top of the screen. I don't want to constantly be dragging my mouse all the way to the top and then back to my window.
It is kind of funny that we either use macOS or Windows/WSL because our target deployment environment is Linux. We can't use Linux because something eventually goes wrong and people with deadlines don't have time to fiddle with their dev machine to get it working again.
> I never got used to macOS’ alt-tab when I switched more than a decade ago, but now that I’m back on windows I find that their behavior works even worse. So maybe that’s a “don’t ever try to learn another alt-tab behavior than what you already do now”
I think the one big benefit I notice of OSX style program management vs Windows style is: with OSX, you have have a program open, but without any windows. Whereas with Windows, if you close all of a program's windows, that typically kills the program.
Really, this mostly shows up when I want to create a new incognito browser window when I don't have any browser windows currently open. On OSX, I just tab to the browser and open up a new window. On Windows, I have to start a new regular window and then open a new incognito windows. It's pretty annoying.
> Isn’t that because it comes with a Unix shell? I’m using wsl and only like 1 or 2 gui programs can talk with my Linux files in a reasonable fashion.
Can you elaborate? I've been using WSL1 as a tool to operate on files on my Windows file system and haven't had any issues. (As an aside, I'm actually very pleased with this approach.) Are your issues specifically with GUI programs running through WSL? I'm just checking to make sure I don't get locked into a workflow that causes me problems down the road.
Don't worry I'm European and 90% of developers I know think exactly like you (and me). It's just that the HN crowd is typically North American, and people tend to like what they have grown with.
I'm also European, and I prefer Linux over Windows. I was using Windows and Linux in parallel for some time, but got really upset when a Microsoft Update changed system settings which was the final nail in the Microsoft coffin for me. At my last two jobs I have been using Linux as my main driver and sometimes Widows via Citrix. Now at my new job I will get a Macbook, which shouldn't be a huge difference to Linux since I already try to install most tools I need for work via Linux-Brew.
That all being said, for me a non Windows desktop at work has become a requirement. If an employer would force me to use Windows I would most probably ask for much more money or more likely leave for another company.
In my experience the Windows-Linux split is around 50-50, but I've only seen exactly two people with Apple devices in my entire life here. Western Europe.
I live in Europe and 95% of the engineers I've worked with here in major companies use a Mac. A few use Linux, absolutely no one I know (300+ developers I've interacted/paired with in the past 7 years) had a Windows machine. Not a single one.
Sony Mobile is definitely a "modern tech company" and they are Ubuntu first. At the same time a company like Volvo or ABB is these days probably 60% software development and at most 40% mechanical engineering.
Industry/hard engineering companies here do tend to go Linux on their SWE machines, I've interacted with multiple SWEs from Volvo and Ericsson on some of my employers' projects (again, including pairing sessions) and none of them were Windows users. Mostly Linux and Mac.
Software companies do skew extremely into Mac territory, I really don't recall anyone ever using a Windows machine and very few run their ThinkPads with Linux, the vast majority have a Mac as a daily driver.
Outside iOS projects, and our designers, I have seldom see people use Mac's.
In fact our Java folks get macbooks as option, and eventually many of whom end up migrating to ThinkPads with Red-Hat/Windows, when starting to deal with complex workloads.
On the .NET teams everyone is on Windows ThinkPads.
All the European developers I worked with in Berlin used Macs. I can understand using Linux, but using Windows as a web developer just doesn't make sense because all of the tutorials and stack overflow discussions and forum discussions will assume you're using a Unix shell.
This is such an odd extrapolation. You can see tons of Macs or Linux in use in Europe, at conventions , in lots of dev communities, in design studios.
I suspect you’re in a self selected bubble. Which you sort of allude to by saying “developers you know”, but I don’t see how that extrapolates to the continent as a whole.
I'm not one to defend the "mac experience" but some of the stuff you mention I actually really like, namely like the separate key-combo for switching between app windows and the two modes of full screen (although I never use full-full screen anymore).
Finder is definitely not great. It's improved over the years which is kind of sad as it's not a whole lot better. You still can't cmd+x a file to move it to another location. I'm not sure when it happened (probably a while ago) but you can actually rename files in open dialogs now! I didn't notice when it happened because I gave up trying many years ago.
Anyway, I like macos as it gives me a linux-esque command-line experience out of the box without having dive into managing a Linux installs (although that's probably not as big a deal as I think it is), because the hardware is nice, and otherwise because it's just what I'm used to ¯\_(ツ)_/¯
For non-developer reasons, Logic, even though not the best, is so insanely cheap for what you get, and as a light gamer, Minecraft runs decently well on my Mac Mini.
You can cut and paste files in Finder, it’s just a decision that’s made on the “paste” side of the process instead of the “cut” side.
Copy as usual with Command-C, then at your destination move the file with Command-Option-V. This allows you to change your mind about whether you want the file to be duplicated or moved without having to first navigate back to the source and perform an additional action.
Oh cool! I never knew that. How long has that been a thing?
In a way I actually like this better as in Windows I used to cut and every-so often I'd blow out my clipboard somehow (at least, I have memories of that... it was like 20 years ago).
Based on an archive.org snapshot[0] of an Apple support page[1], the move shortcut has been in place since at least 2015.
And yeah, the Mac way makes more sense to me too. It's a lot less likely for Command-Option-V to be fat-fingered than both Ctrl-X and Ctrl-V.
Shortcuts like this are all over the place in macOS, and have been for decades. They're usually discoverable by glancing through menus while holding down various modifier keys.
Yeah, I know a lot of the shortcuts, though 2015 is literally 20 years since I started yearning for some kind of cut-and-paste in the Finder. I'd given up hope long before then and started living more on the command line more and more.
FYI for anyone who doesn't know: that other shortcut is Command+` to cycle between windows of one application.
It's totally nondiscoverable (I read about it here), but once I learned that, it felt amazingly intuitive to separate the concerns of application-switching and window-switching. And Command+` works consistently on MacOS, unlike on Windows where every app reimplements its own multiple-document interface and so Ctrl-Tab hardly ever works.
Separate management of applications and windows is a powerful and underrated tool in my opinion. Once you become accustomed to it, the Windows style of a big vat of random windows feels like a chaotic mess in comparison.
I've been using a Mac now for about a year, and I still don't know if I can move windows around in 'Mission Control' or whatever it is where I see all my open windows.
Like, when I swipe three fingers up and see all the open windows, my current window flys to a random part of the screen and all the other windows appear. I can't move these windows, much less get them to stay in an area to find easily later. Is there something I'm missing? I am learning to like other parts of the interface, but this just baffles me especially when I have 20 browser windows open.
Or is there an app that helps with this? I find there are tons of app and tweaks that, as a new Mac person, I have no clue about. I don't even know what I don't know!
Yeah agree here. It makes sense to have a shortcut to only cycle over the windows of the same app. In the example where you have 5 windows in an editor.. how nice that you can only cycle over those instead of also the other 20 windows you have open
If you're looking for another not-very-discoverable shortcut that's quite useful... press Command-Tab until you get to the application you want, then without releasing it press up (or down). You'll get the exposé view for the selected application's windows. While that's open you can arrow between them to select the one you want or type to select by window name.
I'm with you, I'm a dev who prefers Windows over Mac. I'm convinced that most Mac users don't use multiple monitors and they're completely fine with the really poor window management. It boggles my mind; I am objectively less productive with a single monitor than I am with multiple.
Which is not a stock macOS app. Which is what the point was. Stock macOS sucks with window and monitor management. I run three screens and it's aggravating sometimes.
It seems like if you discount third-party applications, you should also discount the addition of peripherals like additional monitors, as they're not the stock experience either.
This seems to come up with every argument about operating systems. People say it sucks if you have to use third-party applications, but I vehemently disagree. The fact that you CAN use third-party applications, the fact that they have sufficient access to make the experience better, the fact that the community exists to make them... I think that's a major pro in favor of an OS.
> Which is not a stock macOS app. Which is what the point was.
This seems like a strange standard to me. I can’t think of an OS across any and all I have had to use over a multi-decade career that didn’t benefit from some third-party app to improve some “stock” feature of the OS.
If you go back to the earliest days of the Mac there have always been third-party apps to improve the functionnality of your sytem. It's what's been expected of users. There used to be a time when third-party Windows style Start Menus and bars were all the rage. I know it's a cultural, not technical, difference, but unless your work restricts what you can install you will be much happier going with the flow. Many window snapping utilities exist and many others who help with multiple monitor setups.
With the great attention to detail of macOS third-party developers, you can be sure those apps will give you a seamless experience.
And windows file explorer copying was inconsistent and buggy as hell from nearly a decade, but installing teracopy fixed that issue. As long as an OS is plug-in friendly, stock issues that can be easily remedied don't bother me.
If you're using multiple monitors and you're not using the rectangle app... that's kinda on you. I switched to Mac after using windows for decades and in less than a week I solved my windows management issues with this extension.
> If you're using multiple monitors and you're not using the rectangle app... that's kinda on you.
This attitude strikes me as snobbery. People aren't born knowing the best 3rd party alternative to their OS's shortcomings, and some have no control over what apps they can run.
Windows 10 has pretty good built in window management. IMO better than MacOS. I say this as a Mac and Rectangle user. (Though only Mac because of Safari exclusively working on it. I prefer Linux.)
It's also the attitude that's endemic in a lot of Mac forums, so good luck finding out answers when everyone rolls their eyes because you should already know.
> I'm convinced that most Mac users don't use multiple monitors and they're completely fine with the really poor window management. It boggles my mind; I am objectively less productive with a single monitor than I am with multiple.
I'm a long time Mac user with an 25" 21:9 ultra-wide display that constitutes two regular office displays. A single ultradwide display is way superior to a couple of displays. Performance wise for your workstation and config wise as well.
Nothing is blurry to me. To manage windows with keyboard or mouse I use moom[0].
I've worked on Windows for 2 years at a large corp, and the blurriness of fonts I experienced with mid-range 27" monitors was unprecedented. Having invested considerable time to make the problem go away, I gave up.
Even modern Ubuntu does better job at font rendering with displays made by the same manufacturer.
Longtime user of a 2-3 display setup here, and I would strongly disagree. If anything Windows’ support of multiple displays is bad… last I knew it couldn’t even set per-display or per-virtual-desktop wallpapers, and its virtual desktop support (which is essential for a multi monitor setup in my eyes) is nowhere near as mature as that of macOS or practically any Linux DE.
The key is that with macOS, you don’t really manage windows. Just let windows be where they will, like papers on a desk, and then manage desktops. While I’m working I’m constantly flipping between desktops on both my primary and secondary displays, with the primary display being set up with desktops of primary windows and the secondary display being set up with desktops of secondary and tertiary windows. The ability to mix and match sets between displays is powerful and way more natural to me than cobbling together some kind of overcomplicated window snapping setup or something like that.
> I am objectively less productive with a single monitor than I am with multiple.
I've tried multi-monitor setups before with both Windows and OS X, and I just don't like them. I moved to a 43" 4K TV, and I like it much better, especially with a window manager like Rectangle. If I need extra screen real estate, OS X has pretty good support for virtual desktops, but I haven't felt the need yet.
One of these days, I might feel cramped by my setup, but I suspect that I'd rather go up to 50" than get multiple monitors. Especially if I could go 8K by then.
I've found Mac window management to be completely un-usable... Until I install Magnet from the app store for $3. Then sanity prevails. And with much weeping and gnashing of teeth, I've managed to get the right combination of dongles to get my 32" and 49" monitors to work at the same time. It was worth the pain to be on a Unix machine, at the end of the day.
Windows window management doesn't really scale well beyond 3 monitors either. I use AHK with great results, and I imagine Mac has a similar non-native program. That being said, I've never even tried to migrate to Mac or Linux because I'm most productive with 8 monitors and the support for that number sounds abysmal on anything but Windows.
“Why are 3rd party applications distributed as Disk Images? Then some as package installers? It's weird and nonsensical.”
How many different bespoke ways of installing windows applications are there again? Is it even countable? Is installshield beloved by anyone untainted by just dealing with it for decades?
I'm with you here. I've been using a Mac at my new job since November. I fucking hate it. I have burned so much time on stuff not working, and it's not just me - others seem to just accept and forget it. One great example of this is the recent bout of zero days: forty five minutes for a minor release update? How on earth does anyone think that's acceptable?
I'm no longer a fan of Windows, having moved along to Linux on my personal machine, but both Windows and Linux are in a completely different universe to MacOS when it comes to bare minimum competency.
Can you outline some other problems you have lost time to besides a software update? I agree that they’re pretty long too, but if that’s your only complaint it’s a bit moot since every OS provides a mechanism for installing updates during off hours.
Notwithstanding that you don't leave a zero day unpatched til after hours, some other time sinks:
* M1 compatability issues.
* Problems surrounding containers. Yes, I have used Rancher Desktop, yes I have used Colima. Neither is a first-class citizen. (As-per usual with Mac dev tools).
* Dual monitor.
* External keyboards, keyboard shortcuts, and customizations.
- lack of degradation of performance due to bloated registry, fragmented disks (probably out of date but still hurts my soul to think out)
- a control panel that doesn’t suck
- several things due to its unix heritage which are far better then windows imo (case sensitive file system, bash/zsh terminal)
- lack of concern regarding bloatware installed by vendors
- don’t have to crap like dot net c++ extensions and other nonsense just to run something
- a more stable, less crashy experience, this is anecdotal but true to me
- reliable warranty and support policies, Apple absolutely kicks ass here
I just want to get shit done with minimal auxillary stuff involved. And that’s why I use ios, macOS. Every aspect of it was designed thoughtfully, and not slapped together. I stopped liking windows at version 8 and have never looked back.
I prefer macOS over Windows, but I think a lot of your points are... outdated, or just outright myths to begin with.
I can't find any (non-anecdotal) evidence that a "bloated" registry impacts performance at all on a modern system. If you have benchmarks or something, I would love to see them! (not being a jerk, I really would). If you think macOS applications cleanly uninstall when you drag them to the trash, that definitely is not the case. Check out ~/Library and see the ghosts of applications past.
Disk fragmentation isn't an issue on SSDs. Anybody still running Windows on a spinning drive has worse problems than fragmentation.
The settings situation does suck, that's absolutely true.
APFS and NTFS both allow you to choose case sensitivity depending on your preference, so I'd say they're on equal footing there. PowerShell isn't that bad, but you can still use bash with WSL or cygwin if you really want to.
I've always clean installed Windows, so I don't know about vendor bloatware. The bloatiest thing I can think of are shortcuts in the start menu to install ridiculous things like Candy Crush, but those are just shortcut ads, not actually installed things. Distasteful, sure, but easy enough to wipe out with a script.
Anecdotally, I've had way more kernel panics in macOS than I have blue screens of death in Windows 10. The T2 fiasco alone has caused me more crashes than I can count.
Anyway, I think in 2022, operating system choice is more of a personal preference thing than a technical one. They're all equally bad, just in different ways.
IMO powershell is amazing, I even use it as my shell on Linux when I have to use a Linux machine. It’s just “different” so a lot of people don’t bother to learn it or expect it to be bash.
> Anyway, I think in 2022, operating system choice is more of a personal preference thing than a technical one. They're all equally bad, just in different ways.
I like most of your post since I agree with a lot of it, but I'd like to try to convince you that it's never been better to be a computer user. OSes are _mostly_ stable, have a clear design goal for users, have multiple well supported shell environments for power users, extremely strong cross-system software (i.e., you can use Firefox et. al., on basically anything), plugging in hardware very likely will "just work" for most hardware, the list goes on.
Most importantly, UI/UX conventions are pretty well established regardless of the OS you pick; for example, while I'm rarely in an actual UI on the linux boxes I work with, more or less it works the way I expect based on Mac/Windows. Hitting the CMD/Meta key brings up a system search utility (this is partially happenstance from MacOS as I believe it's the default hotkey for spotlight...but I use quicksilver on ctrl+space so I might be mistaken) Even the mobile space between major Android releases and iOS are very consistent, Android having a slight blemish in the regional minor releases that lack major Android features.
The point is that I think a lot of established UI/UX defaults for OSes are mature enough that edge cases or test cases are the rough edges. There are elements from each OS I miss, elements they share which are equally frustrating (truly, I wish MS and Apple would document the registry/defaults command respectively in full. Both let you tweak the system to an amazing degree to make a perfectly customizable experience, but you basically have to hope someone else has posted it on stackoverflow/reddit/whatever if you want to discover it).
I grew up starting with Mac OS6 and didn't touch a Windows computer until well into my teens or use one daily until I was 20, and the adaptation period was literally a day or two. My friend was windows/linux for his entire life, saw me using my Mac for browsing tabletop PDFs when we were in university, and was intrigued by the way the UI/UX worked and has been a convert ever since, learning the UI in a few days and only a few questions.
It's really never been better to be a user, and I'm kind of amazed that OS wars continue in 2022. Computers are great. Software/Tech Companies are another story...
> I can't find any (non-anecdotal) evidence that a "bloated" registry impacts performance at all on a modern system. If you have benchmarks or something, I would love to see them! (not being a jerk, I really would).
have to comment on this because I agree, it's a non-issue, but the number of registry cleaners that exist or people's self-invented solutions cause far more problems than they solve. Which is a pretty easy task as typically they solve 0 problems directly.
> lack of degradation of performance due to bloated registry,
Most of the performance degradation on Windows comes from all the bloatware used to spy on you (remember 'Compatibility Telemetry'?). In this regard, I find MacOS even worse (whateverd is using 100% CPU again), and it's even harder to get rid of all the spyware. I think this is one of the few ways Linux beats the competition.
The UNIX heritage thing is eh. MacOS does a very cursory job of pretending to be UNIX, with usually just the very basics working. In contrast, WSL2 is mostly equivalent to working on Linux.
The filesystem stuff is weird. Sometimes MacOS is case sensitive, sometimes not, sometimes it does the weird file locking thing Windows does, that I hate, sometimes not.
I think each OS is crappy in its own way, and I definitely wouldn't put MacOS over Windows.
For what it is worth, I've not had a modern windows install crash on me in years. I think I had a blue screen of death perhaps once with windows 7 (I think?) and was gobsmacked to see it had a qr code and a sad smiley face! How unexpected!
On a Mac ai have had the beach all of death many times, even recently on M1 Macs.
By default it is not case sensitive. It is a counter intuitive case remember system. You can't have File.txt and file.txt in same directory. But you can have FiLe.tXt file.
I have to use a Mac for work, after not really touching the Apple ecosystem. I was amazed that every other dev I talked with had a list of tiny apps that you need to install in order to have a nice experience. On the latest versions of Windows, things just work (or I'm very good at fixing my annoyances).
I regularly use both Windows and MacOS, though I started with Windows king before MacOS... and I just fine MacOS so unintuitive, ugly and clunky!
Also, it regularly crashes when plugging/unplugging an external monitor. Actually, it crashes every now and then for no discernable reason too - feels like Windows back in the day when BSOD was common!
And not MacOS' fault, but I absolutely loathe my MBP's touchpad! Even on the highest sensitivity setting, presses take too much force. And it just feels... weird. Nothing I can put my finger on, but using it still feels unnatural somehow. Plus, it regularly registers phantom touches, which is incredibly irritating.
If it’s randomly crashing that could be a hardware issue. I’ve owned numerous MacBooks and Mac minis over the years, nothing should ever be crashing randomly, especially not macOS.
I’ve got an LG28” 4K on a 2019 MBP. I’ve never experienced a crash but the monitor will sometimes flick (lose video for a second, sometimes it will also auto lock). Just once or twice a day, so no biggie.
It took me awhile to configure my monitor correctly, and the color is still off. I got my wife the 24” 4K that Apple recommends and it has been much better for her (as a designer, she wouldn’t have put up with wrong colors).
2 is something I haven’t experienced. I guess maybe you’re using low resolution monitors? I find Windows to be incredibly blurry or text painfully sharp.
3 is something where I think the Mac mindset is to not really use full screen windows. I personally always have part of my desktop visible and most Mac users I’ve seen in the wild seem to do the same. Probably a consequence of the drag and drop mindset (which Windows is painful to use due to it doing the opposite of MacOS when it comes to drag and drop)
Enable the file path setting in your finder options. Super easy to navigate up then.
Disk images are nice in that usually they contain a drag and drop application install process, while windows often has some messy installer that leaves a trail of shit in its wake that’s sometimes impossible to fully remove when you’re trying to uninstall the program. But… I think a zip file would do all a disk image could and make more sense to people. Sometimes I do encounter zips, though.
Of course, it's all preference, I'm not making a case to say it's objectively bad, just I find it inferior.
The monitor is a ultrawide LG that works perfectly on windows. A Google for "blurry text external monitor MacBook Pro" turns up many people with the same issue (edit: it's a 1440p monitor)
> I'm not making a case to say it's objectively bad, just I find it inferior.
That's good, because usually in this kind of debate there's an undercurrent of "I've been using X for decades (and so know it inside out). When I try Y, anything that differs from X is objectively wrong"
I have an LG ultrawide with native res 3840x1600. Text is very blurry at that resolution though, so I use SwitchRes X to turn it down to the highest HiDPI resolution and it looks great.
I agree that it shouldn't be this way though, I'd have preferred to set the native res and be done with.
I agree. In my experience, Windows "just works".
And when it comes to working on graphics-intensive applications, and using commercial software packages it's a shoo-in.
For all the hate MS gets, windows 11 is pretty much perfect. Never had any issues with it whatsoever. Even teams runs well on it, not so much on Mac from what I hear.
"I can't share my screen" "Teams has crashed"
I would be furious if my 2k EUR Mac couldn't run a chat app.
Yeah… you lost all credibility when bringing up teams as a benchmark. It’s a piece of utter steaming junk that I would barely call software. Zoom and slack work just fine because they weren’t built by imbeciles.
Zoom just sits around devouring CPU while it's not even on a call.
If I ever walk in and my MacBook sounds like a jet trying to take off and is hot enough to cook dinner on, I know it's because Slack wants to update. I have no idea what it's doing to show me that "needs to install a helper" dialog.
Not a Mac developer here, but I spent significant time on both plattforms and run Kubuntu/Linux since 4 years.
It depends probably on what you are developing, but when you are doing stuff for backend, it will very likely end up running on a Linux box. This means having your platform, terminal and so on closer to Linux is useful (e.g. you have bash).
Windows has WSL now, so this would be a way as well.
From all UIs I like KDE Neon the best. It is truly a good experience.
I think it’s mostly linked to what you are used to.
I never Alt+Tab on macOS because I am used to using Exposé and the Windows equivalent frustrates me to no end with its extremely slow animation, same things with virtual desktops which feel very clunky on Windows but works fine on macOS.
Huh, Mac is the one platform I avoid virtual desktops because the experience is so clunky with no keyboard commands to move items between spaces or jump to spaces, and the weirdness with fullscreen windows being their own space but then returning to a previous space when fullscreen is exited.
> no keyboard commands to move items between spaces
Not natively but Rectangle.app is a FOSS app that has them.
> jump to spaces
Control+(left arrow|right arrow) does it for me. It doesn't hop across destops but it gets me where I want quickly without moving my hand off the keyboard.
> fullscreen windows being their own space but then returning to a previous space when fullscreen is exited
I'm not sure what you'd consider superior to this. If I have a window on Desktop 2 and make it fullscreen, I usually want it to go back to Desktop 2 when I exit it.
>Control+(left arrow|right arrow) does it for me. It doesn't hop across destops but it gets me where I want quickly without moving my hand off the keyboard.
You can create desktop-specific shortcuts in Keyboard → Shortcuts → Mission Control → Switch to Desktop N
Couldn't agree more. The UI is MacOS is actively anti-user in my opinion. E.g. menus that are not attached to the window, a task bar that hides the open windows, maximise that doesn't, "natural scrolling" that is unnatural if you use a mouse, no proper window snapping and layouting etc.
The only thing going for MacOS is the linux-like terminal. Now that Windows has that too (WSL) I see no advantage.
However, the hardware is nice. Way nicer than any windows dell or Asus or whatever I've had.
I like Macs a lot but agree in a couple of ways: Full-screen mode wasn't done well, and they never fixed it. Disk images don't make sense for distributing software cause they take longer to open for no reason, though the usual Windows way of having an exe installer is even worse.
> Why does the alt tab equivalent cycle through applications first, and I need another shortcut to cycle through the windows of that application?
I guess I’m the opposite a single command+tab switches between apps then command+~ to cycle through the windows in the app is very intuitive to me. Cycling through dozens of windows on
Microsoft is laborious a confusing in my experience.
And just to make sure your brain always has to concentrate when switching (beyond the fact that you now need two sets of shortcuts instead of one): Cmd+Tab cycles from most-recently-used (what you would expect) but Cmd+~ blindly switches to the next window in a cycle (not what anyone expects or wants).
Because it’s natively UNIX-based and this well-supported by the tooling.
Sure, Windows has WSL and it’s great and it’s gaining developers but IMHO it isn’t quite there yet.
And it’s really important to acknowledge that there are a metric crapload of real world devs on Windows. I wouldn’t be surprised if the vast majority of programmers were on Windows.
I'm primarily a Linux developer, but am now forced on to an M1 mac for work, and it's hot garbage. They force their own half-assed unsupported ways of doing things and break the standards that work on both Windows and Linux.
The state of gaming is actually worse than on Linux.
I prefer all my apps fullscreen and it’s so easy to swipe left or right to change desktops or between fullscreen apps - life savier when you’re in a single screen (I actually use it with multiple screens too). Windows is only now catching up in 11.
> Why are 3rd party applications distributed as Disk Images? Then some as package installers? It's weird and nonsensical.
Maybe you use a lot of paid software?
I just use homebrew for everything, and it just works(tm) 98% of the time. Which is way better than my experience with Windows. I tried using a variety of Nanite, Chocolatey, etc. And none of it seems particularly mature.
> Mac hardware feels great, but I just cannot understand why it's the preferred developer platform.
In my experience, Windows is developer hostile if you aren't using Visual Studio. Almost everything is deployed to Linux, and OS X is close enough that most stuff just works, but the drivers and UI are also completely dependable.
It's infuriating. Comes across as patronising, like Apple assuming the concept of a disk represented a tree of folders and files, and I am currently viewing a specific position in that tree, is too complex for me to understand.
> Why does the alt tab equivalent cycle through applications first, and I need another shortcut to cycle through the windows of that application? When I have an IDE with 5x instances, hosting different projects, it's a pain switching back to the project I want from another application.
I find your feelings really interesting here. Or, I grew up with Windows, but realizing that on Mac OS X was an "a-ha! brilliant!" moment for me. I really loved that it was available on Pop!_OS.
> Mac hardware feels great, but I just cannot understand why it's the preferred developer platform.
Depends what the word developer means....
In the context of Apple ecosystem developer is quite good, in the context of Windows or game console developer not great, for embedded developer depends on the luck of hardware vendor support for macOS, in the context of certified UNIX developer, it gets the job done.
Other than the taskbar from Win95-XP, I find everything that MacOS does more convenient and easier to use. The Disk Images are an especially well done considering that it is a less limiting form of read only ISO disk image. Also, find is better in most respects to base windows explorer, with the except of sorting directories before files and multi-select through the gui.
1. Really nice trackpad. It tracks smoothly. Really well integrated trackpad gestures. It makes a difference.
2. It is literally UNIX
3. Native-app developers that care (eroding year-by-year, but software companies like Panic, Omni, BareBones make first-class software that belongs on the platform). (Windows certainly has more apps, and probably more high quality apps by volume, but Mac apps tend a bit more toward high-quality-density in a more limited application pool)
4. Drivers. I swear, I get laptops from manufacturers with bad Windows installs, glitchy drivers, poor configuration, etc. I also hate cleaning crapware off of new machines and fresh installing OSes.
5. Lack of options. Few people explicitly value this, and sometimes, the Mac way is dumb, but since it's often the Mac way or the highway, it takes some cognitive overhead out of computing.
Overall, I find the Mac less of a hassle. Most things are pretty good, and you mostly can't fiddle with them too much, so you get the strength of that on the whole, not in one particular dazzling thing (except the trackpad.) Garbage window management and multi-tasking indisputable, though.
I’ve used dos, windows, linux, and mac. I appreciate the fact that macs are not very tweakable.
I won’t speak for others but for me endless customization of everything supposedly for optimization was just another form of procrastination.
I also appreciate that it mostly either works or it doesn’t. Either it’s broken and I need to swap it for a refurb or it works. Not much staying up late reading obscure stack overflow posts (in the old days webforums, in the old old days newsgroups) and trying this one more thing to fix the problem de jure.
The design choices to be different really grind my gears. For example the window close buttons. There's no reason for a western user group to put action buttons on the top left of the window, but Mac did it, to be different.
This is the least relatable comment I have ever read on Hacker News. When I switched to a Mac from a Windows machine it was like switching from living in a van to living in a luxury condo. I will never go back.
because it's easy to manage packages using brew, Windows has the... winget which is all but abandoned.
For the longest time there was no such thing as WSL so Mac had a serious advantage for us who migrated between Linux and Mac.
And as you've mentioned I've had so many quality control issues with Windows computers including Dell and Asus that eventually I just said fuck it I'd rather spend 3 to 4 grand on a machine that I know will work flawlessly.
Hear hear for Homebrew, who in my mind is single-handedly responsible for getting Apple to stop shipping old versions of scripting languages. We can soon, with Ventura, link the Brew version of Python, and Ruby, and everything.
Apple no longer ships scripting languages with macOS Ventura. They had said future versions of macOS wouldn’t have them last year with Catalina. I guess future meant next one.
You have to install them yourself, just like Java was removed a while ago. Honestly it’s great news, since Apple has always shipped very old versions, and you always had to be careful when linking against the system versions. Ruby is particularly batshit insane to manage multiple versions, at least for me.
I absolutely believe that the ascendancy of Homebrew is single-handedly responsible for the removal of those languages. Homebrew is magnificent, and with its recent change of the default install location to the correct Unix location, it’s a nearly perfect package manager for macOS.
> Why does the alt tab equivalent cycle through applications first
I've always preferred this to Windows' approach; as soon as I have more than dozen open windows I find it very difficult to find what I need on Windows, whereas with macOS there's an extra layer of hierarchy to speed up the search. They really could add it as an option though, you shouldn't need a third-party app for this.
> I attach a non Mac external monitor to a new MacBook Pro from work and the text is super blurry
No idea what the virtual display thing is, and the issue may be the EDID thing that another commenter mentioned, but macOS' handling of "non-retina" screen resolution is absolute ass since they removed subpixel antialiasing a few versions back. You can still disable the gross "font smoothing" via a terminal command which helps a little, but since subpixel AA was removed, text on "low resolution" displays -- i.e. sub 150ppi -- looks like shit compared to Windows.
Compared with the rather primitive way macOS handles UI scaling, the almost total lack of available > 4K displays for > 21.5" screen sizes to suit said scaling, and the fact that a good number of their most popular computers still ship with a non-integer scaling mode by default, getting an optimal image from a Mac is far more difficult than it should be. I know it's all good enough that most people don't care, but this is supposed to be the company where "good enough" isn't enough.
> Why is there 2 modes of "full-size" windows? Regular style "fill the window", then another wierd one that sort of takes over the whole screen
The "take-over" one was added during the peak of the iPhone and iPad exploding in popularity, when Apple was copying many iOS features back to the Mac. I don't begrudge it being there (my wife uses it almost exclusively for some reason), but I do wish I could switch the default behavior. I strongly dislike the notion that windows should be full-screen unless sharing with another app -- I prefer windows to be the size of their content, which is what the green button on macOS used to do (and still does, if you Option-click it). Windows makes this annoyingly difficult, though the new expanded snapping options in Windows 11 are very nice.
> Why does finder sometimes open in a sub folder as the "top level", then it's a total pain to move up to where you want
I've always found Finder to be the worst part of the Mac. Who knows why it does what it does. What determines the default window size? Why is it always far too small? We may never know.
> Why are 3rd party applications distributed as Disk Images? Then some as package installers? It's weird and nonsensical.
.app bundles are actually directories with metadata to make macOS treat them like files, so they can't be shared as-is. Sometimes you get them in plain zips, but I think the .dmg thing is a holdover from classic Mac OS. .pkg installers are usually used when the app needs to install extra stuff outside its bundle -- just like how sometimes Windows programs are an .exe and sometimes they're an .msi.
I love macOS as a platform, but I completely agree with everything the person is saying in the linked Twitter thread. No-one at Apple seems to be equipped to steward the platform anymore, so the software decays even while the hardware is going through a renaissance.
Unfortunately, the investment that is being put into it is largely to give application parity with new features they add to iOS. Things are superficially changed every year (often for the worse), but the actual base OS doesn't seem to be getting much in the way of features or improvements. Why do I need a third-party app to get reasonable window management? Why do I need two third-party apps to make a mouse usable (one to make the side buttons work, another to fix the scroll momentum and have a separate scroll direction to the trackpad)? Why do I need a third-party app to get a calendar in the menu bar? Why do I still get yelled at for yanking an idle USB stick without ejecting it first?
That turned into for more of a rant than I intended.
I feel like I may be the only developer that prefers Windows to MacOs
No, I’m also using windows, having many years experience with all of the mainstream OSes.
Mac hardware feels great
Does it?? Maybe, if you cover the price with your hand and close browser tabs with same-price pc configurations. Also their aluminium edges are pretty sharp, I can remember few times when I damaged that thing around my fingernails with display or keyboard edges.
Started with Windows XP. I quickly moved to Ubuntu in maybe 2006 after reading a book about free software, of which I had no knowledge.
I've been using Linux at home and windows at work ever since.
I've never thought the apple premium was worth the cost especially when I could use decade old hardware (cheap) perfectly fine with modern operating systems.
All this being said, every time I'm given a Mac to use its a bad experience. I find it very hard to see how people can find it an easy operating system to use.
I'm aware this is just my experience but every time Macs are mentioned all you ever hear is how easy it is to use. When I'm using it I have to Google how to do simple stuff like copy and paste.
Actually I got this mixed up with taking a screenshot, which kinda proves my point because I had to Google to find this out. Also no one can say CMD+shift+5 is intuitive to take a screenshot, prtsrn is much easier.
Of course it is, that's been my point throughout. Every argument here is subjective. It's all based on experience and anecdote. Most people who use Macs seem to think their experience is somehow an objective truth, its not.
I use a Linux desktop and have no issues whatsoever and never have to tinker with it, nothings ever broken; I experience none of the usual tropes people roll out as reasons they don't use Linux on desktop instead of Mac OS. And similarly when using Mac OS I experience none of the "amazing" user experience touted as the reasons everybody should use it.
My experience is no less valid then anyone elses and also no more objective either.
EDIT:
Unless of course you're talking about the screenshot keyboard combo for taking a screenshot? In which case I'd hazard that pressing one button named "Print Screen" is objectively more intuitive that 3 buttons with no obvious connection to the job at hand.
I believe that Apple is strategically choosing to coast on their desktop software while they work to harmonize their mobile, desktop, and entertainment software stacks.
Why? Because based on rumors, patents, and the decisions they've made and the actions they've taken (or not taken), it appears as though that Apple believes that in the not so distant future the concept of having a separate mobile device and a computing device will be irrelevant. Consumers will have a single device, and that device will be capable of some form of altered reality interaction, or at least some impressive context switching behaviour.
It's already the case that an entire generation has been raised with their primary computing experience not being on general computing devices, but on some variant of mobile or tablet computing. It's not too much of a stretch to consider that they will be most comfortable with a device that behaves with some familiarity to that experience, but which can engage in much greater utility in a pinch. And it will be mobile, and it will require a better input experience than mashing your fingers on the small screen that splits its available space between the important information and large input elements.
> the concept of having a separate mobile device and a computing device will be irrelevant.
Well ... yeah. People have been predicting and experimenting with this for more than a decade.
But when you do this some parts of each platform has to suffer. It's a compromise. User interfaces have to change, designs have to consolidate, performance or features take a hit.
So the challenge becomes how do you sell this? Seamlessly moving from desktop to mobile sounds nice but is it actually practical? Do we actually want it?
Let me use a single device but via different hardware depending on context.
Give me a phone I can plug into 2 monitors and a keyboard and interact with as a desktop, or slot into the back of a laptop case for portable computing, or put into my pocket.
Actually sounds pretty good, but how feasible is it for a phone sized device to be sufficiently high spec to run, e.g., fully featured development tools etc. I mean you need 32Gb RAM, 8 core processors and 512Gb SSDs these days just to run the basics (maybe less if using linux)...if you're doing modern game development or CAD stuff probably twice that. But... something you could dock your phone into as needed to allow a seamless experience that supplied the extra resources as needed might be feasible?
I think if/when we get to the point where people with browser based workloads can just sit down at a desk and have the monitors and keyboard automatically connect to their phone, more hardware intensive software will quickly move to the cloud.
You’ll be doing game development and CAD stuff with your phone acting as a thin client.
Yeah, I've tried doing cloud- based development. It's a no from me. I'd think that's further off being universally workable than the phone-docking idea. Though if you've already decided to get fully in bed with a particular cloud provider and totally tie yourself to their proprietary protocols etc. I can see the advantage.
Yeah I think that's a matter of software, plus optimization for a particular use case. Looking at Geforce Now/Stadia/Xcloud/Luna, the underlying technology exists, it's just a (not so small) matter of updating RDP's video stack to be seamless.
I don't have an issue with unix but I still rely on being able to work in scenarios where I don't have a reliable/fast-enough internet connection to be able to run multiple hi-res GUI desktops remotely. If we get to the point super-low-latency network connectivity is as available as electricity, then sure, it's feasible. But even 30ms latency can make GUIs feel clunky to work with. I know some devs seem to tolerate it though.
Some level of "convergence" works on linux phones (e.g. Pinephone, Librem 5), with the same "mobile" apps playing nice with the rest of the linux desktop/userspace. I think it will be the future of consumer electronics.
That seems exactly the opposite direction they’ve always gone though. Microsoft tried to have a completely unified PC/tablet experience, where their desktop OS basically became completely tabled focused in Windows 8. Whereas Apple has always made a strong distinction between e.g. the iPad and the Mac, refusing to build a convertible Mac or any Mac with a touch screen…
I hate to remind you this, but Windows 8 was 10 years ago.
At that time, the iPad was only taking baby steps as the blown-up iPod Touch it was. Its selling feature was reading New York Times. In contrast, OS X was at beloved 10.8 Mountain Lion — the version considered by many as the pinnacle of Mac-assed macOS, and the definition of what Apple’s desktop experience should represent.
So yes, in 2012, what you’re saying about “strong distinctions” was completely true. At that time, “computer” was synonymous with “Mac” in the sphere of Apple and its users.
Meanwhile, look at macOS now. Its last design overhaul, 2 years ago, had one goal: Make all control elements bigger, blobbier and simplified.
Last year’s updates were almost exclusively feature parity with iOS and iPadOS, and/or better collaboration of Mac with the other devices.
This year, the lineup of new macOS features consists of:
• better collaboration between Mac and iOS/iPadOS
• ports of two applications (Clock & Weather) from iOS/iPadOS
• redesign of an integral part of the OS, to make it look like iOS/iPadOS
• port of a window management tool which was very clearly designed for the iPad first and only added to Mac as an afterthought.
To say that the writing is on the wall would be an understatement. Yes, Steve Jobs made a clear distinction when introducing the iPad, but from the day he disappeared, Apple has been working steadily to remove that distinction.
You’re missing the forest from the trees - it’s not one device it’s one user interaction layer. Drag and drop from real life to AR, to Illustrator, to printed document should follow one framework.
We’ve entered into a realm where context dictates the task for the desired computing aims. The paradigms for user interaction change if you’re on a bus, at a desk, walking in the park. A unified abstraction for how to achieve user intent may be one device, but for certain a continuous ecosystem requires one language and one API paradigm to achieve user aims. That said, there are good ways of going about it (MacCatalyst) and bad (SwiftUI) - and I say this in light of whatever VR/AR is coming.
I know a guy working on this for AR and thought he had 0 shot on earth, but seeing Apple go for SwiftUI greatly increases his chances.
I'm skeptical of the "one device" future dream when so much of Apple's revenue comes from selling multiple devices for different form factors. Why would Apple want to sell you one device when they could sell you an iPhone, an iPad, AND a Macbook?
They will still sell all three along with many more devices. The brain, the most expensive part for apple to make will be the iPhone. I'm sure they will still make standalone devices as well but the pinnacle will be a singular compute device that works seamlessly with a wide range of "dumb" docking devices. We have seen previous iterations with things like Asus Pad Phone and Motorola lap docks.
Apples ipad and laptop revenue combined is only around 15% vs 50+% for iPhones.
They can sell the docking stations which may be both wireless and physical docking, as in they are useable when paired wirelessly but for maximum performance slip the phone in the docking slot.
The docking devices (tablets, laptops, home centers, desktops, gaming systems, digital media equipment, etc) can be sold at great profit margins and various spec levels (Resolutions, Coprocessors, input sensors, etc)
> Consumers will have a single device, and that device will be capable of some form of altered reality interaction, or at least some impressive context switching behaviour.
... plus for real work they will have Microsoft Windows 14, so they will be covered.
The Mac has had several record quarters since Apple Silicon Macs started shipping nearly two years ago. Let me repeat: a nearly 40 year-old computer has never sold better than it does now.
On a revenue basis, the Mac generates more revenue than the iPad; that's been the case for quite a while now.
Apple's latest quarterly report shows for the first 9 months of the fiscal year, Mac revenue is about $6 billion ahead of the iPad [1].
I get it; there are all kinds of different UI/UX paradigms that didn't exist back in the day and some of them are out of place.
But I've been using the Mac long enough to remember how fragile System 7 was and how INITs could crash your machine because everything lived in the same address space. On a 25 Mhz 68030-based Mac with 2 Mb of RAM.
The Mac is in much better shape now than it was a few years ago, when Apple couldn't ship stuff because they were beholden to Intel's processor roadmap, that was often late.
I don't mean to echo something we hear at each Apple presentation but these are the best Macs they've ever made. Old timers will remember the multiple times the Mac (and Apple) were doomed. I don't expect that to be the case; I guess we'll revisit when the Apple Silicon Mac Pro ships later this year or early 2023.
I think both Windows and MacOS are in weird places. Linux is Linux and I don’t see it suddenly becoming a better option for the masses any time soon so I don’t know what the solution is.
MacOS could be great with some small tweaks to the finder and modernization of how windows, full screen, and external displays function. They just need to do a comprehensive rethink here because it is clunky.
Windows is actually for maybe the first time in decades not that clunky anymore—yay! But Microsoft is still Microsoft and bugs are plentiful. For every bug they kill they implement two more. And not just small edge cases, normal every day functionality just won’t work for multiple releases. Outlook crashes when I pin an email? Xbox app won’t download game updates? And then there are all of the house ads, but those can usually be hidden away.
Linux, well, is Linux. I get it, you have it setup perfectly, for you. And it is super stable, for you. And it has come a long way in usability, for you. It just isn’t a suitable alternative for your average user.
Both Apple and Microsoft I think are taking for granted their operating systems because they are not the revenue drivers they once were. Instead they are the platform that all their revenue drivers hang on to.
> Linux, well, is Linux. I get it, you have it setup perfectly, for you. And it is super stable, for you. And it has come a long way in usability, for you. It just isn’t a suitable alternative for your average user.
Funny, I feel the same way about both Windows and MacOS. Although both have made massive steps backwards lately.
I don't think so, but I am running the Pro version so I can't speak with authority. Windows does do some weird stuff particularly after larger updates, but I haven't run into anything quite as egregious as that.
> Fresh new hires, who’ve never known a reason to care about the Mac, trying to rewrite key portions of the OS
Apple Support engineers do not understand the basics sometimes. I noticed my 15" 2880x1800 screen showed a resolution of 1440x900 after 12.5 upgrade. I raised a ticket and my request was patched to Apple US who ran a bunch of tests remotely & were confused what & why. I never bothered looking at resolution before, so this half resolution thing bothered me.
I dug out an exact replica config from office running 12.0.1 & voila it was running 1440x900 too. Then it dawned on me: The retina screen allocates 4 hardware pixels for each 'screen' pixel - so effectively canvas size was 1440x900 although screen resolution remains 2880x1800 at 220ppi
Apple engineer in the bug report still struggling to understand this. I have no words except teling him this is a feature not a bug. I genuinely think the software devs are very fragmented within Apple & they don't really understand the customer facing issues entirely.
The future of Apple is half iOS apps and half Electron apps. It's a shame because they uses to be very good at native apps. I understand the economics of it, but it's still a shame. Worse may be better, but it still feels bad.
Half of this rant is “things were better in my day and the younger generation can’t do what the older generation did”.
No, the space has moved on from catering to you. That’s different than them not caring.
This entire rant is so ego-centric that it doesn’t really add anything of value to the public discourse. This however, in my experience, is endemic of the author. Steve Troughton-Smith has some of the worst takes on Apple. A lot of his understanding of the technology is just outright wrong.
His subjective opinion is valid but he states them as facts and that rarely holds up on closer inspection .
r/unixporn is full of horribly impractical and poor UX made by bored people who don't really use their computer for serious work. Just tinkerers. Same crowd who post elaborate and artistic BuJo stuff up with to-do entries of "Post up new photos of BuJo to Reddit" and "Get some new pens for BuJo" as tasks. Meta crap.
If you take the doors off of a small city car to make it quicker to get in and out of at the expense of keeping thieves and the weather out, is that good UX?
I use Linux, and my whole family drives classic cars and kit cars. My father ran a garage at one point.
My point is that what's on /r/unixporn is predominantly gimmicky nonsense, precious little useful ideas. I'm not blinkered to seeing what's style and what's substance.
I have only been using macOS for a few years now but it’s only been getting better for me. The new control panel thing is great, the M1 is mind blowing and everything mostly just works.
Have been using Linux for 10 years now and while I still respect the work that has gone in, you have to be very forgiving of all the stuff that just doesn’t work right.
As a former moderator of r/unixporn (I went by u/fps_co1ncidence) and a longtime ricer of all types of Linux WMs/DEs, I can most confidentially assure you that there is nothing futuristic about what goes on there, just a lot of pointless tinkering and dotfile copy pasting.
The macOS experience is exponentially more polished and Apple will have to screw up phenomenally even from their current position to feel any threat from the FOSS community.
The future, hilarious. The stuff on unixporn is exactly what I was into when I was 15 years old. I had the sickest fluxbox setup, transparent terminal, all kinds of neat widgets, completely customized Irssi themes. It owned.
Fundamentally, I believe that Apple's biggest fans fall into two different camps. In one camp, you have people who effectively want macOS on an iPad with a touch-centric UI which is made obvious by their assertion that a unified everything across all devices is the correct path forward. The other camp wants a traditional desktop operating system on their Macintosh computers.
Personally, the only reason I enjoy Macintosh machines at all is the UNIX platform with a distraction free environment. The has been dying and I have therefore gone back to using Slackware with TDE. I don't want 50,000 notifications. I don't want my computer to be a dumb terminal to network services. I want my UNIX environment to be a first class feature and not an afterthought. The Apple Silicon chips are the sexiest thing to happen in technology since the introduction of the first ThreadRipper, but they aren't worth the trade off for me personally. Being forced back into the Apple world with, what is in my opinion, a distracting environment just makes me less productive.
Back in the 90s when I was doing graphic design work, you had to use a Mac. Windows wasn’t up to the task. Windows postscript support was bad and service bureaus wouldn’t guarantee output. I recall watching three it guys struggling to get a scanner to work right on a pc. Probably a lot of younger folks don’t remember just how crappy Windows used to be. There’s not as much of gap now.
But then, I think the Mac has been in decline since OS 10.6 at least. It’s just more closed, more bloat, less reliable since. Now I prefer to use Linux at home, thought I have a Mac for work.
I think Apple is just not interested anymore in selling general computing. It's not where the big dollars are.
It's not that general computing isn't worth doing right. Many people still need to create deep content instead of consuming it or at best create some mindless tiktok nothingness.
And it illustrates the problem with opinionated design. If the designer's opinions don't correspond to your own, it's not useful anymore.
Most Mac users are also iPhone users. Most iPhone users are not Mac users. To increase Mac users you need a consistent UI experience not alien concepts for iPhone only users. Unifying the interface is evolutionary towards iPhone equivalence is the outcome.
After all, the App Store makes money from iPhone and iPad apps not Mac so Mac needs to evolve in the slipstream of iOS investment.
I haven’t run Windows since 2014. I got into the Apple ecosystem in 2018-2019 (first Mac, then first iPhone). By 2021, I was completely disenchanted and bought a Framework DIY (now running Ubuntu/GNOME). Honestly, the only thing I truly miss when not using a Mac is easy copy-paste in the terminal. In every other way, GNU/Linux is a superior experience.
The Elementary OS terminal has two function copy/paste. If text is selected it copies it, if not pass Ctrl+C to the terminal. It's a reasonable solution.
For me it's the overall Command-key semantics that I miss when using Linux. I like being able to use readline shortcuts in any text field (Ctrl+A, Ctrl+E, etc.) macOS really nails it there and no desktop environments on Linux even attempt to replicate the feel.
The other thing I miss is how VS Code's text navigation shortcuts work. The modifier combinations of selecting text, navigating (with or without a selection), moving and copying text selections and how that interacts with multiple cursors are totally different on Windows and Linux. I'm not sure if it's even possible without the Command key (super) because of how the super key is treated in Linux.
It's all very depressing and makes me feel held hostage on macOS because of my muscle memory. I don't like how Apple treats me.
This whole rant touches on the true topic at the heart of this. Software developer a.k.a the programmer used to be the key person behind everything. Just like brick layer or road builder. But now that the roads have been (mostly) built, or so it seems, what are the professions in the spotlight, and even more importantly, in the shadow?
The erosion of the Mac experience began with Sublime Text. It’s a perfectly cromulent editor, but it was the inflection point where the Mac was becoming an important business target for efforts outside the ecosystem, and where developers who hadn’t been on Macs by choice were being issued them by their employers. It was the first widely adopted third party app on the platform that was both non-native and adopted enthusiastically. Formerly native apps embraced more cross-platform junk (Adobe famously moved a lot of its suite toward Flex, based on Flash). And then by the point Macs were becoming anything like commonplace, everything was a web app.
Not that I disagree Apple’s UI frameworks have degraded the experience. But the whole platform was a wasteland before they even had a chance to degrade anything.
The thing is though, even though Sublime Text doesn't adhere to all Mac UI conventions, it does respect the most important ones and its appearance can be brought further inline with custom themes. Additionally, it's always performed excellently which makes up for shortcomings in overall "nativeness".
This cannot be said for much if not most cross platform software that runs on macOS — much of it is strictly least common denominator when it comes to UI conventions, system integration, etc and more often than not is slow and bloated.
I would say that the erosion started in earnest with Atom. People flocked to it because it was free (even though Sublime kinda was too, being nagware) despite its performance being horrendous and its lack of maclikeness, setting precedence for the acceptance of generic UI, sluggishness, and high resource consumption.
I agree. Interesting point. I was one of those who switched, and, looking back, I didn’t give any thought to how Sublime was built vs. how Textmate was built. A testament to the performance and level of system respect from Sublime, but it did popularize the viability of the approach and a lot of less careful software followed.
The only thing really differentiating ST and "fully native" apps (like TextMate) is that more of the UI is custom. It runs native code (ObjC/C++), uses appkit, native dialogs and generally follows platform conventions.
I recently upgraded MacOS and now TextEdit occasionally crashes randomly. And pasted text has bizarre spacing even in plain text mode. Are they eating their own dog food?
That’s my biggest gripe with MacOS: the yearly release schedule. Give us two or three years on each major release - at least.
Take the labour saved and do something about bugs that affect system stability (the well-documented finder memory leak, for example), particularly problematic apps (looking at you, Music.app) and the woeful developer documentation.
Yes. Yes yes yes. The desktop is there for work, and having a stable interface + a stable, bug-excised system makes everyone's lives better by making work smoother and more predictable. Apple makes great hardware; it deserves software to match.
The first is that the vast majority of concerns are coming from the Independent developer iOS/Mac scene. Some of these concernes are legit. There is unquestionably problems with Apple's quality with some software. The new SwiftUI based Settings app in Ventura is not yet stable, and when Apple's entire raison d'etre has been it just works, it's understandable that people get upset when things just don't work. If I were a independent mac developer, with my only source of revenue depending on Apple's goodwill and development APIs, I can see being upset.
On the other hand, it seems to me that the Apple community, especially the independent mac community has become increasingly a toxic monoculture. Just like polarization affecting other parts of the world - the community has become vitriolic, isolated and extremist at times. Anything not done in line with the one true Steve Jobs and in a way the community wanted - the apple way - has been labeled a disaster and "problematic".
In particular I think of the crusade/jihad against 1Password 8 when 1Password went Electron (in part to address many of the same problems). It got to the point that people were posting "evidence" of how bad 1Password 8 (electron based) was by posting screenshots of a misidentified previous version (which was worse). The twitter abuse heaped on developers just working to make a great product was a bad look.
The Apple podcasting community talks about Apple the way Trump fans talk about Democrats or Democrats talk about Republicans. It often bleeds into non-apple content as well. I've consistently found that the angriest people on twitter are apple partisans.
I finally unsubscribed to Accidental Tech Podcast after listening since almost the first episode just because of how negative everything is. It's not universal - I found another apple centric podcast to replace it - but it is pervasive.
To be fair - a lot of that is that this is what the modern media does. For example Linus Tech Tips's Tech Linked channel has the following headlines for their videos over the last 90 days:
1) Apple Betrayed Us All
2) Apple has no Shame
3) Apple got SERVED
4) Apple calls this PRO?
5) It's over Apple, the EU won!
6) Something is wrong with the M2 Apple Air
7) Apple will never escape this.
so I am inclined to treat a lot of this as just what people do to get listens and clicks.
I think the right perspective is somewhere between the two extremes. I wrote an app recently in Swift and SwiftUI that was amazing in how little effort it took to get a good (but not great) app and interface. Much faster than Flutter and far more visually fluid (animations, look and feel, etc). I can see its limitations though, and if all I knew was Objective C and AppKit, I could completely see where a developer would be skeptical of the approach. But the promise is most definitely there for a brighter future. In the mean time, my personal PoV is that things like Slack (which is electron based), 1Password 8 and Adobe Creative Suite shows that you can be both pluralist and great. Attacking them for not being pure isn't helping anything.
Long term, I have hope. Most of the criticisms of Apple's software were echos of similar complaints of the horribleness of Apple's laptops a few years ago. They turned that around - and I think Apple is honestly working to fix their software in the same matter. Apple needed to get off of Objective-C (who knows C anymore, to say nothing of objective-c?) and AppKit (30 year old technology at this point). Like the Intel to Arm transition, a lot of work is going on behind the scenes to make the transition work.
Apple really wants to be the BMW of the compute world - and Tim Cook's entire strategy seems to be the BMW every nickle and dime strategy. If Apple really wants to do that, they need things to just work.
Apple should accept the old Mac OS X HIG principles and make them mandatory. The UX is so broken, that in some moments I have Windows Vista reminiscence.
But hey, I am an old user. I should move away and say: Yes Apple, you know better than me.:)
Like, really horrible as a desktop macOS feature IMHO.
I feel like Apple are trying to shoehorn bad iPad features into macOS to unify them. Instead of making iPadOS more like macOS and therefore more useful generally.
I dont know - something is screwed up in Apple's strategic thinking.
They admitted as much though with their back-to-the-mac push a few years back.
Shrug.
I've been using Macs since the mid-80s, and i'm very conflicted.
The experience isn't the same anymore, for me But then the computing world isn't the same, and my needs aren't the same.
I used to customise and feel a personal connection to my Mac. Now they are interchangeable and disposable. Like servers, they are cattle not pets.
The new M1 Macbooks are amazing though and we've never had it better in many ways.
>But then the computing world isn't the same, and my needs aren't the same.
Strongly disagree, the needs are the same, the interfaces and information flow are different.
This is not a reason to abandon basic UX and UI principles.
On the desktop, the paradigm is still a mouse (pointer device) and keyboard. Accessibility and Usability requirements are the same.
I had more pleasure in recent years from customized KDE environment than the chaotic changes in macOS.
Unifying UX for desktop computers with UX for touch devices is an absurd proposition.
> Why do they find fault in the minor details of the best computing experience ever delivered?
The whole point is that the Macintosh Operating System was all about caring about details. Criticism like this comes from a place of love, not from a place of hate.
I love my Mac. I love the experience. I love using it. However, when the "dismiss" button on a notification hides itself as I hover over it, and I have to move my mouse off of the invisible control and back onto the control to fix the bug, on a regular basis, it's grating.
Why? I love my socks -- I just wish they would stop coming with holes in them. They're great socks and other than the holes I love wearing them. Other sock makers make socks that aren't as comfy but don't have holes. Why can't these just not have holes?
Apple is hiring tons of fresh new SWEs who have no understanding, nor desire to understand "The Mac Way".
I'm not saying this is right or wrong, the world needs to move on, but it's a punch to my nostalgic gut.
If you're hanging onto the Mac out of some belief that Apple will re-find their way WRT macOS, don't. That train left the station years ago. It's past time to move on.