I am a huge fan of Apple's products and never understood the fuss when the new Pro Keyboard attracted bad press, reading it on my old, robust pre-2015 era Macbook.
Until now. I got the new one at my job three months back. And, here I am, still struggling with and super annoyed with the missed keystrokes (the buttons are so thin/don't press properly), the almost non-existent 'Enter' keys (seriously, who messes with the Enter keys! ), the useless touchbar. I look like a klutz when I have to show code/artefacts to someone because I am always mistyping or closing windows.
Additionally, the touchbar led me to one heart-stopping evening of infinite restarts[1].
I am terribly disappointed that they released such a shoddy product, especially since it's used as a workhorse by thousands of developers world-wide.
Against my better judgement, I bought a 2017 MacBook Pro about 6 months ago to make it easier to work with a client.
I sold it this week at a big loss.
Between the terrible keyboard, the useless Touch Bar, and the dongle hell, and general hardware flakiness, this was the worst laptop I've ever owned, Apple or otherwise (and I've owned almost every Apple laptop generation since the PowerBook G3).
Literally praying that they fix this crap in the next update. My mbp is getting on in years, and if they don't do something I'm going to move to linux for my next machine.
Amen. I'm using a company issued "new" MBP and oh! do I loathe it! Have been repeatedly one click away from buying a Carbon X1 for little more than half what an MBP would cost. Seriously, Apple... don't underestimate the domino effect, my home looks like an Apple Store but once I begin "fiddling around" to make something else work and it eventually does, there's no more incentive to remain inside your Gilded Enclosure
Maybe this is a market opportunity for somebody to build a laptop with high end components (monitor, keyboard, touchpad) running a super-duper-well-integrated linux.
That sounds very like my Dell XPS 13 (on which, in fairness, I run Windows).
But - I have to say I preferred the hardware on my old Sony Vaio Pro 13. The Dell keyboard is OK but the action isn't great, the battery life isn't great, and I really miss the extra thinness from having a carbon shell instead of aluminium.
The perfect laptop will exist somewhere, eventually...!
I've heard that, but I'm skeptical: does Dell maintain the integration, or do they just install Ubuntu on it? If I seem to have a problem with the integration between the installed touchpad, who do I go to? The Ubuntu forums, or Dell support?
I am in the same boat. Received one at work and can not stand the keyboard. I requested an older model, but they are no longer providing them. It might seem dramatic, but I spend my days typing and I need my keyboard to not get in my way. I have resorted to bringing a USB keyboard into work with me.
I have to know if this story is gaining traction because of the Joe Rogan podcast. Did you see his hour long rant on abandoning Apple because of this yesterday? Or is everyone simultaneously reaching this same conclusion?
He was very much a kool-aid drinker, but also a writer, so the keyboard is apparently a huge deal to him. I've basically never owned an Apple product so I'm just watching this from an outsiders perspective.
People have been complaining about the shallowness and lack of travel of the new keyboard since it came out on the new MacBook, but it's really been an issue once professionals were forced into it on their MacBook Pros.
So this has been going on for some time, I see. It just seemed like a coincidence that he was railing on it for an hour and then I see this today. His shows get like 5 million views, so it seems probable he could steer a conversation.
To be fair, when you're actually in work, do you not find a less cramped keyboard, proper magic mouse (or magic trackpad) and dual monitors make you more productive? I know they do me.
> To be fair, when you're actually in work, do you not find a less cramped keyboard, proper magic mouse (or magic trackpad) and dual monitors make you more productive? I know they do me.
I like Apple's trackpads, but the functions that they enable are merely nice-to-have features. If you spend your day typing, any noticeable drop in keyboard reliability is going to outweigh any other input-device productivity gain. Keyboard reliability is a critical showstopper feature.
Same boat here as well. Three of my keys continuously either don't register or register multiple times. Took it in to an Apple store and they fixed one (for a short amount of time) and quoted $800 in repairs (and 5-7 day turnaround) for the other two (to "replace the laptop frame"). Replacing a keyboard (or even just keys) should not be this hard.
Does anyone have any insight into why Apple's QA took a turn for the worse over the past couple years?
Buggy IOS/OS releases and now the hardware issues... I always equated Apple with quality and consistency (along with price), but now I can't really see how the price is justified with the issues they have been having.
They no longer have Steve Jobs Quality Control? He was the type of person to play with something for 5 minutes and say "yeah... no" or to eagerly take something he liked and abuse the hell out of it for 24 hours then come back with a laundry list of improvements.
He was a jerk and he cared deeply about user experience.
In his absence you have what exactly? Tim Cook lacks vision, if you want your mountain moved to Pluto he'll have it there by Tuesday but he'll never stop and ask "why?" Jon Ive ascended from above to bless us with the word "chamfer" while carpenters and machinists world wide rolled their eyes in unison, his focus is purely on aesthetic.
It's more like Steve Jobs was such an uncompromising asshole that he would rather throw a substandard product in the garbage than put out something he considered lesser. He didn't have vision, nothing he "made" was revolutionary at all, and he didn't make anything after the NeXT. Steve Jobs was amazing at two things: getting in on the ground floor (the iphone came out so quickly after the first all touch screen smartphone that no one even remembers the original) and making sure that what they did put out either worked or didn't see the light of day. I remember the early android competitors (notably Motorola's droid) suffering from touchscreens that failed to track accurately (https://www.wired.com/2010/03/touchscreens-smartphones/) compared to the near perfect performance of the iPhone, its the sort of thing that Steve Jobs would have insisted on.
>and making sure that what they did put out either worked or didn't see the light of day
Regardless Job's issues, this was the best and most importand part.
Your toaster should work! If the user has to push the trigger twice (only to have it overcooked) - you should either work on your mistakes and release when you are finished, or throw a it in the garbage.
My memory of the time was seeing a bunch of cool Microsoft Surface demos, thinking "someone is gonna make a killing putting this on a phone", and the iPhone coming out a few months later. I'll go so far as to say that I think it was the first touch screen phone that was actually manufactured and sold.
The LG prada phone was 6 months earlier than the iphone demo and was all-touch with a capacitive screen. LG claimed apple ripped off their design, but I think it’s more a case of hardware evolving to the point this became possible and both companies implementing, but LG releasing a less ambitious product sooner.
I see now that we've been talking about "touch screens" in this thread, but I've been thinking of that as meaning the multi-touch screens we're all so familiar with now (and of which the Microsoft Surface was the first demo I saw). It looks like the Prada had a capacitive screen, but not multi-touch. Maybe multi-touch was the thing, which is why the Prada was forgotten? Maybe not, maybe the iPhone just won by marketing and deals with carriers. Beats me.
In any case, I'm quibbling, I didn't know about the Prada before you mentioned it, which makes your point. Thanks for the pointer!
> He didn't have vision, nothing he "made" was revolutionary at all, and he didn't make anything after the NeXT.
Maybe vision is the wrong word. You're right that he wasn't imagining the future most of the time. What he did have was a degree of objectivity and restraint. He wasn't blinded by the "wow" factor of a new technology and could objectively weight it's merits. Where as most tech companies try to cram the latest and greatest technology into their products to have an impressive bullet list to show around, he held back until he felt the technology was ready.
I had a full touch screen, color, icon grid of 3rd party apps, internet enabled, smart phone in 2001.
Samsung SPH-i300
I had multiple web browsers, email, irc client, telnet and ssh clients, and an open marketplace of thousands of 3rd party apps for every oddball purpose, like I had a resistor color decoder, netmask calculator, etc.
2001.
iphone came out in 2007, and had no 3rd party apps.
I'm with popsiclepete. I had a long string of "smart" devices before the iphone and they were, unilaterally, garbage. Even the ones with great hardware(looking at the treo 600 series here especially) had awful software and bad battery life.
I bought the very first color palm pilot and was first-in a lot of tech. All this stuff was really, really clunky to use and just generally bad. I couldn't hand it to someone else and expect them to figure out how to use it without being shown.
Also, styluses were terrible except for corner-case inputs. The fact that there's essentially only one mainstream brand making a main-line phone with them goes to show this was true.
I agree it was a flub the iphone didn't launch with apps, but i honestly think people needed a year to get used to the interface concept and a lot of complete garbage would have come out of people trying to get first mover on the market(and it DID, even a year later, absolute shovelware)
People really shouldn't be romanticizing pocketpc, or even palmOS. The first truly good mobile OS palm put out was WebOS, and it took microsoft until windows mobile 7... both after the iphone.
I also was a "power user" with some Nokia/Symbian bullshit that had "thousands of apps" and it was absolute garbage compared to my first iPhone. For a normal person.
Let's stop romanticizing the cell phone market pre-Apple/Android - there's a reason the rest of them folded and effectively died or went into obscurity within a few years.
Steve Jobs was responsible for the original Macintosh, the device that constantly over-heated and had the same repairability problems then as these Mac Book Pros.
I know Jobs is Silicon Valley Jesus, but he did not walk on water.
I am still confounded how people continue to say that Tim Cook has no vision, as if he's simply some sort of bean counter with green shades who just lucked into become the CEO of one of the world's largest companies.
Tim Cook's vision is much larger than computers if you haven't been paying attention. Look, from what I can observe, ever smaller devices, lifestyle devices, the focus on privacy, their take on the cloud; Apple is looking to position itself to exist intimately in our lives. The only way to get there is to have the same level of trust that you do with family or even a lawyer.
\\
And yet Apple is eroding trust by pushing buggy software and poor quality hardware. Maybe his focus is just elsewhere but Apple is who they are today because of an attention to their products that other companies lack.
Maybe he's trying to position Apple as a privacy minded trustworthy company when it comes to your data but he's ignoring or entrusting the crown jewels to someone else while he does it. If people stop buying Apple products because they're unreliable and support sucks then it won't matter.
What's the vision exactly? The focus on privacy? What exactly are they doing about that? Their "take on the cloud" is what? 'Do it badly'?
> Apple is looking to position itself to exist intimately in our lives. The only way to get there is to have the same level of trust that you do with family or even a lawyer.
Bleh. That is just awful marketing speak. Why would anyone trust them to "existing intimately" in 'their life' if they can't even reliably produce reliable computers (regularly)?
My iPhone 5 (ha) is fine, but I'm hanging onto it for as long as I can because I expect to be disappointed by upgrading.
People express that Tim Cook has no vision because he's failed to deliver like Steve Jobs did (in a way they care about).
Absolutely nothing! I just don't care for Jon Ives, he's pretentious and likes to demonstrate his superiority by using domain specific terms in casual conversation to impress the layman. When the iPhone 4<?> came out Jon Ives would talk about how exquisite the matte chamfered aluminum edges were as if it was an engineering miracle and he single-handedly invented the chamfer mill.
Don't get me wrong, I think his passion is wonderful and I enjoy listening to him talk about physical design in much the same way I enjoyed watching the movie Helvetica. But he's still a pretentious twat and my impression is that he doesn't care about design beyond it superficially meeting his immediate needs. Similar to how an artist might build an installation for an exhibition with no concern for it surviving beyond that, I don't think Jon Ives designs products with longevity in mind.
You can explain Apple’s problems with a much simpler albeit banal reason: its sheer size. It has grown massive.
They took over the old Sun campus in Sunnyvale while the UFO was under construction, and shortly thereafter the word was that those teams would be remaining there long term as the HQ was already overbooked. Apple also bought up a lot of space in San Jose a couple of years ago.
Apple pioneered small and fast teams of veterans. But it doesn’t scale well to this size. It’s a challenge to effectively coordinate hundreds (thousands?) of teams building various integrated hardware, software, and service components.
Nothing screams "Apple" as much as a perfectly designed, elegent beautiful, device (campus) that is underpowered and unsatisfactory on launch day, requiring a bunch of external attachments to make it useful outside of the demo use case.
I used to be an Apple fanboy when they were the only option in town for a usable UNIX laptop. Which, quite honestly, hasn't been the cause for a long time.
I love my Dell XPS Dev. Edition (9350). All stock intel hardware, everything works (except the track-pad touch detection, can be flaky sometimes) and Manjaro Linux absolutely flies on it.
The build quality of the laptop is outstanding. The only thing that I'm not in love with is the webcam placement, but not a deal-breaker.
It cost me $999 2 years ago and feels as snappy as my 2015 top-of-the-line Macbook Pro that I use at work.
I'm not saying it's better than the Apple/OS X combo in every way, but it's a great overall machine and Linux doesn't suck on laptops anymore. There's no need to put up with Apple selling you 2016 hardware at 2018 prices and an OS that isn't that special anymore.
Each one of these concerns is making me clutch to my early Macbook Air. I'm terrified to upgrade. In fact, a colleague recently purchased a Macbook Pro and I steered them clear of the touchbar.
The 11 inches MBA is the best laptop for me. If they just upgrade the processor to the latest gen, and the screen to retina I'd upgrade to that in a heart beat.
I like those too and just bought a second one (2015 model) for the girlfriend. It seems a shame that they stopped making them. I guess maybe it was too much competition for the more expensive, usb2-less Macbook.
Clinging to my 11" MBA (2014) as well. 4-core, perfect size, indestructible. Totally agree that if they would make any modest improvements just to keep it current (proc, mem, wifi, etc.) I would refresh every year or two. I don't want the newer MacBooks with half the processor, no magsafe, and bad keyboard.
The (later) MacBook Air keyboard has similar problems (to the ones described in the article). My previous MBA had keys that stopped working after a year. And you cannot easily replace the keys without breaking them. :(
So true. Earlier in meetings we can go about silently browsing or replying to emails. But now the keyboard is so loud that we have to apologize to the speaker!
Then decline the meeting. Unless you're a very junior developer or a brand new employee (and probably don't have the necessary experience or context, respectively, to determine whether or not you need to pay attention) you probably have the right to decline meeting invites.
Even at desks in the office, I'm not a fan of people bringing in super clacky mechanical keyboards. I have my mechanical keyboards at home and they are great, but I'm not going to subject my coworkers to that amount of noise. If people want to use a mechanical keyboard at work, they really should do everything they can to silence the clackiness.
Apple products are overrated. Their selling point is security and ease of use but we all know you that a computer cannot protect you from breaches. As for ease of use, most devices nowadays are pretty intuitive.
Why would anyone buy a low performance laptop for thousands of dollars, additional dongles and external keyboards when you can get a lightweight MSI laptop that's far superior in every single way for so much less?
Earlier this year, I returned my brand new MacBook Pro and asked for my old 2016 MacBook Pro back. I'm a touch typist and I just don't get enough feedback from the new keyboard to be sure I pressed a key. It actually slowed down my typing. And the lack of and ESC and function keys means that I just can't develop on it anymore. I'm still happily using my 2.5 year-old MacBook Pro.
My penultimate laptop was an HP Mini 101; I only replaced it on a whim, and still use it from time to time. The keys have been worn to be smooth, concave and most are missing the printed text. When I last opened it up to add an SSD a suprising amount of sand came pouring out; having been taken to many beaches, tossed in bags, and generally used outdoors it had accumulated a huge amount of extra material.
I now use a Thinkpad X140e; it's got a rubberized shock-protected shell and a reinforced body. The keys are in a similar state, the body is cracked and taped together from wear, and it's fallen down several flights of stairs and fallen from great heights. (Thanks, kids!) Too many liquids have spilled on it.
The combined cost of both these machines and the replacement batteries, new RAM, larger disks that I've added to them is less than that of any mac; and they've taken a hell of a beating.
I wish more people would. That's the best way to get any issues fixed, whether by filing bug reports, writing up a workaround on a wiki or Stackoverflow to help others, or fixing the bug.
It's important for us to own our own tools, and keep the right to do with them as we wish. We expect this for our servers, but ignore it for laptops and desktops. If developers continue like this, then users will be corralled further into walled gardens of pay-only or data-leaking software.
HN is outraged when farmers aren't allowed to repair their John Deere tractors, yet the majority choose even more restricted OSs for work and personal use.
(This comment written in Firefox on KDE (Kubuntu) with a 4K screen, all working very nicely.)
And why do you think that a lot of people who are painfully aware of all the issues you mention still refuse to use Linux on a laptop while happily using it on servers?
There is a riduculous repetitiveness about this debate. For me and many others, Linux on a laptop has always been unstable, insecure, hot and noisy. Many specific issues can be fixed if you put in the effort, but on almost every dist-upgrade something essential breaks and you're back to square one.
And the response to these issues has always been a mixture of disbelief, denial, accusations of incompetence and pointers to ridiculously convoluted and brittle workarounds. There are bound to be people on every message board claiming to have run Linux for years if not decades without any of the issues I'm seeing on a daily basis and have been seeing for decades on scores of different laptops.
So where do we stand on this? Will we ever get out of this unproductive loop of claims and counter claims while an oligopoly of corporations builds ever higher walls around more and more restrictive gardens?
I'm running linux on all my machines, including a Macbook Air and it works very well. Unstable, insecure, hot and noisy would not be words I would use to describe my setup.
The only noisy machine here is parked in another room and given its size and processing power it is noisy but that goes with the territory. Apple doesn't make anything that is even remotely equivalent.
Which means I can run the same software on all my machines which is another benefit.
And yes, I've been doing this for decades.
As for 'dist-upgrade', I don't normally do that, I go for long term stable releases and upgrade when the hardware gets replaced, which is once every few years.
> There are bound to be people on every message board claiming to have run Linux for years if not decades without any of the issues I'm seeing on a daily basis and have been seeing for decades on scores of different laptops.
Unless you think they are lying, it's an interesting observation. It could just be you are incredibly unlucky (a few samples in the global population will have problems, just like a few will never experience any, even on Windows). It would seem to me the distribution hump is rather on the relatively problem-free area (my mom is on Linux since the early 2000's and never had serious issues - of course, she doesn't use off-distro repos, doesn't compile her own kernel and isn't running either Sid or Rawhide), considering most people who run Linux are tech-savvy enough to get themselves in trouble. I myself had a couple desktops without sound in the late 90's but, since I switched mostly to laptops without any fancy components (such as multiple GPUs, smart-card readers, fingerprint scanners, stereo cameras...) I haven't seen a machine that doesn't run Ubuntu flawlessly.
I really believe that what it will take is an actual business that makes actual money selling a laptop with good hardware and an integrated linux install that they maintain. Instead of getting into a flamefest on the ubuntu message board where you have a completely different set of hardware and drivers than the people you're debating, you file a support ticket that the audio on your IntegratedLinuxLaptop 5 seems to have stopped working, and someone at this company figures out what's wrong and patches the exact right driver or system software to fix it. If that doesn't sound like a viable business, well, that's why this is still a problem.
Hell, the second article on Phoronix at this very moment is "GNOME Will No Longer Crash If Attaching A Monitor While The System Is Suspended", and if you scroll back further, repeated promises that Linux power management is getting better.
OS X and Windows also have broken usability, but at least their users don't run around telling people "it works fine for me! Just make sure you use a LTS distro with 5+ year old packages and without any hardware made since 2005"
> Linux on a laptop has always been unstable, insecure, hot and noisy.
Wait -- insecure? How? You mean full disk encryption? I am typing this from thinkpad with full disk encryption and SELinux enabled Fedora. Sure there are problems with Linux on laptop but even if you don't have SELinux enabled - security has never been an issue (or no worse than Windows and OSX).
What I was specifically referring to there is that Ubuntu's NetworkManager (in combination with Open VPN) had (or still has?) a DNS leak right out of the box for years.
I think you are underestimating the importance of fixing glaring security bugs ASAP when they become public knowledge instead of denying or ignoring them for years.
Also, a DNS leak is potentially much more dangerous for a minority of people than any locally exploitable bug.
Exactly! Personally I've been running Debian on my laptop for about 3 years now but come on! You can't deny that it is a pain in the ass to use especially if something breaks and the only way to fix it is to sift through many forums online to find the right commands solve even a simple problem.
Also, why is it so painfully slow to copy documents to a thumb drive using Linux as compared to Windows?
This. I gave up after several rounds of dist-upgrades broke, then fixed, then re-broke things. If you’re making fundamental architectural changes on a regular (monthly) basis, it’s pretty clear you’re not interested in being taken seriously in the business user world.
The problem is that "Linux" is ambiguous. A comment below is agreeing and they use Debian. I've heard of Ubuntu being flakey. I used Fedora and Arch for years with zero issues (this is shocking to people who prefer LTS systems). CentOS is a trainwreck.
Take any OS, twist the wrong knobs, and you're looking forward to a long night.
Mind that the priorities of the user are different. Most people, including programmers, aren't concerned with using all the utilities of Linux land. Programmers have eyes too, and likewise enjoy aesthetics.
For programming, I'd use Linux over MacOS any day of the week, but for music, MacOS is the clear winner.
It boils down to what kind of trade-offs we are willing to make.
I might have been one of those Linux users that didn't have many serious issues for decades. Tried OSX for a while (2013-14), but didn't like how OSX gets in the way, though Mac hardware seemed decent. I would rather put up with an unusable fingerprint reader under Linux (Thinkpad Carbon X1), rather than OSX idiosyncrasies. There are other issues with the current Thinkpad (screen mirroring doesn't work reliably), but that is not a deal breaker for me
> For me and many others, Linux on a laptop has always been unstable, insecure, hot and noisy.
I've successfully converted many people with similar claims over to Linux and here's what I've found; most of them were trying Linux on a subpar, non-standard laptop, (Like an old MacBook or a slow Celeron), to lower their investment if they don't like it, which is fine, but then comparing the experience to their high end MBP running macOS, (try installing macOS to the same set of low-end machines most try to put Linux on, see how that goes). All it took is for them to get good, solid hardware, (like a recent Intel stack, or a Ryzen with a GPU compatible with the AMDGPU driver), and all was good. Not saying that is your case, but it is shockingly common, which is unfair to Linux, I'd say.
I'd agree with that, I run stock ubuntu on a thinkpad t460p. I have only one issue which is that display scaling is not fractional, its in set values which doesn't play well with resolutions above 1080p. Thats a gnome issue though and doesn't stop the screen working perfectly when the resolution is set to 1080p instead of 2550x1440 or whatever it is.
This is trivial to fix actually. Ubuntu 16.04 had "fractional" scaling by default but in reality all it did was between 1x and 2x use 1x with larger fonts. 18.04 lost that with the gnome transition but you can just change the font scaling manually. Less convenient and polished but still there in the tweak tool. Firefox also includes a scaling option in the configs. With those two settings my T460s with the 2560x1440 screen looks great. It does suck that the Gnome transition in the new LTS has been so bad. For all its quirks the Unity on Xorg experience was actually very stable and polished for years now.
This, right here, is an excellent case-in-point illustration of the reason Linux isn't widely used on laptops: even on expensive, high end hardware with excellent Linux support, and the most popular distribution, very basic functionality like screen resolution not only doesn't "just work" out of the box -- getting it to work properly at all requires serendipitously stumbling across a random forum post somewhere that directs you to the needle-in-the-haystack magic config file tweak that makes it work properly.
I'm running linux (manjaro - kde) on my thinkpad T470. It works marvelous, scaling works without issue, adding screens, updating, hibernate/sleep. It all just works.
Currently haven't rebooted my laptop for 45 days, updated in the middle (while still being able to work) and still works wihout issue.
Meanwhile my windows desktop keeps forcing updates/reboots every week interrupting my work.
Also interesting to mention: windows always has the fans on of my laptop, on linux only under heavy load.
Bought the laptop especially for the linux support on the thinkpad series, very happy with it.
Googling for the issue would get you the source where I got it from, so it's not as bad as you claim. And up to 16.04 the last few years of Ubuntu LTS releases have been smooth sailing in my experience. In 18.04 they did a transition into GNOME3 as the default and that's still showing the issues that showed up. Unity was actually quite polished and functional.
But let's not blow this out of proportion. It's not like Apple has not had plenty of QA issues with OSX lately. But I agree Linux desktop QA could use some more resources. Unfortunately it seems the Ubuntu desktop/mobile push is mostly over and they're now focusing on server/container where Linux has been great for a long time already. And since volunteers always prefer writing new shiny stuff than spending time doing QA the Linux desktop will probably never be extremely polished. I do find it much better than Windows and comparable or better than OSX in actual functionality for us technical types but your mileage may vary.
FWIW I do the same as suggested and it's great, font scaling set to 1.4 on a 14"@2560x1440 T470P running @2560x1440 looks nice, the only very slight snag is that font scaling doesn't scale the window decorations but a few themes do work (Numix window decorations).
I believe it applies to all screens. I've only been using external screens to project stuff fullscreen so I haven't checked what happens if I use my 1920x1200 screen as a work screen. If it's like in Unity you get the larger fonts but it's still quite usable. Having the font scaling be per-screen would be ideal though.
What’s the battery life like compared to windows? I have the t460p running w10 with the extended battery and I’ve never run out of power in the middle of a workday. It would be painful to give that up, but at the same time I’m interested in running linux as main OS.
This. I've had linux on the desktop since 2002, and unstable, insecure, hot and noisy are the polar opposite of what I've experienced on the whole. Sure, from time to time a tool would experience issues and cause heat from pegging the CPU, or they're would be some vulnerability cause it to be insecure. I'd doubt it's anymore insecure than windows or osx, and I certainly feel like it's more secure.
Unstable? I really only run LTS, though prior to ubuntu I ran debian stable, and I rarely run into anything unstable except for certain glitches which are often graphics related and worked around with different approaches, sometimes resorting to using a different window manager.
How did I do it? I ran business level laptops. Not high-end consumer laptops - those are junk, I mean 3 generations of the Dell D series, 3 generations of the Dell E series, HP Z Books, etc.
I should also point out that mostly kept with distro packages. If I broke down and installed another, I kept it out of /usr and in my homedir or /usr/local to isolate it, or I used a PPA from a trustyworthy source (Just because it's a PPA doesn't mean it's written by a competent person). When things did break, upgrades, etc. I did not blame Linux, I blamed the package that caused it.
This is exactly why Linux isn't more widely used: most business and professional users regard "switching to a different window manager in order to work around repeated graphics glitches" as a dealbreaker-level problem in a tool they use for their work.
It's supposed to be an appliance that gets out of the way and enables higher level work, not a fascinating engineering project. The tool should _just work_, all the time. You shouldn't have to know or care what a window manager even is. That's what MacOS got right.
Now, if only Apple could go back to having the best hardware, too.. :/
I like how you take one particular anecdote and use it as representative; I have had pretty bad graphics glitches on a 2017 MBP at work, unless I disable graphics switching, which kills the battery.
Conclusion: macOS suffers from severe graphics glitches and has terrible battery life. /s
I have personally used Linux on the desktop and the server for many years. I know many other engineers who have also done this, and we've compared notes extensively on the topic. Sadly, this anecdote is not an anomaly, but a fully typical example of the broad experience.
It is, if anything, a rather too mild example of the general class.
FWIW, my Macbook (not my primary machine) cost me 400€ second hand and I spent another 100€ upgrading to 8GB Ram and an SSD drive. It's a ten year old machine which runs like a dream.
I challenge anyone to find a better computing solution for 500€.
You could install Linux on a 500€ laptop but you wouldn't have the keyboard or screen quality of the Macbook, nor would have access to the Apple ecosystem. A lot of programs for Mac are just really well made and nice to use.
> You could install Linux on a 500€ laptop but you wouldn't have the keyboard or screen quality of the Macbook, nor would have access to the Apple ecosystem.
I used a 2009 MacBook pro for close to 3 years.
The following are my opinions, they are not valid for everyone but for some of us they are very valid:
Keyboard had ctrl in a different spot than every other keyboard I ever spent significant time with. (Disclaimer: some other laptops come configured this way but I remap it in bios if it is my machine.)
Keyboard lacked home, end, page up and page down keys. Instead it had extra arrow keys that non of the two resident apple fans in my office could tell me the idea behind.
Basic things like selecting a word using the keyboard would take one of three key combos depending on which app. I think sometimes it was ctrl-shift-arrow, sometimes alt-shift-arrow and sometimes fn-shift-arrow. Resident mac fan explained it was because of an ongoing transition between quartz and cocoa or something.
The application menus would appear on one screen only, often far away from the application it belonged to.
So, while I wish more people would use Macs (because 1. Lots of people like it. 2. it forces application developers to think cross platform which benefits me as a Linux user, and 3. It also increases competition) I also wish people would understand that Macs are not the best choice for everyone.
>Basic things like selecting a word using the keyboard would take one of three key combos depending on which app. I think sometimes it was ctrl-shift-arrow, sometimes alt-shift-arrow and sometimes fn-shift-arrow.
Whereas all Linux GUI applications follow a completely consistent set of keyboard bindings...
Yes! Exactly! It's the only environment where I can rely on all text entry working with emacs keybindings, though to be fair I have to poke a setting to get that.
Oh wait, you were being sarcastic. Well, at least you were wrong and learned something I guess.
(No seriously, you're wrong here. Linux desktops solved the uniform keybinding problem in a cross-desktop way like a decade ago. You just don't like it because they're different, not because they're inconsistent.)
Sorry, I took your sarcasm to imply that linux desktop keybindings were inconsistent. If that's not what you mean (I mean, reading it again, I'm really pretty sure that's what you meant), then I apologize.
You still forget that I mentioned text selection shortcuts. They've been fairly consistent across 20 years of Windows and every major Linux Desktop environment.
The text selection shortcuts are completely consistent on modern OS X, in my experience. At least, I can't find an app where shift+alt+right_arrow doesn't select a word.
I've also yet to find an instance in macOS where Emacs-style text navigation shortcuts didn't Just Work™ automatically. One of the few things about macOS that I actually like relative to the average Unix/Linux desktop.
they have been for 30 years, NeXTstep and non-NeXTstep MacOS. I think Larry Teslar of Apple (long ago now) was part of that. Also, TextFields in the NeXT and now Apple codebase know various Emacs key bindings by default.
I guess you already read that part but for everyone else: yes, lets embrace os diversity.
I'm not against Macs. On fact I say: if possible give Macs to everyone at work who prefers them.
Linux is not perfect. My current Ubuntu has been particularly bad. (But that might be my fault as I got to the current state through unofficial states.)
I even grew an appreciation for Microsoft, partly because they changed a lot and partly because I learned a lot (about ABI stability, large scale software engineering, importance of documentation etc etc)
So lets advertise our OS-es but lets not pretend Mac or Linux is best. Not mentioning Windows here since they haven't annoyed me for a while : )
On Linux ^A on a terminal (and on Emacs) behaves as God intended it to, but on a GUI it's usually "select all". It's really awful (albeit it kind of compensate for that with the select+middle-click dance).
> It's important for us to own our own tools, and keep the right to do with them as we wish
At no point in my life have I had any investment in linux, it's not mine. I use windows at work, I develop using .Net-y languages in visual studio.
And I get you're speaking for a different subset of developers on hn. The real ones, who believe fervently in open source etc etc. But you say "us" and I don't think you should be comfortable representing an entire cross section of the internet so blithely.
> The real ones, who believe fervently in open source etc etc
The people who really care about owning their tools, privacy etc. usually care more about the whole principle behind free/libre software, rather than merely open-source. But I think the OP was making a point that it is important for all of us to care about this, not just people who already do, because in the end, it impacts all.
From Game Theory perspective, "not using Linux" is actually the game's equilibrium. Because if you have to choose between a proprietary OS (Windows, macOS) and an FOSS (GNU/Linux, BSD, ...), you'd have to choose the former in order to increase your own utility. If people said they're gonna start using Linux from now on, that wouldn't work, since at least some are willing to "unilaterally deviate" from that decision and use Windows/macOS to gain a boost against others.
If you want people to start using FOSS, you'd have to offer them something they can't have on non-FOSS alternatives.
Indeed. You are free to modify it to run those tools, or create alternate tools that do run (maybe not so much after Oracle v Google). But ain't nobody got time for that.
I don't want to spend my time making a toaster - I just want toast.
Also this is why FOSS software continually reinvents the wheel, badly. Want some BS program? There are a million of them. Want an actual tool that requires deep talent and domain knowledge? Outside of compilers, if it's FOSS, it is almost certainly garbage.
Unless op is using the same laptop for years, but if you ever try connecting some new device to your machine, your in for some configuring at least with mint/debian. Linux is still worth the prodiuctivity gains, but sometimes you spend hours cursing your luck. For a short list off the top of my head.
Plugging in certain android phones, usb wifi, monitors not detected, monitors being forgotten, grub conflicts after windows updates, graphic cards drivers in messed up after apt upgrade shenanigans.
This stuff happens on all my machines, whether dual boot or pure linux. It really makes you feel helpless when your machine randomly stops working properly.
I don't have them, but then I don't do anything with phones over usb other than charge them
> usb wifi
It's built in (as is 4g, although I now tether on my phone)
> monitors not detected, monitors being forgotten
I've used an external monitor rarely, I don't recall any issues. I've seem people using external monitors on their macs and windows laptops, they often seem to have major problems.
> grub conflicts after windows updates
So a windows bug then. As I don't run windows that's not really a problem.
> graphic cards drivers in messed up after apt upgrade shenanigans
I run an LTS version of ubuntu, went from 8.04 to 12.04 to 16.04 -- took the opportunity to replace my SSD for something a little larger. Next time will be 20.04, but at that stage I think the laptop will really be due for a replacement.
Just get a Vega 64 AMD and see how you fare. Oh and no AMD wattman for you either.
I use Linux, but it does have limitations, especially when it comes to compatible hardware (mostly due to the manufacturer releasing no/bad drivers) and gaming.
well i consider myself fairly technical and i've been trying various linuxes since 2003 - every single time there were some issues that i can't imagine non-technical person to deal with.
no matter the distro, no matter the desktop environment, no matter the hardware.
graphics drivers, sound system, xorg config, multiple monitors, wifi, you name it.
so excuse me if i take your comment about "no hardware problem since 2005" with a grain of salt.
I've had tons of problems on the occasions I've built a windows machine -- took me 12 hours to get a windows 7 machine running casparcg successfully in 2015.
I wish I could just use Linux, but any program that is likely to need to use my Nvidia graphics card just needs to be on Windows.
I don't want my laptop to be constantly using a hot gpu just for displaying the screen, as I need to have EITHER the gpu or integrated graphics selected for use - swapping requires a restart.
Windows has the capability to swap between the integrated graphics for simple display tasks, and GPU for more intensive tasks.
Linux has bumbleebee but it hasn't been actively developed in years. I know the main cause is Nvidia's attitude but it is still disappointing.
99.9% great. Text looks wonderful, almost everything has hi-resolution icons in the correct size. It's not easy to find things to complain about.
My workflow hasn't really changed since upgrading — I still prefer to focus on one or two windows (often with the "Always on Top ᨑ" button) and just see more code/webpage in that window.
Under KDE's display settings, I have the scale set to 2.0.
Problems:
- Minor UI issues. Some UI elements in Amarok don't quite line up correctly. The loading dialog of LibreOffice is at 50% size. GIMP's toolbox buttons seem small. The default steps of shrinking/enlarging text in Konsole aren't great.
- As far as I know, the HDR capability of the monitor isn't supported in Linux.
- About two years ago when I last tried, mixing resolutions with multiple screens didn't work correctly.
If you use a desktop, and can afford to upgrade all its screens to 4K together, I recommend doing so tomorrow. Investigate further if you'd be mixing resolutions; see if you can borrow one.
Referring to [1], I have a UHD-1 screen, which is much rarer than I thought. Just 1.2% of Steam users. The more-common WQHD screens could be OK without any scaling.
I bought a new laptop recently. I was very close to paying well over £1k on an XPS 13 with an 8th gen quad core i5, 16GB of RAM, and a 512GB SSD.
At some point, I came to my senses and bought a used Thinkpad X230 for just £80. I spent £100 on upgrading the RAM and SSD to match those of the XPS I mentioned (I got a rather lucky deal on the SSD). I'm currently planning on spending approximately another £100 on upgrading the display to a 13.3" 1080p IPS panel (which will require some fairly extensive hardware mods).
So, for about a quarter of what I was going to spend on the XPS, I have a machine with roughly comparable specs (Significantly weaker CPU, admittedly), but a much more rugged design with upgradable and moddable hardware.
I think the XPS 13 series is reasonable priced for what you get and I thing you would not have regretted it if you would have went with that option.
Your changes sound like too much work for a lot of people but its great that you found something fitting your needs at a significantly smaller price point.
That 1k is a week or two of dev work depending on where you live. If you've spent more than a week on procuring that laptop, you haven't actually saved any money.
How’s the Linux battery management for laptops? This and driver compatibility are usually my biggest concern. I remember 2/3 windows laptops I used were having trouble sleeping/waking up when lids were shut
Linux has no trouble with hibernate/suspend, at least on my machine.
It's just that I prefer to start from a clean state each day and don't want to be bothered with yesterday's clutter.
I also save downloads and small 'run once' experiments to /tmp, so booting the notebook is a good way to clean everything up.
To add another bullet to the reasons for reboot listed in the siblings, I prefer to poweroff in the evening because my disk is encrypted and I don't want the key lingering around in RAM when I don't need it. /me puts on more tinfoil
What's wrong with hibernating and suspending? It's been two months since I last rebooted my Linux laptop, I typically only do it when upgrading the kernel. I agree it would be nice to keep session state after a reboot, but it's a very minor annoyance. Some window managers such as i3, do at least let one persist window configurations.
Clearing user state is one of the reasons I prefer reboots - it helps keep my desktop clean and tidy. I cannot imagine the mess that would arise if my session lived months. I use shutdown as a shortcut for "close all application I don't use right now"
Linux 4.17 is apparently a big leap forward for laptop power management. I run it on a desktop, so I can't speak to it directly, but I've heard a number of folks singing its praises.
It depends on the model. For my XPS 15 with the bigger battery I can get up to 10 hours working on Python dev and checking out docs. But only when disabling the discrete Nvidia card, which is relatively straightforward[0] to do after having suffered setting up bumblebee.
Glad I'm not the only one having an awesome experience with Linux on the XPS 15 (a 9560 running Fedora in my case). Some of the stuff I've come across on the internet seems to imply that it's impossible to run Linux on this thing and I've had no issues at all!
I was just thinking about this today as I needed to conserve battery power on my xps 15 for an airplane trip. Thanks for reminding me to check out bumblebee.
I think it depends on the model. I run arch on a Thinkpad and the battery life is about what I expect given the battery size. CPU voltage and fan is modulated properly. No lid or sleep issues whatsoever.
It's been great, just stay away from the new Thinkpad X1 Carbon 6th generation, Linux support is poor with lots of things that needs fixing.
But 5th generation had a smooth linux experience, I bet the same goes for most other laptops. I'm hoping 6th gen will be fixed in 6-12 months with newer kernels, etc. But at the moment it's quite broken.
Stock Debian Stretch on my Lenovo works very well. Battery time estimation is dead on, suspend is very low power (days). You can set the lid shut behavior through the gnome admin tools.
I have found it to be remarkable over the last few years. I keep coming back to Debian although I've used OpenSuse Tumbleweed in recent times, and some other flavours before that.
With Debian, I've used a very old Dell Latitude (circa 2010, Core2Duo, DDR2, i3wm or XFCE) and newer Dell Inspiron (2017, i5 7th Gen, DDR4, full HD touchscreen display, KDE/XFCE+i3wm) and never had trouble with either sleep/wakeup or with wifi. Touch isn't as great as Windows, so it detects only "left click", not right click, zoom, etc. but I never use touch so it's something I can live with.
Incidentally, I installed Linux because the Windows 10 that came preinstalled had some hardware issue and the fan was always on, so I was getting around 1 hour of battery life. Switching to Tumbleweed (Linux 4.x at that time) gave me amazing battery life, around 5-6 hours. And when Debian 9 (kernel 4.x) got released, I quickly moved to it, and still get around 4-5 hours if I'm doing text editing, browsing and little bit of compiling.
In my experience running Linux in two laptops is that it tends to underestimate the battery duration but the battery lasts longer than with the original OS.
the design where the display is _least_ fragile part of the laptop is brilliant: wins over removing extra ounces or millimeters. i wish more laptops were built this way and suspect the only reason they aren't is because it doesn't look as "hi-tech" as some of lenovo's other laptops.
Believe me, I wish I COULD run linux on my main laptop for work. Unfortunately, I write a webapp that is run 50% by mobile Safari users. When a bug happens, I need to be able to plug into Safari's dev tools. Other than that, I often need decent Adobe products to work with images. As much as Gimp and Inkscape have improved over the past 10 - 20 years, they still are obtuse at best. For simply writing code, Vim looks the same pretty much everywhere.
Battery life on linux is awful and if you want the latest version of something you end up raising hell to get everything to work, meanwhile on a mac you just do a brew install.
I'm a software engineer, I used to love to customize my computer and make it different, now I just want it to work and not having to waste hours and hours fixing it.
Take the previous design, you know the one from late 2012 to 2015.
If you must (sigh), take out the SD card, HDMI, USB A, MagSafe and replace with USB-C. I don't know who suggested this, but whatever.
Don't add in a touch bar. Don't add in an over sized touchpad. Leave the keyboard alone. The layout of the model is amazing.
I purchased a late 2012 in Jan '13. It's been my daily driver for 5 years now, without a single issue and it's an amazing piece of equipment. It allows me to just get on with my work, rather than focusing on the issues with the OS (looking at you Windows 10).
Want to make the product much better? Upgrade:
- the screen to 4k
- the memory to 32GB
- the SSD speed and give me options of 1TB and 2TB.
- the CPU to the latest i7 and more cores the better.
Listen, you can even charge $5k for the privilege. I don't care at this point. My needs for it just working and allowing me to work without the issues of running Windows 10 are the only things that matter to me.
-------
Ok, I understand. We are negotiating here.
- $6k...
- $7k...
- $8k...
- $9k...
- $10k... ?
Sure, I'll pay $10k for a PRO laptop. One that lasts 5+ years. I'm happy to depreciate it over a period of time, because for me. It's not an ACCESSORY. It's a workhorse!
I really hope someone at apple listens. You guys had an awesome laptop. Then you made it crappy.
--------
I'm literally praying to "Lord Jobs" on the other side, that I don't run into any issues for at least another 3 years and Mojave is my last OS it seems. I'm STILL on El Capitan!
* Give us back those last Wh and bump the battery back up to 99 (right under the FAA limit)
* Improve the thermals (if this means making the machine thicker, then sure). In summer (with aircon set to 26 C) I can't plug my laptop in to charge and use Xcode without it overheating and throttling. Same issue affected my old 2012 rMBP so it's not a new issue.
+1 :)
I use the exact same machine and I simply don't want to upgrade it with the current options.
I don't even need anything higher resolution (retina is great), honestly. In fact, I don't even need a more performant (or more cores on my) processor (my quad-core i7 works great with everything), or better GPU - all I need is:
3. Gradually make product worse without losing market share.
4. Make product good again with giant price increase.
5. PROFIT!
Apple is stuck on step 3. Either they don't know that step 4 exists, or they think they can squeeze more juice out of step 3, or they don't have the technical ability to do step 4.
If they bump it up (which would be a good idea) they'll likely go 3x (4329 x 2700 on the 15") not 4K.
> - the memory to 32GB
That's the one I don't see happening. Much of Apple's tradeoffs are about keeping battery life in the face of design stupidity, dropping LPDDR is very unlikely.
> - the SSD speed and give me options of 1TB and 2TB.
Both options exist (on the 15") and AFAIK Apple has pretty fast SSDs, the main issue with them is that they're soldered not that they're slow.
They won’t support 32Gb ram because it might decrease battery life and requires them to reengineer the logic board but they’re totally cool with a fragile ass keyboard GLUED to a battery! Really there’s no way to call this anything other than insanity.
I remember the media used to talk about the ‘halo effect’ - well I’m someone who has Almost every MBP between 2002-2012 and every iPhone until the 7 plus.... right now I doubt they’ll make a real pro laptop and so I’m likely gonna switch to a Dell XPS. /end frustration
Apple (MBP15 late 2013, iMac 27 late 2009) user at home, Dell XPS 15 9550 (2015) user at work here.
While I totally agree with you about the new MBP, please just know that the XPS 15 isn't without it's problems too! It must just be that Windows users are more used to lower quality hardware/software that such things are accepted...
I really like my XPS 15 as it is fast and as compact as a MBP, but:
- Microsoft Surface Bluetooth mouse and keyboard don't work anymore after an idle period of maybe 10 minutes. Have to disable/enable Bluetooth to get it working again. I reverted to RF dongle keyboard and use the Surface Mouse with the included USB cable.
- I had a huge number of BSOD early on. It's still not perfect, but almost fine now.
- The web cam is at the worst possible position. It films your fingers and breast, rather than head!
- The microphone is even worse, no one will hear you talking.
- Headphone with microphone works fine. But after unplugging I don't have any sound anymore. Sleep/wake fixes it...
- I obviously do miss some of the Mac apps like OmniFocus, Pixelmator, xScope and others while working in Windows.
There is a XPS Developer Edition with Linux instead of Microsoft. I'm personally stuck with Win 7. Will never update to Win 10 and OSX does not seem to be a good choice nowadays. That means Linux will make the run the next years, hopefully.
Personally I like to just get the Windows version and load Linux with GRUB as a second boot option. The few times I need to do something on Windows (photo editing, some games) I just reboot.
I end up doing almost the same except that I ditch the Windows install. The hardware selection is much better and, unless you intentionally sabotage yourself, they work every bit as well as laptops that come with factory-supported Linux installs.
I did this once but this won't save you from forced updates keeping your laptop alive (and hot, and at risk of corruption if it happens when you have to fly/drive somewhere and you weren't expecting an update), plus once Windows took issue with the state of the disk I was sharing between it and Linux (NTFS format) and did a chkdsk which deleted most of the files from it. That was the last time I ran any Microsoft code outside of a VM (Or my employer's kit).
OSX is still a fine choice. I've been using OSX since 2003 and only noticed substantial improvements over time like Spotlight, Airdrop, Facetime, Quicklook, not regressions. I'm really sick of constant doomsaying about OSX and Apple in general.
> Microsoft Surface Bluetooth mouse and keyboard don't work anymore after an idle period of maybe 10 minutes
Go into device manager and disable their power saving settings. I had disconnection issues once or twice a week with my MS Designer Mouse and this fixed it.
I searched the web, updated Dell drivers, found some Reddit thread where they explained cryptic settings in the device manager... All of that didn't help. I gave up when I found yet another blog post where a guy even replaces the internal bluetooth kit of his XPS 15!
In the hope that this fixes it once and for all: Thank you so much!
Whenever I get a BSOD, I miss working on my older Macbook Pro. I just got BSOD this morning as I was reading this, on a brand-new high-end HP ZBook laptop nonetheless. It's the third time that BSOD had happened since I got it few weeks ago, and it turns out that the culprit was the HP's included bloated software called HP Velocity, which is supposed to improve network performance on laptop.
> - The web cam is at the worst possible position. It films your fingers and breast, rather than head!
As someone who's going bald, this is a feature rather than a bug.
Of course, I'd like 4 cameras on the corners and software that allows me to shift a virtual camera to the center of whatever video conference tool I'm using.
BTW, that would be a brilliant new feature for Apple laptops of the future. That and chroma/motion key built in the camera video processing pipelines or, perhaps, the full set of iPhone image processing options.
- So the bluetooth issue may be fixed with some stupidly hidden Windows setting in the device manager.
- I forgot to mention that 3 of our XPS 9550 had their trackpad (and battery) replaced after the batteries inflated and pushed the trackpad out of the case. Luckily this happened a few weeks before the end of the 2 year warranty.
Technically no, not exactly. It’s intel’s fault. Their damn architectures won’t support DDR4LP - that’s the low power variant - because they’ve become complacent. Also the lack of PCIe lanes forced Apples’s hand on USBc only design, although it’s such a daft choice they should’ve just gone with 2 TB ports and keep the other “legacy” available. But no, they had to brag about dual 4K external monitors to it had to be 4 USBc ports and nothing else.
I’m seriously worried that the platform I’ve been so comfortable with the past 10 years will fall in disrepair because of some ill-advised management that chose to chase some other innovation...
There's a simple solution for the lack of DDR4LP: end the "thinner and lighter" insanity and provide normal DDR4 with a decent battery capacity by making the laptop 2mm thicker and 0.5 pound heavier.
I’d cut Apple some slack on the RAM front because Intel was supposed to ship DDR4LP a couple years ago, after years of being fairly good about hitting targets.
The larger point about recognizing that we’ve plateaued on size & weight benefits is important, though. As long as we need keyboards there’s not much point in taking on all of the other compromises.
Ironically, it was Steve Jobs' pep talk to new executives that went something like: "The difference between a janitor and executive is that VPs are not allowed to have excuses". So trying to share the blame with Intel is just silly.
Every single Apple's competitor delivers laptops with 32+ GB of RAM, and MacBooks are stuck at 16GB and there's no one to blame for it but Apple.
> Every single Apple's competitor delivers laptops with 32+ GB of RAM, and MacBooks are stuck at 16GB and there's no one to blame for it but Apple.
This is only partially correct: those are separate models which are bigger and have worse power and heat characteristics. Every laptop which is similar to Apple's shares the 16GB limit for the same reason; the difference is that Apple has decided not to offer that second tier.
Dell's XPS is comparable in size and performance to the MBP and has 32Gb of RAM available. Does the standard DDR4 they use effect battery life? Possibly, but that's a trade off I and others would take for our $4000 laptops not to grind to a halt when we spin up a few docker containers.
It’s possible by hibernating faster, which is the strategy lenovo used in my t460p with 32 gb ddr4. The downside is that leaving it with a closed lid for ten minutes makes waking it up a 5 to 10 second affair instead of instant.
Mah, there’s a mobile standard optimized for that use case... not using it because intel can’t get their act together would be asinine. I’d be shocked if there weren’t any Ryzen prototypes running in Cupertino, with daily performance test results CC’d to intel account managers. I’d to just that...
Part of the reason for Apple betting big on Intel was power efficiency. But with Ryzen going to 7nm before Intel makes it to 11nm(?) reliably, I have to think that Ryzen with integrated GPU starts looking damned attractive.
I don't think PCIe lanes and low power ram are directly related, but it is just another sign of Intel attempting to control the market via feature segregation.
The CPU only has 16 PCIe lanes. Which is flat out retarded when 4 of them will be used by NVMe, and 8 (and really 16 should be) can be used by video. This only leaves 4 additional PCIe lanes for any other devices, including USB-C and thunderbolt.
Simply put, if there were any competition in the market, that CPU look much different. Low power ram and 8 more PCIe lanes at least.
I think I read one comment here on HN some time ago about the lack of >16GB RAM on laptops. The reason was that DDR3 maxes out at 16GB and DDR4 which allows much higher amount of RAM lacks the low power requirements while sleeping. So one would pretty much sacrifice sleeping in order to get more RAM.
I've been very happy with my Surface Book, FWIW - as beautiful as it is powerful, though with a price to match. (But there are plenty of good options these days - I've heard good things about the XPS too, though personally I didn't like the feel of the keyboard).
They could, in theory, have a "power saving" mode that tries to consolidate as much as it can into as few memory modules as possible (I'm currently using about 39% of memory, the rest being free and cache), virtual addresses would remain untouched and only physical addresses would change after a move and then it'd be possible to power down the memory modules that are not being used (PCIe-attached storage is slower than memory, but fast and power saving don't go together that well)
And replace it with what, function keys? Keys which are thicker than the screen that replaced them and has far more functionality in a smaller footprint?
Big fan of the XPS 13. I bought the developer Linux edition first, which had a 1080p (boo!) screen and matte finish (yeah!). It had an issue where the trackpad would just intermittently stop working or the cursor would just zoom around the screen, making it unusable. Dell sent out an engineer THE NEXT DAY TO MY HOUSE with a replacement trackpad. Insane customer support. I ended up swapping it for a Windows version for the same price but with the higher res screen (but gloss) as the HDMI out didn't work on the Linux version. I think this was software rather than hardware.
I couldn't agree more. The new MBPs are basically lemons. Mine absolutely is, and is unusable and sent out for repairs after only 1 year (defects include but are definitely not limited to this ultra brittle keyboard). I will definitely not buy another one of these awful machines.
What ever was the point of making this thing razer thin? What bizarre, mythical user did they have in mind that just couldn't stand a few cm of extra 'bulk'? None of this makes any sense, except it being a case of design driven engineering run amok.
I thought the RAM thing would decrease battery life, because it requires a different intel chipset — as shown by the contemporary laptops from other manufacturers with the same CPUs and chipset topping out at 16GB as well?
I could be misremembering, and it frustrates me either way, but I was certain that Intel also held some blame for this?
I think the reason is LPDDR3 which only goes up to 16GB with Intels chipsets, they would need to use Desktop class ram to go above which uses a lot more power.
You are correct, I bought an Intel NUC with an i7 in it and that had the 16Gb limit. Furthermore, it only had two cores and hyperthreading whereas I was 'used to' i7 having four proper cores and HT.
I wanted to have a desktop that I could leave on without it taking lots of power (CO2 matters) or needing a wind-tunnel grade fan to keep it cool (noise matters). So it had some variant of what Apple used. In my opinion this CPU was a dud, I had to upgrade the BIOS numerous times and it wasn't any quicker than my elderly i5 powered laptop. Plus it wasn't as silent as hoped:
The subsequent products with the useless keyboard were to use iterations of this lemon of a CPU.
My Intel NUC is actually in a drawer unused. I had lost a couple of keys on the elderly i5 laptop, I bought a new backlit keyboard for that on ebay, it came from China in an incredible 3 days and cost ~£50 including shipping. This is the keyboard I am using right now. It actually looks super cool as the laptop was originally all in silver, the replacement keyboard and surround is black and the design looks so good because the mousepad is still silver coloured. It looks like a high end German product due to that colourway combo.
I find it hard to believe that any decent chipset made in the last decade at least cannot support 32gb of ram. I had a reasonably cheap samsung laptop from 2012 that already did. If apple is not doing it, it's not by the lack of choices in the market.
Of course, but it's a tradeoff. These low power low profile Intel CPUs can support more memory, but only DDR4 which is desktop class RAM with more than double the power requirements in use, and four to ten times the power usage in sleep mode.
The logic board would also have the redesigned and bigger for the new chipset, putting pressure on battery space. You see Microsoft making the same choice as Apple with the new Surface laptop.
Fortunately Intel are launching new mobile chips with support for fast, low power LPDDR3E memory over 16GB this year.
Maybe, if their MacBook Pros were just a little bit thicker (say, the thickness of the 2013-2015 models), they'd be able to fit a big enough battery for the extra power draw to not matter too much?
"Considering that a 76 watt-hour battery is used in the 15 inch machines, they could have made the battery 30% bigger to hit the ceiling imposed by the Federal Aviation Administration, and they still wouldn’t have had the same battery life as they do now by using LPDDR memory." [1]
I mostly agree with you. But really, who needs +16GB RAM on their laptop?! I think the need for more RAM is correlated with the need for more CPU clock, more powerful graphic card, and a better Motherboard, all of which result in a huge decrease in battery life (which is supposed to be a laptop's main strength over heavy, power-consuming PC).
It's the reason I end up always looking at gaming laptops. I don't really need a GPU at all, it's nice but not needed. However yeah, Cores & RAM that's what I need.
I was about to buy a new MBP this year but held back because I already have 16GB of RAM and want more, actually need more. I often have the problem that my current MBP runs out of memory and when this happends it costs me like 30 to 60 minutes to get back to the point where I was when it ran out of it. So yeah, there are some of us that need more RAM.
> But really, who needs +16GB RAM on their laptop?!
I need it on whatever machine is my development machine. Clojure REPLs are memory intensive. I'm sitting at 7GB in use right now, mostly thanks to Chrome, even without anything major running.
I could probably squeeze under 16GB if I was aggressive about shutting down server applications (e.g. Apache and MySQL) I'm not using at that moment and only starting them up when I need them, but that's just adding more chores to my workflow.
Having more than 16GB of RAM just means I never need to worry about it. It's one less headache to deal with.
As a developer, I need a machine that's powerful enough to develop on; being light and having battery life is a nice-to-have but not essential. If that means I have to buy a bulky gaming laptop rather than a sleek portable one, that's what I'll do. At the moment I can just about manage with 16GB, but it's definitely the limiting factor on my current machine (surface book 2, which is capable of some pretty good gaming).
For me, large datasets used for analytics. I don’t want to use a desktop, I want to do this stuff in a coffee shop, and in my hotel. The difference between in-memory and disk based operations is, understandably, very significant.
As (mostly) a designer, even I'm pushing RAM usage at times. Some GPU wouldn't hurt either.
yet I don't feel like switching away from Mac is a realistic option for me. There are a ton of apps on which I depend. Can't exactly stick them in a docker image.
The reason I'm sticking to the Macbook is that despite these issues they are still way ahead of any competition. I'm running a mid-2014 retina. It is still way better than anything Dell, Lenovo, Asus, etc.. can get me. 4 years later and the battery still get me a solid 5-6 hours despite strong usage.
But here is the following: Apple is really improving. It is just not meeting the expectations of its demanding users (and rightly so since we are paying a premium for Apple hardware).
I remember getting sick of macOS Sierra and decided to rollback to Mavericks. Holly crap, I then just realized how much more polished Sierra is with all its issues. The external screen resolutions were screwed, can no longer answer calls through my mac, notes no longer sync and a dozen other things that I just "assumed".
These advances are probably building technical debt. But also they are what attaching people to the Apple ecosystem.
I get my airpods connected to my iPhone easily and without issues. Then I quickly switch the input to my Mac. Then back to my iPhone. Good luck getting that to work correctly on Windows software and other hardware.
At same boat as you, and just have migrated all my activities to Windows on 64 Gb Core i7 8 gen CPU desktop. It's a real booster for my day to day work.
I love and miss the aesthetics of macOS and Magic Mouse swipes. However let's face it, Apple alone cannot stand the pace. It just cannot compete with hordes of companies from all over the world who do their part of job really really good. Gigabyte, ASUS, Samsung, Intel, Crucial, NVidia, AMD, Dell to name a few plus many other small and big suppliers vs Apple Goliath. Battle result is predetermined by the forces of nature. Apple gonna lose in the long run.
We all lose in the long run. Those companies all existed in 2005, but Apple was streets ahead of them for the average person. Apple pushed the industry forward. Remember what the typical phone was like before the iphone? Or the typical mp3 player before the ipod? Sure, the ipod may have had less space than a Nomad and no wifi, but it was what people actually wanted.
The question is, is apple really going backwards, or is it a case of "I don't like change".
The longer you use something, the more you don't like anything different. Is USB-C really a step back from magsafe? Is removing the function keys really a step back? How about removing 3.5mm jacks from a phone?
I've used /etc/network/interfaces to configure my network for 2 decades. Netplan is thus stupid and terrible. Except if I look at it objectively netplan is better. It deprecates those nice comfy hacks and configs I don't even think about, so of course I don't like it.
My local hifi shop tell me that the bose qc35s jumped off the shelves like hotcakes compared with the 25s, because they were wireless and rechargable. I don't like them because I like 3.5mm. But as the man said:
Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. Anything that’s invented between when you’re fifteen and thirty- five is new and exciting and revolutionary and you can probably get a career in it. Anything invented after you’re thirty-five is against the natural order of things.
Apple is really going backwards, no doubt in that. It still has a lot of momentum though but it's been mostly downhill since Steve Jobs unfortunate departure.
People wouldn't be so up in arms about MagSafe vs. USB-C if they had a choice, either by both ports being on the laptop or by being able to choose one or the other when the laptop is ordered.
You can classify MacBook users into two groups: people who hook up their laptops to lots of stuff, and people who don't. The people who hook up their laptop to lots of stuff are the kind of people who get to work, hook up their laptop to power + keyboard + mouse + headphones + maybe Ethernet + maybe other peripherals, at a desk where all this stuff is static. For those people, USB-C is a major improvement over MagSafe. The odds that your power cable is going to be tripped over with proper cable management at your desk is vanishingly small and the convenience gained by only plugging in one cable is enormous.
On the other hand you have the people who bring their laptops and power adapters with them to Starbucks or the university library or their co-founder's apartment, and are constantly looking for a power socket. You know, the ones in this thread complaining about the keyboard (because they never use an external keyboard) and won't buy anything else anyway (because they love the trackpad and never use an external mouse). For those people, USB-C is a major regression from MagSafe because even if Apple provided a USB-C dock in the box for free, for them it would be collecting dust in the bottom of a closet somewhere.
But this is typical for Apple, whose corporate ethos seems to consider the notion that allowing the customer to have a choice is a bad thing. Which means that the question isn't whether USB-C is a regression over MagSafe or not, it's whether you belong to Apple's core market segment and fit into Apple's core product strategy, which holds that the word "regression" is simply not part of the vocabulary used in the conversation revolving around Apple's products. Improvements are all doubleplusgood changes, simple as that.
Which means that the question isn't whether USB-C is a regression over MagSafe or not, it's whether you belong to Apple's core market segment and fit into Apple's core product strategy, which holds that the word "regression" is simply not part of the vocabulary used in the conversation revolving around Apple's products
Yes, this.
It's odd though, the apple ecosystem has "always worked" if you stuck with it, where linux and windows have needed various amounts of firtling (customising your workspace, switching etc)
People who plug in perhiphials seem to be against what I understood apple's view to be. I guess they want those of us who just have a laptop and don't connect anything up to be using ipads instead (perhaps that explains their choice to downgrade the keyboard --- you either dock into a fixed workstation with a decent keyboard, or you use a touchscreen. Eventually I wouldn't be surprised if the keyboard is completely removed from apple laptops. And we'll all be shocked. )
I can't really answer at the macro level, but I really don't think this is just a knee jerk reaction to 'change'. Just personally :
* I don't miss the CD rom at all - even at the time. I think that was a totally valid design change
* I love QC35's and BT headphones in general, the convenience is undeniable
* I even like a bunch of soundcloud mumble rap, that's supposed to be anathema to my generation
* All that said, I can't stand the new Apple keyboard, it completely ruins typing, which is sort of a fundamental part of using a computer
I guess I'm saying it's possible to appreciate some change, while not appreciating other would-be innovations. I really have a hard time seeing how this keyboard is in any way a positive change - there isn't really much benefit to having a bad keyboard. It's all negatives, unless saving a literal couple of millimeters in machine thickness is somehow super desirable.
I never had a MagSafe laptop whacked off a table and I had them since introduction. It has happened at least once a year with my USB-C MacBooks Pro. What a silly own-goal.
For magsafe, yes. But overall, I like the move to USB-C. I plug 1/2 as many things into my computer each time I sit at the desk as I used to. Once these docks come down a bit more in price I'll plug a single wire into my computer at the desk.
> Is removing the function keys really a step back?
Not sure. Right now I'm indifferent.
> How about removing 3.5mm jacks from a phone?
Definitely a tempest in a teapot.
As an aside, the few people I know with 2017 MBPs all like the keyboard. None of us have had issues so I'm sure that makes a difference. Personally, I like the key travel and how quickly I can type on the new keyboard.
I started to get cramp in my little finger using a magic mouse eventually I switched to an old ps2 ms mouse (the classic that they have just reintroduced) that I had stock piled years ago
> [Apple quietly announced][1] that they were extending the warranty on their flagship laptop’s keyboard by four years.
No, they didn't. If you read the very source the article cites for this, you find,
> The program covers eligible MacBook and MacBook Pro models for 4 years after the first retail sale of the unit.
The original warranty was 1 year[2]; this is a three year extension.
"Extending by 4 years" is just too good to be true. Although, I had no idea the warranty on MBPs was so bad; a well built machine should trivially last a year. And now that I Google around, this doesn't seem to really be unusual on Apple's part. My last laptop had a 4 year warranty, and it lasted about that long. But my current laptop only had a 1 year standard, and I had forgotten that I paid $80 to extend it for 3 years. Apple seems to offer an extended warranty ("AppleCare+") but wants $270 for it.
Worth noting that in some jurisdictions (e.g. Australia), consumer protection law extends beyond manufacturer warranties by establishing consumer guarantees, e.g. a guarantee that products will be of acceptable quality, and that they will be reasonably fit for any disclosed or represented purpose.[0]
This often extends beyond the period of the express manufacturer warranty.
On a semi-related note, Apple has recently been fined AU$9m for "making false or misleading representations to customers with faulty iPhones and iPads about their rights under the Australian Consumer Law", by representing to customers "that they were no longer eligible for a remedy if their device had been repaired by a third party".[1]
Doesn't Apple often snob this laws? Have they changed their behaviour? If I remember correctly, in EU, up to just one or two years ago they were refusing to service products under warranty older than one year if you didn't have Apple Care even though the law mandates a 2 year warranty period.
>>even though the law mandates a 2 year warranty period.
The law mandates no such thing and this is an extremely common misunderstanding of how EU consumer production works.
EU consumer protection law gives you protection against manufacturing defects for up to 6 years after purchase(it's 2 years only on specific items - in general, it's 6 years).
The absolutely key word here is manufacturing defects - if you have a fault with your laptop, fridge, TV, whatever, within the first 6 months after buying it, then this law assumes the defect existed at the time of manufacturing and the manufacturer has to fix it. But, after the initial 6 months, the responsibility on proving that the defect is due to faulty manufacturing is on the customer.
So yes, you can absolutely bring in your broken laptop to apple after 2 years of buying it, without buying extra Apple Care, and ask them to fix it - but it's up to you to prove that it's broken because of a manufacturing issue. That's why Apple "snubs" at it - in vast majority of cases it's very hard for the consumer to prove that the laptop is broken because of an error in manufacturing, and not because the part has worn out.
Having said that - there are some countries in EU(for example Poland) which require that all electrical devices are covered by a full 2 year manufacturer warranty, and indeed - all Apple hardware sold in Poland comes with a 2 year warranty as standard. Which highlights another issue - EU laws are not as homogeneous as an outsider might think - for example, in UK if your product is replaced under warranty, the replacement is only covered for the duration of the original warranty. But in Poland, the law states that if an item is replaced under warranty, then the replacement has to be covered for a "fresh" duration, so a full replacement resets the time counter on the warranty.
How I wish I lived in that world. Here in the US, my wife recently had a nightmare situation where the insurance company insisted that they didn't cover a certain cost because they didn't know what it was sure, the hospital insisted that they had never assessed the cost and so couldn't explain it to the insurance, and yet somehow everyone seemed to agree that the never-assessed cost nonetheless absolutely had to be paid.
Isn't Apple often the seller as well as the manufacturer? In any case, I've never had to interact with a manufacturer, it was always taken care of by the seller (within the first 2 years).
The UK has a nice catch all for such things in addition to the EU laws;
From the 'Sale and Supply of Goods Act 1994'
~~~
For the purposes of this Act, goods are of satisfactory quality if they meet the standard that a reasonable person would regard as satisfactory, taking account of any description of the goods, the price (if relevant) and all the other relevant circumstances.
For the purposes of this Act, the quality of goods includes their state and condition and the following (among others) are in appropriate cases aspects of the quality of goods—
(a)fitness for all the purposes for which goods of the kind in question are commonly supplied,
(b)appearance and finish,
(c)freedom from minor defects,
(d)safety, and
(e)durability.
~~~
I don't think a reasonable person would regard a £2000 laptop being unusable after 2 years of regular careful use as being 'Fit for purpose'.
> The Consumer Rights Act 2015 became law on 1 October 2015, replacing three major pieces of consumer legislation - the Sale of Goods Act, Unfair Terms in Consumer Contracts Regulations, and the Supply of Goods and Services Act. It was introduced to simplify, strengthen and modernise the law, giving you clearer shopping rights.
Maybe they snub it sometimes but I was able to have a 2 year old iPhone replaced under this rule (the camera padding had drifted across close to the front lens).
I believe the rule here in Ireland is that defects that were present as a result of the initial engineering are covered for more years, but not wear and tear or anything that would fail due to use specifically.
FWIW AppleCare for this price of machine has been the same duration and roughly the same price since forever, even before MBPs.
What they have quietly changed is now you can only buy it in the first 60 days instead of the first year which was a big surprise to me buying a new MBP. It had been the first year for a very, very long time — since inception IIRC. Of course I learned of this change after I'd already had my machine for more than 60 days. :/
There are a few ways to get it slightly cheaper like F&F discount. You also used to be able to get it cheaper from a third party store like B&H but it seems they've cut back on this recently.
The link from SyneRyder’s comment indicates that you can still buy AppleCare anytime in the first year. The only thing subject to the 60-day limit is AppleCare+ (which includes accidental damage).
> As for the Mac, customers who have had their Macs for longer than 60 days but less than a year are not eligible for AppleCare+ but are still able to purchase a standard AppleCare Protection Plan, MacRumors has learned. Apple is only offering AppleCare+ for Mac on its website, so customers will need to call in to Apple Support to make the standard AppleCare purchase. Standard Mac AppleCare is priced at $149 to $349, depending on the machine.
Looks like there are a few but not many on eBay [1]. I wonder if one can still activate the legacy AppleCare (non +) on newer models purchased after they discontinued it.
Exactly, As I said before, they should rise the price a little and offer 2 years standard warranty on all products, iPhone and Mac. It is not like they are replacing those panels or parts for free under warranty, you still have to paid for it, just a little cheaper. And 2 year is, for EU, UK, and AUS part of the law anyway.
And like you said many Notebook now offer 2-3 years warranty as standard. Apple?
It is rather unfortunate we don't have anyone to challenge them in both PC and Smartphone space.
Apple biggest UX failures always stem from putting form before function. There is a lower limit to thinness/lightness before consumers just don't care, and it was passed long ago.
There was great progress made with the unibody Macbook Pros and the Macbook Airs but these latest models are so thin that it's actually annoying and leads to issues like poor battery life, lack of repair-ability, and malfunctioning keyboards. Isn't better usability actually more of a luxury then simple aesthetics?
Based on the timing, this seems like a train set in motion by Steve Jobs but he passed away and now nobody wants to recognize that it's way off course and needs a correction.
On one hand I'm inclined to agree with you. Thin is not always better. On the other hand I'm holding my 3 pound Macbook Pro comfortably and easily. It's light enough that I don't have to think about slipping it into my day bag (something I did think about with the previous iteration) and thin enough to easily grip with one hand and walk around. When I was traveling, there wasn't even a question about whether I would bring my laptop. It's just too easy not to bring.
Sure, there's a bunch of flaws in the design and I hate that they removed the SD card reader, but it's probably the closest I've seen to the platonic ideal of a laptop. Something that you can always carry on you, that you can pop open in short notice and quickly slip back into your bag when no longer needed.
Weight is not the same as thinness. The X1 Carbon is substantially lighter than the MacBook Pro, but with a much better keyboard and much more repairable.
You wouldn't notice if it was 5lbs either, the human body is much more capable than that unless we're talking about small children. People routinely carry much bigger and heavier things.
Perhaps if it was still a netbook/macbook air product then it would make sense, but we also have iphones and ipads now which actually seem to be getting bigger so there's no reason why the most mobile powerful computing device must be so small.
I own a 2012 MacBook pro 15' and work gave me a 2016. The difference is about 1.5 pounds, going from 5.6 pounds to 4 pounds. The difference is big enough that I've slowly started using my work laptop for most of my personal needs too, even though I initially wanted to keep my work laptop clean of unnecessary personal data.
I use the 2012 at home and the 2017 at work. Think I actually prefer it heavier but maybe it's because I also respect my 2012 more after growing to hate my 2017 with 5 stuck keys and return key deadzone so look at this supposed flaw as wanted power and sturdiness rather than unwanted weight.
Plus the lightness is offset by having to carry a dongle and them only shipping the bulky powercord not the short cord in the 2017 which I found an insultingly stingy gesture for a nearly 3K machine.
Yep, I think I would feel the same as you do if my work laptop was a 2017. Work gave me an old laptop, and I was happy with that because I don't have to deal with the touch bar or USB C.
I also share your sense of value about the machines. The 2012 is the last MacBook pro I've paid for myself.
EDIT: I just checked and my work laptop is a mid 2015. It's still much lighter than my personal MBP, but I think this was the last generation before USB c charging and the elimination of ports.
Possibly not, but I did notice carrying my personal laptop around versus my work laptop. It made a fairly big difference in fatigue whether I was going to the cafe or to another country.
You drop more weight just going to the bathroom in the morning. I find it extremely hard to believe that a few pounds makes a difference to any physically capable adult.
It does. If the laptop is light enough, you can handle it (grab it, put it back on the table etc.) comfortably with one hand even while laying away from it etc. It becomes almost as comfortable to quickly grab as a tablet or a phone.
Laptops are not designed to be handled with one hand. It's a nice benefit but it definitely should not be the target, especially at the cost of better usability, which is the entire issue here.
The difference between a laptop I can comfortably have in a backpack or messenger bag when cycling is huge - it goes from something I took when I was specifically planning to use it (and worried about back pain even then) to something I essentially have on me at all times now.
It's not about physical capability. Obviously people aren't getting exhausted moving around 5lb laptops instead of 3lb ones. It's about comfort and ease of manipulation.
Why is giving up a better keyword and battery worth the few ounces? How is that more comfortable overall? Do you hold it with one hand more often than you type on it?
The iOS UI is similarly crippled. I got my parents an iPad as they were moving countries and needed a small portable device that needed no maintenance. To Apple's credit, the iPad is basically impossible to screw up software-wise.
But even the initial setup UI places form over function. Basically there is no way to tell what is a button and sometimes you have to click the word at the top to move on and sometimes the word on the bottom. It's completely bonkers.
I always hit the wrong keys, even after a year I'll end up hitting ~/` instead of the escape key, I routing hit "mute" instead of volume down (or I'll hit up instead).
My problem with USB-C on MBP is that, at least 6 months ago, I couldn't find a good, fully compatible port expander.
In particular, my 2017 13" MBP gets pretty confused sometimes when talking to two external monitors connected via my Elegato expander. Even just one monitor sometimes gets confused after sleeping.
It was annoying enough that now I typically use just one external monitor. And when I need two, I conmect the second directly to my MBP's 2nd USB-C port, not to my expander.
I'm a big fan of USB-C—I try to avoid purchasing any devices without it. But I use the non-USB-C ports on my ThinkPad T480—HDMI, USB-A, Ethernet—on a regular basis. It also has a hot-swappable battery, superb typing experience, and is easily serviced.
And it weighs about the same as a MBP (right in between the 13" and 15" MBP, which makes sense since it's a 14" laptop).
As someone who spends his day shuffling between conference rooms before mercifully being given time to work, I am grateful for every ounce shaved off of the device I work on. no sacrifice to battery, and it's on my company's dime if it fails.
Why is shuffling between conference rooms so arduous?
When did people become so weak as to not be able to carry a few pounds (or even ounces) between rooms? That's not a good thing and arguing about such little weight is indicative of much bigger health problems than any laptop design issues.
I routinely walk home from work—3 miles, which takes about an hour. Even with a high-quality shoulder strap, I can notice a half-pound difference in laptop weight.
My T480 is lighter than a 15" MBP, and is user-serviceable. It's also one of the less svelte models in the ThinkPad line—if I wanted to sacrifice a bit of repairability (nothing as bad as Apple) for some weight savings, I could have gone with a T480s or X1 Carbon.
Realistically, I just can't really see Apple fixing this by introducing a new keyboard design that's any thicker. As much as I would love to hear "We've heard your feedback on the keyboards and we've made some drastic changes to address them in our new Macbook Pro", it just doesn't seem like their style. What do others think they're going to do for their next release of laptops?
Honestly, its hard to get off the mac ecosystem though. I have a macbook pro, and am starting to look at linux laptops. The options don't seem great. older Apple laptops are pretty well built, so hopefully it'll last a couple more years...
As someone who'd been in the Mac ecosystem and paid the price in time and expense of leaving for a PC a few years back, I'm starting to lean back toward the Mac side. The amount of fiddling and headaches I'm dealing with on Windows is just a huge waste of time. Had to reinstall all my programs after a recent Windows update ruined the OS.
Have mac book air from 2013. Went through all OS upgrades with no reinstalls. Even upgraded SSD from 128 to 512 with flawless backup from timecapsule. Can't even image this happening on windows...
I recently had a hard drive failure on my primary HD in my mac pro (desktop, the old shape). The machine wouldn't boot into the OS at all.
I bought a new HD and ran the recovery tool which gave me an option to clone from another drive. I figured I might as well give it a try even though the old HD seemed completely dead.
After a couple of hours the machine rebooted using the new HD. The amazing part is not only were are my files perfectly intact, but the exact same applications I had open when the old HD died opened up automatically on the new HD.
Time Capsule is interesting, it's been consistent since its release in 2008 and has worked flawlessly for me. Compare that to Windows backup which changes to a new incompatible product with every new version of Windows and is so riddled with flaws its unusable.
I'm on the whole pretty happy with Windows, but I've never understood how they can consistently fail to deliver a working easy to use backup solution out of the box for decades. Mere incompetence shouldn't be able to deliver such terrible implementations, it's almost like they're actively trying to fuck it up.
this is what's keeping me from switching. every time I try spending a day on Windows I remember just how much macOS "just works", even if it has warts all over the place.
See, I've always used both and I just don't understand this whole "I have to spend a day on Windows making X work". I haven't experienced this in at least a decade. Windows 10 "just works" with every device and software I've thrown at it.
That's the thing about anecdotes, one person's experience is completely opposite of anothers. I got a laptop with Windows 10 on it and immediately experienced the "100% disk utilization problem", which makes it basically unusable. I Googled around and found hundreds of posts about the problem and many dozens of suggested fixes of which zero worked (for me). So it's sitting on a shelf until I get around to installing Ubuntu on it, which has always just worked for me (which is the opposite of some other people's experiences).
it's also very dependent on what you want to use your computer for. as a frontend dev I want a mix of Adobe CS and a bunch of primarily posix compatible tools, and then Windows is seriously suboptimal.
but even beyond that the sheer number of popups and shit I need to close when starting from a clean install on Windows makes me want to punch the computer, and naturally that is a very subjective experience. and stuff like apps stealing focus in general. if a mac app makes itself the key window in a non-standard way I immediately consider uninstalling it, on win that just seems to be what all apps do when opening new windows.
> as a frontend dev I want a mix of Adobe CS and a bunch of primarily posix compatible tools, and then Windows is seriously suboptimal.
Not sure why it's suboptimal, WSL is amazing. I am primary a frontend dev as well and do most of my building / start-up / testing of my work inside of a terminal. It's great and Adobe works great as well with the added bonus of the better Windows dropdowns (Mac OS dropdowns drive me nuts but for Windows you can give one focus and scroll through all the options with the keyboard).
> but even beyond that the sheer number of popups and shit I need to close when starting from a clean install on Windows makes me want to punch the computer, and naturally that is a very subjective experience.
When I first started up I had like 2 notifications to use Edge and something else and that was it. When I enrolled in the Windows beta I received a few more for requests for feedback, which it warned me about ahead of time. Not sure why you had so many pop-ups and I've had basically none.
It's always so weird to me how my experience with Windows machines differs from those who prefer Macs. I own and use both and they both seem pretty great to me. The only big thing is I make sure to only buy Windows hardware from Microsoft to get their "genuine experience" because HP and everyone else likes to ruin it with their bullshit.
Not to counter your point, but I had that new remarkably stupid ":( Oops something went wrong" error popping up the morning I had to give a presentation. I have Windows 10 Home and it forced the updates, failed after an hour, and fucked up the boot partition, removing grub. So just before the presentation, I had to setup grub again and boot into Linux because windows was still stuck with updates. Thankfully my slides were on google drive.
On my office laptop though, Win 10 Pro has been pretty stable.
To get back to the main point, I usually have to spend a day with both Windows and Linux to install everything right from VLC to compilers. IMO setting up needs to be done on every machine, independently of the OS.
It has been the opposite for me since about Windows 7.
Work MacBook - have to restart it to get USB ports to work, screen flickers when you plug in an external monitor like X windows on an Unix terminal from '99 with a bad conf file... Comeon.
They're all software issues. I think all the MBP's in our office do this stuff. For instance, the external monitor and such works fine. It's just it is very un-elegant how it does the switch over.
Working with Unix-Servers every day i still would struggle with a Linux Machine. Tried it in the past and getting things to work wasn't a problem at all.
But
1) i have to admit that i like all this full aluminum fuckery (shame on me)
2) MacOS is for me more "batteries included" then any Unix-Flavor. Yes there a often alternatives on Linux, still i really like to get my shit done quick and easy with tools like preview for example.
3) Better Tools on MacOS: There is some great stuff out there which help me in every way. Alfred, MailMate, MoneyMoney, Postico, Royal TSX, Paw, ... good looking, functional tools. And still i can use all the great command line tools out of the linux world (vim/emacs, bash/zsh, compiler tools, ranger for filenavigation, ...).
In regards to tools, you can usually find a distro that is "batteries included", try KDE as your DE - There's great tools like Okular, Dolphin... Insomnia is a pretty looking Paw alternative etc, however I think the advantage with Linux is that you can often build it how you want, without the junk that is there and you don't even know what it does, but it consumes CPU cycles. Moreover, with distros like Arch Linux, you get practically all the available tools, always up to date, never have to reinstall, updates take a minute as opposed to half an hour and you can mange everything, (not just brew packages), with an awesome package manager.
I really don't want to say "MacOS > Unix" because of any reasons i mentioned. I also had ArchLinux as main OS on my Desktop, and also tried a new KDE Flavor.
But for me personally it was never as good as MacOS. Maybe i wasn't trying hard enough, or don't want to loose any spent $ on MacOS only software. Dunno. Anyway i just hope for myself that i can get a great MacBook again ;)
(What i missed the most on MacOS was always a fast and keyboard focused window manager like 2bwm. In the end i was able to recreate this behavior with hammerspoon and some custom lua scripts in combination with alfred. MacOS just looks sometimes a bit bloated with all these rounded egdes, drop shadows etc.)
Mounting a seldom-modified and highly sensitive and critical filesystem r/w by default, when the existence of buggy implementations is known, is a deeply irresponsible decision. Blaming the rest of the world for the fallout from their crappy design decisions is a pattern of behaviour among some systemd developers.
Fortunately you don't have to use systemd. I'm happily running Void Linux.
The people who relate the tale as if it were a systemd thing or something that the systemd people did are ill-informed. It was entirely a kernel thing. This was even stated outright at the time by the creator of that particular kernel mechanism.
Well frankly I don’t really care who won the bickering contest on this one or any other. Simply put, I don’t have time for all the things I need to do or desire to, let alone “tighten all screws” to my machine every single day.
I remember a guy at Uni who’d come to the study room, update the various spyware, malware sigs and let the Win machine purge itself for half an hour, while he had coffe and cigarettes.
Since several years, my biggest hassle has been to plug a TM backup and choose “restore from backup” when I bought a new Mac.
Drivers, power management, tweaking this-and-that... nope, my life demands those minutes back. I’m actively worried when I see Apple fuck up and potentially ruin this status quo because “innovation”.
System 76 (the small linux-only hardware manufacturer) just refreshed their Oryx Pro last month, which looks pretty capable and looks from afar to be really good build quality. https://system76.com/laptops/oryx
I haven't found many/any reviews of it since the refresh -- does anyone on HN have one who can attest?
I have a new Oryx Pro. I'm not an expert on laptop build quality but to me the Oryx Pro looks and feels great. It is in fact a Clevo P955ER, which I know I could get for cheaper, but personally I want to support System 76's selling and supporting Linux laptops. I'd actually be interested to see if a straight-up Clevo P955ER with Pop!_OS installed is any more or less stable than an Oryx Pro.
Anyways here are my thoughts:
Good things:
- Keyboard is pleasant to use (for me)
- Trackpad feels good (not as good as Apple's trackpads but much better than most non-Apple ones)
- Desktop-level performance
- No Linux jankiness -- everything has just worked for me so far on Pop!_OS
- Battery life is not as terrible as I thought it'd be on Intel graphics (~5 hours depending on what you're doing. I haven't tested this too thoroughly)
Complaints:
- Need to reboot to switch between Nvidia and Intel graphics
- Fans are always audible on Nvidia graphics no matter what you're doing
- Nvidia graphics are required in order to use any external displays
- Battery life is horrible on Nvidia graphics, so you basically have to switch graphics then shut down before unplugging from an external monitor if you plan on using it on-the-go
No _need_ to reboot. Restarting the Xorg/Wayland server, maybe. You might be able to fool it into thinking you just hot-plugged the GPU, at least as far as the screen layout is concerned.
Remember: the only reason you need to reboot linux is to get a new kernel, or because the kernel FUBAR'd and corrupted data.
I hear you, and I've restarted the display server plenty of times on my computers instead of restarting them, especially on my main desktop where I have lots more stuff running independent of the display (daemons, etc). But, I feel like particularly on a laptop, if you're restarting the display server, it's not much different than just restarting the whole machine. Certainly that's the case for the large majority of users. Anyway, I think the point is that it's a hassle, which it is, not that you literally "need to reboot".
Would it be possible to start a second display server like we used to, one on ctrl-alt-F8 and one on ctrl-alt-F7? So when you want to do some gaming you start up your gaming workspace and switch to it, and when you want to turn off the nvidia graphics you destroy the gaming workspace, but your document workspace that was using the intel graphics is still there? That feels like it could be somewhat usable.
Should be. Windows is actually not crashing the desktop when the driver balks, so you can AFAIK run a Windows VM on a Linux and switch the PCIe GPU (yes, i915 counts from Haswell on) via KVM/QUEMU/IO-MMU between host and VM, with the latter not caring much, but the former crashing Xorg.
I prefer to keep things up/running, i.e. demon-based software, file transfers, mounts (including LUKS password unlocking), music players, etc. Even SSH sessions.
I am still looking for a way to restart Xorg without having to restart the software on it, but so far I have not found a solution.
I switched to a System76 Lumur last year after 5-6 years of MacBook Pro. The biggest adjustment for me was the battery life, it’s not terrible, but I remember I never had to think about charging my Mac. Aside from the flimsy hardware I can’t complain about anything else, I certainly don’t miss macOS and I’d take Linux over it any day. The price is a factor as well, I got a maxed out system for half the price, in another year I can buy a brand new System76 and I would have spent the same amount a MacBook Pro costs. Having said all this, I do miss the MacBook hardware quite a lot.
Note no mention of battery life. Look at the internals and you'll see why - the battery is absolutely tiny, and the logic board, disk, and fans are comparatively massive. The parts are definitely not power-friendly either. Seems very optimized for staying plugged in, with only a quick unplug to go into a meeting. My guess is it'll do ~3hrs w that 55Wh battery at most. This competes with gaming laptops and not MBPs, though it does look nice.
I think System76 is using Clevo/Sager chassis still. You can buy these setups from a place like xotic and load up linux yourself for a bit less than System76 stuff costs.
Yeah, I don't get system 76's draw. If they were doing something meaningful with their systems like developing coreboot for them, that would make it a selling point at least. They have recently started using me_cleaner, which is a good start I guess. Other than that, I think they are developing their own distro, which seems like an utter waste.
I have a System 76 laptop from a few years back (not Oryx Pro), and while it's still my main home comouter, it's definitely a lot flimsier and heavier than any macbook. Battery life was never great, and by now it's shot far enough that I only use it while plugged in. That being said, it's remained a reliable system and I wouldn't mind upgrading to a newer model.
If it were only the battery, I'd replace it, but getting a new system will also scratch the itch to upgrade hardware specs. There's also some piece rattling around inside the case which must have broken off early on... (to my point about fragility compared to macbooks)
Consider (used) thinkpads. There are places that refurbish them, also selling you devices that just got out of their lease, due to the owner bailing out after only a few months. They are frequently available for half-price in good-as-new with still some warranty on them, which you don't need on the T and X series (not the carbon though). They need one screwdriver, for full disassembly.
It's the number pad that's in the wrong place. It should not be on the laptop.
I've got a HP with a number pad because no 15" model doesn't have one. I never use it (the number pad) but I have to shift the whole machine to the right to have the touchpad and the "real" keyboard in front of me. So most of the vern is on the right of my eyes. Great design!
Agreed about the requirements, but do you also shift the laptop right or bend your arms and shoulders?
I'd sell the numberpad as an external device. People that need it will buy it and place it to the right of the laptop (or maybe to the left for left handers, who knows?)
There is another design problem: it could shield the ports on the side where it is placed.
Another solution: engineer the screen so that it can be shifted 1/3 to the left when the laptop is open. That will align it with the spacebar and the touchpad.
About the XPS, I didn't buy it back in 2014 because the then current model had some thermal problem and because I like 3 physical buttons on the touchpad. HP ZBooks do have them.
Only thing I'm considering is a thinkpad x1 carbon. But the last time I tried putting linux on a laptop I had a few issues submitted to the linux kernel and still didn't have the laptop working perfectly (Asus zenbook) within a week I returned it and bought a MacBook pro. Might be time for another go.
I believe Thinkpads are famously easy to make work with Linux. I had one in college (a decade ago) and an x1 carbon when I was at Google (several years ago) and both of them were excellent (though the latter obviously would've had any kinks ironed out by the Goobuntu team).
My last couple laptops had better hardware (for my needs) than ThinkPads or Macbooks, but the occasional compatibility issue is enough that I'm thinking about switching back to Thinkpads.
Consider it's brothers, the 3-digit X and T models, which, especially in form of premature leasing returns, have excellent price for the value. And due to the stagnation with CPU power, all from Haswell should be well within usable.
Just get a Dell XPS developer edition. It is what an MBP is supposed to be but actually delivers for roughly the same price or less depending on configuration.
Oddly enough it has gone completely away on my 2017 model. I can only guess a firmware upgrade fixed it, which is probably unlikely. Or getting jossled around in my backpack got it aligned?
Ubuntu has worked perfectly on my last two thinkpads, an X220 and X1 Carbon (4th gen). That's almost 7 years now. I think only the only real caveat is that the fingerprint reader isn't supported. I can't speak for other laptop brands, but just get a thinkpad (or a dell xps).
I recommend the X1C. I had a first-edition. It was wonderful, and lasted me for five years. Last year I upgraded to X1C5. It is far lighter still, the keyboard is good, it is just a joy to interact with. (debian stable)
I'm about to get a new laptop. I had an MBP for 4 years now.
I just ordered the Razer Blade 15.
DELL's XPS 15 is also a very good choice, although the blade has a bit more recent hardware.
Both are upgradable. You can add RAM or SSD if you want.
Both run linux just fine.
Yeah, quiet but goes to 100C constantly.
Once I get the razer I'll check what's the temps/loudness there.
But with the MBP, I turn the fans up all the time, to avoid burning my fingers when I type :/
The difference is that the antenna & bending issues were grossly exaggerated by clickbait reporting and lawyers angling for a class action win. As the wikipedia articles note, neither affected a significant number of users – the iPhone 4's antenna was still more sensitive than the 3GS it replaced – whereas the keyboard issues affect a lot of people. It's not the urban legend-style “my third cousin has a friend who held his phone hard enough to drop a call” reports which characterized those earlier flaps but real named people who are directly affected and getting hardware replacements because Apple techs agreed with the problem.
Short of fitting a keyboard condom, or iPad (no doubt called TouchBoard) where the keyboard used to be, what can they do? Make the battery thinner so they can get sane key travel back within the case without increasing overall thickness?
So I expect to see a couple more releases of ultra thin keyboards with various fix mechanisms before they accept that rules of physics and dust apply to them too. :)
> what can they do? Make the battery thinner so they can get sane key travel back within the case without increasing overall thickness?
Go back to previous thickness; it wasn't that bad, and thin for the most powerful portable they sell isn't necessarily a good thing. They could even couple it with advancements in battery tech to finally put 32GB on their laptop.
I don’t see why they can’t have Air for customers that want thinness and Pro for customers that want portable workstations. Is there a reason, other than “marketing”?
Yes please. Honestly the new format is such a clusterfuck of unwanted features, poor hardware implementation, and removal of wanted features that it makes you wonder how it got out the door.
Hanging onto my circa 2013 MBP until they come up with a more appealing offer, or going elsewhere.
A "keyboard condom" isn't a joke or insane. It would be a membrane layer between the operating mechanism of the keyswitches and the keycaps. They haven't done it because it adds thickness to the design.
I personally suspect one day they'll marry the touch bar and the taptic engine to make a keyboard surface. That way they could have emoji keyboards and such. And they would only have to manufacture one top case, no matter what country they were shipped to.
Maybe if it worked with the Apple Pencil and the taptic feedback was good enough I could see being curious about it. It’d probably just be a good way to incite a riot though.
Historically they just keep replacing old broken hardware with new/reworked broken hardware. Reworking includes reheating GPUs (!), or putting rubber shims(!) over vulnerable BGA chips with cracked solder joints.
The thickness of the keyboard isn't the issue -- the issue is its fixability. I've heard various accounts of when a key gets stuck, Apple has to replace the whole bottom half of the laptop.
As for making the keyboard thicker, I doubt it. Once Apple makes a decision that dramatically alters their product in a way that users complain about, they generally don't backtrack. (floppy drive, CD drive, and Ethernet removal. Headphone jack removal. Losing Mac Pro traditional tower form factor. Flat design in iOS. etc etc etc)
I honestly don't understand all these quality sacrifices just to make things thinner. It makes the product actually feel somewhat cheap and brittle.
The same thing confuses me with smartphones. Everyone keeps making these bacteria and finger print magnets thinner... the minute you drop it, it's shattered or unuseable... and so then you're buying a case to protect it from scratches and shatter... so what's the point of making it thinner in the first place? Gimmicks IMO just to sell things year over year without adding real value or actual features.
Especially in cases when it doesn't need to be thinner. I just tried out a Apple Magic Keyboard, which is a razor blade for some reason, and had wrist pain within a few hours of using it. Everybody's different, but this keyboard is trash. It is designed solely to look pretty in marketing photos.
I'm not a keyboard snob. I don't need Cherry MX Blue's to be happy. I thought the Apple Wireless Keyboard (with the AA batteries) was probably a step in the wrong direction, but still fine to type on. But this is quite literally painfully bad. And there was no reason for it, it's not a portable product, it doesn't have heat sinks, screens, fast processors, etc. It's just a keyboard, it would be fine if it's 2mm thicker. Actually it would be better.
I am currently using this horrible keyboard. Hate this thing. Before I've been using a mechanical keyboard, but coworkers complained that it was too loud. I like loud keybords. The sound is so pleasant and satisfying I want it to be even more intense. I want to feel vibration every time I hit the key. And when I hit enter I want there to be a minor earthquake.
This apple trash keyboard doesn't have Shift+Insert. The alt is next to ctrl and it's place is taken by cmd. I had to change my bindings for i3. Caps lock isn't solid and instead of caps + f, I oftentimes type simply f, because caps is released too soon. (caps is rebinded to ctrl and caps + f means select suggested autocompletion). Enter is small.
Actually, I will apply for new keyboard this very day, right now. It will be regular mashed potatoes keyboard, but it's still going to be better than this.
The Microsoft Sculpt Ergonomic Keyboard does not have loud switches, but feels much better than the Magic Keyboard. Only the function keys are not great.
Microsoft has released consistently good hardware for over 20 years now. There have been a few duds over the years but by and large their input devices have been stellar.
The ergonomic/natural/arc keyboards are, by far, the best products Microsoft ever sold. I do love the Apple trackpads, however. There is no better way to work on a Mac.
As for my Linuxing, I have a Unicomp "Battleship" (PC122 keyboard) that's loud as hell, but has the best feel since the beam spring keyboards God added to the IBM 3270's when it made them in the 6th day.
I'm still rockin' the original Microsoft Natural on my workstation. My current motherboard has a PS2 keyboard connector, but I have a USB-PS2 adapter if/when the day comes to upgrade.
I originally bought three of them, and two are still working good.
I also recently bought one of the new Microsoft keyboards, and the feel on it is still pretty good.
Trackball is way better than any trackpad for me. I have three Microsoft Ergonomic keyboards. Two of the Surface ones and one of their cheaper ones which is about $20 at a BestBuy.
The thing with the trackpad on Macs is the gesture thing. Swiping desktops, zooming out on all windows, the "3D" click... The fluid way it allows me to work is priceless.
+1 for the sculpt ergonomic, the older black version (with external numpad) and newer surface version are both great.
The newer surface version is mostly better because it has an Fn key instead of a physical switch to go between media keys and F1, F2, etc.. plus real home/insert/end/arrow keys. I do wish they left the numberpad separate though, as it increase the distance to my mouse/trackpad and I preferred to just get rid of it, as personally I never use it. But many people also like to have it...
Newer version is also bluetooth which is great on mac, but less great on windows/linux as you don't get bootloader support.
+1, the sculpt keyboard and even mouse are my daily drivers at both desks and very well priced.
After a quick key remap on MacOS and a bit of muscle memory adjustment on finger placement that was well with it, my typing speed hasn't suffered and hands are happy. I do wish MS made Mac drivers. No Mac keyboard can hold a candle to this kbd.
I tried the my much more expensive kinesis freestyle with the trimmings - it was nice but too slow to type on because of the keys.
The dream keyboard would be a MS sculpt cut in half like a kinesis freestyle 3, maybe with apple keys that work, and wireless halves.
Counter-point, I haven't had a single Apple keyboard fail on me yet (although I do not own the new Macbook, so I don't have the dreadful one) and my Matias broke after just 1 year.
Matias talks a good game, but we bought 30 of their keyboards and they didn’t last very long. We had much better luck with Logitech. Also, we are batting a 1000 on broken butterfly keyboards. It is really disappointing and frankly the feel of the things is awful.
Matias? Keys just stopped working. We cleaned them (we have people who are trained techs) and they still just stopped working. Two of the keyboards just stopped working for everything. It was weird. Plus they were really cheapish build quality.
Using a custom order WASD keyboard on the workstation and butterfly v2. on an MB. Can't say one is worse than another. The worst thing about Apple's keyboard is botched up layout, rather than any haptics problem.
(Also, it's a huge improvement over the old-style far-apart, mushy blocky Apple keys).
>Especially in cases when it doesn't need to be thinner. I just tried out a Apple Magic Keyboard, which is a razor blade for some reason, and had wrist pain within a few hours of using it. Everybody's different, but this keyboard is trash. It is designed solely to look pretty in marketing photos.
If you're talking about the keyboard on the new MacBook Pros, yes, it's trash.
If you're talking about the standalone keyboard actually called "Magic Keyboard", then I disagree, it's perfectly done, and a pleasure to type on.
And I own several mechanical keyboards (and have started as back as to be using Sun's own keyboards on Spark workstations).
That keyboard was good. It's replacement is worse, but only moderately worse, so it's got less attention than the train wreck that is the Macbook and Macbook Pro keyboards.
I would assume he is talking about the Magic Keyboard 2.0 which is the newer version of that one. It is widely considered to be an excellent keyboard and many people (rightly in my opinion) don’t understand why they just don’t put that keyboard in their laptops. I type on the MK2.0 and love it but keyboards are a personal thing and there any many thin style ones that I just can’t type on, and don’t get me started on “mechanical” keyboards, awful, loud, dreadful RSI inducing things (for me).
I have borrowed one from a colleague, because I like to swap keyboards every now and then for variation and the old Apple Wireless Keyboards were ok. I also had wrist pain within hours with the Magic Keyboard. This is a huge difference compared to the unassuming Microsoft Sculp Ergonomic Keyboard, which is not only cheaper but really feels nice in comparison. It also has genuinely useful features such as reverse tilt, a magnetic battery door, a split, and wide spacebars. Plus it has full-size arrow keys (though I don't use them much, because vim/evil).
Short-travel keyboards are, by some studies, less likely to cause RSI than typewriter-style longer-travel keyboards. At the very least, there is reason to question why emulating the key travel of a mechanical typewriter linkage makes sense versus short travel, or even no-travel haptic feedback.
I think Apple's keyboards, when they work, work really well. And you can get them without number pads so your hand isn't traversing a wasteland of wasted keys to get to the mouse.
>I honestly don't understand all these quality sacrifices just to make things thinner. It makes the product actually feel somewhat cheap and brittle.
It's easy: every time people had the chance to buy thinner or thicker products, they flocked to the thinner ones.
The complaints are random outliers around the internet, but the actual Apple's sales numbers (record years after record years and reduced sales for thicker older designs) speak for themselves.
Not just some "vacuous" "non-technie" users either (as the stereotype says).
Can you guess who said the following words about his MacBook Air for example?
Quote:
"I’m have to admit being a bit baffled by how nobody else seems to have done what Apple did with the Macbook Air – even several years after the first release, the other notebook vendors continue to push those ugly and clunky things. Yes, there are vendors that have tried to emulate it, but usually pretty badly. I don’t think I’m unusual in preferring my laptop to be thin and light.
Btw, even when it comes to Apple, it’s really just the Air that I think is special. The other apple laptops may be good-looking, but they are still the same old clunky hardware, just in a pretty dress.
I’m personally just hoping that I’m ahead of the curve in my strict requirement for “small and silent”. It’s not just laptops, btw – Intel sometimes gives me pre-release hardware, and the people inside Intel I work with have learnt that being whisper-quiet is one of my primary requirements for desktops too. I am sometimes surprised at what leaf-blowers some people seem to put up with under their desks.
I want my office to be quiet. The loudest thing in the room – by far – should be the occasional purring of the cat. And when I travel, I want to travel light. A notebook that weighs more than a kilo is simply not a good thing (yeah, I’m using the smaller 11″ macbook air, and I think weight could still be improved on, but at least it’s very close to the magical 1kg limit)."""
"Not just some "vacuous" "non-technie" users either (as the stereotype says)."
Actually, Linus comes off to me as particularly non-techie. He obviously cares about it enough to get the work done but couldn't care about it more than that. Not even the OS (outside of the kernel) is of any interest to him (dissing debian because he thinks the installer is too cumbersome etc.).
Compare the MacBook Air with the lenovo x-series at the time and it's quite hard to see what the air actually brought to the market except for first in class non-replaceable batteries and few external ports.
You've always been able to get silent computers, but Linus doesn't have the interest to research them.
>Actually, Linus comes off to me as particularly non-techie. He obviously cares about it enough to get the work done but couldn't care about it more than that.
Non tinkerer is not the same as non-techie.
In fact I'd call tinkerers the par-excellence non-techies. They don't do anything technologically productive (much less write their own kernel), they just play with tech toys.
you are confusing coincidence with causation, there is no evidence that apples record sales numbers (which seem to be at an end anyhow) have much to do with making things thinner to the point of fragility.
a) "clearly being an asshole makes me steve jobs right"
I never claimed any causative chain of that sort. I said that thinner continued to sold in droves, not that making something thin necessarily will make it sold.
b) "or at least it cant hurt! right?"
That's closer to what I said, which is the continuously thinner products never hurt Apple's sales.
And to answer your question, yes, obviously it can't hurt in someone becoming like Steve Jobs -- since it didn't hurt Steve Jobs himself. In fact, if anything, it could have been necessary to Jobs success, and goes with his obsession and attention to detail and crazy push to his employees.
Not really. It's somewhat established that there are more psychopaths as CEOs than in the general population, suggesting it's a good trait to help one "make it".
> It's easy: every time people had the chance to buy thinner or thicker products, they flocked to the thinner ones.
You'll need to back up that statement with factual sources, because clearly all Apple decisions aren't always backed by marketing studies because all their products are not always that successful.
>You'll need to back up that statement with factual sources, because clearly all Apple decisions aren't always backed by marketing studies because all their products are not always that successful.
What marketing studies? I'm talking about marketing results. Their ever thinner laptops remained at the top of best selling laptops in their price class -- selling several times more than the closest competitor ever since Jobs came back to Apple in 1999.
> What marketing studies? I'm talking about marketing results. Their ever thinner laptops remained at the top of best selling laptops in their price class -- selling several times more than the closest competitor ever since Jobs came back to Apple in 1999.
we're talking about phones here, not laptops and correlation is not causation, you didn't demonstrate what you state as a fact.
You seem confused. I'm not trying to establish a causal relationship between thin and sales. I'm stating (it's a fact) that people have flocked to thin products.
We're also not "talking about phones here". The post is about the MacBook Pro keyboard. That's a laptop. You first brought up phones, when you mentioned "bigger screens".
Besides, even in phones, the ever thinning models kept selling very well, whether they had bigger screens or not.
I like how John Siracusa explains the concept of "naked robotic core"
To paraphrase, the phone is the computing unit, and the user customizes it to fit it's usage.
People that never dropped a phone in their life can use it as is. People who work in construction can put hardcore shells that survive multiple floor drops. Those who like furry cases can go wild and those who want to build accessories around the phone have a smaller target to do so.
Seeing it this way, having the phone thin enough than you can put the case you want without the whole package being bulky is a feature.
To touch on the laptops, a few grams shoved off won't matter and don't justify a huge sacrifice. But the size and weight drop we had from the 15" Lombard laptops to today's 15" don't all happen in dramatically, and keeping a goal of having the next machine thinner than the previous one can help achieve huge progress in the long term.
I am not a fan of the current keyboards, but I can't throw them the stone for trying to make things thinner and lighter. I was super happy to see the lighter 2013~2015 models, and I don't want them to just settle and say "we're good enough for the next 20 years, we don't care anymore".
First of all, I disagree that people _want_ to put furry cases on their phones. Too many people have walked around for too long with giant cracks on their screens that they've realised it's better to slap on a case, even with the ugliness that goes with it. Having some minion drawing on the back of the case is just a minor positive.
More importantly though, I cannot stand how glib Apple is with their devices. Surely the engineers and managers in charge _must_ realise the responsibility on their shoulders. Are they OK with many more people contracting carpal tunnel or RSI because of the choices they made? What about the environmental costs of needing to replace perfectly functioning components like speakers just to replace a keyboard? Just for the sake of A E S T H E T I C S?
I don't mind the whole case of notebook being unserviceable for a iFixit prospective. I don't think anyone does. But it must be done to perfection that it doesn't ever need repairing. That means assuming no water spillage, under normal conditions, less then 0.1% of repair are needed in a 4 years timeframe. That is what I think is good enough for a thinner, integrated trade off.
But that is an ideal they never achieved, and they have been let loose for years. USB-C Over Power, Logic Board burnt out, Keyboard failure, Power Supply not working, SSD Failure, Memory Failure, GPU failure, Display Panel problems. The amount of crap we have been putting up with considering they are some of the most expensive laptop in the market, and we are paying the premium really for using macOS.
> Everyone keeps making these bacteria and finger print magnets thinner
I fear it might be worse than that; I have a Moto C, which is nice enough as a cheap phone. But they've made the case out of the slipperiest finish I could imagine, I have a silicone case just to trust I can hold it.
Now, if that's the cheapest finish they could make to get the price bracket they want, fine. But I can't shake the feeling that some designer chose that finish to make it _feel_ thinner. Just like a rough heavy weight paper makes something _feel_ valuable.
If Apple released a new phone or laptop this September and specifically marketed it as being better built and longer lasting (and therefore thicker and heavier), I'd buy it. Am I the only one?
I'd buy a fatter, easier to repair, MacBook Pro (like the old model they still sell), but only if I could order it with 32 GB of memory.
The thing is Macs used to last a very long time. I kept my 2008 MacBook for about 8 years and it was still a fine laptop until the last day. Today, my work requires a little bit more memory and 16 GB doesn't quite cut it anymore.
This is probably why I'm getting a hefty Thinkpad E for a fraction of the price of the laptop Apple can't build yet because thinner parts don't exist.
its not like the change of thickness is a single variable from one gen of a phone to the next. Id wager there are other variables that make people purchase new phones rather than 1mm shaved off.
If sales increase when it is thinner, it stands to reason that customers want thinner devices.
Just because you disagree with them doesn’t mean you’re not wrong for saying there is zero disadvantage. The disadvantage is not giving customers what they want.
> If sales increase when it is thinner, it stands to reason that customers want thinner devices.
This isn't something that can be A/B tested when there's only one (thinner) offering. People "need" to upgrade their devices and will ultimately buy and put up with the newer model even it means trading a decent keyboard, potentially better battery life, or a headphone jack.
This is the usual excuse to justify questionable design decisions. It ignores which information customers use to make their decisions and how we have whole business sectors that do nothing else than control and shape this information. (PR, marketing and advertising)
By the same logic, if a crappy movie (by reviews) nevertheless earns a good profit on the first weekends due to an intense marketing campaign and flashy, misleading trailers, that would magically make it a good movie, because obviously "customers wanted that".
Correlation is not causation. If the only new products in the Apple ecosystem is thinner, have a non-removable battery, no headphone jack, no SD card slot, no magsafe power, then it is easy to reason that all people who brought new Apple products wanted those features. The problem is, they never had the choice.
I'm pretty sure marketers and designers really do know that. How light a device feels in the hand is pretty much the very first thing someone notices when picking up a new device, and sets the stage for the rest of their expectations.
We also know that in most cases (of course not every case) first impressions are a much bigger influence on the final decision to purchase than later analysis and consideration, regardless of what people think about their own decision making process. Pretty much every company that has ever sold anything cares about this stuff, and many of them have plenty of resources, so this has been studies to death for decades. A lot of super smart people have spent a huge amount of time devising very clever ways to test this stuff.
> We also know that [...] first impressions are a much bigger influence on the final decision to purchase than later analysis and consideration
But this is the whole point of the GP's criticism: The whole design of the device is driven by optimizing purchase, not actually by making it usable, useful or durable.
Of course we can know that. Basically all manufacturers save apple have a very long tail of non-flagship phones which nobody really pays much attention to, but which absolutely can answer questions like that. They have dozens of models, mostly basically identical save for minor changes.
"This is design anorexia: making a product slimmer and slimmer at the cost of usefulness, functionality, serviceability, and the environment." (emph. mine)
Spot on. My sister's best friend is currently hospitalized for anorexia so this resonates particularly well with me. Luckily for my friend, she seems to be doing much better from what I've heard. I hope I can same the thing about Apple soon.
> and so then you're buying a case to protect it from scratches and shatter... so what's the point of making it thinner in the first place?
I agree, also in terms of material. I'd love to buy a plastic phone with some bezel that doesn't shatter easily. The Nokia Lumias a couple years back (e.g. the Lumia 800[0]) were good-looking and super robust.
But manufacturers obviously don't have much incentive to make hardware more durable, unless customers explicitly look out for that. And currently, many people tend to buy (high-end) phones rather for prestige reasons than actual usefulness.
I (still) own a Lumia 830 from back then. It's incredible what my phone endured (drops from a car roof, down a few stairs, even a short dive in a kitchen sink), and I don't really know why the thing is so sturdy since it truly looks good. I recently bought a spare display for about 40$ and a battery for 20$, just so I can replace them if anything should happen in the future.
> and I don't really know why the thing is so sturdy since it truly looks good.
One factor is that the cover glass doesn't go all the way out to edges and corners. This leaves some "crumble zone", as phones by the laws of probability most often land on edges.
What do the corners of your phone look like? I bet they're dented, at least that used to be the case for me.
> The Nokia Lumias a couple years back (e.g. the Lumia 800[0]) were good-looking and super robust.
Yes, I have a Lumia 640 here and I really like it from a hardware point of view. It's a shame that it only runs Windows Mobile, I wish so much that I was able to just install Android on it.
Samsung's Galaxy S range of phones have been getting slightly thicker every generation since the S6. The S6 was 6.8mm thick, the S9 is 8.5mm, with small but progressive increases along the way.
The S9 still feels like it could break easy due to the all glass finish, but I think that has much more to do with avoiding plastic whilst still enabling wireless charging.
Up until the 2016 MacBook Pro they managed this trade off well. But this wasn’t a product that was ready IMO, and it wasn’t just the keyboard. The hardware isn’t there yet to support a pro machine this thin. The battery life is worse than in 2015, the CPU performance has serious issues due to thermal constraints, and 32GB of RAM that a huge number of pros want can’t be done because what wiggle room they had for battery life they wasted on frivolous bullshit. I’m hopeful that they get this turned around soon, but aside from the keyboard they basically just have to wait for Intel to fix their problems for them.
> I honestly don't understand all these quality sacrifices just to make things thinner. It makes the product actually feel somewhat cheap and brittle.
I never understood this. I read often in reviews about how the weight of something makes it feel more expensive and "premium". For something you're carrying around all the time, you want it to be light and manufacturing something light is usually going to be more expensive.
Do I want it to be light? Sure, I wouldn't want to carry a brick of tungsten with me, but something the weight of two or three iPhones isn't much of an encumbrance even to people much physically weaker than me.
If they're making the phone thicker solely to make it not shatter, what you're doing is just building in a factory phone case, which you then can't easily remove or replace. So why not just let people buy their own cases that suit their needs?
This is what you see from a consumer perspective. But from a company perspective, going thinner as way more benefits. Apple invested a lot in R&D to make things thinner, but when it comes to manufacturing products in mass, Apple is actually making more and more money year after year by building a thinner version of a model. They’re saving money (cost of materials) while increasing or maintenaing their selling price. It’s a win for them.
"I honestly don't understand all these quality sacrifices just to make things thinner. It makes the product actually feel somewhat cheap and brittle."
Here's a hypothesis:
For many years, before the iPhone, Apple's computer business was built around keeping its loyal customer base happy. That base would buy new Macs by default, but would stop doing so if the long-term experience of using them went down too far. So Apple focused on long-term quality.
Now, their computer business is built around seducing iPhone users, most of whom who are now using Windows computers, into moving to the Mac world. The vast majority of such users base their decisions on little more than what looks best in the showroom. For that showroom seduction to occur, Macs are better off being the hottest looking computers there. Whether or not their keyboards work for much more than two weeks after purchase is far less important. (14 days being the period of the Apple return policy.)
Another factor: there is little reason for annual or bi-annual replacement of fully functioning computers any more. I'm still happy with my mid-2012 MBP. In the old days, Intel chips got faster according to Moore's law, and new software took advantage of that speed, so you really had to replace hardware at a fairly rapid pace to not experience everything getting slower and slower year-by-year. But the speed of my 2012 MPB is just fine. The situation is night-and-day different than it used to be.
So, once Apple achieves that initial showroom seduction, even if the user doesn't end up being particularly happy with their machine, they probably aren't going to buy another one for 5 or so years anyway. So customer dissatisfaction isn't going to be felt for that much time.
Moreover, from what I'm seeing, customer experience in the Windows world isn't consistently better, so word-of-mouth isn't particularly likely to pull the average user away from the Apple ecosystem and the way everything works together, and years of their emails and documents being stored in the Apple cloud which they'd lose access to, or go which they'd have to through the pain of moving elsewhere.
Overall, if making ever-more money is Apple's only real goal, focusing on thinness over all else is arguably a rational choice, as long as thinness makes Macs look sexier in the showroom.
It might be a racket, like the software/hardware market. Intel and OS/Software manufacturers are in cahoots to constantly design apps that require beefier hardware so that people have to constantly upgrade.
The same might go for cell phone manufacturers. They intentionally leave a market for accessories so that their partners can make more money and to ensure support/staying quiet.
I don't really believe that software developers intentionally write bloated code. It's just that most dev shops only fix performance issues when they become apparent on current-gen hardware.
As hardware gets better and better, the bar for "ok, this is too slow, let's go back to the drawing board optimize it" gets higher and higher.
Skype on my 4-year old phone is laughably slow, but I bet it'd be a lot faster if the devs did not have the latest and greatest in hardware.
Women will buy thinner phones. Fat phone makes you look fat. A delicate slim phone feels better! (kind of joking, but who knows)
Some people, not just women, might think that thinner means more advanced. After all, it's always been somewhat the case in general - remember the old 5kg monitors? Now you have flat LCDs.
I, however, am uncomfortable with thin laptop. I don't feel it's safe to put it in the backpack with other things. After all, what if I have to run? Will it endure the jolting? What if somebody bumps into me from behind in a crowded place? Probably it will be fine, but it is still somewhat unnerving.
But manufacturers want to make stuff that breaks. If it doesn't the you sell little. It can't be really bad, because then you'll get bad reputation, but if it's too good then you'll be outcompeted. I think it's possible to make a laptop that most likely will never break and will remain upgradeable for the next 50 years. But it would require cooperation from many hardware manufacturers because every part would have to be made with quality in mind, instead of profit and scalability.
We will not cooperate because we don't really need high quality laptops. What we have is good enough.
Gimmicks are easy to add. Some of the most annoying are touchscreens. Why on earth would you need a touchscreen on a laptop? Touching screens makes you look like an idiot. Remember those silly stock images of high tech where people touch holograms? I never liked them. It's like trying to sell electronics to apes. You don't explain, you show with gestures how you can interact with this thingy.
Actually annoying is not the right word. They make me feel uncomfortably superior. I don't know why.
I wish apple would just make a line of laptops aimed at professionals. They could give us back a real keyboard, give us back USB ports, give us back swappable batteries, and give us back SD card readers (although this could be microsd at this point).
They could even call it "macbook professional" or something like that.
/s
There is just absolutely no reason for these changes. The lack of USB ports on this laptop is something that I absolutely have not gotten used to. While I've been able to make due with the keyboard, I dug my old 2013 MBP out today, and instead of feeling bulky and old, it just felt good. If the screen hadn't become so badly delaminated, I'd probably switch back to it full time and just eat the ~$2300 loss on this thing (or maybe make it a desktop?).
> sorry, i want neither of these in my laptop. i want it to be lightweight, small, solid and durable body without moving parts or too many holes.
Thinkpad T480s weights about the same as the current MBP, has more useful ports (I find having hdmi, ethernet and older usb handy on a device I use in my work trips), good keyboard, replaceable hardware and whatnot. It is bigger, but Lenovo makes smaller Thinkpads too. Subjectively, I find it more robust than my MBP at work, but that's not the point.
The point is, the sacrifices Apple makes on their current Macbook Pros are simply design choices, not engineering tradeoffs.
i'm not defending macbooks here unconditionally. of course keyboard is worse than previous generation. i hope apple fixes it in the next. but otherwise, macbooks are hands down best hardware design of a laptop on the market. touchpad alone makes me not even consider any of the lenovo stuff. visual, tactile and functional design matter more to me than having 32g ram. not having more holes is preferable to me than having more holes. abandoning everything for 4 usbc ports is painful at first, but a great decision that will move the industry as a whole forward.
of course this is all subjective, use whatever works best for you.
Ethernet sure, but the rest of that stuff has been completely replaced.
This is what I mean: make a laptop for people who need stuff like SD card readers and USB, and make another for people who don't. The way I see the breakdown is the "pro" or "air" lines.
It's used in micro controllers, drones, some cameras etc. the whole point of my post is that apple should make a line of laptops that caters to people like artists, photographers, engineers, etc. They used to, and it was called the MacBook Pro.
A perfect example of this: I was recently at EDC, a massive music festival in Nevada, and one of our lighting displays had its controller stop working. I had my laptop with me, though, so I installed the control software on it. The comical part was that I had to plug I first a dongle to get USB, and then ANOTHER dongle to get Ethernet (controllers like this talk over artnet).
It was just really annoying. It felt like I was using a fashion device for work.
All will, the underlying technology is the same and the 'micro' just refers to the plastic shell; the adapter is just a spacer with electrical contacts.
I was one of those annoying Apple evangelists for years. My 2010 MacBook Pro was my baby and I loved it. Eventually it got to the point where I had to replace it (after 7 years!) and when I went to look around at the new MacBook Pro range, I was so disgusted by all the problems that I ended up getting a Dell XPS 13.
I consider my original MBP and the following models to be some of the best laptops you could buy at the time. If I could buy a 2015 MBP with a modern processor I would do it in a heartbeat.
Meanwhile I wouldn't recommend the current range of Apple laptops to anyone.
Just curious: What's different about the processors in the current MBPs vs those in the 2015s in terms of how it affects your work - or whatever else you're using your MBP for?
Having interfaced with Apple in a different way (apart from owning a new MBP and an iPhone X) I'd say the issue is, and continues to be, growing pains.
They're certainly losing the lead in software engineer talent (giving it away to whom, I'm not sure). But the single most horrible experience I've had with Apple in the last year was trying to get software published on the App Store.
That's a department that's so large that it's become a classic bureaucracy -- middle managers run amok, pettiness, lack of professionalism, etc.
I was trying to publish a piece of software using their less extortionate subscription options and I was told by a young woman manager that I would have to have feature X in order to qualify. I added feature X in a couple days and got back to her. She was surprised (disappointed?) and then told me I'd have to have feature X + feature Y in order to qualify.
I have a couple of contacts from the "good old days" and sent a very angry email. Middle manager woman disappeared, her boss called me and apologized within a few minutes.
If I didn't have that contact I probably wouldn't have gotten past her.
What has happened to Apple that their dev teams work against devs, compile times (this is 2018 remember) are now counted in minutes for small projects, my MBP crashes when plugging in an external monitor, etc?
They've grown too fast. My contact told me that their App Store team is now interfacing with over 2 million developers. How do you grow to accommodate that?
Now, what is the answer to all of this? Fix the marketing driven culture. Thinner isn't better. The number of apps on the app store isn't a meaningful metric. New languages are cool only if you can actually pull them off. Etc.
The app store and code signing have been fairly bureaucratic to deal with from day one. It's actually improved compared to the initial years from turn around times measuring in a day or two vs a week or two. I remember doing such tricks such as resubmitting the same binary and it getting passed. It really is a dice roll.
Compile times comes from swift and promoting it too early because it was too immature. If you write your code in objective-c, your fast compile times will come back. Swift compile times have been improving as time goes on.
They also don't pay as well compared to google or facebook. But even then I think it's more a management thing, since they determine the priorities in software dev.
>Swift compile times have been improving as time goes on.
Swift compile times will never be close to ObjC or C compile times, no matter how much engineering goes into it.
This is because of how the language is designed, it requires a typechecker which solves an NP hard constraint solving problem.
There is a whole bunch of low hanging fruit in the swift compiler as far as compile speed goes.
What swift compiler devs complain most about is about how operator overloading causes some sort of O(n^k) or worse check, because things like the '+' operator has dozens of implementers.
If your really worried about type inference slowing you down, you can write out all of your types like you didn't have type inference. You could even do it in an automated fashion like your source code was some sort of cache.
I'd say the issue is, and continues to be, growing pains.
They're certainly losing the lead in software engineer talent (giving it away to whom, I'm not sure).
I keep on interviewing recent Comp Sci graduates who have a 3.75 or a 4.0 and who can't tell me how to implement cycle detection -- to the extent that they could write a pseudocode function signature or some kind of concretely implementable design. Many of the same grads try to tell me that a null pointer in a C structure uses no memory and other nonsense like that. You know what I think? I think the CA grad student population no longer knows those things, so they are producing undregrads who know even less.
my MBP crashes when plugging in an external monitor, etc?
When my Macbook Pro is "locked" it flashes an image of the desktop screen just before switching to the login prompt.
Those things aren't taught in CS programs as the majority of companies don't need those skills. Most want application development such-as a web or iOS app developer.
However Computer Engineering programs teach those concepts as the focus is on low power micro controllers.
Those things aren't taught in CS programs as the majority of companies don't need those skills. Most want application development such-as a web or iOS app developer.
This sort of thing is very relevant for someone writing an application or a server process. Programmers who can write their own compiler are much better C and C++ programmers, because they have relevant background. Programmers who understand the low level stuff can write faster code when it's important, and they know better where to look when optimizing. Since when has our "field" become so flubby that we're now eschewing the notion of background knowledge?
Are you telling me that we are churning out Comp Sci grads who couldn't write their own low level libraries or compilers? Tech people should have at least a working knowledge of how their own tools work, to the extent that if civilization fell, they could have a good chance of recreating primitive versions of those things.
However Computer Engineering programs teach those concepts as the focus is on low power micro controllers.
Also relevant to high performance code. Also relevant to game engines. Also relevant to interfacing servers with legacy code. Also relevant to technology like WASM. Whoever decided to relegate stuff like that to Computer Engineering seems guilty of the same ignorance I see in these recent students.
I'm just going to throw in my experience here as someone who never went to college but currently writes software for a living.
I do mainly frontend web development with EmberJS, and occasionally work on our backend which is also JS, and I've been doing that for a little over 2 years now.
I never went to college and so a lot of the stuff you guys have been talking about in this thread goes right over my head. I've never written a compiler, the last time I wrote any C++ was high school, and I would so easily fail a lot of these interviews if those were the questions being asked. With all of that said, I think I do a good job at what I do without all of that knowledge. The industry is increasingly heading towards web/app dev in a lot of positions as other people mentioned, and I think it's very elitist to judge people for not knowing everything you do, even if you think it's important. The fact that this industry is becoming so open to so many people is amazing. Me being able to find a good job without a college degree just because of my knowledge of computers is what I love about tech. I think mindsets like yours are what help drive people away from it because they think they need a ton of knowledge to get an entry-level job, and that's just simply not true.
I don't want to sound like I'm accusing you of being malicious, I just wanted to share my point of view as someone who is relatively new to the industry and never went to college and doesn't have the knowledge that you are suggesting is very relevant. Maybe it is relevant and I just haven't figured that out yet, but from where I'm sitting that feels like something that could be taught instead of a hard and fast rule for hiring.
I never went to college and so a lot of the stuff you guys have been talking about in this thread goes right over my head...I think it's very elitist to judge people for not knowing everything you do, even if you think it's important.
Fair enough. However, if someone did go to college, they should at least know what they know, and know what they don't know. If someone is applying to a job with a 3.75 GPA where they might be doing some C++ and they go into an interview and try to tell you that a null pointer takes up no data, they haven't been well served by their education. They should at least know what they don't know, and not waste everybody's time.
However, you should know that these things are important. There are levels of knowledge deeper than being just a user of something.
I think mindsets like yours are what help drive people away from it because they think they need a ton of knowledge to get an entry-level job, and that's just simply not true.
So a generalist Comp Sci degree just needs to shrink into Web Development because of your feelings? Look, Web Development is a fine job, but it's not the same as a generalist field of knowledge like Comp Sci. Should mechanics expect that a Physics degree only be limited to their knowledge because of their feelings? They're applying Physics, after all. (Warning: don't you go and denigrate mechanics! That would be elitist.)
The very fact that you can have a job in tech without a Comp Sci degree isn't a justification for the dumbing down of Comp Sci. It shows that it happened needlessly!
Maybe it is relevant and I just haven't figured that out yet, but from where I'm sitting that feels like something that could be taught instead of a hard and fast rule for hiring.
Let's say you discovered an interviewee thought that a 404 meant the request never made it to the server. Let's say they also got a 4.0 GPA at some Web Development coding academy with a great reputation. Wouldn't you at least be scratching your head?
These are things that used to be taught in a Computer Science degree. Now they aren't taught, and companies are going to have to teach new graduate hires this stuff that people used to take multiple semesters to learn? It also used to be that Freshmen in college were expected to know how to conjugate verbs and compose grammatically correct sentences. Now TAs (I used to be one) are expected to teach these things to Freshmen. How is this not a decline in standards?
I think you're arguing that without good compiler authors, a typical application developer would be inherently unproductive. Which is true — compilers are necessary tools, and it's necessary that they be good.
But what I was suggesting is that one does not need to be capable of being both a compiler author and an application developer (if, for the sake of discussion, we avoid any semantic arguments and treat these as generally different things) to be of good value. I don't know how to, say, write a proper lexer, or write any assembly worth any salt at all, but I can write what I consider to be good, reliable application code at a reasonable level of productivity.
>People don't tolerate compiler bugs very well. I'd say there's quite a bit of overlap.
I don't see how your first point is related to the second. The goal for most webapps is to get something that works most of the time. Most software engineers simply don't need to worry about their third nine, much less their fourth
My school's primary languages were C and C++, and a smattering of java because it was just getting popular. Certainly the low level understanding that comes from writing a lot of code in those languages is helpful.
But the imperative/procedural mindset that it drills into you leads to some really terrible application code, and it takes a lot of exposure to higher level languages to break out of that mindset. It took me years. Switching to ruby was like starting from scratch.
By all means hire a C++ programmer to write your web app. They'll be able to debug your performance issues ricky tick. But also be prepared for some heinous procedural js/ruby/php/clojure/elixer/whatever.
By all means hire a C++ programmer to write your web app.
No one is advocating that anyone write web apps with C++. There are other kinds of servers. The complaint is that the once generalist value of a Comp Sci degree is now dumbed down, and grads are missing background knowledge they once had.
But also be prepared for some heinous procedural js/ruby/php/clojure/elixer/whatever.
I think you are making a few assumptions that don't hold anymore based on your own CS education (I'm obviously making an assumption there...). I got my CS degree in 98 and there was a strong emphasis on C, C++, algorithms, and systems programming at the time because that's where the jobs were. We only had a cursory overview of other programming languages and paradigms, and no assembly - there were no jobs there. Scripting languages were for unwashed systems administrators, and no real programmer would touch them. Functional programming was a weird little academic thing with no future. OO was "if it is a noun make it a class".
There was a fairly good chance you would end up needing to write your own data structures, algorithms, sockets code and come up with a network protocol. You would run compilers, linkers, etc. Basically systems programming lined up with the job market.
Naturally after that I thought that was the "proper" way to teach CS. It worked for me. I got a jobs doing things I learned.
20 years later, I literally haven't run a compiler in years. I use libraries for data structures. I don't need to worry about allocating memory, billion dollar industries run on scripting languages. People are passing functions to functions that return functions like that's how it's always been.
I guess my point is "generalist" education needs to evolve with the industry. That means spending less time on low level details so you can spend more time on the tools, techniques and concepts used today. It isn't a "dumbing down" - it is changing the mix. You can only do so much in 4 years. What was "generalist" 20 years ago is "specialist" today, and it should be.
It isn't a "dumbing down" - it is changing the mix.
When it's leaving out background information, it's dumbing down. Programmers should at least know the basics of how indirection works. Why is it that so many interviewees with gold-pated GPAs would tell me null pointers used up no memory? Do they have the foggiest idea what happens when they add a member in a Python/Ruby program and how that differs from adding a pointer to a struct?
There's a difference between having the background information and treating everything as if it's hazy magic. It's excusable for the buyer of a car to treat the product they've bought as a magic black box. It's inexcusable when a "mechanic" or "engineer" is incapable of doing anything but treating things like magic black boxes.
Scripting languages were for unwashed systems administrators, and no real programmer would touch them.
But all of the smarter people in my program knew two or more of them.
no real programmer would touch them. Functional programming was a weird little academic thing with no future. OO was "if it is a noun make it a class".
I worked for a company that had to fight against those prejudices and low levels of knowledge to sell licenses. We sold licenses to Fortune 500 companies so they could run billion dollar businesses on a "scripting language." You know what prepared my for working there? A generalist Comp Sci education!
20 years later, I literally haven't run a compiler in years. I use libraries for data structures. I don't need to worry about allocating memory, billion dollar industries run on scripting languages.
But you are a savvy user of those libraries because you have the background knowledge. You don't usually need to worry about allocating memory, but you know what the gotchas in extreme corner cases are. And if you had to have a custom library written in C++ for your dynamic language application, you'd know how to spec that out and hire for that while looking out for the details. I had at least a foggy idea past the buzzword level when I graduated. How about the kids who are graduating nowadays?
I got my CS degree in 98
In 98 I was in grad school.
You can only do so much in 4 years. What was "generalist" 20 years ago is "specialist" today, and it should be.
Here is what I see in way too many recent grads with a 3.75 GPA. They don't know any of the background, past a handwavy level. They have misconceptions that are outright wrong. Many of them seem to spend 4-5 years doing nothing but using libraries and gluing stuff together. Hell, we learned that stuff too -- but we learned a bunch of other stuff at the same time, plus we learned what we didn't know and what to do about it. Then again, there was a contingent who only cared about learning X-Windows, because there were lots of coding jobs in X-Windows. Aren't the people who only learn particular technologies that are in the job market the moral equivalent?
Comp Sci is dumbing down to the level of consumers of magic tech. I know engineers and physicists who would have some idea of how to begin to recreate the tech they use if civilization would fall. I think a lot of Comp Sci graduates, if they wound up with nothing but machines running machine code, would qualify for Golgafrinchan Ark Fleet Ship B.
Deeply nested if/else logic, very long functions. Imperative logic that could be better written by higher level constructs like function composition, complicating code with micro-performance hacks, etc. In general inflexible code.
Of course not all of this is because I leaned C first. A lot of it was simply due to being a new programmer. But this kind of code is more prevalent in general in the C world. Just browse some opensource C projects.
I believe a lot of this stems from the "systems programming" mindset that goes along with learning C and C++. The requirements are very precise and well known, and don't change often. There is often a fairly precise "right answer" for how to do something where the "right answer" is some combination of performance metrics. Compilers are like that, file systems are like that, tcp/ip stacks are like that, etc. The programming boundaries tend to be very bright.
The "systems programming" mindset is a liability when writing business apps where a sales person can blow up every assumption and design decision and boundry in one day. The "right answer" is not clear, and not easily measured. The "right answer" has more to do with writing code that is flexible and easy to change. That is hard to measure and a totally different way of thinking.
But what you are citing here isn't a problem with learning C and C++. It's a failing of a generalist education. You might have known enough to avoid the gotchas of concurrency, but just out of school, you didn't know what you didn't know about business application development.
> Since when has our "field" become so flubby that we're now eschewing the notion of background knowledge?
I think the issue is more that the skillset follows the money. If you are a top web developer who can work in adtech/fintech/e-commerce and contract then you will make far more money doing that than you ever would in pure systems engineering. It's not so much that background knowledge is fluff, moreso that their focus is probably a lot more scattered than it used to be, and their background knowledge is perhaps in other places.
>Programmers who can write their own compiler are much better C and C++ programmers, because they have relevant background.
Sure, sure. But how many companies care about that sort of thing? C is rather my best language, and as far as I can tell, that helps me out as a sysadmin, but I need to be good at some EMCAscript based framework if I want a higher-paying Software Engineering job, at least outside of the embedded space (and I'm not that good at C. Also, most of the embedded types I know don't get paid SWE level salaries.)
I knew CS graduates who loved parsing theory and could implement some basic C compilers, and I knew some who didn't know what a SCSI controller was. You get out of your education what you put into it.
I had someone in an interview in Berlin ask me to write a garbage collector. In CSC 202, our professor talked about using reference counts. Reference counts can get leaks if you have two objects referencing each other with to path to either, but what I didn't realize is that Java hasn't used reference counts in a really long time. It does a search from each root (typically a thread) down the object tree; and it breaks things into young/old (eden space and .. something else) so long lived objects don't get searched as often.
I learned all of that during the interview lunch break when I looked it up on my phone. One of my good friends wrote a compile time GC for GO and did this PhD dissertation on it, and he probably would have got this question right. But if it's not in your field, well the problem space for problems is pretty fucking large.
Cycle detection? Man I could probably tell you back when I studied minimum spanning trees and wrote this thing to implement Dijkstra's shortest path:
Off the top of my head, I'd hope each node had a unique identifier and I guess I'd mark them/store the keys in a hash table. I'd move breath first and error out if I discovered the same hash/unique key .. which of course would give me a hash the size of the tree. Unless there's a way to mark the data structure, you now have a second structure the size of your key space.
I'm sure there are better solutions, but I wouldn't expect a senior program to know them off the top of their head, unless you're hiring really specifically for a position writing routing algorithms or looking for a senior airport transit architect.
If you come up with something else (like tagging nodes), you get a strike for inefficiency.
It looks simple enough to feel smug about once you know it, at the same time there's near zero chance the interviewers could come up with this algorithm independently without prior knowledge.
When people say they expect someone to implement cycle detection, they usually mean Floyd's algorithm.
Not necessarily. Sometimes the dataset only has on the order of 10k nodes, and you just want to warn users if they create a cycle, or keep particular routines from going into an infinite loop.
If you come up with something else (like tagging nodes), you get a strike for inefficiency.
A few days ago, I implemented cycle detection in an event notification system where the graph size is relatively small in just 4 lines of code, which should be immediately understandable by any competent programmer. That you should mandate Floyd's algorithm in all cases gives you a strike for pedantic design without regard to cost/benefit.
I just set a "visited" bit in the data structure while the code walks it. There's a cycle if the bit is already set. It's the same algorithm as yours, but a lot cheaper.
Perhaps cheaper in space, but you will have to do an initial pass over the entire structure to zero your visited bits first, so it may not be cheaper in time.
The GC already needs to sweep through the whole allocated memory to find the unreacheble objects to be freed. Flipping a bit while doing this isnt a big deal.
Since the first pass quit when it finds the first marked node, the zeroing can traverse the structure, erasing marked bits, and stopping at the first 0. The entire structure doesn't have to be visited.
I had someone in an interview in Berlin ask me to write a garbage collector. In CSC 202, our professor talked about using reference counts.
Jeez. I think my profs covered ref counting as an introduction, then also went on to cover mark/sweep and generational collectors. We also covered compaction, copying and bump allocation. I don't think it took that long. If I were running a shop that focused on, say Java, I would want to know if my new hires knew background information relevant to optimizing code running on the JVM.
Cycle detection? Man I could probably tell you back when I studied minimum spanning trees and wrote this thing to implement Dijkstra's shortest path:
Someone should have given you breadth first search and depth first search, then ran you through how those are components to other algorithms. You should have been left with those as a "toolbox" such that you automatically spend a second thinking, "what would happen to that graph if I did DFS or BFS on it." That kind of toolbox is powerful and gives you all sorts of useful insights. You are not the only one around here to say, "Man I could probably tell you back when..." What you should be thinking now is that you were not well served by some of your teachers. Fortunately, this sort of thing is easily rectifiable.
Off the top of my head, I'd hope each node had a unique identifier and I guess I'd mark them/store the keys in a hash table.
Well good on you. You just did better than most of my last 6 interviewees.
Unless there's a way to mark the data structure, you now have a second structure the size of your key space.
This is the DFS/BFS part right here. Is the 2nd part of your statement likely to be true, and how often would it be true? Good call on the marking. (EDIT: Just thought of it: Bloom filter.)
I'm sure there are better solutions, but I wouldn't expect a senior program to know them off the top of their head
Cripes! Freshmen used to be able to do this stuff!
unless you're hiring really specifically for a position writing routing algorithms or looking for a senior airport transit architect.
How about you don't want programmers who will end up debugingg an infinite loop induced by a data cycle every single time?
EDIT: I actually just wrote cycle detection for my side project in golang the day before yesterday. It took me 4 additional lines of code. If you have a shop that uses gob or some kind of object serialization, this may well be very relevant!
I had someone ask me the cycle detection question once, and I didn't care for how they phrased it. Specifically, should I find it immediately upon entering the first cycle (at a higher memory/time cost) or should I just eventually detect it (e.g. turtle/rabbit)? It was on me to clarify, but as a newb to the industry, I felt like I should just know which was expected.
And either way, if I were to ask this question, I would spend a lot of time helping the person along the way to make sure they were able to make the logical leaps that made sense to them, not to me.
Here's how I asked the question. I would present the data for a 2 node cycle and ask what would happen to this routine. Here's a response I would get far too often: A conditional clause detecting a 2 node cycle. Then I would present a 3 node cycle, then ask how to detect an n node cycle, period.
Lots of 3.5+ GPA grads can't make that logical leap!
You're trying really hard to find a connection between implementing a GC and debugging memory leaks... and failing.
The whole freaking point of a GC like say Java's is that an average programmer can use it without having to understand how exactly it's implemented.
Of course it won't hurt to know that, but it's not at all mandatory knowledge.
One just has to know which situations the GC can't cope with and avoid them. For Java there's at least one open source dedicated tool for finding leaks, it nicely explains what one needs to know.
One just has to know which situations the GC can't cope with and avoid them.
Unfortunately, many programmers believe that since Java uses garbage collection, you do not have to think about GC and ownership at all.
Oracle had to replace the fast implementation of substring that just returns a slice of a String (O(1) time) by a copying implementation (O(n)), because too many programmers do not know the basics of ownership/garbage collection and would accidentally hold on to larger strings.
Seeing the implementation details of reference counting, mark-sweep collection, and perhaps a generational collector once, makes you more aware of memory and ownership issues, even if you forget the nitty gritty details later.
You're trying really hard to find a connection between implementing a GC and debugging memory leaks... and failing.
I spent years at a vendor for a Virtual Machine. That you would compose such a sentence shows that you are ignorant of some aspects of optimization. You don't even know what you don't know, and projected that ignorance on another.
The whole freaking point of a GC like say Java's is that an average programmer can use it without having to understand how exactly it's implemented.
One of my company's most frequent consulting tasks was helping clients optimize to maximize throughput for the generational GC. That you jumped to the conclusion that I was talking about memory leaks is pretty damning.
I got this just two weeks back, fully up to date 2015 MPB. I was shocked that this could happen; I could even click around on the desktop for a few seconds before it locked up, and I had to hit Esc for the login screen to appear. I didn't even realise it was locked the first time it happened. A very visible reminder about software quality at Apple.
I think this is because the Mac doesn’t lock at all by default. The actual behavior seems to be “prompt for password after screen saver is interrupted” so that flash is the time it takes the OS to realize the screensaver just turned off and it’s time to ask for the password on the lock screen.
When I leave my Mac I always do a ctrl-shift-eject/power to force the lock immediately.
The saving a screenshot and immediately showing it is a UI trick to make waking up/unlocking look faster, gives the user something to look at while the apps are actually restoring to running state. Just they didn’t think this one through or test it adequately
This sounds like the kind of applicant who would complain that fizzbuzz has no real world application. Despite the attitude in that article, there are real applications of cycle detection.
It sounds like the kind of applicant who understands the problem far better than the interviewer. So go on then: once you have detected it, what algorithm would you use to repair it? Wouldn’t you consider its existence to be a bug in the way the list was constructed?
It depends too much on the context to give a general answer what to do with it. It can easily not even be an "error," like in the first example on this link where you're simply testing the strength of a PRNG. Another would be if you're writing something to represent reals in decimal and you want to see where your number loops, like 1/7 = 0.(142857).
Examples where it's not desired, and what to do: Detecting infinite redirects in browsers and stopping the loop. Detecting thread deadlock and terminating the process. Detecting looping references in an Excel spreadsheet and showing #ERROR in the cells instead of letting the process hang forever.
My MacBook does the same. I was messing around earlier this week taking a video of it as I thought you could capture private info from it and while I couldn’t do it from my iPhone with a couple of tries I’m fairly certain you could with a higher quality camera. It’s a ridiculous security vulnerability.
> When my Macbook Pro is "locked" it flashes an image of the desktop screen just before switching to the login prompt.
It's the same thing on Android when you are switching users. How something like this can get QA is crazy to me. That is certainly not the way it's supposed to work.
Finally someone else who noticed this! I took video and it is indeed realtime. There has to be a way to exploit this obvious oversight. This is with filevault on as well.
Most charitable explanation is that OP is just adding some color to their characterizations. Least charitable explanation is that subconciously or not OP perceives young "females" in positions of authority as being unfit for their positions and likes to add plausibly deniable fuel to that flame.
Duuuuuude my work macbook freezes at least 3 times a week when connecting up, I thought this was just my issue, I feel like we're part of a brotherhood now.
Serious question: Why didn't you mention the genders of the the people from "the "good old days"" but had to say over and over again that you were dealing with "Middle manager woman"
I'm confused what the point of this question is. Would you have found it more satisfying if the parent had said "I had a couple of [male] contacts from the good ol' days"?
People do it when subconsciously or consciously trying to paint a picture with stereotypes. So if someone wanted to illustrate they were talking about a testosterone filled ego driven 20 something, they might include the term young male manager in their descriptor.
Is it right or wrong? Probably not great. I don't do it. But I get how it happens. Probably not nefarious, just a little tonedeaf.
True -- though when I think about it, I imagine them using a more specific male adjective like "bro" or "dude." "Young male manager" would have triggered a "why are they mentioning male?" and yet "female" slipped under my radar until another comment called it out.
I wonder if that's because you have a stronger stereotype for male than for female in your mind (not to suggest that you use it, just that you have one or are aware of what others might have as their stereotype). I don't have strong generalizations either way but especially not for the term female, which is why I never understood when people got uppety about it to begin with. Now that I understand some people have a negative stereotype attached to their usage of the word I understand why people get annoyed. But it is a bummer as its all so personal and subjective that no matter how careful or sincere you are you could tread on someones toes and convey the wrong message, as I suspect the parent comment has done.
I don't know, but if that's the intended question then it'd make sense to ask that (which someone else already did, so I recommend upvoting if you're interested).
The question whose point I don't get is the one I responded to. It seems absurd to ask "why did you do X but not Y?" when doing Y would've only left you more dissatisfied.
The subtext, whether it's intentional or not, seems to be that the gender of manager that he (yes, assuming OP's gender) spoke to is relevant to the negativity of the experience. The message can be condensed down to "female + new Apple = bad, male + old Apple = good"
What I'm baffled about is why the comparison was made against the contact's gender. How would involving the contact's gender have helped? Or if that's not the question that was intended, then why not ask the actual question that was intended?
I don't know, but what I'm asking is: if your real question is "why mention X's gender?" then why instead ask "why mention X's gender but not mention Y's gender?"?
I think the point is to make the difference more stark; merely saying "why include the manager's gender" doesn't show the inconsistency between calling out the gender in the case of the employer the writer is unhappy with but withholding it in the case of the employee the writer is happy with.
> I think the point is to make the difference more stark; merely saying "why include the manager's gender" doesn't show the inconsistency
But that's the thing -- is the GP's problem really the difference/inconsistency? Is that really what they're trying to highlight? If it is, then I agree, but that's what I was trying to clarify, because it seems more likely to me that making it more "consistent" could've left them unhappier.
I might not have been clear; GP's (GGGP's?) question was rhetorical; their point was to make a point, not to actually suggest a course of action that they preferred.
To put it another way: suppose I ask someone "why did you forget my birthday but not his?" It's not that I want them to forget his, too (though that is one literal interpretation). It's to suggest that the answer isn't merely "I forget birthdays in general."
That would be weird in this situation since "I mention every gender in general" would be quite a bizarre response (and behavior)... and arguably even more fuel for subsequently accuse the speaker of sexism if that's the intention.
Your comment seems to be against multiple HN guidelines, particularly those regarding assuming good faith, not post shallow and dismissive comments, and the rule regarding ideological wars.
Sexism is one of the biggest tech issues of the past several years. I don’t think asking someone to consider why they used a specific word– even in “good faith” is a problem.
Gimme a break. Their comment is clearly not shallow, nor dismissive and is raising a legitimate, albeit fairly mundane, point in my opinion. And I’m sure we can all try to hold ourselves to a high enough standard to not initiate an “ideological debate war” over that sort of comment.
It’s not so much the half keys that makes it obnoxious, but having the left and right keys be full height, and not having a gap to orient my fingers in the keys means I’m often pressing “shift” with my middle finger thinking it’s “up”.
I would be fine withh the half height up and down keys if they were between half height left and right keys. For me it’s the lack of tactile feedback telling me where he arrow group is that sucks big-time.
Put back? They've never had full sized up and down arrow keys on the MacBooks. What I miss are the half-sized left and right keys - the empty space above them made it easy to position my hand over the arrow keys without looking.
I'm with you here — no generation of MB/MBP has ever had full-sized up/down arrow keys [1][2]. I guess perhaps people are meaning how the left/right arrow keys went from being half-sized to full-sized? I'd like to half my half-sized left/right back...
Good point, just took a look at the picture of a 2014 mbp which I had previously. Something about the older up/down arrows was much better, and I can’t put my finger on it (literally, since I do not have the 2014 anymore).
Interesting, I actually like the half arrow key thing, I often use the arrow key with my pinkie, makes it easy to switch between up and down. To each their own I guess.
I'm curious what you use them for? I've always prefered WASD for directions (too many games in my youth!), and always saw the arrow keys as a waste of space, too far away to use regularly.
Classic case of why business managers shouldn't be directing design of a product.
Business managers sees .2mm shaved off the thickness of macbook pro to 'justify' the premium cost of it over other laptops while a developer/pro user of the laptop would increase the thickness to gain battery and maybe mechanical/mesh hybrid keys... I would be more than happy to pay for a pro version of the current macbook pro.
EDIT----
Instead of apple doing crappy warranty repairs to make up for their crappy design, they should give us mechanical/mesh hybrid keys and shoot the person who thought .2mm was worth such a shit keyboard....
It can be both, and often is. Honestly, I'd be surprised if Apple's continued push for thinness comes from just a single team. Design pushes for thinness, because hey, it looks pretty damned impressive. Engineering eventually figures out a way to make it work after (I presume) screaming a bit. Which is impressive.
Marketing and the business side love it because it's a concrete metric they can point to in order to justify it to consumers and it doesn't require any specialized knowledge for consumers to understand. Thickness is simple. Everyone can understand it, and some people can even use a ruler to measure it themselves.
And consumers continue to choose to reward them by buying thinner devices. After all, if we can't have flying cars, personal jetpacks, and live like The Jetsons right now, we damned sure can have thin devices as a consolation prize. Consumers would probably buy these devices even if they weren't thinner than the generation they replace, but it all gets tied together. Not to mention the pesky problem that the biggest problems (lack of repairability, keyboard switches that are deathly allergic to dust, etc.) aren't really noticeable until something goes wrong. And then, it's "my MacBook is broken" and not "my Macbook is broken and the ultimate cause is a zealous focus on thinner and thinner devices."
Maybe at some point that was true, it's not true anymore. Unless you're getting a gaming laptop, pretty much every ultrabook is only mildly thicker than the MacBook Air or Pro, some of them are thinner.
Very unlikely. Ive is the successor to Jobs on the design front - what he and his team want to build chassis-wise, they're likely to build. He's got a great deal of political capital in the organization because he's been such a hit maker.
My semi-informed understanding of how things work at Apple wrt hardware is:
- Industrial design makes cool concepts, showing whats possible in hardware
- Eng & UX makes cool features, showing whats possible in software
- Product marketing determines mix of features that make for a compelling release
- Marcom figures out how to pitch it
Obviously lots of back and forth as a particular concept of what makes a compelling release is refined and roadmapped, and many other teams involved. That said, typically product marketing is driving what gets released.
As to the importance of product marketing itself, its one of Apple's greatest strengths because the vast majority of electronics/computer companies have a hard time figuring out a) what different segments of the market might want, and b) what is the intersection of possible AND useful with technology.
An easy way to determine if someone knows what they're talking about wrt Apple is if they complain about a monolithic 'marketing' bogeyman, because it shows they don't understand the nature of how products are built at Apple, nor how products succeed in the market.
That said, they occasionally miss. The keyboard has been generally well-received (tho a bit polarizing) EXCEPT for the obvious quality disaster. Not sure where the breakdown was there. And the touchbar has been very polarizing among the pro segment - the choice to bundle it with the high-end machines for the developer market segment was a miss likely entirely on product marketing.
Considering it reinforces a commonly held engineering ideology that only engineers understand or are capable of solving a problem or goal, and every other job function in a corporation is clueless deprivation of engineering, I’d say the response is fair. Apple is quite successful despite organizing in a way that baffles engineers, and will spend a bit recovering from this, then everyone will forget about it within a few Christmases. That’s not even advocacy, that’s just, like, exactly what will happen. It’s dust in the path of an asteroid at this point, with well-documented prior art. All of us with the keyboard get a bum deal, but events are probably proceeding exactly as engineered, if we are all honest with ourselves.
The correct following step is “what can I learn from that,” including, yes, many pitfalls and questionable decisions to learn, rather than “I know exactly what’s wrong with that success, and it’s business leaders.”
I realize the irony of saying this here, as many are at least amenable to said ideology. It also doesn’t strike me as an intentional color, just subtle thought basis underlying the sentiment.
I don't know. A lot of people are searching and holding out for non-Mac alternatives, because Apple ruined the calculus of "$200/yr premium in exchange for a device engineered to satisfy needs and comforts you didn't even realize you'd love", and replaced it with "No way I'm paying a fortune for a physically painful and unreliable keyboard that missing one of the critical keys just because some industrial designer was wants to feel special."
None whatsoever. Given Apple's very well known proclivity to elevate their design/product teams, it's probably more likely to have come from someone on that side of the fence anyway. But it's pretty popular to deride non-technical people (especially those in management) as know-nothings, so it's an easy reach.
This is absolutely not the way Apple works. "Business" people don't make decisions like this at Apple, they get made by the Industrial Design team, backed by the Product Design (mechanical engineering) team. Business people market the products, price the products, procure all of the components and make sure they are put together in the millions, but they absolutely do not set the design direction of the company.
idk - do consumers really notice if a laptop is 30grams lighter or if it will get them through a full day of work and won't hurt the joints in their fingers after a marathon of typing.
Apple makes the best laptop, no doubt but this laptop sucks and the direction they are going means it's just going to get worst.
Generally speaking, by the time consumers notice (or not notice, as the case may be), they've already purchased the machine. So it's largely irrelevant.
This is what happens when you do user testing and the users you primarily test are only internal folks.
Apple, next time: take your prototype, throw a handful of fine sand on the keyboard and shake... if it can't be fixed by customers (via canister air or removing/cleaning the keys), then rework.
P.S., I am tying this now on this absolute crap keyboard with multiple letters sticky and repeating.
I have a 2012 MacBook Pro at home and just recently got the newest one at work. While I don't love the Touch Bar, I don't hate it like most people. I don't use any `esc` heavy apps like vim so that's a pretty big factor. The most annoying thing is honestly the lack of volume buttons.
When I use my old MacBook Pro at home the keys feel slow and spongey. The new ones are fast and crisp. I guess I've been lucky enough to not have any of them fail on me.
I concur, although we seem to be in the minority. I love the feel of the new keyboard and can type faster on it than my old mbp keyboard.
Having said that, mine has developed the issue of sticking keys, for some reason it's the g key that repeats for me. I'm happy they've acknowledged this issue, I'll be getting mine fixed under warrantee.
Does it drive sales? Couldn’t some other stat be used to pad their presentations? Any thickness less than an inch is fine with me. I’m not concerned with whether my laptop fits in an envelope.
I don't know why so many people are under the impression that they do thinness for thinness sake. They do thinness for lightness sake.
The lighter your laptop (or tablet, or phone), the more you'll choose to not leave it behind but rather to just lug it around with you, and so the more places you'll have it and the more it'll be there to aid your productivity in random situations. [Also, for phones and tablets specifically, the longer you'll be willing to hold it up to your face to stare at it before putting it away due to the "gorilla arm" feeling.]
Sometimes I actually carry my MacBook around in my backpack when I'm just downtown for a meet-up and have no plans to do any work. I get it out for the same reasons you might pull out an external Bluetooth keyboard for a smartphone—e.g., if you want to type a long response to an email, or need to type a snippet of something that's awful to type on a phone keyboard, like code. Except that this Bluetooth keyboard happens to have its own computer attached to it.
(I don't use it at home, though; at home, I use a Hackintosh with a real keyboard. Which happens to be an Apple Magic Keyboard 2 with the exact same butterfly key-mech in it that the MacBook and rMBP have. But Magic Keyboard 2s aren't getting gummed up left and right, because the butterfly key-mech itself actually works fine when it's given adequate travel height. It makes a huge difference; you wouldn't even think it's the same key-mech!)
> The lighter your laptop (or tablet, or phone), the more you'll choose to not leave it behind but rather to just lug it around with you, and so the more places you'll have it and the more it'll be there to aid your productivity in random situations.
I didn't know how true this was until upgrading from an old Lenovo T400 to a newer Lenovo Carbon X1. I'll regularly carry the X1 a few blocks to a library, park, or coffee shop. The X1 and my work laptop - a Dell Latitude - are about as thick and heavy as the T400 alone, so it's not even a question that I take both to a location when working remotely. Now I need a bag with 2 laptop slots.
> [Also, for phones and tablets specifically, the longer you'll be willing to hold it up to your face to stare at it before putting it away due to the "gorilla arm" feeling.]
Conversely, the Samsung Tab A has enough magnets in the back to stick to a fridge - this remarkably makes it feel heavier than the X1, especially when working on magnetic tables and surfaces. The Tab A causes the "gorilla arm" feeling almost immediately, making it basically worthless to me. To add insult to this injury, when it's stuck to my fridge, the screen refuses to turn on when rotated to landscape. So it can't effectively play full-screen Youtube while stuck to the fridge, negating the last use case I found for it. I think this is due to a magnetic screen-off sensor that all Android devices seem to have on the back, but don't know why this would only turn off landscape and not portrait.
I'm jealous sometimes of my girlfriend's Samsung ultrabook for this same reason. I think it's Samsung's answer to the LG Gram and wow, it's light.
If you want a really light machine, don't you have to ditch the aluminum? My 2017 MacBook Pro gets heavy, esp once you add the charger. I take it everywhere.
They're both similar metrics, and both just vanity now.
There are lower bounds of acceptability and luxury that were hit years ago, with much more important things to work on now like battery life and connectivity. Nobody cares about shaving another few millimeters or grams off the design if it means a frustrating experience overall.
The Magic Keyboard actually doesn't use the same butterfly key mechanism: it's a scissor switch, albeit one that attempts to emulate some of the feel of the butterfly key keyboards (but with more travel).
I think it's just an easy thing for them to market. All other things being equal, people love "thinner and lighter." The problem is things aren't always otherwise equal.
I bought a 12" MacBook when it came out because it seemed like it was a big advance over the previous Mac Air. I also bought the first generation Mac Air. So, obviously, you and I don't have similar criteria.
MacBook Pro users and MacBook users generally don't overlap. It's fine if you want a super-light, super-thin ultramobiles, but it makes little sense to compromise the Pro's usability in the name of thinness. I've been a Mac user for the last 7 years and for the first time I'm getting a non-Mac computer because the MacBook Pro is no longer attractive. It's a shame, because I have invested a significant amount of time and money on my current setup.
Are you sure about that? When I work at my desk I use a dock and some monitors, and all the heavy lifting is on a remote server. I suppose there could be a selection effect for workers whose employers force them to work solely on their laptop, but is that common?
If you look at derefr's comment, he appears to work like I do.
This is NOT a big issue for consumers. Apple is great at keeping customers happy, their 4 year warranty extension for replacing the keyboard shows this.
This IS a business issue. It has a quantifiable effect on profit margins for Apple's macbook, and I would look at it as an investor and see if it effects Apple's valuation in the short and long run.
The Note 7 is a fantastic comparison - the complete recall did not have to happen but did because of design choices from Samsung engineers. It could have been a $200 million dollar issue instead of a $5 billion dollar one.
Similarly, these keyboards did not have to be that expensive to repair or be so prone to damage, but being that way has a snowball effect on repair/maintenance/warranty.
It's also a design decision that they're kind of stuck with. They are not going to go back to their original keyboards (unless they release a mac book pro classic, which I would personally be estatic about) - and so the increase repair costs cutting into their margins is a challenge for them to solve going forward (especially since the brittle-ness is because of the thin design).
Not mentioned in this article is how many people also dislike the feel of the keyboard. Add that to risking an out of commission work machine and it’s a big deal for consumers.
Thanks for mentioning this. I much prefer the older style keyboard.
I used to split my work time coding at my desk and at a "comfortable" location (couch, chair, etc), but the thought of using the new MBP keyboard has me working almost exclusively at my desk with external keyboard.
> Not mentioned in this article is how many people also dislike the feel of the keyboard
Do you have any statistics on that? Anecdotally, almost everyone I've spoken to prefers the new keyboard over the old one, ignoring the key stuck issue.
I didn't have problem with MacBook Pro 2017 keyboard regarding dust. My major problem is, it is noisy. Using it in library is a huge hassle. It totally disturbs everyone around me.
Warranty and free repairs do not mean that it doesn't take time and effort to actually get done. Consumers would very much prefer to not have problems than to constantly get them fixed, even if free.
They’d get a lot less cynicism and a lot more kudos if they didn’t outright deny the problem exists until PR is so against them that they have no choice. They did it with the classic iPhone 4 antenna issue, they did it with the battery throttling, they did it with the keyboards. It’s a pattern by now.
I'm not sure how much a longer warranty will help with keeping customers happy if they have to replace the keyboard every year and be out of service for a week or so while it happens. Heck, I was annoyed enough when my iPhone's battery started swelling and I had to spend two hours at the Apple Store to work through the replacement process. If I had to go without it entirely for a week, I'd have been nuclear angry and the fact that the repair was free would not have been too comforting.
The horrible user experience that is the Mac Keyboard is so upsetting to me (granted, I'm a keyboard snob).
The original Apple ][ keyboard was one of the best I'd ever used. The original 1984 Mac keyboard was great. The kb's in the last few years (the Jony Ive era?) have been so unsatisfying. They work, mostly, but are hardly joyful.
I guess when Jony makes his solid chunk of depleted uranium that doesn't actually do anything, he'll be satisfied.
Glad to see him mention the Samsung debacle as well. When my Note 4 finally gave in to the eMMC problem earlier this year, I upgraded to a new-old-stock LG V20 instead of the latest greatness, because it was the last flagship made with a user replaceable battery. Over my Note 4's 3 1/2 year tenure, I replaced the battery annually. Each time, the battery life performance was restored to what I remembered from when the phone was new. I don't know what I'm going to do when the V20 dies, but it'll be a sad day indeed. I just have such a hard time believing that I'm the only one who wants a rugged (plastic or metal), repairable phone (the V20 can be [i]screwed[/i] apart ... no glue here) with a user-replaceable battery.
"User replaceable" is kind of a stretch given the amount of disassembly/reassembly required to change the battery. I don't think it can be considered user replaceable is if requires something 90%+ of users aren't comfortable doing vs phones where the battery pops in and out.
It would be awesome if Apple would release a MacBook Pro Classic, a 2015 MBP with updated hardware Similar to the iPod Classic that was around for a while after the iPod Touch was released.
I picked up my repaired late-2016 15” mbp this morning from warranty repair (replacement). The keyboard feels 100% better. Apple replaced my top case, logic board, and display system for free, out of warranty. I was very concerned about this computer’s longevity, I’ve even ordered a Razer mx cherry keyboard to use when at my desk to take some workload off apple’s questionable design decision, but the replacement keyboard feels much better. I want to be clear, my desire is not for repairability, like iFixit wants, but for durability and For Apple to stand behind it. The MacBook Pro is a fabulous computer, the keyboard and touchbar were mistakes, but I think they have fixed 1 of the 2.
When I first got my new computer back in late 2016, the keys had a distinctively inconsistent “clack” that would sorta come and go, and it felt inconsistent, maybe a subtly different amount of pressure required for different keys in different parts of the key. It’s hard for me to pinpoint it, but I’d say that the keyboard feels more consistent across the board. I’m all of 12 hours in on it, but I’m very hopeful. If anything, there is defiantly an improvement to the finger-feel with this new keyboard.
How long did the repair take? I talked to a guy at the Apple store who said they’d likely do it even if the issue is intermittent, but I don’t love the idea of not having my machine for several weeks.
It took 4 business days from drop off to pick up, but fortunately I got in just before the extended warranty was announced. I imagine now it will take longer, but that’s speculation.
Reasonable speculation, based on increased lead times when they offered cheaper battery replacements on iPhones. Although, few people would wait long to get a broken key fixed!
Dropped my MacBook 12 with an intermittent space bar off at the mall Apple Store on a Friday Afternoon. Got the email saying it was ready for pickup on Tuesday.
I'm not completely clear from the phrasing — Was the entire warranty repair free or only the things listed in the third sentence (free extras after paying for the repair)?
Ha you got downvoted but I admit that I sincerely want this computer to be fixed, having invested deeply in the Apple ecosystem. No question there is bias, but my experience also covers Acer and HP computers at work, which have real shortcomings, incomparable to my mbp. For what it’s worth to your snark, I used a Linux system as a pinch hitter while mine was in the shop, and very much enjoyed Linux but for the crappy machine. A comparable i7, SSD, longBattLife,retina quality laptop is always going to be $$$, so go with what you know, right? I’ve been tempted by the Surface Book 2, but since it costs about the same for comparable systems, go with what you know!
This is the frustrating thing. PC build quality has been such shit for so long that when Apple stumbles there's no real replacement. I'm frustrated enough with OS X that I'm ready to give Linux an honest try, but the hardware story is so bad that I feel a bit trapped.
My 2015 MBP is affected by an earlier production issue where interactions with the keyboard when opening / closing the lid damages the coating of the display (just lookup “Staingate” to see pics).
Maybe the thinner keys were ironically meant as a fix for this production issue in order to save them money on screen replacements.
I'm picturing one of those immensely obnoxious apple videos where ive is talking about some crap about frivolous keyboard superlatives, with some cgi fly by of a keyboard... uh.. such crap.
I recently was fixing a relative's early gen Macbook Pro which had the same keyboard as the Powerbook. The keyboard felt soooo nice. I really miss that keyboard.
I obviously don't matter, but I was spending several thousand dollars every two years on Apple hardware (laptops, and before that a Mac Pro desktop before it became a small cylinder that I can't upgrade). For one person the total amount I've spent probably approaches 8-10K in the last 10 years or so.
But not since the new MacBook Pro laptops. As a heavy vim user, the touchbar is an immediate dealbreaker. As a lover of mechanical keyboards (cherry MX brown both at home and at work), the keyboard is a dealbreaker. As a taker-of-photos and Lightroom/Photoshop user, the USB-C-only and dongle requirements is the final straw (although probably more tolerable than the other two problems).
There are other people like me (especially on HN). And now the fact that Apple doesn't seem to care about people like myself, and that they're happy to not take my money, really makes me wonder if they've gotten too arrogant for their britches. There really is a sense of Apple waging war against its users (as the HN parody site n-gate likes to say). I feel like I'm being asked to adapt to the above annoyances - and for what, an operating system that makes it a pain to stop processes that auto-update (Adobe), that has ancient versions of GNU utilities, that has numerous competing / overlapping package managers (brew and so forth)? Sorry, no.
Today I have a MS SurfaceBook (not without its problems but tolerable) and a custom build that's lightning-fast thanks to a Ryzen processor and M2 storage (a US $1600 that Apple would probably sell for $3500). For Unix I use MobaXTerm (Putty/PuttyCM/X/Cygwin/etc. all glued together) and cloud instances [edit: and CentOS VMs for work and personal projects; I don't have enough free time in my life for Linux on the desktop's high maintenance] .
I for one am holding off upgrading to see what Apple comes up with for their next Macbook. There are things that I want to upgrade to with the current model, like a brighter screen (for working outside), 802.11ac and a fresh battery, but that's about it. I don't want to risk spending thousands of euros only to gain frustration day in day out.
I wrote about my experience while looking for a personal laptop in 2013: https://ashishb.net/tech/the-weird-state-of-laptop-industry/
Little has changed since then. There is still a considerable gap, and no one is building a developer laptop. What surprises me is the amount of energy Google is pouring into Chromebooks, they could have made an excellent (GCP-integrated?) developer laptop instead. For backend engineers to front-end engineers to mobile developers, it could have become the defacto developer machine - provided the Wi-Fi, battery, and sleep/wake-up handling is as good as Macbooks.
Google is working on making Chromebooks more developer-friendly. Some of them support running full desktop Linux apps now, but it's still in the experimental stages and I wouldn't recommend buying one for that specific feature until it's more mature.
That said, I've been doing light webdev work on a Chromebook using Crouton (to run desktop Linux alongside ChromeOS, with seamless switching) and aside from difficulties with the MicroSD slot and apt-get on Ubuntu it's been quite nice. Getting solid, first-party Linux desktop app support would make Chromebooks a serious contender in the "cheap dev laptop for light work" space, and I think Google is working toward that.
Obviously, the hardware is going to preclude you from doing any serious heavy lifting, but I'm pretty excited to see what they come up with in another year or two. The battery life on these things is fantastic, plus some of them can also run Android apps.
> Google is working on making Chromebooks more developer-friendly. Some of them support running full desktop Linux apps now, but it's still in the experimental stages and I wouldn't recommend buying one for that specific feature until it's more mature.
Yep, I'm running Arch. Battery is great. On my work laptop (the 6th gen) I easily get >7 hours if I'm not doing things in the browser too much. Yes I know there are a million factors, and my general use case isn't the same as everyone elses. Let's put it this way, the way I use my previous mac and my new thinkpad, the thinkpad lasts around 1.5x as long (sometimes more).
In terms of WiFi. I haven't noticed it being worse than my mac, but I also don't think I've ever dug deep enough into this kind of thing on either platform to really say. Every time I've spent any energy looking into WiFi it's been on Linux.
That title is very clickbait but I agree with the article overall: nobody asked for these keyboards. They should have invested in thinner batteries or relocated some components in the laptops, or made thinner mechanical keyboards (Razer has good progress on that).
Scott Forstall was fired for less serious issues, but Apple has continued to roll on without firing any other high level executives (directly connected to technology) in the last few years. Someone needs to take responsibility for the overall sad state of the Mac, all the software quality issues on its platforms, and the hardware quality issues.
To outsiders, Apple seems like an opaque and uncaring company, and for good reasons. What Apple needs is to spend time on self-reflection from top to bottom and radical thinking to change "the way things are". Until that happens, I wouldn't expect groundbreaking changes. "Think different" is what it badly needs. The sooner, the better.
Unpopular opinion on HN: I prefer the new keyboard to the old one. Sure it feels a bit weird the first week, but after 8 months typing with it, I can tell you I won’t go back to the old one.
And yes it breaks too easily. And the touchbar is useless. But I like the keaboard.
I'm with you man. I walked into an Apple Store and tried the keyboard and said 'no go' but later I said I'd give it a try and it is just fine. Macbook for two years now.
I understand this keyboard is brittle, but I absolutely love the feeling of it: I have a hard time typing on my old MacBook Air after using my newer MacBook Pro for more than one year now.
If I could change one thing though, I would get rid of that stupid touch bar....
I have a MacBook 2012 with the same crappy butterfly keyboard that I barely use as I can't stand it.
I wanted to practice soldering and I picked up a TiBook from a junker, and somehow by just stripping it down, and cleaning it up I got it working. So I put it together, and I hate to say it, but for a 2000-2001 design I love it more than any new laptop I have.
The keys have enough travel to feel like a keyboard, the monitor is nice and big, and it feels like such a nice machine. Granted with only 512MB of Ram, and a 500Mhz G4 it's not going to be anything useful today, but It's such a damned shame that it feels so ergonomically nice.
I really like the keyboard DESIGN. Initially, I despised the keystroke depth, because my hands hurt after typing on it. But once I stopped typing so hard, I found my speed increased, and I didn't have any pain.
That being said... I still despise the keyboard for how fragile it is. The slightest mote getting under the keys is enough to throw it off, and I've had so many problems with keys that stop working on the machines around the office. And don't get me started on the dang touchbar.
All in all, it's not worth the problems. I'm lucky being in IT, as I can put off "upgrading" my main machine for a while.
While I do like Apple and their integrated hardware/software platform (see my problems with Dell XPS 15 reported in another comment), this also shows the problems with a monopole producer of macOS hardware.
I still run a 2009 iMac and a 2013 MBP. The MBP is still fine but I'd prefer to replace the iMac with a desktop with external screen. But there is no hardware worth its price!
The Mac Pro is great, but too expensive for hardware from 2013. The Mac mini has the same problem, but even worse, the existing hardware isn't fast.
This shouldn't happen if multiple companies were in competition for great Mac hardware.
Not a biggie. They made one design error, but who did not?
And managed it perfectly - free repairs, nothing to whine about, just dont eat cookies above keyboard and don't take it to the beach, treat your tools with respect.
I love iFixit. I used them extensively when I used to run a cell phone repair shop. I don't read them much any more, but they stick to their principles and provide amazing insight.
Since the iPhone 3GS, one of Apple's priorities has been to make their devices less repairable. They have made conscious decisions that actually make the device cost more, such as introducing pentalobe fasteners for their iPhone 4 partway into it's manufacturing run.
Now that Windows 10 has a built-in Linux subsystem, what advantage do Macs have any more other than signaling?
I'd gladly pay apple 300 dollars for a copy of OSX I could run on generic PC hardware.
macOS is for me, largely just easier, for most of the things I want to do, it 'just works' - in essence, I'm paying up front, for less effort later - at least thats how I see it anyhow.
Having used the Linux subsystem in windows, I dont think I could do any serious work in it, outside of say developing a linux console application, even basic functionality like ping doesnt work (in the version I have) - its great for lightweight testing, and to have a unix shell that you can use unix tools in (sed, awk, etc) but its not the same thing as a full unix.
While I'm sure that usability of Linux as a Desktop is better than it was a decade ago, at a glance, it doesnt appear to be better enough to replicate the experience I have in macOS.
In the end, I just hope Apple releases better hardware before I need to upgrade.
The whole analysis loses a lot of it's weight if one considers that since around 2010 companies actively focus on reducing(!) repairability. For the producer it's a desired feature. And the market doesn't seem to care that much. Yes, people complain, but then they still go and buy a new $1k smart phone every year or two. At least in a professional environment they do the same with their $1.5k Macbook Pros. Hell, some people buy a new one, because they don't like the stickers on top of their old one anymore.
Indeed. I needed to replace the battery on a 2014 rMBP and I was quoted $350 because that meant replacing the whole top panel: battery, keyboard, and trackpad.
The solution to this while keeping the same form factor is to create a macbook pro that is waterproof so that a keyboard can be washed out if any dust has accumulated below the keys.
The real problem is they have no competition, so they can do whatever they want. See the person describing their Dell XPS problems somewhere in this discussion.
All the various stupid decisions and quality problems add up slowly, but I'd hate for my 2012 mbpro to die right now. I might reluctantly go for something from Dell or Lenovo with Linux, but I'm sure I'll get a thousand paper cuts from things I used to take for granted but don't work any more.
In my org, within 1 year, we have found 4 faulty Macbook Pro laptop keyboards (2016 15 inch Model). Cost of replacement - $750 per keyboard. Just ridiculous.
> In the meantime, let’s give some other companies a shot. Dell and HP have gorgeous, reliable, repairable flagship laptops that are getting rave reviews. Right now, I think they’ve done more to earn your business than Apple has.
But when these laptops have an error or break, will their manufacturers extend an extra 4 years to the warranty? Probably not.
And that’s why I keep coming back to Apple hardware. The longevity of it is unparalleled.
The entire Apple universe is bizarre. I've not used an Apple product except for the first gen iPhone but I did love it when I did.
Since then, Apple has been behind the curve on every thing. I want to go back to using Apple since I really like their stance on privacy and security but feature wise they're usually 2-3 years behind a Samsung or a Microsoft. And now, this...
In terms of their computers, there has absolutely been a decline in quality, but in terms of their other product lines (and you specifically mention phones), I don't really think this is correct.
I mean in terms of phones, the iPhone X is pretty much top of the line -- are there other phones that exceed it in a meaningful capacity? Bezel-free design, face recognition, fake-DLSR like camera effects, etc.
There's plenty of phones on par with the iPhone that blow it away in the cost department. Android hardware has had good screens, good cameras, good specs, and plenty of screen unlock gimmicks (face recognition unlock was introduced ~3 years ago if my memory is correct), and all of that usually well before an iPhone ever did; the iPhone's only compelling feature is that the stock software is kept pretty slim (but even that is slowly slipping away as Apple loses focus).
What Android hardware severely lacks is manufacturers that aren't bound and determined to destroy their products with shitware. It doesn't help that the platform isn't obnoxiously marketed as a social/economic statement, either.
The iPhone’s CPU is literally years ahead of comparable Android devices, with today’s flagship Android devices comparable to an iPhone 6s in terms of performance. One or two other devices may have a camera that matches Apple in terms of a spec here and there, but iPhone is still the gold standard. The fact you call Samsung and others’ garbage face unlocking from 3 years ago gimmicks might be the most accurate thing in your post, but those never worked reliably and in no way compare to Face ID.
And then you finish by noting that most Android OEMs require you to use their shit version of the OS. Well then, real convincing argument when the most critical part of any computer is garbage. This place is insane...
The Pixel 2 camera absolutely destroys the iPhone X. I have both and I've done a lot of side by side comparisons and my results jibe with all the online reviews that pick the Pixel. Apple is way behind here.
And that's on the back camera. On the front facing camera the quality gap is even bigger.
I thought the "face recognition" on Android 3 years ago was 2D and easily fooled with a likeness printed on a piece of paper? Not exactly equivalent, like a lot of "specs" that don't stand up to deeper inspection.
> What Android hardware severely lacks is manufacturers that aren't bound and determined to destroy their products with shitware.
This, plus lack of/severely delayed software updates in the face of critical security bugs.
> and is likely what is contributing to their lackluster sales numbers at that price point.
I think the price tag alone has a significant impact on iPhone X's sale not withstanding the effect that the older iPhones (especially 6s) are holding up really well if you don't want or need a better camera.
> Since then, Apple has been behind the curve on every thing.
There are many examples to the contrary. Apple's mobile chips are still way ahead of the competition. AirPods are the gold standard for that product category. FaceID had a substantial lead on the Android world for about a year.
I don't think that's fair to say. Their hardware is consistently top of the line. They make mistakes like this, but they've been doing it since _forever_ once they get cocky (remember the cube? the puck?). But usually they get back on track after a while.
To me, other than raw specs (like, say, screen density), they're still winning more frq8 then not.
(And I say this as someone who avoids Apple like the plague).
I've got a 2012 MBP which doesn't have much in the specs department, so it never gets used. I use my early 2015 MBP from work for my work and personal stuff, I check MacRumours regularly praying that Apple will have come to their senses and make a proper MBP again.
When I leave this job I'll have the choice between using my 2012 MBP or buying a 2015 model off of eBay, it's a sad state of affairs, and every power user that uses a Mac laptop is in the same boat.
The other thing is that they really haven't provided a compelling reason to upgrade. Sure, I would like a faster CPU with more cores. But what I really need is more RAM and they don't offer that. My 2013 MacBook Pro has 16GB, just like the 2018 MacBook Pro. I would love to buy a model just like this with more RAM and an integrated LTE modem. A faster CPU would be gravy. I would even be willing to put up with no longer having a real escape key.
I was given a newish macbook at work, and the touch bar really is grating IMHO. I haven't encountered build quality issues but little things like not being able to easily hit ESC in Vim or F5 in Firefox really annoy me. I probably will get a linux friendly laptop from Dell for my personal use when my old one finally keeps over.
Apple Designed rather than Engineered their downfall. I never liked how engineering served form more than function in things as simple as thin power cords that always fail on me. One exception is the 13" unibody. I almost prefer they did it knowingly for accessory sales because it seems more rational.
What to buy instead? My 2015 MBP has been a wonderful machine to work on, and there's no way I'm getting one of these little gray turds. So, I'll be using Ubuntu for my daily driver, which is fine. But that last generation MBP is so nice, what other hardware compares?
That is just insane. We just can't trust apple et al to do good designs. It saddens me that common sense and goodwill isn't valued to the point that they are actively working against it.
I was absolutely ok with repairability of 2013 pro models. But what they did with new keyboard design is awful. I write this now from my Dell XPS 13, not from my MacBook Pro different models of which I used for > 5 years.
Writing from my 2017 MacBook Pro. Apart from the keys being louder than before and my nagging fear of an errant crumb or cat hair I'm happy, typing is crisp and maybe faster for me than on any other keyboard. I move my hands less. In comparison my regular Mac keys now feel loose and smooshy, almost as if they'd been produced by the Haribo Gummy Bear factory. (Not that this is a horrible feeling. I miss gummy bears; can't they make them out of xylitol or something?)
Have you had issues with OS X? My fiancée's laptop needs to be rebooted every time it goes to sleep, and both her and my own older (2015) MacBook Pro want to boot from the recovery partition every time. It's a pain in the ass and a terrible degradation of experience.
That sounds awful, but no, I haven't had that kind of problem before. Sounds like you might want to think of reinstalling/upgrading the OS? I've never heard of that issue though, sorry.
maltitol gummy bears do exist. Search for "sugar free gummy bear". Eat in moderation at first - some people have a very intense bowel reaction to them.
I always carry a mechanical keyboard (wasd without keypad) and a mouse alongside my notebook, I'm always amazed by the people who can work on notebook keyboards and use a trackpad.
Am I the only person on the planet that likes the shallow keys? Although it took a little getting used to, I can definitely type faster on the newer keyboard compared to the older one.
Considering how many times, according to the media, Apple has been working on its own downfall, I wonder if they actually meant "upfall" and my entire life has been a lie.
Macbooks are a joke. If you ignore the CPU upgrades how exactly is the new Macbook Pros any better than the 2015 one? I would argue they are actually worse in many regards
I bought my 2017 MacBook Pro from an Apple Store in Tokyo back in March. Would it still be worth buying Apple Care given the new warranty I'll have on the keyboard?
I'm still using a late 2013 MBPr, this machine is a tank, took so much beatings, and literally nothing broke in 5 years it runs as if it was its first day of use.
- Apple as a company does not have a single point of failure.
- Arguably the iPhone could be called that; if they stopped selling MacBooks next year it still wouldn’t be their “downfall” (not that I like this state of affairs as I like MacBooks).
I don’t really hate the butterfly keys either. I’ve been using it for over a year now on the pro, and I don’t know if it’s the second iteration, but it hasn’t failed me yet. I will agree that the first iteration was awful, my “new MacBook” keyboard broke because I made the mistake of taking it to the beach.
Apple losing the image of quality engineering will effect their entire product range, with consumers less willing to pay a premium and second guessing engineering achievements by keeping in mind side effects.
If the next iPhone is 20 grams lighter but I can’t make phone calls after 6 months, will I buy one over a equally powered Android $300 less?
I think the story of Coach (now Tapestry, ticker TPR) should be the ghost story they tell at every Apple corporate retreat. What do they say, "years to build and seconds to destroy"?
What happened to Coach? I seem to have missed that news, and from what I saw in NYC they are currently moving into extremely expensive real estate so I thought they were doing rather well?
It's generally believed that they diluted their brand by overexpansion into outlet channels. From 2012-2015 their stock got cut by more than half ($80 down to $30) while the broader market was up ~25%.
I would summarize what happened as they stopped being "cool". The (subjectively) not so attractive Coach bags covered with "C" branding were multiple hundreds of dollars and a luxury good. When everyone was wearing one and you could pick up a cheap last season one at the outlet mall, the cachet was gone.
The value of the brand is when you can charge more for something of identical quality, or that people ask for your product by name. Apple's brand is (should be?) golden to them and they should treasure it. It's not based on cosmetics but quality, too.
tl;dr - Coach is still around and recovering and making billions of dollars.
Thanks for the explanation, I think I had only come across the Coach brand after they expanded aggressively internationally which I presume came after the "cheapening" of the brand, which is no doubt a big part of their turn-around success.
The article mentions it at the end, by "downfall" they don't mean Apple is going to fold. They mean that Apple making such a difficult to repair laptop is now going to cost them a ton of money because the "keyboard" repair requires replacing the entire top half of the machine due to the design. The article argues that if the design had been easier to repair they would be able to do this otherwise minor repair much more cheaply.
I think that's a fair claim, even if the title is a bit overblown. Apple is paying the price for what is probably too big of a compromise in the design of the laptops.
Actually, if Apple were to stop selling MacBooks next year it could be the downfall.
How do you think that the iPhone & iPad ecosystem is sustained? Through developers (most of whom are) using MacBooks.
They have to port XCode to Windows, which means they don't get to have the lock-in they have now. If there's any flop for the iPhone for 2-3 years, the cost to jump ship isn't that high anymore.
From what I can tell, the newer iteration of the keyboard fails just as often as the old one. My sample size isn't huge (mid-high single digits on both iterations), but the result is consistent.
Its great that the keyboard hasnt failed you, but you have to know there are other people with opposite experience. When a product fails in a systematic way, its upto the company to remedy it (Galaxy S?, Toyota, J&J ..) One could argue Apple could have done more here.
hum I'm been ticked since they cancelled the 17" MBP and I am more than a little annoyed that I lost physical esc and function keys (Yes, I used them - in Emacs).
It bad enough that I'm now spending a non trivial amazing amount of time on non-Apple laptops, but soo many apps have no acceptable alternative on Linux, sadly (I buy Linux version whenever I can).
> In the meantime, let’s give some other companies a shot. Dell and HP have gorgeous, reliable, repairable flagship laptops that are getting rave reviews. Right now, I think they’ve done more to earn your business than Apple has.
...that don’t run macOS. Not a chance. Do hardware people not realize that running windows or linux (chromebooks excepted) is a usability or malware nightmare (or both)?
Mine is still going strong. Second Apple product I ever bought (first was an iPod many, many years back), and since I've been a bit of a convert! The battery only lasts about 2-3 hours now, so I replaced it with a new model last year. It did have issues with one of the keys but it seems to have come right. I much prefer the 4 USB-C ports over the array of various different ports on the 2012 rMBP, especially because I can plug the charger into either side, but I do miss magsafe a little.
Overly dramatic; Apple is doing fine. But they are definitely behind in appeasing the pro user segment with decent hardware and not doing nearly enough to fix that. I replaced my 2012 model (essentially the last model that had Steve Jobs' fingers all over it) late last year with the 15" model. So far so good with the keyboard but not looking forward to having to cash in my overpriced apple care and lose it for a week or more while Apple does whatever it needs to do to fix a problem that shouldn't need fixing.
What I want and what Apple sold me are miles apart at this point and they came very close to losing my business. Technically, I could be up and running using Linux and not lose a lot of time having to switch tools. Essentially everything I care about should just work. I do backend stuff. In the end I chickened out and had to replace my old laptop in a hurry and went for an easy upgrade to not lose time dealing with a different OS, configuring shit that shouldn't need configuring, shopping around, etc. That's what it means to be a pro, I just don't have time for that stuff. I don't even mind the steep price. I'm totally OK paying 4K for a laptop that I use every day for 3-5 years. Not a problem. Earns itself back within weeks. I bill more per week.
But I do want quality and performance. I'm getting neither with the latest model. It's fine but nothing special. Mind you, I have the tricked out version with the extra GPU memory, faster cpu, etc. But it's 16GB, just like the one it replaced. Has a slightly faster 4 core CPU, just like the one it replaced. I guess the SSD is a slightly faster but don't really notice much difference. The retina screen is nice but I would have preferred a non glossy one. I clocked my builds before and after and they're about 30% faster on the new machine. That's appalling for a state of the art laptop replacing a laptop that was half a decade old.
On top of that the keyboard absolutely sucks. It's loud, flimsy, lacks keys I need, the arrow keys are tiny, etc. Terrible job in form over function style design. The touch bar is a nice gimmick but not getting a lot of value out of it. Definitely not worth losing the function keys over. I don't think I would miss it.
If I were to shop right now and had more time to do so. I'd be looking to get one of those nice new CPUs with 6 or more cores (yes, I actually use them), 32 GB, or 64 GB even. Decent cooling so the laptop doesn't throttle itself every time I need some performance. A keyboard that doesn't suck. And less dependence on dongles would also be awesome though not super critical for me. If that means 0.5cm thicker laptop: great. Not a problem.
On a positive note, I know several people that have abandoned Apple. They seem quite happy with their windows/linux laptops but they seem to be replacing them fairly quickly as well. My observation with issues they are having is that competitors still have some catching up to do. You get issues with trackpads on linux, webcams that are in weird positions (dell), build quality issues, etc.
All that means in my book that all Apple needs to do is slightly up their game on the quality front and they'll be fine.
Did you read the rest of the article? It's very well explained, and makes some really good points. Of course they're biased towards one factor above all else: repairability. It's right there in the name of the company! But calling this a troll post isn't justified.
> In the meantime, let’s give some other companies a shot. Dell and HP have gorgeous, reliable, repairable flagship laptops that are getting rave reviews. Right now, I think they’ve done more to earn your business than Apple has.
The laptop mentioned in the "rave review" link is an elitebook 840 g5. I have been using the 840 g1 as my primary laptop for the last 4 years and it really is a sturdy device. The hardware is flawless, I haven't had a single issue with it. However, do yourself a favor and wipe it clean the moment you buy it, and install a fresh copy of windows. HP's added software is literally poison.
I'd have absolutely no problem using PC hardware. I'd love to be able to pick my components and build a better desktop for half of the price - in fact I already have such a machine for gaming purposes. However unless I can put macOS on it that hardware is useless to me.
As a web developer I want a unix environment and I'm not sure if Windows' subsystem for Linux is going to cut it.
However I also use a lot of audio software like Ableton Live as well as countless VST/AU plugins, which renders any native Linux flavor useless.
macOS is the only option and until that changes I'll continue to begrudgingly wait for Apple's infrequent and late desktop hardware updates that feel unnecessarily expensive.
I agree that Linux doesn't it cut it for audio, but that situation seems to be improving - I was surprised to see Bitwig, Renoise & Pianoteq all have Linux versions, and the JUCE SDK that many VST plugins use also supports Linux.
And with macOS deprecating OpenGL, which many VST plugins currently rely on for their interface, I'm concerned about the future of music on the Mac & how long I'll be able to use those older VSTs.
I can confirm that window's wsl and bash systems are moving along very quickly. I moved to windows about a month back and have only one use case for which I need to boot ubuntu up: docker with symlinked folders.
Apart from the above use case, I've missed nothing. There's some hacking I'm attempting to do to get vscode to use the underlying bash mechanism for its code linting tools but that's more a preparation for the inevitable moment when some linting tool or package depends purely on a Linux/Unix environment.
Twoish years ago I moved from
Linux full time to Windows at home and whatever the Apple OS is called at work.
The Apple one was a pain. Command line tools are almost but not quite right (zcat in particular just doesn't work) and I found the user experience pretty annoying at first but passable after that. Still don't really understand how "installing software" is supposed to go, and lots of struggle against Homebrew and lots of other software packaging.
Windows had WSL when I got my laptop. Real Linux command-line, nice. Filesystem performance was crap, and they didn't have postgres at first, but it was probably a better dev environment than my Mac, with one exception: no Docker in Windows 10 home. Explorer started getting a bit slow and unstable after a while, but I was on preview releases.
A couple of months ago I quit my job, and yesterday I formatted my home laptop to install Linux again (decided to put my hobby project onto a Docker-heavy workflow) and oh my God it's lovely. Installing packages is fast again, software compatibility goes without saying, and Unity was surprisingly pleasant (though I did drop it for i3 because I'm a masochist.) Hard to say how much of that experience is "Linux is nice" and how much is "I'm a Linux person" though.
Linux doesn't cut it for me after I've tried a BSD. It isn't as bad as Windows, but BSDs have a certain kind of elegance. And because the things you can do on them are limited (browsing and coding) and there aren't as many packages you are much more focused.
The issue with Apple's unix environment is that it comes with BSD tools. A lot of functionality that you'd expect to work doesn't end up working, which can be frustrating. You have to go through some pretty annoying steps [1], which are more involved than the steps required to get a Linux environment on Windows [2].
The fact that MacOS/Darwin has BSD utilities instead of GNU is a common gotcha. Not a showstopper though, `brew install coreutils` will get you the tools you are used to!
Yeah, updates is why I really dislike Windows. The rest I can live with. I prefer Ubuntu + i3 or Mac OS X, but if I have to use Windows, then it’s mostly doable besides the updates. Be it updates to the OS or Visual Studio (which is why I do use it; Xamarin); random breakage and it takes far too long and happens far too often.
Well... about 3 months ago my Windows 10 machine got stuck updating. Basically it just kept endlessly updating the same update consuming huge amount of resources;
At first I figured i just let it do it's thing so i waited and waited and just kept it running over night. After a few days i noticed it was the exact identical update it kept installing over and over.
I searched online, and many people were having exactly the same problem. At that time there wasn't a solution yet. So in the end, i just gave up...
- i'm a developer btw so i have at least some technical knowledge :)
(can't remember the exact KBxxxxx number unfortunately.)
Have you explored the multitouch gestures for all this? I work on a 40" 4K monitor and I can't live without OSX's window management exactly for that reason. Windows wants you to fullscreen everything, while OSX has nice little touches like making the windows subtly magnetic or letting you resize symmetrically by holding alt.
Not to mention OSX is the only OS where you can drag the file icon out of a window's title bar to do stuff with it (eg. upload a file you have open) or right clicking the title to see the folder structure it resides in.
All these little affordances exactly where they need to be, and invisible when you don't need them. In my experience OSX is designed with a degree of consistency unheard of elsewhere, and that's why it gets accused of being unusable: people keep looking for the crutches you need on other platforms instead of just interacting directly.
That's actually quite subjective. I've had it on the desktop on and off for decades and it's constantly been a source of unhappiness. Linux had been the solid rock for me, and MacOS was good for a while.
The things I mentioned are what constitute the superior UX that you get with Windows.
Also, the majority of programmers and the vast majority of desktop users are on Windows. So, outside of the tiny HN/SV bubble the market disagrees with you.
Sure, where pre-installed apps are "adware" and debug logging is "spyware". In which case you should never use Android, iOS, a Mac, Ubuntu or any website.
My work Dell does not hold a candle to my MBP in build quality or reliability. Good thing we bought the Dell service whateveritis that lets us send back broken ones to be replaced
Off center track pad is not an issue. Just a matter of getting used to it. Buttons is a great addition. I don't understand why having buttons is an issue. You can still tap if you want to. But with buttons, things like dragging become much easier.
I'd like a laptop with cherry mx keys, also in future a screen won't be needed if we use VR so some kind of dock for motion controllers could be a thing.
I wonder if most Apple users, and the ones they tested the keyboard with, simply don't use the machines very much. Their core demographic update their social media profiles, and sit in coffeeshops, but how many of them write software? The people here are certainly important for their business, but were missed when Apple was designing a machine for appearance and not functionality.
Have three of these Macs, apart from the occasional stuck key which releases when you tap the back of the Mac they've been solid.
I couldn't go back to a normal keyboard on a laptop now, the reduced range of travel on the butterfly keyboards takes a while to get used to but it's amazing once you're used to it you just don't want to mash keys through more than 1 mm of travel to type a letter anymore
Until now. I got the new one at my job three months back. And, here I am, still struggling with and super annoyed with the missed keystrokes (the buttons are so thin/don't press properly), the almost non-existent 'Enter' keys (seriously, who messes with the Enter keys! ), the useless touchbar. I look like a klutz when I have to show code/artefacts to someone because I am always mistyping or closing windows.
Additionally, the touchbar led me to one heart-stopping evening of infinite restarts[1].
I am terribly disappointed that they released such a shoddy product, especially since it's used as a workhorse by thousands of developers world-wide.
1. https://discussions.apple.com/thread/8189417
EDIT: Grammar errors