Hacker News new | past | comments | ask | show | jobs | submit login
MacBook Pro 14-inch and MacBook Pro 16-inch (apple.com)
1657 points by 0xedb on Oct 18, 2021 | hide | past | favorite | 2059 comments



All: to read all 1300 comments in this thread, click More at the bottom of the page, or like this:

https://news.ycombinator.com/item?id=28908383&p=2

https://news.ycombinator.com/item?id=28908383&p=3

https://news.ycombinator.com/item?id=28908383&p=4

(Comments like this will eventually go away - sorry for the annoyance.)


HDMI, SD Card, and MagSafe. Things people on the internet inclusive but not limited to HN said they will never come back because the future is USB-C.

Now I just want to know if the new keyboard has more key travel distance back to the like of MacBook Pro 2015.

In case anyone wants to know the thickness difference.

MacBook Pro 13" 2015 - 1.8 cm

MacBook Pro 13" 2016 - 1.49 / 1.55 cm

MacBook Pro 14" / 16" 2021 - 1.55 / 1.66cm

So basically even the new 16" is still thinner than the MacBook 2015 era. Which I think vast majority of people were happy with.

Edit: Both 14" and 16" have 254 PPI, up from ~220. Apple tends to stick with same PPI for a very long time. So this is interesting. 3456-by-2234 or 3024-by-1964 is 14:9 Ratio. So somewhere in between the old 16:10 and 3:2 which is current trend of Lenovo and Surface Laptop.


> Both 14" and 16" have 254 PPI, up from ~220

Which also means they finally support more screen real estate at native 2x! Since 2016 the 13" MacBooks retained their 2560x1600 panels but shipped by default in a "looks like 1440x900" mode, which renders a 2880x1800 frame and does a non-integer scale down to the panel's native resolution, trading sharpness for extra space.

I hate the slight fuzzy look so I run mine at native 2x (e.g. "looks like 1280x800"), but it makes things pretty cramped.

These new screens can run native 2x at the "looks like 1440x900" res (other rather, "looks like 1512x982"). Very welcome improvement.


I have always been utterly baffled by Apple’s approach to high DPI. They ship some of the best panels around, and then they kneecap them by using the most idiotic scaling scheme imaginable that guarantees that it’s worse than a low resolution panel for some tasks that want precision and uglier on things like single pixel lines which crop up a lot in e.g. web content, because pixel precision is impossible. Microsoft managed to do it right (though mixed-DPI took a couple of attempts), despite having no real power over developers. Meanwhile, Apple, who are in a position to sternly tell their developers to do it right or their software will break (and do so far more frequently than Microsoft), … caved and did it terribly? And do they even support real mixed DPI, or do they just fake it by rendering at the higher scale and downsampling to the other?


This doesn't gel with my experience at all. Yes, past MacBooks weren't necessarily integer scaling (with this new display the default setting and what most people use will be though as far as I understand), and non-integer scaling is not ideal. But on my last two MacBooks at least the screen is visually plenty sharp, and looks way, way, better than the regular-DPI external monitors I still have (whether the external monitor is connected to the Mac or a Windows PC). And above all, my experience with Mac is that high-DPI has actually worked from the beginning, whereas on Windows I still have some apps that break if the scaling is not set to 100%!

Even right at the start, old apps would look a bit blurry but would still at least be in the correct proportions on macOS, whereas with the Windows scaling a lot of stuff just wouldn't work!


I have no direct personal experience on high-DPI macOS. For Windows, I’ve only come across a couple of apps that claimed to support high DPI but severely mangled it, and they both got fixed before long; and a few others that had minor problems, e.g. opening windows at unscaled sizes so that you had to resize them, or rendering small icons. And to be sure there are various DPI-unaware apps, probably far more than on macOS (linked to why Apple was in the perfect position to implement and impose proper fractional scaling), but that’s not a serious problem either.


thats the issue with windows (and linux). highdpi displays are still rare (i did say: <10%), hence a lot of developers dont support it well in their applications. on mac however highdpi is the default.all apps i have used that is native macosx works fine.


Microsoft's DPI scaling is terrible. So much stuff is ugly and/or broken. (My favorite example is DPI scaling bugs that used to exist in the DPI control panel itself: https://i.imgur.com/zi80IhG.png) Mixed DPI is a huge pain because windows change size when you drag them around. WPF was supposed to fix all this, way back when that was the future. Apple's approach is much simpler and generally works better though it certainly has its downsides too.


Your example is from Windows 8, released in 2012?


Yeah it's my favorite due to the irony. But DPI scaling bugs assuredly continue to exist both in Windows and in applications. On macOS such issues are much more rare.


On most of my machines I switch to the integer scale mode, but realistically, the effect on sharpness is not too far from antialiasing in practice. I am not sure how they do compositing of media on macOS though. Theoretically they can render them on a distinct layer that corresponds to the output display pixels and directly composite that layer to avoid an additional resampling.

Windows on the other hand, does the theoretically correct thing but messes up the ecosystem (unsupported apps, multiple monitors)


They composite everything at nearest higher integer scale, with correspondingly large framebuffer (just make a screenshot of entire screen and check the dimensions). Then, during scan out, it is scaled down by output encoder.

This method does not need GPU at all and also saves memory bandwidth; no need to save scaled down framebuffer and read it out again.


Mixed DPI still doesn't work reliably for me on Windows. Lately mine has been confused with dragging windows between monitors, they'll get stuck on the border when dragging even though I can move my cursor across fine when not dragging anything.

It could be unrelated to the mixed DPI scaling, but that's what nearly all my display issues on Windows have been related to so that's my assumption.


Windows occasionally "stick" a bit when dragging between windows monitors even with the same scaling on both screens. There are a bunch of keyboard shortcuts that are very reliable though. Windows Key + Shift + Direction: left/right instantly moves windows between displays, up maximizes, down un-maximizes and then minimizes. Windows Key + left/right (so without shift) also docks windows to the left or right of the same screen, and this feature is what makes windows also "stick" when dragging them from one screen to another.

I also recommend you check your display settings. If your windows aren't the same vertical size, then your overall desktop isn't a rectangle, it'll be some other polygon, and that means there are corners between the screens that your mouse can get stuck in.


Personally, I could care less if the UI elements themselves are fuzzy when using non-integer scaling. I love high-dpi screens mainly for reading text, and as far as I can tell non-integer scaling doesn't seem to make it fuzzy? Either way, it's way less "fuzzy" than standard-dpi displays where you can actually make out pixels very easily.


IME non-integer scaling on a Mac does make text fuzzy.

Doubling the DPI of the display makes the fuzziness less noticeable of course, but someone commented on this site that the fuzziness bothers him even on a retina display, so he distinctly prefers Windows's text rendering over Mac text rendering on the same hiDPI monitor.

As noted in other comments on this page, many Windows apps are blurry when the OS is set to a scaling factor other than 1.0, but Chrome, Firefox, the new Edge browser, Emacs, VSCode and most of the apps (e.g. Settings) that ship with Windows do not have that problem.

None of the apps except VSCode that I have opened on my Linux install over the previous 3 months are fuzzy when the OS is set to a non-integer scaling factor either, though in that case I have had to learn to adapt to certain minor bugs in Chrome and Emacs.


> which renders a 2880x1800 frame and does a non-integer scale down to the panel's native resolution, trading sharpness for extra space.

This isn't how it really works. It doesn't render 2880x1800 frame and scale down, it will just render at the resolution given for the size of the widgets desired. Any modern GPU can do that, the only time down scaling is needed is if you have a higher resolution bitmap that you want rendered at a lower resolution (or you can scale up, anyways, modern applications should not be rendering fixed sized bitmaps to ever need to do that).

Edit: never mind, that is how it actually works. I never realized MacOS never got true resolution independence like WPF.


No he's right, it renders to a 2x resolution off screen with hidpi assets (2x resolution icons, double-size text, etc) and then does downsampling back to your screen resolution 2560x1600 for 13 inches, 2880x1800 for 15 inch, 3072x1092 for 16-inch MBPs.


macOS does in fact do this kind of non-integer bitmap scaling. It's got nothing to do with what the GPU is capable of; Apple's UI framework can't do layout or text rendering at anything but "1x", "2x", or "3x" densities.

This is also why the first Catalyst apps looked really blurry and bad on non-Retina displays; Apple applied a non-integer scale to the final rendered frames to work around UIKit not being able to draw the right size widgets.


Gnome actually always render a full framebuffer at integer scales and then downscale in GPU with bilinear scaling.

4K output, 1.5× scale means 6K framebuffer scaled to 4K.

That's sadly how it works at least on Gnome, and Gnome says they've copied it 1:1 from MacOS.


To my knowledge windows is the only OS which renders the widgets at fractional scales and doesn't scale the output, but it only does it for apps which implement DPI awareness correctly. Some apps advertise to the OS they support DPI awareness, but then render controls at the wrong size. Others are GDI-based and don't support DPI awareness at all, and those are rendered 1x and scaled fractionally, which looks like a blurry mess.


Just made a comment about this recently, but a few year ago Microsoft added an enhanced GDI scaling mode. It applies the scaling during the GDI calls so it can render elements sharply. Obviously this won't magically make low-res images look better and it can't do anything about UI elements not rendered through GDI, but it's much better than just scaling the whole window up after it's rendered.


AKA the "Apple/Android/CSS was right" mode.

Personally I think they should abandon/maintenance-mode the "manually scale the pixels to fit the screen" mode and make the "virtual pixels" mode capable of supporting DPI-aware apps.


DPI awareness is generally supposed to be the domain of the lower levels of your GUI library, so that just about everything else above that level just works, and only a very few pieces in a very few apps need to concern themselves with scaling matters. The “System (enhanced)” scaling mode in question here is really just about doing that: patching GDI (the most common lowest level) to be as DPI-aware as possible if the app doesn’t have something else taking care of it.

Nothing about Windows’ approach to DPI awareness needs to be abandoned. Your app code isn’t supposed to be DPI-aware, you’re just supposed to use a GUI library that is.


Android and KDE use this method as well, which is very awesome as it provides the best performance and the best quality.


Interestingly, early versions of MacOS 10 supported this as a hidden feature. It never really worked properly.


It has never worked properly on windows either, but that hasn't prevented microsoft from shipping it. In fact, for a long time they didn't support per-monitor DPI awareness in the OS, so people with mixed DPI multi monitor setups had applications render at the wrong size or blurry when they moved them across. This was fixed in one of the windows 10 feature updates, and after that it was a patient wait for software to implement support for it. I believe MS Office 2019 was the version where it finally got correct support for it.


When the MMO Star Wars: The Old Republic launched, I tried the beta and kept having issues where it would launch to a black screen, wait, and then crash.

Someone, somewhere, figured out that the game only runs properly if you have the text scaling at 100%; I had it at 150% because I was playing on a TV and the screen was unreadable at 100%.

I have to wonder what Microsoft did with text scaling that it broke a DirectX game.


wow, I suddenly now understand why some of my Windows apps' UI's look so blurry


Apple and GNOME do this to avoid rounding errors when working in scalable layouts defined as screen-width percentages. Web browsers have accumulated all sorts of tricks and hacks to ensure that, say, floating four 25% width columns always adds up to the same width as a 100% column. This isn't a particular failing of CSS or the web - though CSS floats absolutely make this so much of a problem that browsers have to actually correct for this when users scale pages.

Exposing a 1x design to non-integer scale factors fundamentally runs the risk of breaking it - you will get odd pixel gaps or misalignments which would not occur at integer scales, because the virtual pixel coordinates no longer line up with actual screen pixels.


Qt has done that for many years, so has Android, so has Windows.

Only macOS and Gnome don't, because they value single-pixel alignment errors more than the massive performance issues and reduced sharpness their solutions cause.


And Qt after all those years does not work correctly in HiDPI. Not even on Windows.

Unlike macOS and Gnome, which do.


For downvoters - exhibit A: https://imgur.com/a/Ad0ZkmX

Problems like this are common, it is not app specific. If you call this a superior solution, then you haven't seen a really working one.


Looks like that application explicitly uses (device) pixel sizes. That issue is common, but still application specific.


In that case, it is still the problem of the approach that Qt has chosen.

With macOS or Gnome, even apps from 90's that were newer updated, will have proper scale. They might be blurry (macOS upscales using nearest neighbor so it is pixelated, Gnome uses bilinear, so it is blurry instead), but they will be correct size.

The Qt approach works only for apps, that do the special Qt dance.


Gnome does NOT work correctly in HiDPI.

I’ve got 3 systems with 1.5x scale, and Qt handles them perfectly fine (even with per-monitor separate scaling).

While Gnome absolutely does not. Either I get absolutely unreadable tiny 1x, or I get absolutely unusable 2x with no space for anything.

(That said, Qt on Windows handles per-screen scale factors very weirdly, especially with applications using pixel values)


The principial difference between how Mutter/Gnome and Plasma/ handles hidpi is, that for Mutter/Gnome it works for old apps without any modifications, while the Plasma/Qt approach works only for apps that were modified to use the new APIs.

Yes, the old app on Mutter will be upscaled, but it will be right scale on each monitor in mixed-dpi environment. Yes, even that old wordperfect from 1998. While on Plasma, non-Qt apps will have hard time.

Also, you won't be able to do perfecly fine anything on 150%. How are you going to do 1-pixel wide line? Exact cursor positions? You are not going to. With hw downscaling, yes, you pay the price for larger framebuffer and you get good approximations, but you also avoid a great deal of corner cases, that most software users are running is never going to solve.


> Also, you won't be able to do perfecly fine anything on 150%. How are you going to do 1-pixel wide line? Exact cursor positions? You are not going to. With hw downscaling, yes, you pay the price for larger framebuffer and you get good approximations, but you also avoid a great deal of corner cases, that most software users are running is never going to solve.

I care more about the performance loss of rendering at excessive resolution than about the 1-pixel imperfections.

> that for Mutter/Gnome it works for old apps without any modifications, while the Plasma/Qt approach works only for apps that were modified to use the new APIs.

You could do a hybrid approach and render old apps at 1x/2x and down/upscale them and let updated apps render at the native resolution.

At least render fonts at native resolution and only up/downscale the rest of the widgets. Scaling fonts can NOT be done by post-processing or it WILL be wrong. (MacOS only scales widgets, images and fonts will always be rendered at native resolution)


> MacOS only scales widgets, images and fonts will always be rendered at native resolution

MacOS scales entire framebuffer. You can verify it by doing a screenshot and examining it.


In my screen, 1x and 2x are both inadequate, but I'm using Gnome on Wayland on 1.25 scaling right now. Gnome can do fractional scaling. You just need to configure it with a gsettings command:

https://www.linuxuprising.com/2019/04/how-to-enable-hidpi-fr...

https://wiki.archlinux.org/title/HiDPI#Fractional_scaling

Or using gnome-tweaks. Then, on display settings, it should show up the new scaling configs.

And, in Gnome it's very sharp, with zero blurring! It was somehow sharper than whatever KDE was doing when I tested it some months ago.


Eh, no. Gnome doesn’t actually do fractional scaling – as said earlier in the thread, it’ll render at the next integer resolution and scale down in GPU.

Which I don’t really have the performance for (I’m already using Budgie’s option to do exactly that under X11 right now already, it’s not exactly the most performant solution).

And even under Wayland, Gnome still renders at the next integer scale.

While KDE under X11 actually renders at the correct resolution, with proper scaling and perfect sharpness, which I really miss.


That is working as intended, and either you don't have a graphics card that is good enough to drive your monitor, or you're using CPU-rendered apps that will always be slow at higher resolutions. This is a problem with a lot of cheap hardware and older software. It worked for Apple because IIRC they've been porting all the low level drawing over to Metal. GNOME is a lot slower to do that unfortunately, you won't see everything being GPU accelerated until GTK4 becomes widely adopted.


The Metal (Apple) or OpenGL/Vulkan (Gtk) renderers are not that important for apps that were fine with software rendering for years. The final composition of framebuffer is being done with hw anyway (for most cases, unless you have a really old hardware with non-working opengl 3 or so).

The downscaling, when done properly, is being done with output encoder, not with GPU. It should be transparent for anything graphic card, that came with DP or HDMI port. Maybe except original rpi.


In my experience it becomes an issue if you do something like maximize the window on a 4K screen, GTK3 has noticeable slowdown there whereas GTK4 is faster.


That’s not what I, as mostly android/web/qt dev, want though. Why should I get 50% less performance just to fix some tiny single-pixel issues on a screen where I can’t even see individual pixels?


I'm not sure how to answer that question, you don't have to do anything you don't want to do. GTK and its apps currently use integer coordinates, so in order to change that it needs to be changed to use floating point, and all the apps need to be changed to use it too. This is an API break and another thing that will take a long time to plumb through the whole stack and I doubt it will happen until at least GTK5. Maybe use KDE or Windows until then? Or get different hardware?

Edit: comment from a GTK developer here that goes more into detail: https://gitlab.gnome.org/GNOME/mutter/-/issues/478#note_7939...

I guess if you want to know why should things be this way, you could say it's because of technical limitations? Though that may not be a satisfying answer to you.


GTK2 used float coordinates and supported fractional rendering.

GTK3 breaking that is a bug, not a feature.

Especially text should NEVER be scaled after rendering.

And re: performance, that’s great that you’ve got hardware that can run 3D in-browser shader sites or games at 6K resolution at 60fps just fine, mine can’t. If it were running at 4K, as it’s supposed to, I’d get significantly more fps. Especially at a time where GPUs are rare and Gnome still decides I should get 50% more performance just so they save some work.


I think you might be thinking of Cairo when used with one of the print backends, not GTK. GTK2 did not use float coordinates or support fractional rendering. See for example here:

https://developer-old.gnome.org/gtk2/2.24/GtkWidget.html#gtk...

https://developer-old.gnome.org/gtk2/2.24/GtkWidget.html#gtk...

Those would need to be floats if you wanted to have a non-integer scale. Changing everything to use float coordinates is a large undertaking that touches the whole stack and will require a lot of work. It still hasn't been done yet as of GTK4. If you'd like to spend your time helping out then please do so. Please refer to the gitlab comment above for a description of some of the technical issues that you would have to solve. But it may be more cost effective for you to just get some new hardware with a better GPU, or use something else.


It is not 50% less performance; on hardware from the last 10+ years, it should not be noticeable. Neither in performance, nor in energy. If anything, it should be more performant and less buggy, becuase instead of complicating the software, you solve the problem with dedicated circuitry, that is already part of the output encoder anyway.


Not noticeable is interesting.

What hardware from the past 10 years are you using that rendering 3D games in browser or shader websites at 6K vs 4K resolution makes no difference?

I’m already struggling to get enough fps at 4K, with Gnome rendering it at 6K and downscaling to 4K my performance is even worse. That’s using an RX 5700XT, a GPU that in the current absurd market retails over $1000.

And as Gnome ensured that neither GTK nor Wayland expose any non-integer scale factors, or support it, browsers on linux have no way to render at native resolutions and handle scaling themselves, which would improve my performance situation significantly.


If you have games, run them fullscreen. Fullscreen surfaces will get removed from composited desktop, scaled independently and can run at arbitrary resolutions.


How do I run https://www.shadertoy.com/ in native fullscreen?

How do I actually multitask while doing that?

At 1.5x even my Ryzen 9 3900X and RX 5700XT are stuttering to show many shadertoy examples at 4K, where the performance at 1x is more than acceptable.

Simply: Gnome (and Wayland) is broken, and this API must be changed.

EDIT:

For https://www.shadertoy.com/view/ss3SD8 the difference on my laptop for example is 1280x720 at 1.4fps (1.25x scale) or 850x450 at 17fps (1.0x scale).

That’s a massive difference, and enough to be the difference between usable and unusable.


"Gnome (and Wayland) is broken, and this API must be changed."

Again I would say no, that's wrong, it's just not currently meant to run on your hardware. If you know how to change this API then please help. AFAIK there is no one working on this currently. To solve the problem, it really needs someone who is committed to getting it working on the type of hardware that you have and who can champion that use case. If that's not you then you'll have to wait (indefinitely) until that person shows up, if they do at all.

Also, shadertoy is a really bad test case as those are more like demos, they aren't optimized and are just made to show how much you can do by computing things on the fly in a shader. A real program you'd see in production would make better use of GPU memory and wouldn't hammer the GPU cores so hard.


What’s so complicated about just copying Windows, Android or Qt 1:1?

I’ve got all my time already full with jobs, projects, and other stuff I’m working on, if I’d find time, I’d love to help extending wayland, deprecating GTK3 and GTK4, and modifying the software accordingly.

I’ve made custom patches for the same purpose in the past already, but I just don’t have the time.

Especially since with X11 it was at least somewhat possible to do it, but now with wayland it’s sadly enforced in the protocol to only use 8-bit sRGB and integer scale factors.

If I ever find the person responsible for that decision...


It's not hard to copy an API specification, but actually implementing the API and changing drawing in every app, toolkit and compositor to support floating point is what is going to be complicated and will take time.

Edit: The decision to use integer coordinates I believe was made long ago in GTK1 because that's what X11 used, and it just hasn't been changed since.

"Especially since with X11 it was at least somewhat possible to do it"

X11 never actually supported this and also only uses integers for screen and input coordinates so I'm not sure what you're referring to. I think you are thinking of the app itself doing its own DPI scaling which will also work in wayland, but it will have the same problems where the DPI could mismatch with other apps and with the rest of the desktop.

"I’d love to help extending wayland, deprecating GTK3 and GTK4"

Well GTK3 is already in maintenance mode, and apps are currently in the process of being ported to GTK4, so first they'll probably want to finish that and get some feedback before starting on GTK5... it will probably be at least a few years before anything related to this could ship.


GTK2 had an integer DPI variable. GTK3 now just has an integer scale variable.

GTK2 could actually render at 144dpi, or 108dpi, just fine. Everything was slightly misaligned, not everything scaled perfectly, but it worked.

X11 had an integer Xft.dpi variable. Wayland now just has an integer scale variable.

WHY was this downgrade made? If a protocol break could be made to remove this functionality, a protocol break to add it back should be just as justified.

Break everything, apparently removing functionality was a good enough reason to do it, re-adding it should be just as well.

Yes, I am extremely angry after having spent years yelling at the wayland-wg and gnome devs not to make these decisions, not to put these things into the protocol, and complaining that if an API break is necessary, it should be extensible for these use cases. I’ve been complaining for a decade now and all of the complaints got ignored and instead the mistakes I warned about were made.


That DPI value AFAIK only changed the text scale, not the scale of widgets. You are thinking of two different things, the GTK3 integer scale actually affects widgets. The Xft.dpi setting also originally only meant text scale but has apparently been overloaded to apply to widgets in Qt? I'm not sure, I haven't tried this recently and I don't use X11. But it's right there in the name: Xft is the X freetype font rendering.

I understand your frustration but from what I have seen, nothing has actually been removed. This is just a feature that was never implemented because nobody signed up to implement it. Wayland could be extended to support it eventually but somebody actually has to put in the work. I think the best bet for somebody working on this would be to get it fully working and stable in KDE first, since Qt apparently has the toolkit support for it, and then maybe it can be adapted to work for GTK.


> GTK2 could actually render at 144dpi, or 108dpi, just fine.

Your definition for just fine must differ from everyone's else. Most GTK2 shipped assets for 96 dpi and that's it. Just try running GIMP on 192 dpi display, with "properly set dpi", for example. You will see yourself that is was not just fine.

> WHY was this downgrade made?

Because it didn't work. That's why.

> If a protocol break could be made to remove this functionality

It wasn't protocol break. It was apps either ignoring the facilities (in better case) or being outright broken -> desktop locking on constant values, where everyone was able to test their wares.

> protocol break to add it back should be just as justified.

Not going to happen.

> Break everything, apparently removing functionality was a good enough reason to do it, re-adding it should be just as well.

You can start. Show 'em.


> and changing drawing in every app

Exactly. Nobody is going to update everything what's already out there. For many things, there's not even the source, even if there was a will. There's unbelievable dragging of the feet in the Linux community, so even that will is a big question.

These compositors have to play with cards they were dealt, not with cards they wish they had.


Why was it possible to break the world to remove these things when switching from Gtk2 to Gtk3, but now it’s not possible to re-add them? Why was it possible to remove it when switching from X11 to wayland, but now it’s not possible to re-add it?

I’ve complained about this since before Gtk3 and Wayland were stable, constantly warning against making these decisions, and now that the bad decisions have been made, doing another protocol break to fix them is apparently out of the question?


Sadly I think you will have to accept some of those apps will just never have Windows-style DPI scaling, sorry. It's the same thing with various older apps on Windows that will just not get updated because they use Win32 API functions that are hardcoded to use integers. Maybe you could come up with some clever solution that tricks them into rendering at a different size? I'm really not sure, if you are an expert in this then please help.

Android started with it from the start and developers were aware of it.

Windows and Qt were retrofitted and it shows. Both are buggy and unreliable.


> "Gnome (and Wayland) is broken, and this API must be changed."

> Again I would say no, that's wrong, it's just not currently meant to run on your hardware.

It' absurd that not running on common hardware is, somehow, not broken.


We can point out corner cases for both approaches till cows come home.

Yes, both of them have advantages and disadvantages, engineering is about picking the right compromise. So yes, there will be cases where you would be better of with the other approach... for a while.

But the approach is not being picked just for today and for the current state of tech. For the same reason, you won't lock yourself into unnecessarily complicated software that is going to be permanently buggy, and to be obsolete soon -- just like you would not appreciate being locked into Psion/Symbian-like memory management today, despite making sense, solving a problem and being more effective years ago.

This is exactly the same case. That software is going to stay here for decades... you RX 5700XT is not.


I mean, it works, and the end result is a fractional scale. But, yeah, it has some overhead.

When I tried here some months ago, KDE with fractional scaling wasn't as sharp as Gnome's upscale-then-downscale (I'm not really sure why. Downscaling should involve some blurring). I really wanted to use KDE, but text rendering on KDE's fractional scaling appears to be blurry.


> Gnome doesn’t actually do fractional scaling

Gnome actually does fractional scaling. It doesn't do fractional rendering. Two different things; you can do one without another.

As I wrote elsewhere, you cannot do perfect sharpness at fractional resolutions. How are you going to render 1px wide line exactly?


> As I wrote elsewhere, you cannot do perfect sharpness at fractional resolutions. How are you going to render 1px wide line exactly?

With downscaling, I get a blurry mess. With fractional rendering, I get a line that may be a tiny bit too wide or too thin, and may be one pixel off, but it’s going to be perfectly sharp and clear.

At least render fonts at native resolution and only up/downscale the rest of the widgets. Scaling fonts can NOT be done by post-processing or it WILL be wrong.


> At least render fonts at native resolution and only up/downscale the rest of the widgets. Scaling fonts can NOT be done by post-processing or it WILL be wrong.

They are fine, but just cannot downscale to arbitrary sizes. Note that macOS doesn't do 125%, for example, because that's one of the worst cases - you have 5 pixels to do job of 8.


If you do that, you throw out all the hinting and scaling code the font authors may have added.

Many fonts intentionally change weight slightly at small sizes, by scaling it afterwards you break this functionality.

Also, by scaling afterwards in compositor in sRGB space you create issues with brightness as the compositor (at least under Gnome) does not take gamma into account.

The amount of tradeoffs is extreme, compared to a few UI widgets getting slightly shifted around.

And I’ve already mentioned the performance issue in multiple other places in this thread.


Hinting is not really used nowadays; it made sense on low res displays, but not in HiDPI. You are better off with autohinter now.

Wrt weights, fonts are not defined in pixels; so adjusting for this is the easy part.

Yes, ignoring gamma is a problem, and I'm not sure whether anyone in Gnome/Freetype/Harbuzz is working on this; probably not. But this is a problem for low dpi and integer scaled hidpi too, not just for fractional scales.

Widgets slightly shifted also mean your mouse is going to be shifted, and quite possibly in different direction. Now, that's going to be a problem, that the users can clearly reproduce.


>Gnome actually always

No, not always: when XFree86 and XWayland are out of the picture (i.e., when the app is talking directly to the Wayland protocol) there is no downscaling or upscaling at all.

So, for example on Fedora 34 (my daily driver) if you say

gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer']"

, then a new line (named "Scale") appears in the "Displays" pane of the Gnome "Settings" app. If you use that new line to set the scaling factor to 125% or 150% or 175%, then start google-chrome-stable or google-chrome-unstable with the flags -enable-features=UseOzonePlatform and -ozone-platform=wayland, then navigate to the page containing this comment, the "Y" in the upper left corner is a little blurry -- enough so that most people with non-hiDPI displays should be able to tell -- but everything else on the page, including the white box around the white "Y" and emphatically all the text is rendered directly at the correct resolution.

This is because on a modern OS, the fonts are stored as resolution-independent mathematical descriptions of curves (splines, to be exact) that can be rendered cleanly at any resolution. And rounded corners specified by browser CSS or in GTK 3 or GTK 4 (the versions of GTK capable of communicating directly in the Wayland protocol) are specified in a resolution-independent way. And if the aforementioned white "Y" were a SVG file instead of a PNG file, it wouldn't be blurry either (but don't bother changing it, dang, it is fine the way it is).

And all the Gnome apps, e.g., the aforementioned Settings app, e.g., Gnome Terminal, on my install are non-blurry at fractional scaling factors.


Agree- I have been largely disappointed by USB-C. It's been okay as a smaller USB-A replacement for flash drives and mice, but in general the lesson of USB-C is: just because it fits, doesn't mean it works.

My personal favorite annoyance is how chargers and cables are advertised with the wattage they support, but it's really the voltage and current that matters.

If you have a device that wants 20W as 9V/2.22A , your 30W charger may not support that specific combination and will charge much slower than a 20W charger that does.

Edit: Yes, I went to middle school and know that power is voltage times current. My point is: having an equal or higher wattage USB-C charger is not sufficient.


Its time to tell the somewhat unpopular story again. ( Because USB-C Supporters hate it )

Trying to help a lawyer out and trying to explain why the $5.00 USB-C cable he'd bought from Amazon wasn't delivering 4K video to his expensive monitor AND powering his laptop too.

Me: OK: so its a USB-C cable, but its not a high data rate USB-C cable.

Him: But, its a USB-C Cable.

Me: but, no, not all USB-C cables are high speed cables. And some of them can't do high speed and power delivery

Him: but... its a USB-C cable: it plugs into the port.

Me: Um... just because it plugs in, doesn't mean its going to work. You can have USB-C cables that are actually slower than the old USB ports.

Him: but.... shouldn't it just work?

And so on. For... 15? more minutes? maybe 30? I finally got him to buy a "proper" belkin USB-C cable . Which was bought from a company that should be anonymous, but lets just say that a "refurbished" cable was shipped, which, surprise, surprise ,didn't work, for ANYTHING. This basically sums up everything that is wrong with Tech thinking vs User Thinking.


Don't we have this problem with regular electrical cables, but instead of not working you blow a circuit breaker or just start an electrical fire? There is very little stopping me from going to a wall socket and hooking up a 20 amp welder in a 15 amp plug and cranking it so the breaker trips or worse.

edit: I wrote this but then realized other countries other then the US exist and handle electricity differently, my example only applies to the US


I have a USB-C monitor, so I just plug my macbook in with a single cable and I get power and video. How people aren't loving this I don't know. I use a display port dongle for two additional screens to make a triple display and I still have 2 USB-C ports available. I plug my keyboard/mouse into the monitor.

If you're not loving USB-C, you're just not doing it right.


It took me buying three different usbc to hdmi cables to get my surface hub to play nicely with my two monitors. Nothing in this stack was less than 6 months old.

That’s why I don’t love it. Love the promise, hate the result.


Have you noticed any increase in battery degradation from keeping it plugged in all the time? I like the idea of that setup, but that's one worry I've had.


What was the price premium you paid to get that single cable experience? 1000$? 2000$? between the hub and monitor?


My employer paid for everything. Zero dollars and I get to keep everything outside of the macbook when or if I decide to leave.


I love the idea of it but I don’t yet have a monitor that will do this for me.


> This basically sums up everything that is wrong with Tech thinking vs User Thinking.

This is a pretend dichotomy. It's not "people who think about tech vs people who think about people." It's "some parts are cheaper than others because they don't do as much."


Maybe a road analogy would have helped him

If you need to send 100 cars down a strip of road every 10 seconds, you can't use any road. You'd need one with lots of lanes

But that said, I'm not defending USB-C. It's true, I've had to do more research on cables than ever before in my life since it started becoming the new """standard"""


> This basically sums up everything that is wrong with Tech thinking vs User Thinking.

Are the cables actually accurately describing themselves? I'm more concerned that products are obsfucating what corners they cut rather than that a lawyer should "know" the nuances of the USB-C industrial complex.


My laptop has two USB C ports. You can charge it through one of them but not the other one. My other laptop has one USB C port. You can't charge it through that one. I have a USB cable for my old phone. You can only charge it slowly through that cable.

USB C is about as useful as an IDC10.


I've been pretty impressed with USB-C. I've managed to add a GPU to a NUC and train/infer with tensorflow, no problems. Plug a computer into a monitor and power and keyboard/mouse at the same time, while also using same cable to pass through charge two lightning devices...

I find it much harder to break a USB-C port then a micro USB port on all phones I've owned.


I've had headphones die because the connector came loose after around 500 insertion cycles.

Several companies in China make a "mag safe"-style connector for micro usb that is really nice. Some of them only provide power/ground. Others provide the full 4-wire spec.

I have a pair of Sony Headphones that annoyingly shut off the wireless when you charge them. So I didn't really care if it had the full four wire spec.

There's some out there for USB-C, but I wouldn't trust them for high-speed data transfer.


I grew up using Walkman; I have decades of experience with 3.5mm headphones jacks breaking. Interestingly, it seems like nearly all devices with 3.5mm jacks are engineered a lot tougher because I haven't been able to break a jack in years.


Wisdom accumulates!


There's a dead comment under here which points out that 3.5mm is three connections around a large-ish cylinder, while USB-C is 24 connections in a tiny space.

It's easy to make tough, lasting connections when you have space for decent-sized leaves of sprung metal.

It's incredibly hard to make tough, lasting connections when each connection is on a 0.2mm pitch that is almost too small to see.

Bottom line - USB-C is an optimistically designed standard. It's almost impossible to make a super-affordable USB-C connector which will stay reliable for thousands of insertions. Especially when it's also used for up to 100W of power.

Plastics like Stanyl are supposed to be good for 10k cycles, but it's always tempting to cut corners and use something cheaper. And the physical design tends to collect dust and compact it inside the plug and the socket - which doesn't help.


> There's a dead comment under here which points out that 3.5mm is three connections around a large-ish cylinder, while USB-C is 24 connections in a tiny space.

That's not it really. It's the connector and solder strength.

The problem is that when you buy a piece of electronics with a micro-usb port, you don't know which connector type the product designer used, or if the solder joints are good, etc. Sometimes you won't know for maybe 100 or 200 insertion cycles, which usually places the device outside a manufacturer's warranty.

The solder joints on a right angled jack experience a torque on insertion. That torque puts strain on the solder joints. Do it enough times those solder joints will crack from the cycling. Or worse the connector just rips the copper up off the fiberglass layer, since the strength of copper bond to fiberglass is much much less of the strength between solder and copper.

Here's an example of an SMD mount only micro female usb connector: https://tinkersphere.com/electronic-components/1817-female-m...

The only thing holding it to the circuit board itself are tabs which need to be soldered to copper pads on the board. (My failed headphones were of this type.)

Though it seems like most USB C devices appear to be using through hole legs for mounting (I haven't seen a device use surface mount only yet). The solder flows through copper plated hole, bonds with the copper in the hole as well as the copper on the opposite side as well as the leg in the hole, increasing the strength of the part.

https://www.mouser.com/Connectors/USB-Connectors/_/N-88hmf?P...


I have magnetic connectors for USB-C in both varieties. The circular one I normally use just has two pins and is great because I put my phone down near the cable and it usually leaps across and connects. The multipin flat connector supports PD and data, I'm not sure exactly what its limits are but it does what I want.

Both were cheap ebay purchases made because I'm willing to plug a $10 cable into a $600 phone despite the early USB-C cables occasionally frying devices.


All headphones and earphones I’ve tried will cut off everything else while they are being charged. Including Bose and Apple.

Yes, this is a PITA. If you know of a maker of headphones and earphones that doesn’t do this, please let me know.


I had (and lost, unforunately) a pair of Sennheiser MM400 headphones... The charging jack was on the removable battery pack and you could charge it while using it (no effect at all on headphone operation)


> All headphones and earphones I’ve tried will cut off everything else while they are being charged. Including Bose and Apple.

This has not been my experience with my AirPods Max. They work over Bluetooth just fine while being charged, unless they are completely depleted (0% battery).

For reference, this is on firmware version 4A400 but this has been the case since I received them (months ago).


I’ve used my B&W PX over Bluetooth while charging over USB-C.

They also support playback input over USB-C or 3.5mm. It’s nice. Expensive, though. But not Apple Airpods Max expensive, at least!


I also have the B&W PX and can confirm this.

I don't love these headphones though. They clamp on my head way too hard for extended use.


Turtle Beach can charge while using. And play two inputs at once. The biggest downsides are how ugly they look and lack of NC.


Adding another one, my Sennheiser PXC 550 (both original and II) will charge and do bluetooth or wired audio.


I bought a used set of Bose QC (Quiet Comfort) and they came with two removable batteries and a charger, all packed nicely in a really small case.

I replaced the foam for $7 at some point but they still worked nicely last time I checked despite being 7 or so years old now.


Weird, I can charge and listen to my Sennheiser Momentus Wireless 3 at the same time.


Jabra 85h can charge over USB C and play over Bluetooth at the same time.


> Plug a computer into a monitor

… and get only 30Hz on 4k ;)


You can totally get 4K 60Hz, so long as the USB-C port supports DisplayPort 1.4a Alt-Mode.

(which raises one of the problems with USB-C - it's impossible to tell how what features it supports)


USB is meronymic morass.

alt-mode was the "killer feature" for me, alt-mode MST with a bus-powered splitter to drive dual DP (or HDMI) QHD external screens from a single USB-C port. This makes a regular laptop a viable desktop replacement for me.

Support for multiple external displays makes the billing, though these have 1x HDMI + 3x TB4 ports to play with.


> much harder to break a USB-C port then a micro USB port on all phones I've owned.

phone context, yes. But in laptop/notebook context, USB-C is not durable, stuff gets very loose.


Depends on the manufacturer. Lenovo x280 vs 2018 MBP. The Lenovo USB-C port is awful after 3 years of use, the port is loose and devices, including the woefully cheap charger, fall out. MBP, similar age (slightly older), and the ports still have a satisfying 'click' when plugged in. x280 cost more...


I find that port on the left here is decent, but the weird combined one is very bad:

https://www.onmsft.com/wp-content/uploads/2020/09/ThinkPad-T...


I don't know whether I over-use it, but the charger won't stay in of its own accord - same Lenovo brand charger works fine in the Mac. Pre lockdown, I commuted with both machines everyday. I've been disappointed with the build quality of the Lenovo laptop as a whole - the trackpad has the surface peeling off and the backlight is covered in bright spots which I can only assume happens because the back-panel doesn't have adequate protection.


What I personally hate is how cable type/speeds aren't usually advertised. I've got a dozen or so USB-C 2.0 cables that feel useless for anything but power, and only a couple 3.0 cables that continually get lost and need replacing. The connector itself is fantastic, USB-A and USB-B have needed replacing for a long time (especially since 3.0), but the implementation has been a disaster by any measurable margin.


> Agree- I have been largely disappointed by USB-C. It's been okay as a smaller USB-A replacement for flash drives and mice, but in general the lesson of USB-C is: just because it fits, doesn't mean it works.

I think USB-C works fine (h/t Benson Leung) but the implementation is problematic. Partly because of the complexity of everything they've tried to shove in, also partly because USB-A still exists and is the biggest competitor USB-C needs to overcome.

Give it 5 more years and we'll look at USB-A like serial ports. Till then, USB-A is legion.


Given that we're already 5+ years into Apple's USB-C revolution, I'm not so confident that this is going to happen. I'm going to keep using my USB-A peripherals for a long time, and more importantly, mice, yubikeys, keyboards, and a whole lot of other things are still largely sold with USB-A. Some of those have replaceable cables... but it's still a lot of effort to move away from that.

Furthermore, ditching all of those USB-A accessories and cables that still work fine just feels wasteful. If the only thing you're throwing them away for is compatibility, and not some feature that's actually improving them, even more of a waste. It's just churn to push more tech company profits.


This is a gradual transition. Most electronics last somewhere around 5-8 years, so we're looking at another 5 years before we can truly ditch USB-A.

The fundamental mistake Apple made was continuing to ship USB-A cables in the box with products, and expecting USB-C adopters to pay up for USB-C cables. This was good for Apple's short-term margins, but terrible for the transition.

So, I blame Apple for keeping USB-A alive. They introduced the all USB-C Macbook Pros in late 2016. You'd have expected that over the next year they'd have phased out USB-A, so by 2017 or 2018 they'd have not shipped any USB-A cables or adapters in the box.

But nope. Airpods 2 (released in 2019) still ship with a USB-A cable in the box. It's the same with the iPhone 11 (2019), and the Magic accessories for MacBooks until 2021.

> If the only thing you're throwing them away for is compatibility, and not some feature that's actually improving them, even more of a waste.

Interestingly, this is also a good argument for why the EU shouldn't force iPhones to use USB-C. People have tons of lightning cables that would become e-waste overnight, and all the demand for USB-C to USB-C charging cables would be profits of (some) company - Apple, Anker, or random Amazon companies.


> The fundamental mistake Apple made was continuing to ship USB-A cables in the box with products, and expecting USB-C adopters to pay up for USB-C cables

I remember hearing someone point out during that period that if you bought an iPhone and a MacBook, brand new, you couldn't plug the phone into the computer out of the box. Which seems like one of those things that would have been unacceptable to Jobs if he were still around.


USB-A also became so widespread that they became wall plugs in their own right, so at least those will stick around a lot longer unless people start hiring electricians to change their plugs.


USB-A is alright for charging small electronics. It's 5V and somewhere between 0.5A and 2.1A (1A is very common).

I have charging cables (with USB-A on the power side) in multiple places around the house, using the 5V/1A adapters that shipped with iPhones over the years. This works fine, though it is slower than the "fast charging" we've grown to expect these days.

The device end of these cables are miniUSB (bicycle lights), microUSB (external battery pack), USB-C (my partner's noise cancelling headphones), and many of them are Lightning (Airpods, Airpods Max, iPhones, iPads, Apple TV Remote).


I suspect USBA isn't going away, it's far more robust than USBC, what I suspect will happen is devices that need smaller sockets will use USBC and everything else will stick with USBA. There is a reason I have one USBC port on my motherboard and 5 USBA ports.


Guess I'm the opposite. I hate USB-A the connector with passion. I always have to try 2-3 times to be able to plug the USB-A connector in. Now I keep only 2 USB-A chargers in my home for legacy devices and advice my wife explitcitly not to buy new gadgets with USB-A connector, be it mini or micro.


Here's a slightly helpful tip. USB-A connectors have a USB logo on them. Look at or feel for that logo on the connector and always have it face up when you plug it in. It should always work.

There's a few exceptions. If the USB-A jack is rotated sideways on the device, then "up" could be left or right. And if it's a cheap off-brand sweat shop connector, then the logo may be on the opposite side. A sign that it could break quickly too.


I've marked the top of all of my USB cables with contrasting paint.


At some point in the future, USB-A will be phased out. If only because new devices will need the better profiles USB-C provides or because enough devices will no longer support USB-A.

I'd say 3 more years will be the turning point. You must remember, before USB there was no universal port. So USB-A didn't have to replace it's predecessor.


I'm dissappointed in USB in general, we have standards but it seems few follow standards. For example, I recently bought my wife an Asus ROG Falchion keyboard. Really tiny one with Zhuyin input.

So the connection is USB-C, but if you connect ANY USB-C cable to the keyboard, it wont charge. Even the USB-C <> USB-C cable that comes with iPad Pro doesn't charge it. So far of the ~20 cables I own that I've tried, the only cable that will charge it is the Asus provided cable...


Same with my Momentum 3 headset. It charges over any USB-C cable "that can also carry data". If the cable is power-only (and surprizingly, many are power-only, including Apple's cables that come with iPad/MacBook) it won't charge even though I just need to charge my headset and (obviously) no data transfer.

It's stupid, but it is unfortunatelly the way it is.


Oh! I wonder if it will work with my Wii cable then! I haven’t tried that! It never occured to me that most of my cables are power only.

Thanks!


Yup, check if it can also carry data. If it can, it's probably that. Cheers!


> Agree- I have been largely disappointed by USB-C.

Also: the cockamamy version/naming convention(s) of the latest USB revisions.


I'm honestly really happy with USB-C. My phone, laptop and tablet charge the same way. I bring one charger and two cables, so I can pack much lighter than before. Each device can serve as a power bank for the others. My monitor serves as a powered hub, so there's only one cable connecting everything to my laptop. I use the same charger for everything from my camping lantern to my computer. I don't have a big box of cables anymore, nor do I pack 5 different cables when I'm on the move. It's very convenient.

It only becomes a problem when I leave my bubble. I usually travel without my adapters, so I can't borrow someone's mouse, or plug something into their TV.

Then there's the issue with cables and chargers. I have thick cables and 100W chargers, so it's never a bother, but it's not a trivial thing for the average user to make sense of.


When USB was first being sold it was supposed to connect everything. Monitors, modems, network cards, peripherals.

USB has been letting everyone down since 1998.

I am floored at the ports. Floored. Apple LISTENED? Like, seriously, LISTENED to its customers? I thought HDMI would never come back.


> If you have a device that wants 20W as 9V/2.22A , your 30W charger may not support that specific combination and will charge much slower than a 20W charger that does.

Such a charger would be not specifications compliant. Don't blame the specs for what the manufacturers did with it.

Oh and don't get me wrong: there are a lot of problems with USB C mostly with the complete impossibility of knowing what the heck a USB C port is actually capable of. Is it Thunderbolt capable? if not, DisplayPort alternate mode capable? Can you charge the device over it? If yes, what's the maximum wattage.

This will ease finally because Microsoft will require PCIe tunneling on all C ports for the device to be Windows 11 certified so there won't really be different USB 4 ports -- except for charging wattage... https://docs.microsoft.com/en-us/windows-hardware/test/hlk/t...


A charger does not have to support any particular voltage/current combination. If a device requests a specific config (9V/2A) but the charger only provides 12V/1.5A with 5V/1.5A fallback, then the best the two can negotiate is 5V at 1.5A. Even if the charger provides 9V/1.5A the device might be programmed to only accept 9V/2A with 5V fallback. This would be USB-compliant, but still frustrating to the user.


First of all, I have answered to "If you have a device that wants 20W as 9V/2.22A , your 30W charger may not support that specific combination" saying that's not specs compliant. https://i.imgur.com/rkIJuMr.png I highlighted the relevant part. These are the rules: https://i.imgur.com/b8q96tH.png So, to summarize: since a 30W charger must provide everything a lower wattage charger like a 20W charger does and 20W falls between 15 and 27 then yes it must support 9V 2.22A as well. If it doesn't then it's not compliant. Note the caption in the second image "an example of an adapter with a rating at 50W. The adapter is required to support 20V at 2.5A, 15V at 3A, 9V at 3A and 5V at 3A."

> If a device requests a specific config (9V/2A) but the charger only provides 12V/1.5A with 5V/1.5A fallback

Then it's not compliant. 15-27W is to be delivered via 9V. 12V is optional.


You're almost correct here - if a device advertises a particular power rating, they need to provide the configurations you linked. But they are allowed to provide any arbitrary configuration beyond their marketable power rating. So consider a charger that has a smallish inductor in its DC/DC supply, and a DC/DC chip with an integrated MOSFET limited to say 2.5A switching current. For thermal reasons, that charger can't safely provide more than say 1.7A regardless of voltage. Because it can't supply more than 1.7A at 9V it can't be listed as having a PDP of more than 15W. However, that charger is allowed to provide that same 1.7A at 20V if it wants to, and advertise that configuration to the device - as long as it doesn't list that as its rated power. This sort of charger would be functionally providing >30W, but would only be allowed to put 15W on its packaging/advertising materials. From a user perspective, they see a charger happily providing 30W to one device, plug in another device, and get a lower output power. The lower power is exactly what it says on the packaging (which the user threw out ages ago and never looked at) but the user is still frustrated.


> However, that charger is allowed to provide that same 1.7A at 20V if it wants to,

Is it? Allowed by the standard where?


Looks like I was wrong on this - at least in the latest version of the PD2 standard it's not allowed to advertise any combination that exceeds the PDP. It's only in PD3 that devices are allowed to limit power in particular configurations due to thermal constraints even if they can technically deliver the same power in another configuration, but even that is limited only to >100W configs. So you're right about this - my example is definitely incorrect.


> chargers and cables are advertised with the wattage they support, but it's really the voltage and current that matters.

watts = volts * current


GP's point is that although chargers are labeled with wattage, they can't output any combination of volts and currents as long as they multiply to the stated wattage. Can you take a normal 100W charger and ask it to output 100V and 1A? It can't.

The Apple 29W charger supports two configurations only: 14.5V × 2A, or 5.2V × 2.4A

The Apple 30W charger supports these four instead: 20V × 1.5A, 15V × 2A, 9V × 3A, 5V × 3A

Do you see a problem here?


> chargers and cables are advertised with the wattage they support, but it's really the voltage and current that matters.

Wattage is what you get when you multiply amps and volts. You just said the equivalent of "It's not about the ice cream, but about the ice and the cream".

Also, it arguably does have everything to do with volts - if you put the too much voltage into a device, most likely you're going to damage the device, its surroundings, or yourself (i.e. in a house fire).


What he's saying is this:

USB-PD on USB-C negotiates the highest voltage your device and charger will support.

You can buy a 30W USB-PD charger that supports a smaller subset of voltages than a 20W USB-PD one, and in practice will deliver less power to your device because the voltages both like don't line up well.

What actually is supported by each tends not to be specified/disclosed-- or if it is, it takes a lot of digging to figure it out.


Yes and power negotiation does not protect against improper voltage whatsoever. This wasn't my point, despite me addressing negotiation in another comment.


> Yes and power negotiation does not protect against improper voltage whatsoever.

Oh? Everyone else seems to think the mechanisms are sufficient for safety.

> This wasn't my point, despite me addressing negotiation in another comment.

I still don't understand your point, unless it's a failed effort at pedantry.


If I put 120 volts through to my MacBook directly over the cable it will invariably damage it. Voltage negotiation doesn't matter here.

Also the original comment was unclear; they've edited to clarify. But of course, I'm the bad guy on HN for not being able to read minds.

E: I realize my GP comment was also unclear. I should have said "excessive" voltage, not improper I suppose. "Improper" in my head meant outside the rated voltage for either end/the cable. I'm not the one being a pedant here.


Voltage negotiation matters, because that what keeps a charger from putting 120 volts into your MacBook.

Which is exactly what GP was saying in the very part that you quoted:

> chargers and cables are advertised with the wattage they support, but it's really the voltage and current that matters.

and what you agreed with in that very same comment:

> Also, it arguably does have everything to do with volts


What do you think USB power negotiation is for if not negotiating a mutually supported voltage? That’s pretty much why it exists.


USB-C power negotiation absolutely protects against improper voltage, assuming it is implemented correctly.


Out of curuosity: is there any instance where it isn't implemented correctly that actually damaged/fried some electronics? What I've seen is that in the worst case it doesn't charge or negotiates at a very low voltage and charges slowly.


Yes, I don’t have a search term handy, but there’s someone who maintains a website cataloging these.


Your own analogy can be used to explain how you do not fully understand the situation. 10 cups of cream and 1 cup of ice does not make ice cream, even though that technically is ice and cream. Just like a 10V 5a charger is not the same as a 50V 1a charger.


I think you've deliberately missed my point just to argue with me.


I'd say the same about your initial comment.


I don't think they did.

The important part with regards to power delivery isn't simply wattage. If a hypothetical charger can put out 96W at 1V@96A, it's never going to deliver even close to that amount of power to a device that expects 96W at 20.5V@4.7A.


Exactly. I was not just trying to be a jerk. Their analogy was actually a good one, they just used it incorrectly. Ice cream is only ice cream in a certain range of ice to cream ratios, just like a charge needs a "close enough" mix of voltage and amps for it to reach useful wattage (depending on the device).


I think what OP is saying is a cable might support 30W at 5V * 6A, but not 10V * 3A.


They're not interchangeable. USB-C has power negotiation, but that doesn't mean the devices support those voltages. You still need to understand the voltage ratings on both ends of a cable.


That's exactly what OP is saying - chargers are commonly advertised as being able to supply a particular wattage, but that particular wattage is only attainable if the device being charged supports the maximum voltage the charger is capable of delivering.

OP is complaining that that leaves the true wattage of a given device/charger pair unknowable from the charger's packaging alone without further information as to what voltages and at what currents it can supply on request from the device. It's certainly a valid frustration.


This makes more sense and isn't how I parsed the original comment.


That is precisely the point of the original comment that you responded to.


I found this comment that explains it well:

https://news.ycombinator.com/item?id=28911240


The reverse situation would be the case (cable supports 3A, but not 6A). For the range of voltages supported by USB-PD, the cable only cares about amperage. Copper wire doesn’t care much if it’s 5V, 50V, or 250V, it’ll carry it the same. However, if the size of the wire isn’t correct for the amperage, it’ll over heat and potentially catch fire.

It’s the end device that cares about voltage and amperage both as the voltage steppers inside have to be ready to handle providing the right voltage to the chips and the amperage has to be right to handle the load.


Glad there’re back, but these are not Pro features and should not be limited to the Pro line (starting at $2,000). These are basic features that should be on consumer machines. I shouldn’t have to pay 80% more just to get HDMI, SD, and other features that were on Apple laptops 6 years ago.


It looks like Apple is trending back to ports and I think that will continue in the future.

It looks like the 13" MacBook Pro will be phased out in the near future (over the next 6-12 months) and I'd guess that the next MacBook Air will become the replacement. There wasn't a lot of difference between the M1 MacBook Air and MacBook Pro before. I think Apple can nicely introduce a replacement MacBook Air with an M1 processor (or M1 Air) with an HDMI port and SD Card slot and it will be quite differentiated from the MacBook Pro with its M1 Pro/Max processors, miniLED displays, additional memory bandwidth, additional CPU cores, additional graphics capabilities, etc.

While this announcement doesn't introduce the product you're looking for, it does signal that Apple is going in the direction you're interested in and it shows that Apple can offer so much for pro users that it doesn't have to worry about a consumer machine cannibalizing their pro line. Apple can literally offer twice the number of performance cores to give pro users a huge incentive to pay $2,000+ instead of $1,000. Apple has miniLED displays with 1,000 nits of brightness (compared to 500 nits on the old MBP and 400 nits on the Air). With their own silicon, they're able to really differentiate the Air/Pro lineup (which didn't really happy last year) and offer compelling reasons to pay more without withholding basic features.

Yes, a MacBook Air with HDMI and SD Card slot doesn't exist yet. However, given how many bells and whistles Apple has given the MacBook Pro lineup, they can give a redesigned MacBook Air those ports and not have to worry that pro users won't buy their pro versions. I'm hopeful that a new MacBook Air will arrive in the Spring with the ports you're seeking.


> It looks like the 13" MacBook Pro will be phased out in the near future (over the next 6-12 months)

Knowing Apple, they will keep it around for a couple years just to have the price point with the Pro name. They love keeping around older models for price anchoring/upsell.


Apple does like to keep models at certain price points. Just look at the 2nd generation MacBook Air, which stayed with us, un-retina screen and all, from years after the MBP and MacBook, to hit to $999 price point.

A 13" M1 MBP allows them to hit a $1299 price point and say "starting at $1299" when speaking about their pro laptop line.


I don't think the Air is thick enough for HDMI. The Air now is more like the old MacBook, the cheap fanless ultra portable, and wouldn't be surprised if it stays like that, previously it was much less differntiated from the MacBook Pro lines.


You can still do mini- and micro-hdmi, but by that point you may as well do more usb-c.


Mini and micro hdmi are almost non existent in the wild. USB-C is the correct choice for the airs.


The thin MacBook(2015-2019) is latest "MacBook" so it's not old but latest.


What can we expect from the new Air? Will this model replace the current Air or will it be an additional more expensive offering?


Probably MagSafe 3 , a notch at 1080p with a screen with thinner bezels, SD card reader, faster SSD, LPDDR5 but still up to 16gb , maybe a hdmi port and Thunderbolt 4.


> I shouldn’t have to pay 80% more just to get HDMI, SD, and other features

Pedantic nitpick, but a USB-C hub costs ~$40, or about 4% more than the base price of a Macbook Air.


My girlfriend bought 100€ worth of adaptors for her Macbook Air and can never find them when she needs them.

For a desktop it doesn't matter, but for a portable device that you need to carry around every extra thing that you need to carry around is just something you are going to misplace.


I'm a hardcore apple fan (from before the iphone) and this hub situation is just pathetic, they should have put in an USB-A in there and call it a day.

You can run into some VGA projectors but most have HDMI now though.



Woof. That thing is ugly


Really? Looks very cool to me. Matter of opinion, I guess :)


Yikes. My 2017 MBP is bent and it's always on a flat surface. I shudder to think about keeping it on something like those blocks :-)


I like that it also raises the keyboard at an angle. Remember typewriters? (Dangerous question these days, sorry. ;-)


This is a single device that I can plug MBP power adapter, mouse, and keyboard USB dongles into: https://www.amazon.com/gp/product/B07QXMNF1X/ref=ppx_yo_dt_b...


I'm not going to deny that having them built in is better but I had a tiny case that fit my MBA power adapter, USB-C cable pocket wifi, and hub. Since I always needed the power adaptor and it's cable it wasn't hard to keep the hub with me.


[flagged]


Nasty much? We like jumping on to blame `lusers` here on HN, but this is a design problem. It's a fact of life people lose things, and Apple conveniently forgot about this to push the dongle ecosystem.


The magic that originally brought me into the Mac ecosystem, after a decade and a half of Windows and Linux, was that the work MacBook I received was the first time I had an actual portable computer. The battery life ("you mean it still has a useful amount of charge after 3 hours? WTF is this sorcery!?"), good-enough trackpad (which makes it like 5x better than the best trackpad I'd used before, all of which had me considering an external mouse a must-have for more than 5 minutes of work), and port selection meant I could pick up my laptop—just my laptop—and go, and be fine for most or all of the day, doing almost anything.

The dongle bullshit (and the USB-C "well yes it can do that but only if you have exactly the right cable, so you'd better bring a couple with you" thing) broke that simplicity, and put me back to having to make sure I had other crap with me, which was a shame.

Dragging their feet on moving iPad and iPhone over to USB-C, so they could at least share dongles & charging cables with Mac laptops, was/is salt the wound.


Try Baseus USB C Hub Adapter for MacBook Pro 2020/2019/2018/2017, a 9-in-1 USB Type C Hub dongle with 2 Thunderbolt 3 ports (40Gbps), 4K HDMI (60hz on USB-C, 30hz on HDMI), RJ45 Ethernet, USB-C data port, 3x USB 3.0, and audio:

https://smile.amazon.com/gp/product/B0872V4LFP/

Unlike single pigtail all-in-one dongles, this two piece no pigtail thingy supports 2x TB3 or 2x USB-C/DisplayPort displays since it is tapping both sides of laptop. (Incidentally this also lets you charge on the right to prevent potential issues with charging from left.)


+1 for a helpful recommendation. Much appreciated.


Hah, go back and watch Steve Jobs making fun of computing devices that use a stylus. He agreed with you on the losing things angle.

> Who wants a stylus? You have to get them and put them away and lose them. Yuck! Nobody wants a stylus.

https://www.youtube.com/watch?v=ELZK-Pow6fs

> Handwriting is probably the slowest input method ever invented. ... We reimagined it and what we're doing is completely different than what they (Microsoft) did. ... And what we said at the very beginning is if you need a stylus, you've failed.

https://www.youtube.com/watch?v=w2xPt8txgGs

Apple now uses styluses on their iPad Pro's.


This was for devices that only worked with a stylus, so if you lost the stylus you couldn't use most features of the device anymore.

Modern styluses like the Surface Tablet Pen or Apple Pencil are for drawing and note-taking, in addition to still being able to use fingers on the touchscreen.


This is revisionist history.

In the keynote, he never mentions "devices that require a stylus".

He says "who wants a stylus? Yuck!" and "if you see a stylus, someone did something wrong".

This spin into "well _obviously_ Steve meant X" is purely personality worship, not borne of his words.


> In the keynote, he never mentions "devices that require a stylus".

Back in 2007, if something came with a stylus, it was overwhelmingly the case that the stylus was required.


I think Jobs had a point about designs that require a stylus. There were many portable computers before the iPhone that required a stylus. iOS was designed with fingers in mind and then added a stylus for some kinds of work. This is similar to how the original Macintosh had no arrow keys and forced developers to design for the mouse. Later designs then added arrow keys.


Steve Jobs passed away 3666 days ago.


>people lose things, and Apple conveniently forgot about this to push the dongle ecosystem.

I think it's a bit unfair to say that Apple deliberately change interfaces in order to push their adapters.

They've got 2 different problems to address, firstly they want to update and be in control of the interfaces to their hardware, Firewire,Thunderbolt,USB-C but, at the same time their machines seem to last quite a long time, so at any time there's a pretty significant number of users stuck on older interfaces and with hardware that uses older interfaces too (e.g. I have an older Firewire external hd that I now dongle to Thunderbolt, have noooo idea how I'm going to get that to work when I get this new machine).

So in order to allow all that to work they need to support a multitude of interfaces because getting a new laptop shouldn't mean throwing everything else out.


>Firewire external hd that I now dongle to Thunderbolt, have noooo idea how I'm going to get that to work).

Simply add another dongle, the Apple “Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter” and you’re good to go.

I just tested mine before posting this, from a FW800 drive with the Apple FireWire-Thunderbolt adapter plugged in to the Apple Thunderbolt 2-3 adapter. Works on M1 Macs too.


> push the dongle ecosystem

Out of all the complaints about moving to USB-C, the idea that it is a conspiracy on Apple's part to sell dongles is the one that makes me roll my eyes the hardest. Say what you will about the USB-C-only MacBooks we've had in the last six years or so, but I'm fairly sure they were the first portable computers Apple ever shipped that had no proprietary connectors. No ADB, no MagSafe, no FireWire (not technically proprietary but largely DOA anywhere but Apple in the early 2000s).

I suspect if Apple had a magic wand (and/or even more of their money), they wouldn't have you buy dongles at all, they would have you buy new cables. Which is what I did years ago -- USB-C to Lightning, USB-C to mini-USB, USB-C to USB-A, even one USB-C to USB-B kicking around somewhere. And, because USB-C is not a proprietary standard, I don't think any of those cables are from Apple.

(Yes, I do have an Apple brand USB-C to USB-A dongle that I bought when I got my first USB-C laptop, because everyone was screaming how important they were. I almost never use it: if I find myself using it more than a few times for "dongle + cable", then I buy the correct cable.)


Apple isn't trying to push a dongle ecosystem. they're trying to push a "thunderbolt/usb-c for everything" ecosystem. In typical Apple fashion, they're willing to be pretty radical about it on the computers themselves, and let donglepocalypse be a side effect.


And at the same time, their non-pro machines come with only one spare (not for power) USB-C port. If they loaded up their machines with them across the board, this would be less of an issue.


You can use the power port to connect a dongle too. I have two dongles that have multiple interfaces, as well as a power passthrough so I could connect my power to either one of them. Though I don't use either because my usb-c monitor powers the MacBook.


Which one? Looking for a solid recommendation.


Not one by Verbatim, mine broke after a month. Granted, I was going for the cheapest one I could find just so I could get a HDMI port.


But you'd have to carry that around separately and not forget it, lose it, etc. OP is asking for those ports to be on the laptop itself.


You're being obtuse, not pedantic


Now you're being pedantic!


I have one of these, and it works fine except when it gets jostled and all my peripherals are disconnected.

Also it’s not powered (those are closer to $200), which means it can’t charge my iPhone or anything else.


FWIW I've had nothing but trouble with USB-C hubs. I use my most recent one exclusively as an Ethernet adapter.


For a MBP hub, the most reliable solution I found was "buy a hub with two USB-C ports."

I think between the higher price point & dual ports, they had enough to put a proper microcontroller in there, which seemed to make everything happier.


If I'm ever in the market for a USB-C hub again, I'll keep that in mind!


However, it does take away from Apple's "just works" marketing if you need to research compatible adapters and half of them just don't work properly for anything beyond 1080p60.


Yeah and it interferes with wifi (on MBP 13" from 2019).


Does it support 4k@60hz ?


HDMI, yes. USB-C ones are another story though.


Wouldn’t be all surprised if at least some of those ports will make it back whenever they release the redesigned 13 inch MacBook Pro and MacBook Air. Probably time constraints that made them release them with the old design. Just my guess.


Do you think there will actually be a redesigned 13" MacBook Pro? My assumption was that it was being replaced with the 14", but maybe I'm mistaken!


Can’t see an advantage for them in updating to a new 13” design. 14” is pretty much the same size physically, no? 1/3 inch difference. I guess they’re only keeping it now so they have a starter offering at a low price point.

To get the low price point in a year or two they can just have the low end 14” be the starting Pro machine. Only advantage to the 13 now is the touch bar, if you like that.


Personally I think they will keep the 13” MacBook Pro as the entry level one with a lower tier cpu compared to 14” and 16”. Their old line up had an entry level 13” while the 15” was more capable so that’s what I’m basing it on but my guess is as good as any.


Air will replace the 13” in all likelihood. The difference between the two were minimal to begin with. If the Touch Bar is going away, there’ll be virtually no difference at all.


I would guess we might see new 13" and Air devices next year


Hopefully they bring back a 12" model.


I would be surprised if there are lower-end MBP (smaller than the 14") but a 12" might be better differentiation than 13" vs 14".

I just want a pastel orange MBA though.


That would be amazing. Still miss my 12” PowerBook…


Don't their current 13" models have about the same footprint as the old 12" laptops from way back when?


12" models were about an inch narrower.

According to everymac.com, the 12" PowerBook G4 (2004) had an amazing 10.9"x8.6" footprint (and the 12" MacBook (2017) was 11.04"x7.74" - and 2.03 pounds.)


Built-in SD would allow for cheap permanent storage upgrades to the cheaper macbooks. Apple may be listening to what people want in laptops, but I don’t see them cutting into their own upsells by that much.


In a certain way, yes. A quick search on Amazon shows I can get a 1TB SD Card for $150-300. However, the rated speeds are in the 100-200 MB/s range rather than the over 2GB/s for Apple's 2020 MacBook Air (never mind the faster speeds of their new MacBook Pros which is 7.4GB/s).

Yes, the $400 that Apple charges to upgrade from 256GB to 1TB is high compared to the $200 for an SD Card. At the same time, the storage is 10-20x faster. I think the storage is different enough that Apple doesn't have to worry too much. Sure, some people might try that strategy and walk around with a laptop with a card sticking out a bit and some third party will make a microSD adapter that doesn't stick out, but I think most users will probably opt to pay for the storage and Apple can ignore the small number that buy niche products.

I think between saving money on their Apple Silicon processors and being in a great place to have serious differentiation between the Air and Pro line, Apple doesn't have to worry.


found a (new) 1 TB usb-c flash drive that does up to 1000 MB/s read and 900 MB/s write.

https://www.bhphotovideo.com/c/product/1655140-REG/kingston_...


Oddly I have an "half sd card" that I can use for extra storage in my old 15 inch work MacBook. Its like an SD card with a food that fits perfectly (almost flush) when iserted into the sd slot. It has a lip so you can remove it easily.

I though I'd have a mountable extra storage.. But ultimately it would sometimes unmount on sleep (you could unplug and replug..) and it was a little slow. If I wanted to use the SD card from my camera, I had to remove it. I replaced it with an external ssd. which is bulkier and corded...

this wasn't the exact thing, but it gives an idea what its like: https://www.amazon.com/Transcend-JetDrive-Storage-Expansion-...


Could be nice for a permanently mounted disk for Time Machine backups.


Why not consider a used one previous model with ports if you need them so much, but unwilling to pay extra for overpowered machine?


I think these still stick to 16:10.

If you take 74px off the height for the notch/menubar area, you get 3024 x 1890 and 3456 x 2160 which are both 16:10.


This sounds right to me. In full screen mode, the menu bar area will be replaced by an artificial black bezel that hides the notch. [1]

[1]: https://www.macrumors.com/2021/10/18/macos-hides-notch-on-ne...


What will happen if you mirror screen with a 16:10 monitor? Will it display black bars on the sides?


Keyboard travel was fixed in 2020. I have the last Intel MBP, spec'd to last, with a physical escape key and Touch ID, and I love it.


Keyboard reliability was fixed and switch back to scissors or what they called magic keyboard. But the Key travel distance is still not the 1.5mm used in the pre Butterfly era.


Having used all three of them, I'd say the fixed version is finally good enough. 2015 was one of the best laptop keyboards I've ever used, but the new one is 80% there. It's nowhere near close the disaster inbetween.


Reliability isn't as good for me as the 2015 MacBook Pro. My 2019 16" doesn't register presses that are near the edge of the keys like the 2015 model did. Perhaps it's related to the travel distance. It is my biggest issue with the 16".


This was the selling point for the butterfly keyboard when it was introduced, actually, that no matter where you hit on the key it would be equally responsive.


Anecdata FWIW: I upgraded directly from a 2014 (same keyboard as 2015 that everyone praises) to the first 16". I did notice a difference at first but adjusted within less than a week, never had any complaints after that.


My own anectada: I owned a 2015. It now belongs to my son. I also own a 16" with a Magic Keyboard. I had a butterfly along the way that was a total nightmare.

Going from the butterfly to the Magic felt absolutely incredible. I now own an M1 MacBook Air (mainly for the better and quieter performace, and the lack of TouchBar), and its keyboard feels about the same as the 16", so pretty good.

After a long time, I was helping my son with stuff on the computer, and as soon as I typed a couple words I was blown away by his keyboard. It's better in every aspect to the newer ones.

Magic Keyboard ranges from good to very good. The 2015's is superb.


I’ll supplement my anecdata to say…

I gifted my other old 2013 which had the same pre-butterfly keyboard, erased my data after I had already gotten used to the 2019. I did notice the difference in key travel then. But it didn’t bother me when I went back to the 2019 after. I honestly couldn’t say I’d prefer one over the other, besides current familiarity/muscle memory.


Key travel isn't back to what it used to be, but it's moving down everywhere so apple at least isn't far behind. I went from a T series thinkpad to an M1 air and the keyboard was a clear downgrade, but in practice I find it to be "good enough".

Apple:

.7 mm - butterfly keyboard

1 mm - current mac scissor keyboard

1.3 mm - old mac scissor keyboard

Thinkpads:

1.8mm - T series thinkpads (aka the ones with the good keyboard)

1.5mm - last year's X1 carbon (the thin and light)

1.3mm - this year's X1 carbon


For added reference, the beloved 7-row ThinkPad keyboards found in laptops like the T410 and T420 have 2.5mm key travel.

Kailh Choc (low profile) mechanical switches have a total travel of 3mm.

Cherry MX switches (and most clones) have 4mm travel. Some clones are a little more or less.


I miss the 0.7mm travel. Less work for the fingers. Faster typing. Too bad they couldn't make it reliable.


I know it's completely subjective, but I'm currently on a work provided 2018 mac with the 0.7mm travel and I despise it. I have far, far more typing errors on this keyboard than I do on pretty much anything else.

I'm excited to try the new 1mm keyboard.


Same, I use a mechanical keyboard at my desk, and being forced to use the laptop keyboard would drive me insane. I spilt coffee on it and ended up upgrading to a 2020 Pro which (to me) is just barely bearable.


You spilt coffee on purpose for an upgrade? :)


I certainly don't miss the feeling of typing on concrete. I suppose a very light fingered typist might appreciate the reduced travel though.


Typing on concrete - that's exactly right. I got awful RSI from the butterfly keyboard on my 2018 15-inch MBP. I had used previous MacBooks for nearly a decade with no problems.


I actually thought the old butterfly keys were fine back when I was using them, but after getting an M1 2020 mbp, going back is HORRIBLE


Yes! Thought I was the only one. It was slick, good-looking, and typing was a breeze on those.

But it was super-problematic and had mine broken multiple times over the years. If they could make 0.7mm butterfly reliable, that would have been awesome.


I feel like we are either odd ducks or else in a silent majority. I love minimal key travel and find it a far superior typing experience.


I love my 2019’s keyboard. That’s the year they added the extra membrane to prevent dust getting in. It’s been flawless for me for the last two years.


I have the 16" MBP (late-2019), really annoyed with the heating issues. Any time I do anything intensive, like edit raw images in Lightroom, at some point temperature rises which makes some "kernel process" pretty much hog up all the CPU and the machine starts lagging severely. Looked up online and this is quite common.


It’s come up before but still not everyone knows: you may want to try plugging the power cable on the right side. It can drastically reduce this throttling.


Yea it's especially common when running external monitors. The conspiracy-minded would say Apple wanted to leave a reason for MBP buyers to upgrade to M1 laptops.


Using any external monitor requires turning on the discrete GPU. I'm sure that's a large part of the problem.


Why would that be required? The embedded GPU should be good enough for many things.


Poor drivers. The same reason why the discrete GPU pulls around 20W just for being on with an external screen, even if the screen is black. Apparently not an issue:

    * in clamshell mode;
    * on Windows via BootCamp.
According to some reports in the MacRumours forum [0] this is fixed in Monterey.

[0] https://forums.macrumors.com/threads/16-is-hot-noisy-with-an...

Edit: Not quite a fix. Apparently they introduced a Low Power Mode that helps bring wattage and temps down by essentially preemptively throttling CPU and GPU or something like that.


> by essentially preemptively throttling CPU and GPU

Not really a fix then. That may prevent hurting your machine but it’s still not running at optimal performance.


Indeed. I disable turbo boost on my 16". It hurts peak performance but runs (slightly) cooler and quieter, and sustains for longer. This reads all kinds of wrong.


I have the 2020 Intel one as well. I think the key travel still isn't what it was in 2015 or earlier, though. They are still too shallow for me. It's minor, but I do feel I bottom out too quickly on them. It feels like tapping on glass after a while.


It's insane to me that manufacturers still haven't figured this out. Reducing the travel by .1mm is extremely worth it if you can give a slightly squishy/springy bottoming out with a rubber gasket. It dramatically reduces typing fatigue.


Its kind of funny that while everything else has gotten better laptop keyboards have gotten so bad that the one thing we try to avoid on desktop keyboards (I can never go back to that squishy plastic dome feel after the crisp action of mechanical switches) would actually be an improvement.


If you need a rubber gasket at the bottom to avoid “fatigue”, that means you are smashing your fingers down too hard and can plausibly injure yourself with or without the gasket. This feature of typing style can be improved with practice. Try to use a light springy touch with only slightly more force than minimally necessary to actuate the keys. Try to keep the palms floating in the air (rather than resting on any surface) while actively typing.

More helpful in keyboard hardware would be greater key travel distance and a sharper tactile snap.

(Trying to solve keyboard injuries with rubber at the bottom of the keyswitch is similar to wearing heavily padded running shoes then smashing heels into the ground on every step, relying on the shoe padding to absorb some of the shock. The better way to avoid ankle/knee/hip injury is to get a less padded shoe and fix the gait to have gentler steps and use the tendons/muscles to absorb the shock instead of cartilage etc.)


Why should I spend time training myself when we could just design equipment to be more easily usable. Minimalist shoes fell out of fashion, and serious competitors mostly never used them.


Indeed, that argument feels very much like "you're holding it wrong". Technology should adapt to us, not the other way around. Especially human interface devices like keyboards.

Part of the issue with Apple's keyboards is that the activation point is so close to the bottom due to the short travel. Lenovo does this a lot better and it's really a joy to type on. Bottoming the keys out hard is instinctively reduced due to the earlier and clearer feedback.


I agree that technology should adapt to people’s anatomy rather than vice versa. The improvements in that direction in switch design are a longer travel distance, sharper tactile snap at the actuation point, where possible audio feedback, and possibly lower force required for actuation.

More importantly than keyswitch design, the keyboard is a pretty bad overall shape for human hands. Laptop keyboards could be improved dramatically with a different physical arrangement of keys. In particular the right half of the letter keys should be scooted an inch or more to the right, and arranged so the wrists can stay straight while typing, the right pinky should be given fewer keys to press, the thumbs and index fingers should have more keys, and in general finger reaching should be minimized.

Training to safely and efficiently use tools is also important, however. Untrained typists have injuries at much higher rate than trained typists, because they develop habits poorly aligned with human anatomy. Same story for musicians, etc. Many people would benefit from more effective handwriting training.

Telling people to sit with their backs straight instead of hunched over or slouching, keep their necks straight, keep their wrists straight while doing fine work with the fingers, (more generally avoid postures with high static strain on muscles/tendons,) change positions every once in a while to reduce/shift remaining static loads, get enough sleep and exercise, etc. is just generally good advice irrespective of equipment. RSI is no joke, and people who make heavy use of computers should make some effort not to cause themselves permanent injury.


Yes indeed, I've had RSI in varying degrees for decades ago I'm very careful with it. Though it's easy for bad habits to slip in.

But part of this is that I will feel very quickly when something is ergonomically wrong. The butterfly keyboard in particular but also Apple's super flat magic mouse.

On the other hand, it doesn't have to be something super fancy either. Even the basic Microsoft wired ambidextrous (so not officially "ergonomic") mouse serves me well.

The older Apple keyboard (Island style) also suits me. It's not all bad :)


Because the lack of feedback on the shallow depth is extremely off-putting to most people. Me included.

I can’t say I’ve ever gotten typing fatigue from any keyboard I’ve used even when being stuck typing for 10+ hours with very little downtime.


You should spend time training yourself because the design that “feels” softer is still actually putting a ton of strain on your joints, which often leads to debilitating injury.


It was fixed but not brought back to its former glory.


Having briefly had to use an old Pro from when they still had DVD drives in them (which I used way-back-when but hadn't touched in a long time), I can authoritatively say that the "former glory" is a solid two generations behind the very-flat newer keyboards. Even the one it replaced (2012?-2015?) wasn't as good as those were. Felt great to type on. I don't think nice-feeling keyboards are compatible with extreme thinness.


I own a 2011 Macbook Pro I still use daily. I have Apple laptops going all the way back to the PowerPC days and use current ones through work but that DVD equipped unibody generation has the very best keyboard. By far.


I have a 2011 MacBook Pro and a 2019 MacBook Pro, and the keyboard in the 2011 MacBook Pro still feels so good even after 10 years.


wow.....


I agree that they felt great to type on but at least for me typing speed was much slower. Going back to my 2010 (I think) MBP now, typing still feels great but it also feels like typing in molasses.


I have it too, but to say I love it would be a stretch. It really doesn’t feel like max spec machine most of the time.


I thought the 2016 MacBook Pro was an enormous improvement over the 2015 models, because 3 lbs is the sweet spot for me where I no longer had a hard time deciding between an Air or Pro model. The 14" model is 3.5 lbs like the 2015 13".

Users' testimonials will probably make me happy that I waited for the second iteration of Apple silicon, but I do wish there was an option that had a combination of trade-offs like smaller battery, terrible speakers, or no magsafe to get it half a pound lighter.


We’ve had the M1s for quite a while now and they were already head and shoulders above the previous Intel MacBooks.


> Users' testimonials will probably make me happy that I waited for the second iteration of Apple silicon

Is there a reference? I had the first gen (air) as my personal driver and couldn't have been more happy. I thought most user were pretty happy too.


Add my vote as a satisfied user. The M1 Air is the best personal laptop I've ever used. The battery life is just impeccable, it has plenty of power for what I need, and I imagine the longevity of this thing will be great because of the lack of fan and entry points for dust.


The thickness argument (i.e., excuse) came from Apple, not users. They did the same with the 3.5mm audio jack on phones, claiming they needed the space to make things thinner. Meanwhile, phones like the LG V35 were thinner with the same IP rating but still had an audio jack.


They didn't exactly say they needed the space "to make things thinner". They needed the space for other things like the new generation Taptic engine on iPhone 7 (which was the first phone without a headphone jack, and was also the same thickness as the 6 and 6s).


> They needed the space for other things like the new generation Taptic engine on iPhone 7

I mean, they didn't technically need the space - there was space for one guy to cram one into the bottom without removing the taptic thingy (although it did get shifted slightly) [1]

[1] super long video and not practical for an individual to do it, but shows that there was space and it could've been included. https://www.youtube.com/watch?v=utfbE3_uAMA


Exactly. There's no engineering reason, and if there was, it would be embarrassing. Apple is a marketing and advertising company at this point which makes hardware and software to support that mission. So they marketed that they wanted or needed to remove the jack because <reasons>, when the real reason was that they wanted to sell you overpriced bluetooth headphones and leave no option otherwise.


They added a more sensitive camera that year as well as a brighter display with more colors, found interference, moved the display circuitry and cabling to the other side of the phone to compensate, and then that interfered with the headphone jack instead. So they took the headphone jack out instead of compromising the quality of the audio output.

If the guy who put the jack back in did any actual signal quality testing while playing video to stress the display, I'd be appreciative to have this theory put to rest, but I don't think he did.


I'm not claiming they couldn't have made it work, but having the port so close to the edge (on the curved corner) would create issues for mass production.


I have never owned a Mac anything but the addition of an SD card slot is making this seem very attractive to me. Conversely the latest Surface Pro eliminated its little hidden microSD card reader, seems inexcusable for a device marketed as "Pro" and has really put a damper on my desire to upgrade.


Who wants to take bets on the next Surface Pro restoring the reader?


*restoration of the SD card slot.


A USB micro SD card reader is essentially pennies.


What cost does an sdcard slot impose? What is the benefit of removing it?

It is the size of a thumbnail, and most people won't even notice it if they aren't looking for it.

I want to know who was cheering for it to be removed.

And the few times I have needed it, it was so nice to have. Even if I only use it once in the lifetime of the product, it will be worth it.

I haven't had to change a tire in years, but I think it is prudent to carry around a spare tire.

Simply having access to a feature is itself a feature.


SD card isn't the problem, having a dedicated port for everything is. I really loved the USB-C-only approach:

Get whatever you need and plug. My camera has CF card, my drone has microSD card. So let's add them to the mix too. Also I never need the MagSafe port or never used HDMI port in my life. On the contrary, someone might need HDMI or SD slot everyday. My point is that they can't make everyone happy, and paying (not just by price but also the looks/simplicity) for ports that will never be used isn't the best thing to do.

USB-C solved this exact problem and what they did on this version is a major step back IMO, including removing the Touch Bar which had great potential for improvement.


I used a Thinkpad for over a year without noticing the microsd slot on it. I think they are so inconspicuous that most people don't even notice them.

And what is the alternative? USBA <-> USBC <-> SDCard adapter <-> MicroSD to SD adapter

Is that convenience?

Even if they gave away usb sdcard adapters for free, it would still be trouble to go out and get one. Or carry one around with me everywhere. Just another thing to lose. Another thing dangling out of my computer putting stress on the ports.

If I'm completely honest, I think it is more emotional than practical. This craving for hyper-optimization.

And even Apple now seems to be admitting that it was not a great idea.

I can understand dongles for tablets, but this is supposed to be a laptop, a somewhat general purpose computer.

> Also I never need the MagSafe port or never used HDMI port in my life

I assure you you are in the minority on this. MagSafe by the way is amazing. It has saved my old Macbooks more times than I can count. I've already had to get my USBC power slot replaced from tripping on it.

Having access to the ports is itself a feature. Having the peace of mind of knowing that you can go to the office or classroom and be able to use the projector or a jumpdrive, without having to worry about bringing the right adapters is a huge plus for me and I think most people.


(Cross posting from my reply to sibling comment)

I'd completely agree with you if MagSafe was the only solution, but my USB-C cord has been kicked accidentally multiple times in the last few years. Every time, it just snapped out perfectly either from the brick or my MacBook. Nothing got harmed, so USB-C already solves the only problem MagSafe is solving, so why have it?


And about the HDMI, I understand many people finding it a great convenience. What really should be that USB-C should replace HDMI, or at least it should be compatible in a way that they have the same form factor and can be used exchangeably. One port to rule them all would be great and the best of both worlds.


I suspect MBP battery life didn’t look so great considering the thinness, new hungrier M1 processors, dramatically increased number of pixels on the main display, and possibly the extra ports to small degree. Perhaps dropping the second display is what helped them soften the hit a little. Rarely use touch bar myself other than for adjusting volume and brightness, but in those regards it’s been incredibly convenient.


Yeah, Mr. Ive went too much into "form over function" territory. While I love the thinness of my current 16" I also admit that I'd love to have more juice with a slightly ticker/heavier body.

The Touch Bar had a great promise, but they couldn't improve it enough to make it better (uhm, haptic feedback and faster response). Don't know how much it really contributed to battery drain though.


From my understanding, any display is important contributor to battery drain, and touch bar is a pretty much always-on (while you work) high-DPI display.

I dread returning to the old way of controlling screen brightness and volume, though.


> Also I never need the MagSafe port...in my life.

You only need to need it once. I know more than one person who broke their laptop by kicking the cord pre-magsafe.


I'd completely agree with you if MagSafe was the only solution, but my USB-C cord has been kicked accidentally multiple times in the last few years. Every time, it just snapped out perfectly either from the brick or my MacBook. Nothing got harmed, so USB-C already solves the only problem MagSafe is solving, so why have it?


You got lucky. It's not solving the problem. You just tripped at the right angle.

It's a little like saying that you didn't get hurt when you had a car accident without a seat belt, therefore you have no need for seatbelts.


What if they improved it then? They could easily solve this at the cable level where part of the cable near both ends is magnetically snapping, while still providing USB and PD functionality. Having a dedicated port when they replaced it with a convenient port like USB-C which could also bring the safety of MagSafe, but instead bringing back a different port while the device can perfectly charge over USB-C ports too is a bit weird IMO.


An SD card reader is great, until you need a micro SD card reader and then, oh well. And advances will march on; ports on whatever I use will become outdated. Using an OG M1 Macbook Pro, it's not been a big problem. The dongle I have works great and has more useful ports than the Macbook it replaces.


I have a portable reader (SD/microSD) which I almost never need on the go, and a reader built into my dock (SD/microSD/CF). Another dust-and-small-object-collecting nook is actually a downside for me. Built-in SD reader in my old MBP stopped recognizing cards (unless you push it in pressing at just the right angle, there’re a few posts on Apple discussions about this issue) after a couple of years anyway.

I’m wondering how much bigger the battery could be without all those extra ports—a top-spec M1 Max laptop would probably rival the current king, the 13" M1 MBP.


I thought the same when I upgraded from my 2013 13 inch (with SD slot) to 2020 M1. Turns out it's a lot less convenient. As a camera user, it's so simple to just pull the SD card out of the camera and plug it into the MacBook. But now I just plug the camera in with the cable, even though I have the usb SD reader.


The dongle life is just so very shit. Is there anyone who prefers a dongle to a built in solution?


Not sure about prefer, but indifferent at this point. The few places I might have needed to connect to HDMI (like a conference room) have all permanently attached a usb-c connector at this point. Adding the sdcard back is ok, but is superfluous at my desk because I have a dock (single cable connect at my desk is very nice). I also already carry a small usb-c -> sdcard reader so I can bring photos directly into LR on my iPad Pro.

MagSafe looks like a great return at first, but having used the M1 MBA for a year I've learned that I almost never plug into power in random places anymore. It's a bit like the plug on the bottom of mouse doesn't matter when you do it so infrequently. My MBA ends up almost exclusively charged from the dock on my desk, so the big advantage of MagSafe isn't there anymore.

And for these other ports, we lost a usb-c port. Probably not that big of deal since I've been living with 2 for a year, but it is a tradeoff to consider.


It's pretty similar though - instead of SD card into computer, it's SD card into reader into computer. Have to do this for micro SD cards in some way/shape/form anyways.

This may be a, "everything is amazing and no one is happy" sort of situation. I think all my cameras connect wirelessly to my Mac and my iPhone anyways. I can copy/paste text/images/whatever across the two as well and that's just magic when it comes to casual content creation.


If using the laptop mostly in a stationary place, but occasionally taking it to meetings (or just around the house), the dongle life is okay - means just one cable to plug in when I come back. Of course it would still be better to have a few extra ports on the device itself, there's no getting around that.


And a lot of them break or I lose them. Built in hw seems more robust.


I am glad to see the return of magsafe (and having two USB ports in addition to the power cord)... but if that's the only reason I really prefer the new 14" macbook pro to the much cheaper 13" (that extra "inch" is $800!)

The lower cost machines used to get power and HDMI as separate ports to two USBs.... I wonder if future lower-priced machines will again, if this is a trend...

The extra ports are literally the only thing I want here that's not in the much cheaper 13" pro...


> that extra "inch" is $800!

This is hardly an accurate conclusion. To start with, the screen on the 13" has 4M pixels (2560x1600). The screen on the 14" has 6M pixels (3024x1964). Technically speaking you are buying 50% more screen. The new screen is more dense, but the screen itself is pixels and you are getting 2M more (50% more) pixels. In addition to the actual better quality LED design which is the "Liquid Retina XDR". The new screen is also able to deliver 1000 nits of sustained brightness, with 1600 nits of peak brightness. That is compared to the 500 nits offered on the cheaper model. So again this is double the brightness (even more than that considering peaks). If that weren't enough, the 14" comes with adaptive refresh up to 120Hz. Again, this is double the 60Hz of the 13".

So that's a hell of a lot more than just 1" of screen.

Furthermore, you are comparing the base model of the MacBook Pro 14" to the base model 13", which aren't comparable. The 14" base model comes with 16Gb of RAM and a 512Gb SSD. If you compare the equivalently spec'd 13" model with the 16Gb RAM and 512Gb SSD then you are looking at $1,699.

So for equally spec'd machines, it is $1,699 for the 13" and $1,999 for the 14". This means the price difference is a mere $300.

But even that isn't comparable. Because the $14" comes with the M1 Pro chip compared to the M1. I won't rehash that, but its a significant upgrade in CPU.

So you aren't paying $800 for 1" of extra screen. You are only paying $300 for the difference which offers a major screen upgrade, in addition to port upgrades, keyboard upgrades, and a major CPU upgrade.


One minor note in case others aren't aware - the M1 to M1 Pro is surely a big improvement, but the base 14" does not have the same M1 Pro as the base 16" - it has a lower-binned version of the M1 Pro that only has 8 cores instead of 10, and has a 14 core GPU instead of 16. The arrangement of the cores and the GPU core count is still an upgrade over the 13" MBP with M1, but it's not the full M1 Pro with 10 cores and 16 GPU cores.


Apple should hire you to upsell folks who are considering the 13" -- you just converted me, too.


If you are considering the 13", also take a good look at the air. The difference is $300 and you basically only lose an hour of battery life and the touch bar.

https://www.apple.com/mac/compare/?modelList=MacBook-Pro-14,...


*win a lack of touchbar :)


You could've just said that the 14" doesn't have the touch bar :D


Yep. I've been waiting for this announcement, and the touchbar was 100% the deciding factor for me being a continued customer. No matter what else they did, if they had insisted upon plaguing me with another touchbar, my next computer would not have been a mac. Now I can finally get an M1 system with a reasonable amount of RAM, yay.


Probably need a few more days to let all the information sink in. But generally speaking Apple tends to simplify their product lineup over time. So I would not be surprised if there will be a 14" M1 Pro with 8GB Memory and 256GB Storage starting at $1499 in the future once they amortised some of the cost with higher price model. As mentioned in the post below, if you know BOM cost and list out things on the new 14/16" MBP the price is actually quite competitive.

The more interesting question is what happen to the sub $1299 price category. MacBook Air or MacBook? And how would it be designed?


Jobs' Apple tended to simplify the product lineup. Cook's Apple tends to rationalise product lines based on manufacturing lines, which is why you still see vestiges of older SKUs stick around—because the tooling cost for the motherboards, chassis, etc has already been amortised and it's cheap to keep it going.


> really prefer the new 14" macbook pro to the much cheaper 13" (that extra "inch" is $800!)

It's a $300 difference from the 13" with equivalent SSD & memory specs (nevermind you different CPU and screen size).


Some basic math makes believe the Monterey menu bar might be 74px height.

(3456 * 10/16)-2234 = -74 (3024 * 10/16)-1964 = -74

Which means both models have a 16:10 ratio + 74 extra pixels in heigh. Clearly they are adding stuff for the notch instead of clipping from useful area...


It would still feel like it is clipped, regardless of the actual gain in pixels.


Sounds like you'd enjoy a pan & scan film :)


> Both 14" and 16" have 254 PPI, up from ~220.

That is a very welcome improvement! I like scaling my display a bit so I have more real estate, but then the fonts get a bit blurry. 10% more resolution sounds great!


Bummer, I hooe usbc can still be used for charging. Abd that will be the default charger. I love the fact that I can charge my phone laptoo (lenovo) and my wifes macbook air all with the same charger.

Bummer 2, because now producers of tvs/monitors will have less drive to reolace old hdmi with usbc.

sd cards? Are they that popular nowadays?


You can still charge with the USB-C ports. The supplied power adapter is a USB-C charger with a detachable USB-C to Magsafe cord.


So it's now going to be possible to use magsafe for non Apple device but that are using USB-C charging?


No. Magsafe is a different port. The magsafe cable is magsafe on one end, usb-c on the other. The magsafe end being the bit that pushes power out.


It's a USB-C brick with a USB-C to MagSafe cable. You can remove the magsafe part and use it with a C-C charger


And the MagSafe cord retails for $50.

For a power cable.


I wonder if you can still use the old magsafe power cords to charge the new machines, even if at a lower wattage. Those charger cables aren't cheap, and I still have a few from old MBPs.


You can keep using them with a Magsafe-to-USB-C adapter, for example from Elecjet [1]. I have a couple of those and they have worked just fine with my 2017 MBP.

[1]: https://elecjet.com/products/anywatt-magsafe


I highly doubt it, MagSafe 1 to 2 looks similar but are different physical dimensions. MagSafe 3 looks to be the same, the port looks significantly skinnier/shorter than 2 (about the same as the usb-c ports).


I'm guessing its not physically compatible, since they made a point of calling it MagSafe 3. But we won't know until a machine lands in the wild.


It is shaped differently and they said it was updated to support more current.


Ya, for connecting to a hub it's awesome, just plug one cable in and you get power, monitor, keyboard and mouse. Hope that still works.

Also, I like being able to charge on either side, it's annoying having the cable across when the plug is on the wrong side.

So if the Thunderbolt 4 ports support all this, then it's amazing, just best of all worlds. Otherwise a bit of a bummer.


Thunderbolt4, while more expensive, supports all of this and with a clearly marked cable. Usb-c is a mess. It’s worth the extra $20 to have a cable that can do usb-c and thunderbolt.


USB-C are more like adapters that let you plug in anything that isn't USB-C. There still does not exist a device for under 100 USD that will let you plug in the MBP's USB-C power, a USB-C (over thunderbolt 3) monitor, and also a USB-C to Lightning cable. I hope you didn't have a wired keyboard, too.

There does exist a hub that has 2 USB-As and 2 USB-Cs for 50 USD, but it does not pass through video or power.


My guess is this change was brought about by all the poor quality USB-C cables, hubs, and chargers out there. I bricked my Macbook Pro M1 by trying to power it from a supposedly compatible hub but wasn't negotiating the power output properly. Had to send it in for repair and have heard many other similar reports... guessing they decided to bring back the Magsafe for fast charging to avoid this headache of cheap/improperly manufactured USB-C hardware.


The place I worked when the USB-C MacBooks came out had a hell of a time with video output. This monitor only works with a DisplayPort dongle, this one won't work with a new MacBook at all, this one works but only if you use one specific cable, this one works but glitches in ways it never did with HDMI, et c. There came to be a whole level of cult knowledge about which laptop + cable + monitor combos would work and which wouldn't.

It was a very dumb situation. If they really wanted to go to USB-C they should have had a transition period where they just replaced the thunderbolt and maybe the charging ports, until shit settled down—which it never really did, because the USB-C cable situation is insane, so here we are, with some ports added back.

It's 2021, almost 2022, and I still wish one of the ports they'd put back was USB-A. I'll probably feel the same way in 2025. Maybe by 2030, when almost none of these MBPs are still in use, I won't still need a USB-A port way more than I need a USB-C port.


I think that many people only decided that they didn't need these features because Apple removed them.

It reminds me of this video from awhile ago by The Onion:

Apple Introduces Revolutionary New Laptop With No Keyboard[1]

[1]: https://www.youtube.com/watch?v=9BnLbv6QYcA


Oh man, that's one of my favorite Onion videos. So many clever subtleties that make it a UX nightmare. Like how the first suggestion when you select T is the unicode TM symbol, the absurd sentence predictions that come up first when you type "the a", scrolling through an alphabetical list of every file on your hard drive.

I also imagine they were making a joke with the $2600 starting price, which is funny cause that's what I paid (albeit in Canadian dollars) for my base-model 14" MBP yesterday


They kept two USB-A ports on the 2020 M1 Mac mini. I’m grateful for that, but having them provided by a dock is fine for me.

It’s possible they’ll show up in upcoming machines like the iMac Pro and Mac Pro.


> My guess is this change was brought about by all the poor quality USB-C cables, hubs, and chargers out there.

I doubt it, you can still plug the usb-c >> magsafe3 cable into a crappy hub.

It’s almost assuredly to bring back the “my idiot pet or rambunctious child just ran through my power cord” breakaway support.


This is moot now right? Because even though magsafe point is back, they are still shiping with USBC charger by default.


And you can still plug a sketchy USB-C charger into one of the 3 ports.


The announcement implies that they no longer attempt fast charging over USB-C, so lower power requirement will improve reliability with 3rd party hardware.


What? Where does it imply that?


These new computers can also be charged over USB-C.


> I hope usbc can still be used for charging

They said it can.


SD cards are still widely used in professional cameras, and they're not going away anytime soon.


Yeah, I have one, but I use it so rarely since I had Pixel 4.

I was also looking at wifi enabled sdcards, because I'm lazy and would like to get images sync themselves.

I thought pros use wifi sd cards already.


Wi-Fi SD cards are kinda cool, but I don't know any pro that uses them. The write speeds tend to be slow, which is noticeable even for still photography nevermind if you're recording 4k video on a prosumour drone or camera. Secondly, they're inside a device and have the antenna built in, so as well as draining battery it's slow to transfer files off. If you're shooting 3mb jpegs then it might be OK, but my 6 year old DSLR is producing 30mb files per exposure.


You can still use the thunderbolt/USBC ports (not sure if all three or just one) for charging. But I think only the new MagSafe 3 port gives you fast charge (50% battery life in 30 minutes).


> sd cards? Are they that popular nowadays?

For photographers, absolutely.


> So basically even the new 16" is still thinner than the MacBook 2015 era

Oh wow. Apple's photos make it look so fat. I'm still using a 2013 15" MBP and am very happy with how thin it is.


Yeah, that's what I didn't get either. From the side view pictures, I actually estimated that it got thicker.


Great specs, but since I already have an 2020 M1 MacBook Pro, I'll hold off for a few more years. I'm hoping by then Apple finally has an OLED (or equivalent) screen for their MacBooks.


That’d be micro led, still some years for that.


"Now I just want to know if the new keyboard has more key travel distance back to the like of MacBook Pro 2015." Assuming the keyboards in these new Macbook Pro's are similar to the keyboards in the M1 MacBook Air then they will feel very similar to the 2015 keyboards. Indeed I think I found the M1 MacBook Air keyboard easier and faster to type on vs. my 2015 MacBook Air. The 2015 MBA keyboard definitely is stiffer/heavier. And the keys wobble more - maybe wobble isn't the right word but it's not as precise. Could be it's age at this point and the M1 MBA was brand new; would take more than the 2 weeks I used the M1 MBA to suss out the real differences. I returned the M1 MBA because I needed more RAM; I have a 16" M1 Max on the way. Come November 3rd I'll be able to compare them directly :)


> HDMI, SD Card, and MagSafe. Things people on the internet inclusive but not limited to HN said they will never come back because the future is USB-C.

They don't say it anywhere, but I hope HDMI inclusion is for the 2.1 version's increased bandwidth (even over TB4), potentially allowing for single-cable 8k at 60Hz.


Citing the technical specs this [1] says it's only HDMI 2.0. The spec [2] only says "4K bei 60 Hz", so i guess that's right.

[1] https://www.macrumors.com/2021/10/18/macbook-pro-hdmi-2-0-po...

[2] https://www.apple.com/de/macbook-pro-14-and-16/specs/


Thanks, I missed those separate sections abou what each individual port can do: that's a shame!


>Now I just want to know if the new keyboard has more key travel distance back to the like of MacBook Pro 2015.

If it's the same as the 2020 then no. It's better than the butterfly but not as good as the 2015


I am very curios about this laptop thickness topic. I understand that going from 3 cm laptop to 2 cm is a noticeable difference and might have some practical effects. I can believe that 3cm laptop does not fit the bag well, but 2cm does.

But here we go from 1.8cm to 1.55 cm, does this make any practical, visible difference?


No. It's a matter of being used to something. When used to 1.8 cm, you go "wow" at 1.55, but going in the reverse direction is a shrugging matter. For weight it's the opposite. When used to a light laptop, you find a slightly heavier laptop to weigh a thousand tons, but going from heavy to light is a shrugging matter.


It does. If you have an old MacBook Pro 2015 and MacBook Pro 2016+ the difference is quite noticeable.

But the trade off from 1.8cm to 1.55cm is subject of your personal opinion. In a perfect world we would want a 1.55cm laptop with the same heat removing capability as the 1.8cm.


IIRC most people on HN lamented the passing of magsafe. A lot of people did defend thunderbolt though. Guess they weren't heavy users of that god-awful plug form factor (it always stopped providing a good connection after about 100 pluggings).


> 3456-by-2234 or 3024-by-1964 is 14:9 Ratio

If we subtract 74 pixels from their heights, we get 3456 by 2160 and 3024 by 1890, both of which are 1.6. So I'm guessing the notch is 74 pixels high?


Is the increased thickness the reason for the much longer battery life?


It's part of it. ProMotion helps, by increasing efficiency for screen redraws. A new display architecture helps, due to more power efficiency. More thickness means room for a larger battery. And M1 helps by being vastly more power-efficient than Intel.


Yeah, as a mac user, I could not believe it,

I sold my pro (2015) Bought a pro (2019) with 4 usb-c, even I missed the megasafe.

Now they changed back to 2015, feel cheated by Apple :(


Man, it's the usual game, they gotta sell 'em widgets: do a model with port A, everyone buys dongles and accessories, then a few years later change to port B, everyone gotta buy new dongles and accessories... Rinse and repeat.


I was skeptical going from 2011 macbookpro keyboard to 2021 MacbookAir M1... but the new keyboard is a _JOY_!


These look positively insane. 120Hz HDR displays. Can be specced with up to 64 GB of RAM and a GPU that (apparently) matches a 3070. All the ports you could ever want and magsafe. I can't wait to get my hands on one.

The notch doesn't bother me because it's literally more room on the screen. Laptops with the camera below the screen tend to have an uncomfortable angle that look sup your nose, and the design suggests that they may be adding Face ID in a future iteration.


Where did you see the "matches the 3070" thing? When I was (admittedly) skimming the two articles, the performance metrics were all about "performance per watt" which ... doesn't mean anything for actual performance in a "how long will this scene take to render?" sort of meaning.


They only say 'pro laptop GPU' and everyone is assuming a mobile 3070.

But we don't know what Apple is comparing.


If this is coming from this article: https://www.apple.com/newsroom/2021/10/introducing-m1-pro-an...

In the picture titled "GPU performance vs. power", that is "relative performance" with lower power consumption, so it doesn't say anything about literal performance.

Then there's this:

> In addition, the GPU delivers performance comparable to a high-end GPU in a compact pro PC laptop while consuming up to 40 percent less power, and performance similar to that of the highest-end GPU in the largest PC laptops while using up to 100 watts less power.[2]

Where that 2 points to:

> ... Discrete PC laptop graphics performance data from testing Lenovo Legion 5 (82JW0012US). High-end discrete PC laptop graphics performance data from testing MSI GE76 Raider (11UH-053). PC compact pro laptop performance data from testing Razer Blade 15 Advanced (RZ09-0409CE53-R3U1). Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.

  82JW0012US on Amazon has a 3050Ti in it
  11UH-053 has a 3080 in it
  RZ09-0409CE53-R3U1 has a 3080 in it
So there's some data. I've still got some real questions about why they're talking "relative performance" in their graphs, though.

(edit: formatting)


It was written very small in grey text on the bottom right of the slide. I didn't notice it during the presentation but people posted screenshots on Twitter. The "Compact pro PC laptop" is Razer Blade 15 Advanced (rz09-0409ce53) and "High-end PC laptop" is MSI GE76 Raider (11UH-053). You can see if you pause at 22:46[0]. Both of these have RTX 3080s and in Apple's chart their chips are slightly slower but at much lower power. What we don't know if what benchmark they used for this comparison since 375 "relative performance" isn't a defined metric.

[0] https://youtu.be/exM1uajp--A?t=1366


What does the vertical axis mean in those graphs? "Relative performance" seems made up.

Certainly there are tasks they can do to skew that one way or another.


Probably covering their butts in case whatever benchmark they used works differently on ARM vs x86, or if there are weird macOS differences. If there were a bunch of performance patches made to the Mac version of Shadow of the Tomb Raider that made it faster on macOS, that's not because of a faster chip (just an example, I don't know how they benchmark).


There was a link in the slides for the models but I missed it.


M1 Max GPU specs https://youtu.be/exM1uajp--A?t=1218

Mobile 3080 specs https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-max-q...

fewer teraflops but higher gigatexels, gigapixels


Here is another Mobile 3080 from the same site: https://www.techpowerup.com/gpu-specs/geforce-rtx-3080-mobil...

That seems closer to the power consumption they had on their comparison chart.


I don’t think we know. The Apple website shows GPU benchmarks for pro apps for the new MBPs, where there is a big performance leap relative to the top GPU from the last model. But I haven’t seen comparisons on gaming against nvidia GPUs, which I think is what would be needed to make such a claim.


Part of the reason that nvidia GPU drivers are so large, and they've been so unwilling to open source them is a good amount of nvidia's secret sauce for gaming is in the software. They'll work with devs on optimisation before launch with might disproportionately benefit their hardware, sometimes the drivers will fix games for devs, e.g. swapping out shaders for visually equivalent but better performing ones, or the z-index issues cyberpunk had if you didn't update to the "game ready" drivers. e.g. look at the lows for Tomb Raider or anything for Red Dead 2: https://www.youtube.com/watch?v=KIwIeobQSnQ

It's why when Intel moved from the GMA to Intel HD (i.e. "can just about render a desktop" to "can play indie games"), it took years before games even reliably worked on Intel GPUs, never mind performed. Of course once they did it pretty much killed off the budget GPU market.

Even at the high end consumer market, it's why you often find AMD GPUs outperforming cost equivalent nVidia cards for compute use but then falling flat in games.

Apple has the advantage that the big game engines like Unity/Unreal etc. have been targeting their hardware for the mobile support for the last while, so they have arguably a bit of a head start relative to Intel's entry, but I have to imagine they have a catch up to do before they run high end PC games comparably. Probably will run the entire iOS catalogue in 120hz though.


Yeah, they also claim to have "studio quality microphones". I think specs are great, but comparisons are not always honest.


They made a direct comparison to "the fastest laptop we could find". I think there was fine print that had the exact laptop they compared it to, but I noticed it too late to read. But presumably this means at least 3070 performance.



Wow, this is going to hopefully support VR experiences where you carry the laptop on your back or chest…


No need for that with the new quest


Yeah, but then you'd have to settle for a Quest.


The Quest 2 is a major upgrade.


Hardware-wise maybe, but it's still bogged down by Facebook. Supporting them by buying their products, when the window of opportunity for user-respecting alternatives is still wide open, is frankly unethical.


Fully immersive portable AR…


...which has "NVIDIA® GeForce RTX™ 3080 Laptop GPU 16GB GDDR6" for its GPU.


Yeah, but the things that aren't clear from the comparisons are how under-clocked those gpus may be, and at what power usage did apple set them to before they ran their tests.


In at least one of the charts they had the M1 using 100 W less power than the top spec laptop. So they must have had that thing clocked pretty high.


> All the ports you could ever want

I agree with everything else you said and I think overall these machines will be awesome.

I would disagree with you on the ports though, I think this is kind of a miss for Apple. They caved into some of the loudest complaints from several years ago which were already coming from a loud minority and that minority is now smaller than ever.

And if they were set on changing the ports these certainly aren't all that you could ever want. A few USB-A ports as well as keeping at least the 4 USB-C ports from last year (instead of reducing to 3) would have been more useful for more people than the HDMI and especially the SD ports are.

(HDMI used to be useful for plugging into a projector for presentations, that use-case is now nearly non-existent now that all meetings tend to happen over zoom, webex, teams, hangouts, etc. even when in-person. HDMI used to be useful for monitors but increasingly displayport over usb-c is supported by everything except the lowest end monitors. SD used to be important for stills and video footage, but increasingly cameras use CFExpress, CFast, output over hdmi to an atmos-style recorder, or even micro-SD for small GoPro style cameras, all of which will still need dongles. Devices like Raspberry Pis require micro SD not standard SD).

They also could have added the nifty ethernet-over-magsafe via the powerbrick that the iMac has and they didn't do that.

Edit: One additional thought - I'm seeing from the comments some reasonable situations where HDMI still comes in handy - cheap monitors, plugging into a tv to watch something, and I guess there are still lots of people physically plugging in at work for presentations. Fair enough, but in that case it's weird that this is a "Pro" feature. These are all very non-pro usecases and you'll still need dongles around for anybody with a non-pro machine (like a macbook air).


> HDMI used to be useful for plugging into a projector for presentations, that use-case is now nearly non-existent now that all meetings tend to happen over zoom, webex, teams, hangouts, etc. even when in-person.

I'm not sure you have the information to make the claim that the use case for HDMI is nearly non-existent. We use HDMI all the time. Availability bias is a thing. I find that people drastically underestimate how much they don't know about a subject or use-case of a product.


HDMI is ubiquitous. Any current projector, TV, or monitor (or any produced in the last 6 years) will have an HDMI port.

For a static port set, I think the burden of proof lies on whoever wants to remove the HDMI port.

However, I think Apple ought to have considered a Framework-style approach to ports. Why have static ports when you can have dynamic ports?


> Why have static ports when you can have dynamic ports?

Because dynamic ports (basically internal dongles) reduces the amount of internal volume they have to work with. This would cause compromises in other areas like a thicker laptop, less battery life, noisier fan, etc.


> Why have static ports when you can have dynamic ports?

The numerous USB-C dongles are basically dynamic ports..albeit with a lot more inconvenience. But internal swappable ports removes a lot of internal space.


Yep, I'm in Australia in a state that has been largely in office for most of the year and I used the HDMI port on my Dell XPS. Although because of the macbook users, every HDMI cable in the office has a usb-c dongle stuck on it so I wouldn't be too inconvenienced by the lack of the port either.


I wish I could plug into HDMI all the time. I still get a lot of VGA projectors !


I strongly disagree. They didn't cave in to a minority, they gave Ive the boot and are finally putting back so many ports they know they should never have removed.

Why they shouldn't have been removed?

Because USB-C is awesome and _almost_ everywhere, but you need ONE important occasion where you need to connect a screen, plug in an SD card or even an external (pen|hard) drive, maybe in front of customers, and you get really infuriatingly frustrated, and regret buying this stupid trap which has half the ports of the one from 10 years ago, but costs 3x as much and just lost you a customer.

The old one would have worked. Who's the idiot which came up with this contraption?

Dongles are NOT the solution. They get lost or forgotten, they're messy.

Reducing to 3 USB-C ports is perfectly fine as we regain the dedicated magsafe.


> which has half the ports of the one from 10 years ago, but costs 3x as much

I'm with you right up 'til that '3X as much' part - MacBook Pros have remained static or gone down in real prices over the last decade - not 3x higher.


Maybe they bought a Chromebook 10 years ago?


Point taken, thank you.


For all the times Apple seems to be willing to "stick to its guns" and ignore consumer complaining, this seems like a weird time to give in.

They basically made a line-in-the-sand on USB-C adoption, and it pretty much worked. I'd argue that an HDMI and SD card readers are less useful, and certainly far less versatile than additional USB-C ports.

For all the dongle jokes of a few years ago, I don't really see a reason to go back to less versatile ports in laptops.


Every monitor and TV manufactured within the last decade without exception has an HDMI input and monitors that support thunderbolt are nearly double the cost. Even if money is no object that HDMI port will prove to be useful.


It's weird to sell the machine with a super high resolution 120hz mini LED screen though and then optimize the ports for the cheap $400 monitor market.

You're right though the HDMI port might come in handy in random situations, better to include it than not. The SD port is the one I really don't get. And I still maintain that the continued lack of USB-A ports is a much bigger dongle problem than the lack of HDMI port was.


They didn't optimize the ports for the cheap monitor market, at all. These machines support Apple's extremely expensive display and all the various 4K and 5K displays out there, admirably.

And they ALSO have the single most needed and bitched about port if it's absent, by far, for nearly all business users, which is HDMI. Far and away the best thing they could have added.

I personally will not use it often, but when I do, it will be invaluable, and for the users I support, it will be a tremendous increase in convenience and ease-of-use.

If you need a lot of USB-A ports, get a CalDigit dock and be done with it. If you need just 1, there are extraordinarily tiny adapters.

It's time to get over USB-A.


Which reminds me: I want more USB-C ports; Caldigits latest Thunderbolt 4 dock finally added the ability (as I believe the chipsets available maxed out at 2 ports previously). One of those, plus a 14" MacBook Pro with the M1X Max and I think I'll be set for computers + connectivity for a very long while.


If I had one of these machines, I would not be using that port at my desk.

However, I would be using it in every conference room I use, multiple times daily. It’s the difference between plugging in the cable vs. finding the right dongle that’s security-cabled to the presenter cable, and connecting it in the middle.


SD is absolutely massive in Asia, on a level that I think cannot be gauged properly from the West. I reckon these ports have been added with an eye to the Asian market rather than California.


That's interesting. Are you talking about microSD in smartphones, or standard SD?

MicroSD support would have actually made more sense to me than standard SD. But from what it looks like micro cards will still require a dongle of sorts.


SD and microSD nominally inhabit the same space, but microSD is effectively limited to storage whereas SD can be used for other things; it's just more flexible to provide an SD slot, if you can spare the lateral space. Among other things, microSDs are so tiny that it is often more practical to put them in an SD adapter whenever you need to handle them. That's why SD adapters are so widespread, and you'll likely find one in the package whenever you buy a consumer microSD.


Yeah, that's a bit weird actually. I looked at my country's most popular comparison site, and listed the top-10 best viewed memory cards. Of those 10: 9 microSD, 1 standard SD.

https://tweakers.net/geheugenkaarten/vergelijken/#filter:q1Y...


You can get an SD case for a micro SD card for less than 5 bucks. Can't adapt the other way inside a slot.


It's actually coming free with most off-the-shelf microSDs. And exactly that - SD as a physical format is more flexible, nowadays it can be used even for non-storage cards.


> SD is absolutely massive in Asia

Why is that?


Cheap android phones with decent cameras and expandable storage (microSD).

Edit: I'm obviously oversimplifying, but most of the world doesn't have always-on data connections (e.g. rural areas) and manufacturers outside of silicon valley do optimise for people that may want to download media so the can view/listen when they're not connected. Nothing beats a cheap phone replaceable battery and storage.


I assume it took off instead of USB storage for reasons lost to time.


It's not some great mystery, micro SD cards are tiny, USB storage isn't.


Because the cheap HDMI monitors might be the only ones available to you as your company calls people into the office but now with the added beauty of hoteling.

/snark


SD cards are essential for anyone who does video or photography seriously - and that’s a big part of the market this laptop sells to. I’m glad it’s back


Nope. I do them seriously. My Mavic 2 Pro has a microSD slot, my Canon EOS 5D Mark IV has a CF slot (it also does have a SD slot too, which is slower and I've never used it).

SD card reader is a nice addition for a handful of pros, sure, but definitely not essential for everyone who does photo/video seriously.


yeah and my blackmagic cameras have SD/CF/SSD - but SD is super common, most cameras have it. New UHS-II SD cards are pretty fast - https://www.cameramemoryspeed.com/sd-memory-card-faq/fastest...

MicroSD->SD is easy with a dummy adapter


My company had to dongle up EVERY meeting room once developers switched to MacBook pros with USB-C. But we still have non Mac users who use HDMI. To this day, every meeting has the potential to be sixty seconds late while we navigate through dongle hell…


The HDMI is for plugging into displays that are available to you when you are out of your office. Since they all have HDMI inputs this is a very sensible port.


Depending on HDMI spec supported, an HDMI port can power pretty high specced display. I am not sure where this cheap monitor thing is coming from.


USB-a dongle's make money and I'm sure the removal of magsafe cost them on applecare


Anecdote, but... I rarely, but sometimes, need to read SD cards, for e.g. embedded systems, RPi, Nintendo Switch, etc. Not enough to go out and buy a brand new USB-C dongle which I would then lose somewhere in my house, but enough that it's annoying to not be able to use my primary laptop with it.

Don't get me wrong, I would still buy a MacBook with no SD card slot, but I would occasionally run into cases where I wish I had that dongle that I lost somewhere.


Well these are all microSD cards, so you would still need an adapter :)


The SD card slot is a weird duck.

There's a single use case for which can't easily be replaced by a USB-C dongle: using it as makeshift additional "internal" storage.

SD cards work pretty well for that, for most read-heavy use cases. On older Macbooks, with undersized internal storage, I'd keep an SD card in the slot. I'd keep downloaded media there, installers, even infrequently used applications. Assuming one's using a high quality SD card, performance was "plenty good enough."

I wouldn't keep anything crucial on an SD card, but everything on my local storage is backed up to the cloud anyway, so not really an issue.


Take one of the "pro" markets—photographers. They sure have to deal with SD cards, and they often shoot tethered, which almost always means HDMI connection.


Tethering isn't done over HDMI. Invariably it's propietary, Micro USB-B or Micro-USB 3.1 to the camera and a plain old USB-A to the machine. Wireless tether is a thing to.


I imagine the air and base model pro will still retain USB-C only.


I really hope they bring MagSafe at least back to the air in next years rumored redesign and refreshed chip.


i was also hoping that they would somehow be able to shoehorn an ethernet port inside of the power adapter, like on the M1 iMac

on my M1 MacBook Pro (which, throughout the pandemic, has rested on my desk in clamshell mode all day, like a desktop), I use a

  ethernet <=> ethernet to thunderbolt 2 <=> thunderbolt 2 to thunderbolt 3/usb-c
adapter chain, which takes up an entire usb-c port since it doesn't work at all when I plug it into the usb-c pass-through on my other usb-c hub

unfortunately, there seems to be some kind of hardware bug in the thunderbolt 2 ethernet adapter, so even when the machine is asleep or off, it runs hot

it also seems weird that Apple doesn't manufacture a first party ethernet <=> usb-c adapter — if i'm not mistaken, the only one on their website is made by Belkin and costs $30; moreover, I don't think anyone even manufactures a 10gbps ethernet adapter for the M1


There are Thunderbolt 3 10gbit adapters that work on M1. I use a QNAP QNA-T310G1S on an M1 Mini with no issues. It's SFP+, which I prefer--both due to existing hardware I already have, and for lighter power consumption compared to 10gbit over RJ45.

QNA-T310G1T is the RJ45 model, but there are other makes available too.

(I bought the Mini at launch, and they've since released a SKU with built-in 10gbit Ethernet)


> (HDMI used to be useful for plugging into a projector for presentations, that use-case is now nearly non-existent now that all meetings tend to happen over zoom, webex, teams, hangouts, etc. even when in-person. HDMI used to be useful for monitors but increasingly displayport over usb-c is supported by everything except the lowest end monitors.

> the cheap $400 monitor market.

I found your employer is great and you're rich.


>Fair enough, but in that case it's weird that this is a "Pro" feature.

Doing presentations at work is a non-pro use case?


If the HDMI port supports 100% of whatever resolution/framerate possible for modern top end monitors then I'd be happy to see the end of fighting with DisplayPort. Getting Macbook Pro to work with modern monitors is a buggy mess, loads of hassles []

I'd be happy to know that they made this choice deliberately to fix whatever issues where I guess plaguing the USB-C display connections.

[] https://forums.macrumors.com/threads/some-users-having-exter...


I believe the 3070 uses a different die from the mobile 3070. The mobile and discrete parts have completely diverged so comparisons are meaningless. Laptop GPU performance is entirely a question of power and cooling.


In their charts, they showed it's about even (in performance, with much lower power consumption) with a laptop using a RTX 3080 (Mobile), which performs about as well as an RTX 3060-Ti / 3070 (Desktop). So that's pretty wild. More and more games getting playable on macOS too through native ports or emulation, so in theory you wouldn't need a gaming PC anymore. These numbers are for the M1 MAX though, which is a bit more expensive.


Not sure where you’re getting that from? This [1] appears to be the machine they compared with (assuming the part number actually refers to a unique spec), which only has a 3050 Ti?

[1] https://psrefstuff.lenovo.com/Detail/Legion/Lenovo_Legion_5_...

EDIT: Nevermind, they mention other laptops in the other slides, some of which do have a 3080.


It's here: https://www.youtube.com/watch?v=exM1uajp--A

@23:15 the high end laptop being a MSI GE76 Raider 11UH-053


Fascinating. I have pulled the trigger on an M1 Max to replace my i9 MBP, and wasn’t hopeful for the GPU.

But now I am VERY curious how well it will do at gaming using CrossOver Mac. I’d been eyeing a getting a 3080 laptop as a secondary for being able to do mobile gaming, but an M1 Max + CrossOver Mac setup might scratch that itch with only a single laptop. A fun experiment for next Tuesday.


A desktop RTX 3070 is 50-60% faster than a mobile RTX 3070. Which ofcourse is understandable.


> Which ofcourse is understandable

It really isn't. Nvidia should be using clearly differentiated model numbers to represent two completely different lines. They did this to intentionally confuse customers.


"They did this to intentionally confuse customers."

Same for intel, there is a huge number of people who were asking me why their laptop with a dual-core I7 is slower than a desktop with an I3


They even did for a generation, with the 10xx series.

But their partners probably weren't happy that the highest end laptop sku was a xx60, so here we are again.


Yes the naming is very confusing but it is understandable that a desktop GPU can be faster than a laptop GPU.


They're both GA104, but the desktop version does have 15% more cores enabled and 30% higher clocks.


I have a bunch of questions about the notch:

- When you go "fullscreen", does the fullscreen window stop at the bottom of the notch, or like can you see the notch in maximized Netflix videos? In the keynote I thought I saw a maximized video whose height stopped just under the notch.

- So is the extra notch space used only for the menubar? What happens when you auto-hide the menubar (which I do)? Is that functionality disabled on this hardware?


The "fullscreen" screen area is the 16:10 that results from subtracting the notch height. The area left and right of the notch is strictly for the menu bar and will be blanked out on video or full-screen apps.

I guess that if you auto-hide the menu bar, a black strip will mask the area.


Note that it starts at $2500, and that's probably with the lowest cpu and 16gb of ram. Probably gonna be $5K+ when specced to what you mention above.


The M1 max with 64GB ram is $3,800, and the lowest M1 pro with 16GB ram is $2,000.


United States order for 16-inch 1tb ssd 64gb ram shows $3900. Brutal.


I just paid $7,063.33. It should arrive in two months lol.


What specs get it to $7k?


64GB memory, 8TB SSD puts you at $6k, plus $500 in taxes. Add on 3 years of AppleCare coverage for $399 and you're basically there.


Gotcha, I completely glossed over the SSD options.

Since I replace my machine frequently enough for work, 32gb and 1TB is plenty for me for the work I do.


All of them :P


Yep. Starts at $2500 CAD before the extra 13% tax :( and that's not even the spec I would want


the 1TB default HD, spend $400!!!!! to get up to 2TB drives me nuts. I know the 1tb is just a teaser but that's still a crazy jump.


> Laptops with the camera below the screen tend to have an uncomfortable angle that look sup your nose, and the design suggests that they may be adding Face ID in a future iteration.

I agree it's probably designed with a future FaceID implementation in mind but if this isn't the case, Dell have managed to move their camera back up above the screen on the latest XPS's. In my opinion, this is the best looking option currently:

https://m.media-amazon.com/images/I/81FV+91am5L._AC_SL1500_....


Can the notch be "hidden" with a black toolbar?


Yep, it can be 'hidden' with the menu bar. They specifically mentioned and showed it in the keynote.


Probably - some of the marketing shots they showed in the event showed it being hidden with a black bar, but I couldn't find any that showed the toolbar itself being below the notch. At any rate, if it doesn't exist at launch, someone will make an app for it.


And if Apple doesn't provide that app, it will break after every update.


Also: Can the mouse pointer be "hidden" (moved) under the notch?


Yes. Tim said the notch looks great in dark mode.


Yes, it looks _really_ nice. Specced with the crazy graphics, and 64G ram and 2TB storage, it costs 4000$. Ouch! Is the Max CPU still fanless?

(I really wish there was a matte screen option at the price point, my old macbook pro was a very expensive mirror and it puts a lot of strain on the eyes to try to concentrate on whats on screen, rather than behind it)


None of it is fanless on these models. This is more of workstation Core i9 replacement.


Only the Air is fanless, every other M1 * Mac has fans.


Tip, search online for "custom screen protector". Some companies will cut a screen protector to custom size. Just measure length and width of the screen. For my 2019 16" MBP, I ordered a matte one and it's glorious ever since.

For Dutch people: https://www.smartfolie.com/


I think is about the same if compared with a windows-based with the same specs (GeForce RTX 3070 Mobile) and so on.


it costs 4000$. Ouch!

A Windows laptop from HP or Lenovo with similar specs/performance won't be cheaper.


Oh c'mon... I'm not going to argue that battery/build quality is better on the apple, but that is some serious FUD.

Literally just picked up an asus zephyrs g15 15.6" with 2560 x 1440QHD 165 hz refresh rate, AMD Ryzen 9 5900HS, 16GB Memory, NVIDIA GeForce RTX 3070 and 1 Tb ssd for $1850 at Best Buy, then picked up 2x16gb DDR4 3200 MHz for $150 off Amazon to upgrade to 32gb.

Weight is a little over 4 lbs. (4.1lbs)

Total cost: $2000


The device doesn't even have the same specs, the $4000 is for a MacBook with 2 TB SSD and 64 GB of RAM yours has a 1 TB SSD and 32 GB of RAM that's half.

Other than that yes MacBooks are expensive. But so are HP EliteBooks or Lenovo ThinkPads.


You can get comparable Thinkpads at ~2800$ if you use the secret store and/or wait for sales.


Can you elaborate on the "secret store" ?


There is a Lenovo storefront called "perkopolis" to which you can find an entry key fairly easily that has deeply discounted thinkpads. So for example a 5800HS + GTX3060 + 16GB RAM + 512GB SSD laptop is going to run you around 1200$ there vs 2000$ without a discount.


Is this a US-only thing, or is it available elsewhere?


You are comparing a high-end MacBook to a mid-range Asus. Build quality costs money.


As someone who owns an Asus and a work-provided Dell XPS, and has extensively used a 2016 MBP... Yeah, no. Build quality differences are minimal, and at least on the Asus and to an extent the Dell not everything is glued/soldered on so it's maintainable. And it wasn't Dell or Asus who shipped faulty keyboards they refused to replace.


I own both a MBP (2019) and an Asus Zephyrus G14 (2020). While the latter is good value for the money (I use it for gaming), chassis, screen, sound, touchpad, fingerprint sensor, and reliability are much inferior. If you want to compare the new MBP to PC laptops, at least compare apples to apples (ROG Strix, Legion 7, etc.). Otherwise you're just arguing that a high-end laptop isn't worth the money to you, which is fine but has nothing to do with Mac vs. PC.


Total cost: $2000

What's your point. Of course you can get a laptop with lesser specs from a cheaper manufacturer for less, especially if you upgrade it yourself after the fact.

But if you price a Thinkpad P series of HP ZBook with 64GB RAM, 2TB storage, and similar GPU/CPU performance, you are looking at $4000. I know because I have one.


A 2TB SSD an 64 GB of RAM will cost around 800$, so that will be 2600$.


Have you been to Lenovo's or HP's homepage and seen what they charge for their high end ZBooks and P series laptops with those specs? No one is saying that you cannot get a Windows laptop with similar specs for cheaper if you buy from a cheaper brand and upgrade it yourself. But if you want a top of the line high end Windows laptop from someone like Lenovo or HP with matching specs then those companies charge basically the same as Apple.


Yeah I have. Lenovo has a really high list price but it has a secret store and a lot of sales.

Anyways that's the beauty of PCs, you don't have to pay the exorbitant prices or the OEM if you don't want to, a Lenovo P15 will allow you to swap the RAM and SSD however you want and it takes 5 minutes and like 9 screws.


FWIW, similarly spec'd 16" MBP (32GB, 1TB) is about $3100 though it will likely outperform the Asus in various measures that may or may not matter to a particular owner.


Can confirm. Currently using a P52 with 64 go ram and an Xeon processor, and a 4K screen. Probably cost my company 5-7k after all the enterprise support fees and such. Would much rather have a Mac, the thing is just way smaller and lighter.


The Pro is faster than a laptop 3050Ti [1]. The Max is slightly slower than a laptop 3080 (looks like 8%). [2] Relative performance is somewhat meaningless, and of course also depends on what features where being used. However, my M1 Air compared extremely favorably against similarly priced PC offerings, and I expect my 16" M1 Max will compare favorably with a 3070.

[1] https://live.arstechnica.com/apple-october-18-unleashed-even...

[2] https://live.arstechnica.com/apple-october-18-unleashed-even...


We don’t know what those tests are really measuring. Is it a game? Or some video production workflow? And what about features like ray tracing? Does the M1 Max support all the same technologies? Apple’s benchmarks seem purposely obfuscated.


Apple probably cares about thing like adobe softwares more. As it is their main consumer uses.


>that (apparently) matches a 3070

I don't want to know the thermals on the laptop then.

There is a reason for the 3070 being available with a minimum of 2 large fans and liquid cooling existing at all. How do you want to cool the same performance in a thin laptop? With passive airflow?

For reference, this is what a laptop with the mobile (!) version of a 3070 looks like:

https://gzhls.at/i/89/84/2618984-n1.jpg

It's also 1199 euros, just saying.


I noticed the feet on this laptop seem a lot more prominent than previous iterations, and I'm sure the reason is to allow better airflow


That would make a lot of sense. I've noticed that my 2015 MBP deals with heat a lot better if it's raised up off the surface it's sitting on.


They do seem more prominent.

It's not clear to me that's a cooling win, though? Unless there are actually some kind of vents on the bottom?



Passive airflow and a heatsink?


There's two fans so: not passive.


They claim m1 has 4x less power use when running at full speed, so thermal output will also be 4x less


CPU is completely dwarfed by GPU here.


given how m1 macs perform, thermals significantly better than any laptop with non-apple stuff inside.

nvidia/amd just don't design with thermals in mind and that's exactly why you need to attach multiple helicopters to every piece of their hardware.


3070 is Samsung 8nm vs what I believe they said was TSMC 5nm here? Being an all new architecture perhaps it also has more design thermal improvements.


Well yeah putting the camera below the screen seems borderline insane, but the notch definitely takes away beauty from Apple laptops known for their incredible sense of design.


Personally I think the 2020 design looks bad because the top black bezel is thicker than the sides. While things like the ipad have it uniform around the whole display.


> All the ports you could ever want

Add 4xUSB-A and Ethernet please.


notch bothers me a lot. i was ready to buy, and i hope all you guys enjoy, but i’ll be sitting the notch out. it was one thing on a phone. this is just dumb.


You are getting more screen, not less. Nothing dumb about that.

I can see someone having an aesthetic issue with it, as a distraction perhaps.

But having a fat bezel at the top, as all other bezels shrink toward zero, is also aesthetically non-optimal.


> I can see someone having an aesthetic issue with it, as a distraction perhaps.

The person that can’t get any work done because of the notch staring back at them has probably other bigger issues on the go.


or, it could be a real thing with people and you should gfy


It's just an addition, it doesn't remove any real estate from existing screen. Have a dark/black background/strip around on top and it should solve the issue. Sure not ideal, but solvable.


Personally, the investments in the camera and display are wasted for me. I run my mbps in “clam shell mode” and use external displays and cameras.

The 64G of ram is huge though. It only took them half a decade to figure it out.


64GB RAM was available on the 2019 Intel 16" MBPs (at least). This comment was written from one of those with 64GB RAM.


You should wait for the "Mac Mini Pro" which will have M1 Pro and Max options


For me that’s not an option, because I use my MacBook with an external display all the time, but I also use it as a laptop from time to time. Mac mini is not a solution for me.


Yeah, I'm kind of in that inconvenient spot as well. I use my laptop docked 99.99% of the time.

That other 0.01% of the time is the killer.


I am a fellow full-time clamshell MBP user.


Wow looks like Apple has abandoned every bad decision on the MBP for the past 5 years in one swoop. No touchbar, increased key travel, added back hdmi/sd/headphone/power jacks.

Plus bumped up RAM limit, M1, new displays, 120Hz... Wow.


As someone who bought a mid-2018 with the escape key on the touchbar, then watched the next model have it's own key, then watched all these ports get added back... I'm extremely frustrated to say the least. Trade in is valued at $840. If I had known either of these two updates would have gone "backwards" like this, I would have waited to purchase


Funny you say backwards: this 2016 review predicted this (in its way) https://blog.pinboard.in/2016/10/benjamin_button_reviews_the...


This is amazing as having only vaguely followed the announcements I thought it’s was published today until the very end.


I’ve been sitting on my 2015 MCP considering it good enough for my needs until now. I’m sure I’m not the only one.


You aren’t. Very ready for this upgrade.

ps… 6 years for a laptop that… works very well to this day is why I have no qualms about the price of this thing.


Same! Still have the 2015.

Really the only thing that doesn't work well on it to this day is the dreaded Microsoft Teams, which brings the thing to a screeching halt.

Well, plus trying to do more than minimal VM/container use, to be honest.


We have a lot of people at work with 2015 models - and they are SUPER happy. I have the 2018 model with i7 and I can't really see any difference in performance. I love the display, but mostly I use a 27 inch high DPI external screen along with an Apple magic keyboard.

One thing I strongly dislike about the i7 is how hot it can get with long compiles and such. So I think it's time to upgrade to the M1 Pro or Max.


I’m fairly sure that is a ms teams issue, it runs like trash despite how much hardware you throw at it.


Of course it's an MS Teams issue, but I am required to use MS Teams for work, so it's problematic to be using a machine that has trouble running it!

Unless you're saying even a new M1 mac has just as much trouble running it!


Got the same life out of mine, although the battery swelled up this year. :-/

Didn't bother to look into the cost of replacement as my initial research indicated that they'd want/need to replace the keyboard and upper chassis as well, so probably a $600+ repair. So I opted for a company laptop instead.


Yeah, my (maxed out at the time) 2012 mbp15r is amazingly still usable -- and there's finally a suitable replacement.


Team 2015 MacBook Pro here. This thing still works so very well and only turns into a pavement grinder when it comes to the fan sound when I play Civ6.


After the 2016 models came out I felt like I'd won the lottery when I found a 2015 model going cheap-ish.

It's been a long wait to upgrade


Yep. I have a Mac mini, a 13 inch MBPR an 11 inch Air all from around 2013 and they just dropped off the MacOS update path in the last few months. All are more than adequate for general office work including software dev.


Back in 2018 I chose to buy a refurbished mid-2015 specifically to avoid dealing with those issues. And feeling no frustration whatsoever :)

I guess when you take a stand and refuse to hand over your money for anti-features, you do get rewarded sometimes.


The early-2015 model has been working wonders for me over these past years. Now the hard part is between the new MBP or Framework, hard choices.


Having had to send in my early-2015 for repairs after accidental liquid damage, my next purchase will be a Framework. Mac repair costs are insane, I would have tried the repairs myself (thank $DIETY for YouTube repair videos), but it was going to be a multi-hour-process fraught with risk, so I sort-of get why Mac repairs are costly.

Since I had no AppleCare as I chose to "self-insure" (it's a 6-year old laptop), and the Apple Genius helpfully informed me the repair costs were the same as the price of a brand new Apple laptop (what a coincidence!), a 3rd party handled my repairs, and it was still pricey. I'm comfortable with DIY repairs, so I see Framework laptop when I decide my Mac bites the dust, or needs more costly repairs.


The Framework laptop is a good choice—I'm typing this reply from one.

My least-favorite thing about the older MacBook Pro I migrated from was that (even when it was covered by AppleCare+) both times its battery failed I needed to wipe its SSD, hand it over to Apple for a week, and then restore it from a backup when it was returned.

If my Framework's battery ever fails I can order a replacement for $59 and replace it myself in minutes.

I see a lot to salivate over in the new MBP. But as long as I can avoid it, or until batteries become radically more reliable, I'll never again rely on a main laptop without a user-replaceable battery.


How did the switch to Framework feel in terms of daily use? Most importantly: how is the trackpad on Framework?


I'm mostly happy with the trackpad. I don't think it uses a haptic click like the MBP's trackpad, for whatever that's worth.

My one complaint about the trackpad, in comparison to the MacBook Pro's, is that while both support using two fingers to right-click, I have to be a bit more conscious about spacing my fingers slightly apart on the Framework laptop's trackpad (in Windows, if that matters) to register the right click, compared to on the MacBook Pro.

The build quality of the Framework isn't totally on par with the solid aluminum chassis of the MBP, but it's totally fine for me. The Framework's fan sounds less annoying, probably on account of the wide airflow through the bottom of the case.

The Framework is much faster than my old Intel MBP, but not quite as fast as an M1 MBP would be. The battery life is middling (but fine for my purposes). Overall I'm very happy with my choice after a few weeks.


Why couldn’t you change the battery of MacBook yourself?


Both times that the battery swelled up previously, the laptop was still under warranty. Apple wasn't going to hand me a new battery to install myself; I had no choice but to let them install it if I were to have it covered.

Now the laptop is out of warranty. If it were to fail now, I could in theory buy a battery replacement kit from iFixit, and muck around with adhesive removers and many delicate parts and hope I don't break anything else in the hours-long process: https://www.ifixit.com/Guide/MacBook+Pro+13-Inch+Function+Ke...

That's a fairly user-hostile situation, though. There's a reason Apple doesn't advertise the battery as user-serviceable, in contrast to some other manufacturers.


Ignore the "trade-in" value. Your machine has excellent resale value, because it's a Mac. Take advantage of it.


> Your machine has excellent resale value, because it's a Mac.

If the laptop is from Apple's garbage era of 2016-2019 who would pay good money for it? Macs don't unconditionally have excellent resale value, the resale value depends on the quality of the item.


A lot of people will pay very well for those machines. I hear that you think otherwise, but that's because you haven't actually bothered to check. I agree with you that the 2016-19 machines are inferior, though.


Put it on eBay with an auction starting at 99c and see where it ends up, you'll be surprised.


Same here. I bought the mid-2018 for $3.5k (full spec). It had battery issues and now I think it's mostly just junk especially that it's quite slow with the new macOS.


Use the keyboard service program[1]. A full keyboard replacement includes a new battery as well.

https://support.apple.com/keyboard-service-program-for-mac-n...


You should never upgrade to the latest macOS due to performance loss issues.


It depends. Sometimes macOS upgrades are worth it for new features.

For me personally, I installed Big Sur for the ability to use Stereo HomePods as an AirPlay output (previously you could only choose one, but not both in stereo).

It made my movie/tv/video watching experience so much better that it's worth the downside of slightly lowered performance.

That said, because macOS is not a fully sandboxed system like iOS, it's probably worth doing an erase install every 3-4 years, especially if performance has dropped noticeably. I did this with the 11.4 mid-cycle update earlier this year and it was helpful on my 2014 machine.


You definitely should because of so many of the hacks that have come out.


Seems Apple loves to Think Different every couple of years. They make these paradigm shifts so quickly, it's a blessing and a curse.


Just swap CapsLock with Esc


This is a price you have to pay for being an apple customer.


Damn, I liked the touchbar :-(

I think that Apple's biggest mistake is not making it more obvious that the bar is editable in all programs. It becomes pretty useful when you can keep only the functions that you use the most.


Since the non-touchbar folks across the internet are more vocal, I love the Touch Bar.

What's sad is by killing it on the new MBP, they essentially kill it on my laptop as less and less developers will care to implement it. I wonder how long Apple software like Xcode will support the Touch Bar.

Edit: times it's superior to buttons: scrubbing videos, changing Safari tabs


It's not that people are against the Touch Bar. People are for physical function keys.

If they had added the Touch Bar in addition to physical function keys then there would have been less resistance to it.


I am against the Touch Bar and would rather have a keyboard without function keys than a keyboard with a touch-sensitive area that I can easily brush with my hands while typing.


I’m against the touchbar, and haven’t used it at all for 5 years. It seems like it could be useful for casual users, but it offers almost nothing to pro users who already have keyboard shortcuts memorized.


It offers a lot to pro users who want to have more shortcuts than are practical to memorize.


I would rather have those shortcuts visible on the screen, where I am already looking, rather than having to peer down at the keyboard to find them. And I'd also rather use the methods I already have for issuing commands to my computer, the keyboard and pointing device, rather than using a third method for some random subset of commands.


than are practical to memorize? How many could you fit on the touchbar? 20?


Around 60 I think. About six that were perpetually visible, then a dozen that were revealed via a "folder" that appeared when one of those was tapped. The remainder swapped out per-app (like, one set for Safari, another for XCode, etc.).


So you never changed volume or luminosity ?


They could have easily changed volume on their display - I had the touchbar but never liked the fact that you had to look away from your screen just to change the volume.

Changing volume is definitely one of those use cases where the physical function keys were much better IMO - It's one physical button press rather than touch the button, look down at the slider, and try to drag it into the right position.

I also use my laptop in a dock about 50% of the time, so end up just using the on-screen volume adjustment and keyboard shortcuts anyway... If Apple really believed in the concept they would have put the touchbar into their keyboards too.


They definitely could’ve made this more discoverable, but you can touch and immediately drag to change any of the sliders. No need to lift your finger or have a 2 step process


At that point I might prefer a touch screen that can be disabled. The extra real estate for touchbar with the minimal functionality didn’t make sense to me


I also just cannot stop hitting it by accident. It's been a year and I'm still doing it constantly. I don't know what's wrong with me.


You and everybody else :) I think its combination of placement and behavior/sensitivity contributed to its demise.


Less? That's like 1.5 additional browser bars sitting there doing nothing. And we all know how much we hated additional browser bars (Remember the old times when we still had those? lolol).

As of now, it's 0.5 additional bars. The whole row after ESC is pretty much useless.


I can't imagine having so few Safari tabs open that changing tabs with the touchbar would be useful ...

Honestly, I'm on the side of "the touchbar is inoffensive". Never bothered me one bit, but I don't use it anyway so I don't care that they're removing it*. I'm sympathetic, though. I always liked the butterfly keyboard, and the general opinion on that has been made patently obvious.

* I wonder if this has to do with the fact that I never look at the keyboard, so it just doesn't occur to me to use the touchbar, same as I never saw the point of keyboard backlighting.


I also love the touchbar, the only button it's not superior to is the escape button. Who touch types the function keys anyway? Love it for the volume and screen brightness controls.

Not too worried about my laptop being killed though, they screwed the keyboard up so hard it's essentially dead anyway. Have to take it to the Apple Care center to have it revived again sometime soon because I've got 3 sticky keys that won't unstick.


I touch-type volume and brightness keys all the time. Back in the olden days I touch-typed function keys for debugging (F5 = continue, F6 = step over, etc).


Volume and brightness are probably the main reason why I absolutely hate the touchbar. It's clumsy to adjust, they never respond quite as fast, and sometimes take 2 or 3 touches to actually acutate, despite the button on the LCD changing the background - which means physically it actually registered my finger but decided to ignore it.

Second gripe is precisely its main selling point - that it can change based on context. I don't want my keyboard changing. I don't want to have to look down to operate my laptop, it breaks the flow and slows me down. I had 2 laptops with the Touch Bar. I tried every software for customising it. BTT, MTBMR (or whatever it's called), and at least one other. I always eventually fix them with the few buttons I use (no sliders): volume/mute, screen brightness, keyboard backlight.

Then, there's also the fact that you can't adjust its brightness or turn it off, and it's always emitting the full spectrum of light, which I find uncomfortable in the dark or late in the day. Granted, this is a smaller thing which is why it's at the bottom but it builds up the list of annoyances.

Edit: just remembered another one - it can freeze. Most of the time it can be fixed by killing a process. Other times, you need to reboot the computer.


> changing Safari tabs

FYI, command-<number> will let you switch between browser tabs. So command-2 will bring you to the second tab (starting from the left), etc. I tend to only navigate between the first five tabs or so in this manner (basically, what I can reach with my left hand), and I drag tabs I'm likely to navigate to like this to the left.

What this looks like in practice is:

<tab1> some admin console </tab1> <tab2> email </tab2> ... <tab10> documentation about the admin console </tab10> (drag documentation in tab10 to occupy tab2)

Now I can use command-2/1 to go between the console documentation and the console itself. IMO, the cost of dragging a tab to the left is much cheaper than having to break your visual connection with the screen in order to hunt through the touchbar.


It was buggy as hell, good riddance. For me the biggest annoyance was when volume buttons would freeze the touchbar, and the only way was to reset the audio system! Ugh!!

(Make it inherently safe - physical keys, so less chances of programmers ruining it).


>Damn, I liked the touchbar :-(

If Apple had released computers with the touch bar above the existing physical function keys, everyone would love it and other companies would provide a similar feature, the same way that they have all converged on the same industrial design as what Apple pioneered 20 years ago.

Instead, Jony Ive insisted on the touch bar replacing the function keys, and here we are. At least the butterfly switches are finally dead. (It only took years of worldwide embarrassment and Ive's departure to make it happen!)


As someone who bought a new MBP 2 years ago, and never wanted a TB, why do you like it? What's the use case? In my situation, I run in clamshell mode, connected to a large 4K monitor 90% of the time, so I can't even touch it most of the time, but I never found a use for it that wasn't inferior than the function keys it replaced.


In my particular case:

* Top row of physical buttons wasn't a big deal since I map Caps Lock to Esc.

* Changing volume and brightness is better in the touchbar: the buttons are large enough to get them by touch, and sliding the finger is faster and more granular than repeatly pressing buttons.

* Here's a tip: you don't have to move your finger from the button to the slider. You can just slide from the volume button.

* BetterTouchTool allowed me to have any shortcut in the touchbar.


I mapped out loads of complex shortcuts and automations to various apps on both a per-app and global basis.

- Instead of cmd-ctrl-opt-D in Safari, I’d hit the touch bar button that looked like an iPhone.

- instead of hunting and pecking for XScope screen calipers, I would touch the bar button that looked like calipers.

Etc. I had several dozen per-app automations and shortcuts mapped to easily visible and understandable buttons, now I’m back to wondering if it was F3 or F4 that meant “hide the sidebar” in XCode.

This is a huge step backward. The Mac community has decided that it wants to stay in the past; Apple is saying, “fine, you dinosaurs want the ultimate 2015 MacBook Pro? Here it is. We will leave our engineers who think about the future to the iPad.”


Touchbar was awful. Making it editable in all programs was a nice gimmick, but in the end who wants to look down on their keyboard to see which key they need to press next? There is also no mechanical feedback breaking the symmetry between the touchbar keys and any other key.


I loved the idea of it but in practice I found that I barely if ever used it and the bugs and issues were plain annoying. Stuff like it getting stuck in 'volume control' mode after engaging the slider and being unable to close it or persistent touchbar apps not being so persistent. At one point I attempted to use one of those dock to touchbar apps which worked great and I assumed would help my productivity since I use the dock in a hover to show mode but I found that it often just disappeared to be taken over by default functionality. Which might've been an app bug, I don't know. The promise of it was really cool but it never really ended up being all that useful for me so I sort of forget it even exists.


So you're the one.


Yeah, me too - used it only for sound and video controls though.

I think they gave way too few APIs to devs for it to be truly functional.

The only time I was missing it was some emulators that didn’t display function keys even though the keys were needed for them.


Because I work at home and never travel anymore, I've been using a late 2019 15" for 2 years and I'm not sure I've had it in "laptop mode" often enough to have a real opinion about the touchbar.

I use an external keyboard and a pair of 4K screens with it, so it runs closed at the back of my desk. :)


Ha is this going to be like Google's blob emojis where everyone hated them when they were in use and as soon as they finally removed them loads of blob-lovers came out of the woodwork...


Function keys should be usable in many programs as well...


> Damn, I liked the touchbar :-(

Yep, that's the downside of a quasi-monopolist. Limited choice.


Bringing back a dedicated HDMI port is a downgrade. This port can only do 4K 60Hz HDR (implying HDMI 2.0), which is inferior to previously-supported standards in addition to taking up space as a single-purpose port.

The Intel MacBook Pro 16 supports DisplayPort 1.4 over USB-C, which can do 4K 98Hz HDR, and doesn't take up dedicated single-purpose ports.


Nope! Not a downgrade. Distinct upgrade. You are aware this machine has THREE Thunderbolt 4/USB-C ports, yes?


How many 4K monitors can it drive at once?


The maxed out configuration officially supports 3 ProDisplay XDRs (6K), one 4K screen via HDMI and the built-in screen.


“Connect up to three Pro Display XDRs and a 4K TV with M1 Max. Or connect up to two Pro Display XDRs with M1 Pro.”


Thank you!

Edit: but I wonder how many Hz the M1 Pro can drive a pair of 4K displays at...


Theoretically Apple could implement HDMI 2.1 that supports 4K/120Hz and above with DSC. Maybe they think HDMI is just for a compatibility.


The tech specs page only says 4K 60Hz HDR


Yes. The point is that HDMI itself is more advanced than Apple implemented.


Not a dowgrade as many conference rooms are still equiped with HDMI and it's still more practical to use when you are unsure about the setup.


Can't you still use the USB-C ports for the higher bandwidth if you want to (and have the dongle?)


But you can't use the HDMI port as a USB-C. I would much rather another USB C to an HDMI, which never gets used


True, but you also have a dedicated power port, when before if you wanted to plug in that had to be one of the USB-C's too.


I doubt there’s another USB C bus available to be used so it’s not really stealing away one(or if it is it’s shared with other peripherals built in).


And they made a new bad decision: a notch.


This isn't a bad decision. They are literally adding more space on the screen that would normally be useless.


The new Dell XPS's managed to squeeze their camera/mic up above without needing a notch[0]. I'm guessing Apple had the ability to do so too but decided not to either because a.) they're planning on adding FaceID at some point in new models and/or b.) because they're trying to "unify" the design aestethics on macOS and iOS

[0] https://m.media-amazon.com/images/I/81FV+91am5L._AC_SL1500_....


Dell XPS webcams are generally really inferior to MacBook Pro webcams. https://www.youtube.com/watch?v=QgFd_w2n1es&t=193s shows a good comparison between the M1 MacBook Pro from last year and the 13" Dell XPS. The difference is night and day, even though the M1 MacBook Pro has a 720p resolution. I imagine the difference is similar with the larger 1080p webcam on the new MacBook Pros that, as mentioned in the announcement, has a larger aperture to let in more light and a larger image sensor.


Very easy to squeeze a shitty camera wherever you feel like it. Maybe Dell should move the FN key to the left of the CTRL key instead so their keyboard was tenable.


The bad decision is not adding face id to the notch


Why would I want face id when I can just touch a key?


Not everyone has the same abilities


If one can’t use a keyboard, maybe the laptop form factor is not the right one to choose.


I'm not who you are replying to, but as someone with a condition that leads to dry, cracked skin on my fingertips, it's not that I can't use a keyboard, it's that fingerprint scanning doesn't work for me. Thankfully it's not the only way to unlock a MBP.


This is going to sound awfully like shilling. But I’m a large fan of the “unlock with Apple Watch” feature of the Mac; and I’ve tried to reverse engineer it for Linux.


Look into KDE Connect which has a similar feature. Not proximity based as far as I know, but it handles the hard part of unlocking the device.


My problem with it is it only works 75% of the time. Another 15% of the time it just hangs indefinitely. And 10% it just doesn't even try.


This makes me curious wether you have used a FaceID device or not. I remember saying the same thing but now I kind of “hate” using a TouchID device.


I'd take a touch sensor on the back of the phone like is common on android devices over face id any day of the week, by the time the phone is level with your face it's already opened.


Then you’d have to pick it up each time (or use PIN). FaceID works pretty well even when the phone is lying in front of you.

I also think that with FaceID my phone is generally unlocked when I pick it up as well.


Apparently you haven't been wearing facemasks for the last year like the rest of us...


I have, but I think I unlock my device at home/work/maskless 99 times for each 1 time I need to do it with a mask. And 99 times I do that, I am generally in a place where I can just pull down my mask safely for 1s. That last 1 time I am OK to use the PIN.


Because with Face ID the computer would be unlocked before your finger even reaches the key.


They cant fit FaceID within the thickness of the lid on MBP.


Maybe they are going to remove Face ID from iPhones instead, which would be a good decision?


I would like that but I assumed I was in the minority - have you heard thats planned?


I've never heard a rumor of anything approaching "Apple plans to get rid of Face ID". I don't see how it's possible, at least until fingerprint readers under the screen are 100% reliable, and even then. The only time I've seen people complaining about Face ID is when wearing masks (though pairing it with an Apple Watch fixes that).


> at least until fingerprint readers under the screen are 100% reliable

I am using a $200 Oppo with screen fingerprint reader, never had any issues.


Interesting! How fast is it?


Takes 1 second or maybe less than that. I have never thought about it because it just works.


I don't see Face ID ever fully going away, even if Touch ID returns as an option. The depth-sensing camera is used for other purposes, too, like Animojis.


pure speculation, but I reckon this is due to security concerns. the camera on a laptop is off by default, including when the laptop opens. i for one have always been a little concerned that most phones (android and ios) don't have a "camera on" indicator.


Hey, I can agree with this and who knows if they aren't planning on doing it.


or using hole punch camera


>"This isn't a bad decision."

Of course it's not a bad decision. It's idiotic decision. They literally added something to spoil what was needed. They could just fix their previous idiocy but no .. they had to add another one. Even the fact of it presence is annoying enough to avoid buying such model.

It is "uuugly" (with S.Jobs intonation) and that should be enough.

It's also invitation for more SW bugs and unnecessary complications. My Menubar is set to autohide and even that is not working properly because some windows from time to time stuck in a mode when they fail to extend themselves to occupy space where MenuBar was. They stuck in size FullScreenSize.Height-MenuBar.Height and you cannot change it unless you reboot the machine. It's obviously a bug. They cannot fix it for many OS versions already. Imagine adding to this bug another complication with notch ... Goodness.

Let's not forget that other idiots are going to copy it everywhere without too much thinking too.

"Ugly" argument should be enough. But there are more: How about mouse pointer. Should it be stuck once you reach this notch or it should pass through? Both ways it is idiotic. How about full screenshot - it will be with notch now? How about those apps that would fail to hide it? I am pretty sure you'll stuck with this status bar spoiling the whole picture here and there. How about multiple screens and complications with moving windows from one screen to another with keeping their size or complications with screeenshots when multiple screens are connected? What if you have a lot of menu items in your app? Some of them will be after the notch?

And all of this for what? To solve something that was already solved long time ago by "autohiding" MenuBar?

For my taste adding this absolutely useless "notch" is really idiotic decision . The very presence of this ugly notch is spoiling the feeling.

Even knowing that this device was made by people who find this idiotic notch aesthetically pleasing spoils everything because you know it's not the first idiocy and not the last idiocy you'll find there when you dig deeper. At least this is my expectation now. I was correct about ports, I was correct about keyboard, I was correct about idiocy of touch bar, lack of magsafe and I think I am not mistaken here too. They could've ask me, would save them a lot of redundant efforts in production. I use laptop to it's maximum and described my usage in details somewhere in comments. I really use laptop as laptop and I use each and every feature of it.


So you would rather have a thick bezel containing the camera, as before?


It's not useless, remember that full-screen is thing?


If somebody wrote a plugin that simply moved the menubar (and the maximum top of fullscreen apps) $height_of_notch pixels lower and drew nothing but black above it, would you be happy? Because that would give you parity with existing macbooks.


I would be incredibly happy with this but I'm not sure it is possible. I certainly hope so!


I read that hiding the notch with a black bar is actually a built-in feature they just briefly brushed on in the keynote.


Didn't they just say that it looked "really good in dark mode"?


I rewatched it and have to confirm what you wrote, unfortunately. It was just a rumor, sorry.


I can almost guarantee there's going to be something like this built into the OS.


No problem there. "Full" screen can just start below the notch. (Don't know if they'll do that, but they should.)


Looking at the videos, that appears to be exactly what they did. That makes total sense for not breaking existing apps but now I'm curious whether developers can opt-in to use the space around the notch in full screen mode.



Not to mention, with the mini led display, letterboxing will be indistinguishable from the bezel.


If you’re curious, this just posted: the result full screen mode renders below the notch (which is still more pixels than the previous models) and developers can customize it to render in the rectangles they call auxiliary spaces.

https://developer.apple.com/design/human-interface-guideline...


Ah, the classic "glass is half full" discussion


Think you misunderstood the parent comment. Full screen doesn't somehow add pixels into a bezel


Now you get the same amount of window space without losing the menu bar.


Full screen doesn't extend to the sides of the camera on current laptops. All that's there is the bezel.


There is a generous bezel remaining below the screen. They could simply shift the display downward and have the same size display in the exact same footprint without the notch. It's a purely aesthetic choice.


I think it's really funny how they undid every single complaint this crowd had, and rather than focusing on that, people are complaining about the notch.


Well, it's kinda like if someone made a really nice meal for you but left a booger on the top of the plate.

You'd complain about that booger, right?


No, it is like if you ordered 15 donuts, got 2 free ones, and are mad because 17 donuts don't line up neatly when put in 3 lines


How unnecessarily dramatic. They fixed so many things, on the balance this seems like a good deal for those of us that use MBPs. Implying that you’d reject all those changes over the notch seems quite silly to me.

Personally I think everyone will forget the notch in a few weeks. We got over it for phones, which have much less screen real estate to spare. If the extra space enables a permanently visible status bar in OSX, even when full screen with apps, then I personally will be thrilled with this change.


It's a plain analogy, any drama would be created by the reader's reaction.

People also stop complaining simply because they've complained enough. It's definitely not always the case that they "get over it". People that need a Mac will certainly rationalize since they have no choice.

But given a choice, people would absolutely not buy a laptop with a turd like this notch front and center.

As usual though - Apple gives you no choice.


Nah I'm not gonna spend 3500 on this.


Assuming you’re willing to buy a MBP personally, skipping it over the notch seems inexplicable to me. In every other way this seems like the best MBP delivered in a literal decade, assuming they didn’t mess up the keyboard. Much faster processor, faster display, better keyboard, better peripherals, and finally removing the 16GB memory cap. They even maintained compatibility for people like me who prefer single cable docks.

Of course if price is the reason, then that makes more sense. Apple seems to be returning to a wider (and more reasonable imho) price and performance gap between the pro and the air. If I was in the market for a laptop, and I’m not because I think laptops are silly in general[0], I’d go for an air. The pro is very much in the “work will buy one for me” price, and it offers performance I don’t really need out of a personal machine.

0 - As I’ve said before, laptops generally compromise too much for my tastes. Like many others I work entirely from a desk, so my personal mac is and always will be a mini. But since work will inevitably give me one of these, I’m thrilled that they’ve made it so much better, even if I wouldn’t buy one myself.


I wanted to get one personally cause I'm still on a 2014 but the design puts me off. It looks like one of those old school 2008 models. The notch is just ugly.

I'd get one but I need it to look premium if I'm dropping 4k and right now it looks like it's from 2008


It’s your money, but this mode of thinking is utterly alien to me.

The point of the MBP is that it’s for professionals. It’s a tool, not a fashion statement. The idea that it’s not worth the money because it looks outdated is utterly baffling to me; it’s worth the money because they’ve stuffed it full of the most performant components apple has ever put into a laptop ever. I want it to be a function over form machine, and worrying about its looks as part of the buying decision genuinely never crossed my mind.

If you want a fashion statement laptop and don’t need the performance, then don’t get a MBP. Again, your money; I don’t get it, but you do you. But the issue here isn’t that the laptop is bad per se, the issue is that that laptop wasn’t made for your use case.


> It’s a tool, not a fashion statement.

Pffft, as if Apple didn't market every. single. one. of their products as a fashion statement.

Apple clearly has always chosen form over function and being different over function. That's the only reason why the ridiculously useless Dock exists at all - marketing loved it since it embodies both of those traits.


> Pffft, as if Apple didn't market every. single. one. of their products as a fashion statement.

Nonsense. The Mac Pro is clearly marketed as a professional took for professional use, always has been. Ditto with their high end displays. I don’t even think they had these objects in the last Apple Store I went into, actually.

Furthermore, every single ad I’ve ever seen for the MBP has been about artists, creatives, and programmers using it to make things. They have always presented it as the professional’s tool for creating stuff, even if at times that’s been a bit of a farce.

Now compare how they presented the MacBook, a laptop that they sold in gold color. That is a laptop they presented as a fashion choice, and interestingly it’s also the laptop they abandoned first.

> Apple clearly has always chosen form over function and being different over function

It’s only “always” been this way if you’re relatively young. In fact Apple’s turn towards form over function sometime after Job’s death caused quite a bit of angst here, both in terms of their hardware and software design. I’m old enough to remember when the MBP was unquestionably the best laptop a developer could buy, and I was very sad to watch it slowly lose ground to other laptops as Apple pursued thinness over user experience, hardware specs, upgradability, or durability.

If anything else, what I’m seeing today seems like a return to Apple from earlier in my career, when they made professional grade laptops that lead the pack. There are still some changes I’d love to see, such as more upgrade ability, but an Apple that’s willing to make its devices thicker if it improves user experience is very much a “function over form” move.


I have a suspicion that most vocal notch haters are not even users of Apple products. Notch on iPhone? It is in the status bar anyway, so mostly does not matter. Notch on a mac? Same. Right now I have just an empty space in the center of the menu bar. I could not care less if notch takes a bit of that space.


We don't owe Apple anything. Apple is a company. It's supposed to satisfy customers, not the other way around.


Huh? Who said you owe anything to apple? What a weird suggestion.


Don't forget that the older power bricks can be used on this thanks to the MagSafe-to-USB-C charging wire.


Is the cable necessary? Can they no longer charge through the USB ports?


Looks like it'll charge via mag-safe or USB-C, but mag safe will go faster on the larger models due to it exceeding 100W.


I'm afraid other manufacturers are going to copy it.


I mean I have a 2019 MBP and right now the space next to the camera module is black bezel. What possible harm could be done buy adding pixels to those areas?


But of what use are they? It saves you like a dozen or so pixels when apps aren't in full screen, and they don't do anything in full screen mode. Wouldn't hiding the menu bar save the same amount of pixels without the notch?

I personally find notches to be very distracting visually, and I just don't like them from a design standpoint.

I think Apple recognizes this even. Looking at their marketing material. You have to scroll pretty far down to ever be shown the notch, as it seems they're intentionally hiding it with apps in full screen mode.


The notch issue can probably be fixed though just by turning it off and that area will just be black space. I hate the notch too so I'll probably just do that.


The notch is awful. But the prices are worse. I really wanted to buy something today but those two factors are very off-putting.


I guess we'll see. It obviously fits OS/X most cleanly since the notch just effectively becomes part of the menu bar.


It's hard because of Windows.


Would you prefer a larger bezel?


I would prefer something like the Dell XPS laptops, very minimal bezel with a camera and screen that still looks edge to edge. I'm talking about the new XPS ones, not the old ones with the camera at the bottom of the screen.


The XPS has a 720p camera and people complain on forums it has terrible picture quality. It also has thicker bezels than these new macbooks.

I think apple made the right trade-off. Thin bezels and a good camera at the top of the display. Sadly you cannot have both in a laptop without a notched design.


That approach leads to a webcam that's garbage. Apple chose the notch approach to allow thin bezels on most of the screen while still having a 1080p webcam with good low light performance.


for folks following this the 2020 Dell XPS camera's have absolute TERRIBLE quality.

https://www.dell.com/community/XPS/XPS13-9300-2020-version-w...

The Apple camera solution here, while it doesn't make you happy, may make others happy.


Hardware performance and design don't seem to be the same here though - XPS could have gone with a better camera in their smaller bezel and GP's point still stands.


It's unlikely the reason for that on the dell is the design of the bezel, most likely they just cheaped out or got a discount on several year-old parts.


Sure, but when people say ie, Dell does this better - every time I go look at the Acer or Dell product - they haven't actually focused on what users want.

My guess - looking good on a zoom call for your work or your soririty or your dating videos is going to matter a TON more to folks than whether or not something is integrated into a fat bezel, or is in a notch.

This debate has already occured BTW with the phones, and despite various claims that the notch would destroy iphone sales it did not.

That said, I build my own PCs (talk about driver issues long term) and do purchasing of Lenovo and Dell for business (and a dell "server" costs a mint even though what is inside is also not THAT amazing) and don't use a mac personally, but my family does, and so I'm not totally blind to the value offering they have.


Yep, and the notch was a lot smaller than I expected from the discussion when looking at the apple site.


Then buy a Dell XPS laptop? Why do macbooks need to be perfect for everyone?


Is a company not beyond reproach? Can't I like the product but still criticize its flaws?


But it’s not a flaw. There’s nothing objective here. The notch is preferable for many, and annoying to some.


What kind of comment is this? Your parent expressed a valid criticism followed by a reasonable solution.

It's not like Apple can't do the same as Dell.


I have to think Dell and others could have put in a notch if they wanted to make that compromise. They clearly felt it better to either do it right (and find a way to nestle the camera stuff into the bezel) than to go full notch.


I would prefer the menubar to not be messed up.


I would be surprised if the new OS didn't handle that


> Would you prefer a larger bezel?

I would, if only along the bottom!

tl;dr: a thicker bottom bezel allows the necessary space for a better keyboard layout without shrinking the touchpad, and makes the laptop more usable when your eyes are at heights closer to that of the screen rather than looking down on it from above

Because I have poor eyesight but I like to keep a lot of text on my screen, I often work partially or fully reclined with my laptop lying on my chest or my stomach. When I do this, the height of my fingers as they rest on the keyboard tends to obscure the bottom part of the screen on 'modern' laptops with super-thin bezels all around, so I have to reduce the height of my full-screen terminal.

On older laptops, where the bottom bezel may be a full inch or more tall, I don't need this. Additionally, I prefer a full, standard, IBM style keyboard layout: a dedicated row of F keys, spaced out in the standard way, and with full-sized arrow keys. One problem with such keyboards is that they compromise the size of the trackpad, because of th space they take up on the bottom of the laptop. On keyboards without trackpoints, or with designs centered on large, excellent trackpads like MacBooks, this cannot work well.

So for me, an ideal hardware setup for input on a laptop might well be something like a MBP, but with a 3:2 display and a bottom bezel 1-2 inches tall, which would allow for a full-size keyboard alongside a spacious, Mac-like trackpad.


Apple fits a camera, a lidar, and god knows what other sensors on the front of a phone in a notch that's three times thinner than the one on its own laptops.

So yeah. They could have a thin bezel. They deliberately chose not to, even if it eats into the precious little space of the top menu bar.


It looks like the iPhone is about 2-3 times thicker than a MacBook lid. I'm not sure they actually have the space to put iPhone equivalent hardware above the MacBook screen.


The module itself is very thin: https://www.ifixit.com/Teardown/iPhone+12+and+12+Pro+Teardow...

I believe they could've quite easily reduced the bezel width.


For content consumption yes, creation maybe not.


What content are you consuming? Because if you remove the screen area next to the notch, it's 16:10, and if you're watching videos, you're probably doing it at 16:9, and games at 16:9 or 16:10.


Yes. Yes I would.

I would rather take a bezel and reduce the amount of e-waste this custom garbage has on the world. Add to that this will most likely make screen replacements significantly harder and more expensive.


I think the e-waste angle is mostly unrelated.

Can you explain why you like a larger bezel? It seems like a hands-down negative to me.


For me, notches are just ugly. The main purchasing decisions behind my current phone is its lack of notch. I hate the feeling of something interrupting my screen.


I think the e-waste angle is mostly unrelated.

Is it though? I was able to replace a broken Macbook screen a couple of generations ago. It was hard and took a while, but I managed. I'm very sure I would not be able to do this given this kind of screen.

I guess we'll see when the first tear-down comes out. I wouldn't be surprised to find out that the only way to fix the screen is to replace it together with the webcam.


There will be 3rd-party screens that fit this, just as there were 3rd party screens that fit every other macbook.


I'm pretty sure there hasn't been any "3rd-party screens" for macbooks since Retina displays came along. It's all sold as the display + case, lcd bonded to the glass.


Probably true. I haven’t looked into this since I did a screen replacement on a MacBook in like 2013


macOS has grey menu bar where the notch is, that stays unused 100% of the time.


The XCode menus on my 2015 MBP 15" go well into the 'notch zone'.


Yeah I wonder how this will be resolved. The screenshots of Photoshop on the Apple store page show a menu that runs quite close to the notch. Will menu items be truncated or pop up randomly on the other side? How will it work for less terse languages like German?


I routinely don't have enough space there. Any professional app (and these laptops are squarely aimed at professionals) will eat into this "unused gray space".

Add a few apps that add ions to the menu bar, and suddenly you already have no space even without the notch.


Switching to fullscreen keeps everything below the notch, judging from the screenshots. I'm betting they'll add an accessibility option to keep the UI below the notch as well.


I don't have any issues with the notch.


Did they ever remove the headphone jack?


No but a lot of people seem to think they removed it, they didn’t


They did remove the input jack with G3 (2012-2015), and S/PDIF (digital line out) with G4 (touchbar) though.


There’s optical digital line out up until the 2015 Macbook Pro, sending light out of the headphone jack. You need to add a little plastic adapter compared to standard TOSLINK cables, which I presume were getting too big.


It's interesting how silently they dropped the optical line out, I never saw it mentioned. I needed it for something on my iMac Pro the other week and was boggled that it didn't support it. The argument I read about it was the DAC is so high quality now, digital out is simply not needed.


I’m still slightly bemused as to why they dropped optical out from the Apple TV too. It’s not like it’s supposed to be a media hub at the centre of your house or they run a music streaming service or anything!


I assume because modern HDMI supersedes it. Sucks if you have an old receiver/amp tho.


I specifically got a device to split audio out of the HDMI stream (technically it just copies it out) to RCA so I can pipe it back into the stereo in zone 2. Annoyingly complex but it works.


Optical out was a real blessing back when electrical connections (iMacs to amps) were noticeably noisy.

To go from optical back to electrical just seemed wrong from a tech evolution standpoint. Not hearing a lot of call to rip out multimode fiber in backbone networks and replace it with a shinier coax. But HDMI seems to be meeting the need, and there is always the aftermarket for fully annealed 99.999% oxygen-free HDMI cables to consider.


All the home audio stuff supports audio over hdmi now. One less fragile cable to worry about, it’s great.

For legacy equipment you can get a box that takes hdmi in and spits out spdif.


SPDIF has less bandwidth than HDMI and doesn't support formats like lossless surround or Atmos.


> There’s optical digital line out up until the 2015 Macbook Pro, sending light out of the headphone jack.

Yes, that's what I wrote was dropped in G4.


No but they did move it back to the left side, which matches the location of the wire in most wired headphones. No more need to wrap the wire around the back of the machine...


It's still there, now with better support for high impedance headphones which is great for music producers.


Music producers are using an external interface regardless, and if they are on the go and just using headphone jack, it doesn't much matter because they know they arent getting a true monitor grade signal to judge the mix with.

This is just great for headphone nerds, audiophiles.


Fair-ish points. For on the go, they can still take their dedicated over-ear monitors though, thus reducing the compromises. Sure it's not the same but better than the alternative.


Wouldn't music producers use a dedicated DAC though?


It depends if they want to test how things sound through a Mac and happen to have high impedance headphones lying around. Seriously though the DAC in the Mac is very good, you’re talking $300+ to get something better and for my money it sounds nearly as good as a Topping D90 for example.


The $9 iPhone headphone dongle is a better DAC than most audiophile products, and has a bigger R&D budget than the entire industry. Headphone amps are real for electrical reasons, headphone DACs are a scam by the gold plated cable people.

(The Google equivalent dongle is also good.)


Don't think the same. I remember when they removed the jack I was using iPhone with a pair of wired CX 880i's. On old iPhone they sounded great (of course not a true audiophile setup, just a comparison). After I got my iPhone 7 which removed the jack and gave the dongle instead, they sounded absolutely terrible through that DAC. The change might not be noticeable for average EarPods user, but those Sennheisers which were great earbuds for its time was literally sounding worse than back-then-not-so-good Bluetooth headphones through that DAC. With all the world moving to Bluetooth/wireless (speaking of casual use, not pro setups), I doubt this will ever change.


https://www.audiosciencereview.com/forum/index.php?threads/q...

You certainly shouldn't be able to hear DAC differences through earbuds, but they shouldn't have issues being driven either unless they're really exotic. If I had to guess, I'd say the plug wasn't seated properly, or there was an issue with the in-line microphone.


In a studio: Sure. But on the go it could be a nice asset.


I don't think Apple has, but people might have had a fear they would do. Now by adding those additional ports it's clear what route they're going and the jack is safe. I guess that's what he meant...


No; it's still there.


bumped up RAM limit but the RAM price is now more than triple the consumer price Top-Tier 32GB RAM costs 150€ and the Apple 32GB RAM costs 460€.


3X is standard Apple tax for many years (also for storage).


> Plus bumped up RAM limit, M1, new displays, 120Hz... Wow.

They couldn't use LPDDR5 RAM with Intel chips. They wanted to bump up the RAM years ago, but couldn't do it until now.


2019 16" maxed out at 64GB, which is the same as the M1 Max.


They specifically mentioned the speed. DDR4 is used in 99% of PCs right now. Apple can use faster memory on the die directly.


Ah true, there might be a bit of crossed wires here. It seems like @mizzack was using "bumped up" to mean the amount of RAM, while @Alex3917 used the same phrase but this time to mean LPDDR5/speed, but obviously I interpreted the latter as amount (because that was the context).


I was also talking about the amount of RAM. The reason they didn't allow people to increase the RAM at all for a couple years as because of the power consumption. After people freaked out they eventually added an option, but kind of discouraged it. I had forgotten that they even added that option though.


>increased key travel

I hope that is true. But they only state and quote

>For the first time, Magic Keyboard brings a full‑height function key row to MacBook Pro — with the tactile feel of mechanical keys that pros love.

Nothing really concrete about Key travel.


Some of this had been previously fixed but not noticed by some: keyboard was fixed on the last Intel MBP models. Headphone jack was never gone to begin with. RAM had already been boosted.


it seems the "one last thing" this time, is, well, a notch, at the top of the screen, on a proprietary system that makes heavy use of the top of the screen as part of the ui.

the whole value proposition of the apple ecosystem is that it is all beautifully designed, integrated and pleasing such that it inspires you to go forth and design beautiful things.

but now they've gone and stuck a webcam in the middle of the menu bar. jobs is probably rolling over in his grave.


actually, just watched their video. it doesn't look too terrible.

i suppose the only way to know if it's annoying is to try it.


The menubar is 60% taller, that's a lot for something that's supposed to be subtle and out of your way.


Headphone jack never went away. Increase key travel was also fixed a few years ago. What do you mean by power jack? USB-C ports function as power jacks. Magsafe is incredible and I'm so glad it's back.


I just scrolled through the posts and I felt they were laggy. I think I am already biased to 120hz even without having experienced it myself.


>Wow looks like Apple has abandoned every bad decision on the MBP for the past 5 years in one swoop. No touchbar, increased key travel, added back hdmi/sd/headphone/power jacks.

It's taken two years to fully undo Jonathan Ive's obsessions.


> headphone

Did they remove it? I checked my wifes Macbook Air (the one with 2 USB-C ports) and it has headphone jack, did they remove it only in MacBooks Pro?

> increased key travel

Is that good? I prefer smaller key travel, makes typing faster.


No they never removed the headphone jack.


Actually, they kept one of the oldest mistakes: overcharging. $400 to upgrade from 16GB RAM to 32GB. $200 to upgrade to 1TB SSD. These guys control the whole supply chain without middlemen like WD, Samsung, Crucial, etc — and they charge basically double what anyone else charges.

I'm not paying $2k for a laptop below what I need as a developer, when <$2k from literally anyone else will get what I need.

Apple is ridiculous. Inflation is murder right now, and they JACK the prices without delivering more value.


Lenovo X1 Extreme charges $340 to go from 16GB to 32GB. Then $680 from 32 to 64GB. (regular price, there's a clearance sale now). $300 to upgrade SSD from 512GB to 1TB. It's $2400 for an X1E (at sale price. Regular is $4.3k!!) roughly at par to $2k 14" pro.

Surface Laptop Studio with similar specs at $2100. Costs +$600 to go to 32GB and 1TB (which == $400 + $200 upgrade you mentioned).

Dell XPS 15" is $2400.

If you were even dimly aware of current prices for high-end laptops you'd know the Mac is priced in the same expensive range. If you prefer $1K plastic crap that's fine. Some people like Velveeta and Canned Spam too.


What's interesting is Dell is shipping machines at top end with chips from generally older / slower nodes. Somehow Apple has some kind of lock on a lot of capacity at TSMC at the top end of the node range.

That XPS 15" is probably running a comet lake chip.

https://en.wikichip.org/wiki/intel/microarchitectures/comet_...

Ie, 14mm, launched in 2019

That node size (less the ++'s) goes back to Skylake (when it was amazing actually) which is from 2015.


Don't buy them?

Acer sells MUCH cheaper machines all day:

$1.6K for a Predator here:

https://www.walmart.com/ip/2021-Flagship-Acer-Predator-Trito...

Apple has been told that their products (watch, airpods, iphones etc) have been too costly for decades now. And still makes money selling them.

What I don't get - Apple's stuff seems to last forever. My dell's just crap out (battery, drivers, updates etc). My wife on a mac -> she just goes and goes. I'd love to see the support histories on old ACER's and Dell's vs Apple.


I don't understand comments like this. They're not buying it and they're explaining why.


No, they are saying apple is making a mistake and that "Apple is ridiculous."

What they don't value is that apple, in the midst of chip shortage, is going to be doing delivery within ONE WEEK of announcement of an absolutely bleeding edge node solution.

I get it, your Dell Intel whatever is better and/or cheaper. I used to be on dell and now am on lenovo. But there is a reason apple charges what they do, a totally integrated solution, that just works, battery life for miles, not too hot on the lap etc etc. and available in seemingly insane quantities.

Seriously, ford parks cash cow F-150's and apple is turning out millions and millions of phones (100 million phones alone per year)?


Eh, I’m still using a ~4 year old iMac. One of those that are painfully difficult to upgrade RAM. (And I keep repeatedly hitting the limit right now.) And I’m trying to keep it in sync with an even older MBP.

And you’re right: I don’t appreciate that Apple can deliver a week later. That’s not valuable when my machines still work. Not paying a premium for that.

I’ve been chafing at the upgrade cost of Apple laptops for more than a decade of using their products. The reason customers keep paying is for the reputation, and because of their short memories. The keyboard debacle, the screen issues, MacBook Air early gen laptops burning people’s laps… and those are just the laptop issues.

I’m upset because this is just another instance where Apple’s profit margin goes up substantially. Their markup is basically 4x higher than MSRP off the aftermarket RAM/SSD.


You're being hilarious.

No one is using Dell or Acer here. We're just holding on to our old MBPs.

I wouldn't care about the notch for a 2k laptop but I do care for a 3500 laptop. Ontop of that I'll absolutely need more storage and ram so it's 4k laptop. But the notch. So ugly. Welp guess no 2021 MBP for me.


90 million iPhones in Q1 this year, 55M in Q2. Closer to 200M/year.


I own MBP 2012.

It has:

* dead SSD, replaced on warranty

* dead battery

* dead keyboard

* dead audio

* some hardware issues, I think related to GPU (it panics sometimes)

* replaced charger, second charger is in bad condition, probably time to think about replacing it as well

It's the most terrible laptop I've ever owned. I owned plenty of laptops and I never had any issue with any of them. Macbook really set the bar.

That said, it kind of works... I use USB keyboard, USB headset, always plugged in, don't push it too far, as it starts to throttle very fast. But it loads some ancient macOS version, constantly nags me about updating it.

I'll buy new Macbook. Because of macOS. Reliability-wise I don't have any expectations from Apple. I bought plenty of Apple devices and I had plenty of issues with many of those devices. They definitely don't last forever. Apple software - that's what hooked me.


As far as anecdotes go, I'm 100% opposite I guess.

I have 2013 MBP, but have removed OSX and use it solely as a Windows machine. Nothing ever broke on it, and I use it solely for build quality and reliability.


I’m running (admitting reluctantly) fine on my 2014 MBP. It works for everything I need it to. It can be slow for some tasks, but I still get all the updates.

I’m interested in the disparity between Apple and others with enterprise support programs. Virtually any other company will send you a loaner or replacement laptop under warranty and repair yours. Apple seems to just say “That sucks” and might fix your laptop.


> What I don't get - Apple's stuff seems to last forever.

Especially the keyboards, and the speakers in recent years.


I've been a Mac user since the 90s and a long time apologist for the Apple premium... I understand and have generally been totally fine paying for design, fit and finish, longevity, and attention to detail that's hard to come by elsewhere. That said, the absurd markups on storage and memory have always been REALLY aggravating to me, and were a factor in my decision to build a desktop machine instead of getting another Mac last year. It's not even a question of affordability, it just rubs me the wrong way. $400 for +16GB of RAM in 2021?? Nonsense.


Did you hackintosh? Because I sure as hell am not running Windows and Linux is great for VMs/containers/servers but doesn't run all my local software.

If you don't mind Windows, lots and lots of cheaper options. But you already knew that.


I installed Ubuntu and Windows, and set up a Mac VM in case I needed it, but haven't really used Windows or the Mac VM. I do use a lot more VMs than I used to–having lots of cores, RAM, and fast NVMe drives it has opened my eyes to a different way of using the computer.


What do you normally pay for 16GB of LPDDR5 RAM?


Supply chain issues complicate things a bit, but if I'm searching correctly the list price for Micron modules is ~$50 or less per 16GB.

Ultimately, though, my beef with their pricing is not really about this specific release which obviously uses cutting-edge components, but about Apple's general approach to pricing these upgrades. They'll happily sell you old tech with the same crazy markups. You can still buy a new 27" Intel iMac with 8GB of RAM and the option to upgrade to 16 for $200. That's just a normal 8GB DDR4 SODIMM, which you could instead order for $29, and it installs in about 15 seconds.

I get that businesses need margins to survive, but Apple's stinginess and customer unfriendly price structure with storage and memory has been going on for a long time and just really bothers me.

I like their products! Just let me spec it such that it deserves the "Pro" label without feeling like a chump.


> I'm not paying $2k for a laptop below what I need as a developer, when <$2k from literally anyone else will get what I need.

That's perfectly fine, you don't have to use a MacBook if the value isn't there for you.


Are they also not allowed to comment about it?


Just seems a bit pointless, that's all.

I'm never going to buy a Bugatti Chiron, but you won't find me going to their forums and bleating on about how the gas mileage isn't as good as my Ford Focus...


I have used Apple products because if you calculate how long they last and the original cost, they cost less over a longer period of time.

Or at least they did. This loss in value is what I’m complaining about.

And this isn’t Apple’s forums, now is it?


Here's the deal. I don't care if I can get something workable for $1000 less. When I'm making many times that with the laptop as my primary interface for work, I'd rather spend more just to make me feel that tiny bit better each day.

The touchpad alone is probably worth it.


Before all the touchbar and butterfly keyboard nonsense, MacBooks were the best laptops on the market, hands down. It looks like they may have reclaimed that spot with this update. I'd argue worth the premium if that is the case.


They charge what the customer would pay. As long as people keep buying, the prices will keep increasing.


I'll admit up front I haven't done the math on this MBP, but are the prices really so bad on a laptop$$/year basis?

If you look at a feature comparison, sure, macs have always looked like a bad deal, but that evaporates when you bring a time calculation into it. The surveys I have seen all seem to place macs at about half the failure rate of other devices, and they seem to last twice as long. everyone seems to have a shocking story from Lenovo/Dell et. al. about shoddy hardware and incompetent support for anyone without a $10mil+ PO. If you pay more up front, you can make it back by using the same device for longer.


This is literally what they have always done and they don’t see a reason to stop


Get used to it. Apple will likely bust their competition at some point.


touchbar was actually nice


This ^^^^. Absolutely!


> Wow looks like Apple has abandoned every bad decision on the MBP for the past 5 years in one swoop.

Not unless they disabled all of the spying in MacOS. Doesn't matter if you bought it, it still isn't yours. Hard pass.

https://news.ycombinator.com/item?id=25078034&p=2


Priced out the one I would buy and it is $4300. Seems the value isn't quite there. Hard to say though without playing with it for a while. So clearly I need to go back to work for some tech company that will buy one for me :-).

It is more interesting to me to reflect on the comments from about 5 years ago that "ARM will never be in laptops, its a 'phone' processor."

That statement was both true and ignored the reality that if you wanted to put an ARM processor in a laptop you could add features and design it to work that way. Chip design is expensive of course, and so being a company like Apple really makes it possible to do this sort of thing as a 'risk' venture, but what surprises me is that Apple spent maybe $10B over the years developing an in house ARM design capability (after buying PA Semi) and here it is paying them some huge dividends.

If you compare that to Microsoft's efforts trying to use off the shelf ARM chips in the original Surface and now the Surface RT and you can really see the advantage of having the chip designers and the software designers depending on each other. That was true in the "WinTel" era when Intel and Microsoft were joined at the hip, but it was never the case for ARM CPU vendors who were more concerned about being in the next flagship phone than what ever it was that Microsoft was doing.

What an interesting alternate history if Microsoft decided to develop an in house production chip capability at the same time they decided to get into the hardware business for "real." [1]

[1] Yes, I'm aware they have done custom chips, the pixel/pen processor in the Surface is one such but they haven't really jumped in with both feet like Apple did, and certainly haven't had it front and center as long as Apple has.


You'll get a solid 4-5 years use out of such a device. I did out of my 2016 MacBook Pro.

Even if you depreciate at $1000k/year, that's not too bad for a professional tool. It's not worth $0 after 4 years either.


Heck I was still using my 2012 Macbook Air until pretty recently, a decent machine if you don't mind that it gets really hot under load.

Barring any major fiascos (like past keyboard issues), the 2021 MBP does look built to last. We're in a good spot now in terms of stuff like a great screen (if you don't need constant 120Hz), low-power video decoding, a solid 8-core CPU, and probably even a passable gaming GPU. Should remain very usable even as it's outclassed in subsequent generations.


If you get the M1 Max with 32 core GPU, I bet that's actually well above "passable" for most gaming needs.


This is the model I’m getting.

Now if only I could get Deus Ex: Human Revolution to work on it.


How many games run well on ARM?


The Steam Deck is about to find out!


Steam Deck is running an x86 AMD processor, not ARM.


The Steam Deck does not use an ARM CPU.


My MacBook Pro from 2012 (the one without Retina display and had an optical drive too which I replaced with a secondary SSD) lasted me over 8 years. It finally overheated during summer (my house was very hot and fans were probably clogged). I do iOS and Android app development, so that $1000 device made me tons of money in return.


Yep. Still running a 2013 MBP here


Same here, for the first time in 8 years I’m considering to upgrade.


Same with my MBA 2012. Works great. Will get an M1 something or other and pass the MBA on to my daughter. If I get approx 10 years out of the M1?, I’m happy.


> Seems the value isn't quite there

Looks like you are not in the target audience for those machines. The value is absolutely there if you need the CPU to do as much work as possible. As someone who spends a big chunk of my day compiling, a speedup by x2 means I need to wait half as long. And it won’t even toast my lap. The 1K nits screen means I can finally work outside in summer. This is _exactly_ the machine the disgruntled ecosystem has been calling for, for at least 5 years now.

While the 5nm process will certainly yield good AMD CPUs, the iOS/macOS optimizations built into Apple Silicon are hard to match for general-purpose CPUs. And the GPU story looks bad for Nvidia - the Max 32 core GPU uses half the power envelope for almost the same output as the mobile RTX 3080. And afaik the 3080 is already produced on 7nm, so there will be some engineering to do to get there.


People are comparing Apples and Oranges. (Ha)

Look at a Mobile Workstation class laptop and those are easily 5k+. Those are the ones to compare with the high end MBPs. Not your USD 1500 Windows laptops.


I've RTX discrete card maxed out laptop with 64GB RAM, dual SSDs, 4K OLED and what not. It was priced at around $4500. The battery life is about 2 hours and fan runs so fast that it almost sounds like a siren. I'd to do hours worth to tweaking to force run its CPU and other components run at very low speed to make it usable in untethered mode. It needs so much power when charging that plugs in airplane charger refuse to charge it.


How did you even open the lid more than 60 degrees?


And mobile workstations will keep you tethered to the wall and you can only tote them in an XL backpack.


FYI the 1k nits is only for HDR content. I believe peak brightness in the sun is capped at 500 nits for SDR content.


> Seems the value isn't quite there.

The MBPs are astonishingly good machines, but I agree the price is always forces tough questions about value and fit.

I suspect the lack of a Mini update was in part to buffet purchases of these more expensive laptops. So many people are traveling and commuting less now, cheaper Mini's would have undermined sales of the MBP.

I'm running an XDR Pro Display off a 2018 Intel mini with an eGPU, and going to wait out for presumably a 2022 M2 Mini.


I don't think so. The iMac sure. The Mac pro, may be. These machines don't compete with the mini.

I like the mini in theory but I'd never shell out to buy one. I almost never work just from a desk. With WFH, the average person is likely to lie in bed or couch or work from the kitchen counter and dining table, than they are to work from a proper desktop.

(Not even to mention the fact that you'd need to have/buy a good display and keyboard setup to go with it.)


My experience matches this. I have converted a guest bedroom upstairs into my office but I can't sit there all day and work. I go down to the living room after lunch and also work from the back porch whenever I can. I have a desktop PC but I barely touch it.


> Priced out the one I would buy and it is $4300. Seems the value isn't quite there.

I was also surprised at how high the maxed out laptops can go in price. From memory, around the 2015 model line the top end would be at around $3000 without the crazy disk options.

There was a rule of the thumb that Apple priced their product at a rough ratio of $2 / day. That would give 4 year for my 2014 laptop, and it mostly did. That would be 5.5 years for the top end 16" laptop, I guess that's reasonable ?


FWIW, while I’m disappointed by the pricing I bought a 2019 16":

- Mid CPU

- Lowest available discrete GPU

- Max RAM

- 2 TB SSD

RAM was my absolute must have, and to get the same on today's MBP:

- I would have to at least max the CPU

- That’s substantially better GPU than I’d buy otherwise

For the same RAM and storage, the price is almost identical (in fact it might be identical but I don’t feel like digging up my old order to compare such a small difference).

I am disappointed they didn’t move the price down more. And I’m disappointed I have to buy more compute than I care about to get the other specs I want. But it’s clear at least to me this is more value for my money than I paid 2 years ago.


It’s worse if you bought a recent Intel based MacBook Air. I have a 2018 and it’s bad performance and battery compared to the M1.


Consolation would be that you still have full backward compatibility with all the Intel ecosystem.

I like the M1, but I could have chosen the 16" intel at the time I bought it, even if it's to switch to the M1 Pro/Max next year for instance.


So far I have not seen much compelling need for full backward compatibility to Intel.

Have you come across many instances where you wish you had an Intel CPU?


I mean I paid probably 3-4x what you did for a similarly outclassed computer so IDK if it’s worse.


Ah I thought I read you had a 2016.

I realize I haven’t looked at the exact price / performance jumps between the releases across MBA and MBP for Intel to Apple Silicon.

I’m curious just how big of a change there was across common metrics.


Money printer go brr.


Apple has always been a hardware company though. Software has just been for the purpose of selling their hardware.

MSFT is the opposite. So it makes sense it be much slower/harder to get going in this area.


I completely agree with that, Microsoft's foundation was software. What is intriguing is that when they decided to "go hardware" (which started with Mice and Keyboards and later game consoles, tablets, phones, and laptops) their execution strategy didn't evolve to making their own silicon until much much later. Apple was much more aggressive here with the PowerPC alliance after Motorola let them down followed by later moves into chip design.

I am always fascinated by seeing how the "core competency" of a company constrains its execution options later in its existence. Sun, for example, really was a 'sales' company first and it couldn't "sell its way" out of the wave of Linux clusters that started displacing its servers. Now as a hardware company it too was let down by Motorola and that was some of the genesis for SPARC much like Apple's move. But as a sales company Sun kept shooting anyone who used their "open" architecture CPU by ruthlessly out selling them.


> Apple has always been a hardware company though. Software has just been for the purpose of selling their hardware.

I'm not sure that's the right perspective. Steve Jobs said in an interview [0] that "Apple views itself as a software company."

[0] https://www.youtube.com/watch?v=dEeyaAUCyZs


They definitely don't behave like a hardware company. And arguably much (most?) of their value comes from their software.

But at their core selling hardware is where the money comes from.


I wish the non-Apple ecosystem gave a clear alternative that allows us to choose specs like this :-(

I look at Lenovo Thinkpad with Ryzen 5000 CPUs, they are stuck with crappy displays.

Dell has better displays but not the Ryzens CPUs I wanted.

Framework is modular but no Ryzen.

Nothing makes me want to open my wallet.

I’d love almost exactly a MacBook Pro spec with M1Max replaced by Ryzen 5000 series CPU and 64 GiB of memory, and an excellent display. May be that unicorn of achieve will cost as much as your configured MacBook Pro :-(


Eventually the Ryzen Framework will come and then the whole discussion is over for me. I won't buy a >$2000 device that isn't upgradeable, repairable and has a poor Linux story.


Same with me when Framework gets big enough to provide the top dog choices, I'd happily pay a premium for the modularity and run Linux. Hope that day is not too far.


I would buy that in a femptosecond.


So I was looking for an alternative and bought Lenovo Yoga Slim 7. Ryzen 5800H with 90Hz 1440p screen.

It runs loud af, fan is constantly pulsing, but it's an alright machine for ~1200eu.

It's now going on ebay to fund the mbp purchase, I want better thermals out of my laptop.


First gen of m1 mac was first Apple product that i bought. I can't say i like Apple ecosystem or macos. I am opposite of Apple fanboy. I don't like Apple and this price they made me pay another reason for it but simply anybody who want and who can afford will buy it. There aren't any alternatives.


As a review, I like this. An expanded list of likes/dislikes from a someone with universal dissatisfaction is always good to read.


> Seems the value isn't quite there

I'm curious, what exactly is this machine missing in your opinion and given the choice, what would you go with other than the recently introduced MacbookPro ?


I can't speak for the other commenter, but for me the M1 air is so good I wouldn't feel the need to pay several times more for a "pro" machine.


Yeah. The M1 Air is also fanless and thinner and lighter. The prospect of fan noise in particular is a turnoff.


I expect to be surrounded by pitchforks and torches for saying this, but I genuinely believe that Linux is the true winner of today's presentation.

The last few months have been watershed times for both Apple and Microsoft. Starting with Apple, their developer relations are starting to crumble in a major way. Not only have developers stood up against their highway robbery, but other companies like Microsoft have one-upped them just for the hell of it. Today's presentation was almost completely devoid of any software discussion, which is really what puts the nail in the coffin for me. A 5nm laptop chip will always be more powerful than the 14nm one I've got, but at least my chip runs the software I want. Without a commitment to replacing 32-bit libraries, updating their coreutils or making a package manager developers actually want to use, I think Apple's message is clear: We're pivoting away from the 'Pro' market and heading straight into 'Prosumer'. To their credit, it seems like they've made a fine prosumer laptop.

Microsoft, on the other hand, has usurped their enthusiast community with Windows 11. The strict CPU requirement in particular was a bad move on their behalf, and it's going to leave a lot of their install base out in the cold. While I don't expect a whole lot of consumers to pivot to Linux, I can imagine a lot of developers making the switch. Many devs already use WSL, so they're pretty familiar with Unix systems as-is. The Windows-to-Linux onboarding experience is getting better and better as Wine is continuing to close the gap in compatibility. Especially for the gamer/enthusiast market Microsoft traditionally caters to, the value proposition of using Windows is diminishing by the day. Windows 11 does nothing to change that, and actively impedes your workflow if you're on a Ryzen CPU.

I think for a lot of people's workflow, Linux will be the only thing left that "just works" in a few years time.


This is a pretty bad take.

I don’t see anything particularly wrong with Apple’s developer relations. You certainly might hope they would give us more than they do, but you always hope for that. Bottom line is, Apple gives us what we really need — a market.

Today’s presentation was almost devoid of discussions of software… because it was about hardware. It’s not like WWDC doesn’t exist. You can hear about Apple’s software as much as you would care to.

“Apple doesn’t care about pros anymore” is a perpetual refrain, but it seems to go through phases of being more true and less true (there are many kinds of pros, so it’s never going to be all true or all false.) We’re in a middle state at the moment, IMO, mainly due to the uncertainly about what the higher end pro Apple silicon story will be. Still, these current laptops look awfully good for many “pros”.

Linux has a lot of nice points and some pretty good distos that are improving all the time, but that’s not near enough to displace Windows or MacOS.


I use Linux daily. "Just works" is a stretch as soon as you throw any GUI + desktop environment into the equation. Even more hairy if you add an nvidia GPU in.

The average consumer is more likely to switch to ChromeOS (which I guess is Linux underneath, for now) than to any Linux distro.


> Even more hairy if you add an nvidia GPU in.

You're also describing Mac with that one. But worse.


Macs haven't had nvidia GPUs for a while now, so not surprising they're unsupported.


Unsupported by Apple is one thing. Not allowing Nvidia to release a driver for MacOS 10.14 and later is another thing.


Apple is actively preventing them, or Nvidia can't be bothered? Hadn't heard the former before.


"Beginning in macOS 10.14.5, software signed with a new Developer ID certificate and all new or updated kernel extensions must be notarized to run. Beginning in macOS 10.15, all software built after June 1, 2019, and distributed with Developer ID must be notarized." - https://developer.apple.com/documentation/security/notarizin...

You can't release a driver for Mac without Apple's permission. Apple won't allow Nvidia to release drivers for newer cards. Nvidia has stated this on multiple occasions.


I tried to install Ubuntu on an older desktop this past weekend, because that's a solid choice to revive any aging system, right? ... nope, it wouldn't boot because Canonical decided to drop BIOS boot support on their installers.


This is not true. Ubuntu supports BIOS boot without any issues. I booted 21.10 yesterday on a Core i5-540M and it worked without any issues.



"Today's presentation was almost completely devoid of any software discussion, which is really what puts the nail in the coffin for me."

Literally what? This isn't WWDC. It was an event for presenting new hardware. There was zero chance of it being software or (especially) developer focused.


Though ironically the macOS Monterrey RC actually came out the same day as this event.


It’s not ironic, new releases of MacOS always coincide with the release of new hardware.

And it always follows the announcement of the release at WWDC around July.


I don't mean ironic from Apple's perspective, I mean ironic given the comment complaining about "only new hardware".


I don't know if you've been following marcan & co on their work on bringing linux to M1 macs, but it's moving along nicely. Pretty soon there will be an interesting conundrum for believers in open systems: buy high performance mac hardware but run an open OS on it, or take the high road and buy a "worse" laptop from a more open vendor.


One could view Marcan's work on Asahi Linux as "making M1 open".

Going out on a limb here, but as a firm believer in open systems I think it's fine to buy an M1 Mac to the extent it's guaranteed to run Linux (or OpenBSD, or whatever) without any problems.

System that can't do that are the ones to really avoid.


And yet I still have to hard reboot after closing the lid on a fresh ubuntu install on a 7 year-old lenovo in 2021.


Don’t worry, 2022 will be the year of Linux on the desktop!


This is the one I bought!


Epic update. Crazy fast chips, 120hz, mini LED, more ports, 64 gigs of ram. It's literally everything you could want in a new MacBook Pro.


Took away the ports just to bring 'em back and play the hero. Apple playing 5d chess.


I'm guessing this has to do with Johnny Ive's departure. He was notorious for design over function.


i'm guessing it has to do with the new M1 chips and their great power efficiency. back with the intel chips, an SD card slot or HDMI port consumed valuable internal real estate that could have been filled with battery instead, and filling it with battery was a better trade-off for most people.

with the new chips, they can comfortably sacrifice some battery size.


Disagree. 2016 15" MBP was 4.0 pounds. 2019 16" MBP was 4.3 pounds. New 16" MBP is 4.5 pounds. Apple simply decided in 2016 to shrink the size and weight by removing ports and other things, and then to increase the size and weight by bringing them back.

Personally I'm surprised they didn't decide to keep the same weight by shrinking the battery. Not that many people absolutely need a 21 hour battery life, though it is nice.


Pretty sure the batteries are bigger in the new ones vs the Intel ones, at least in capacity (6000mAh vs 5000mAh). I imagine they are the same physical size; if not bigger.


Apple isn't sacrificing any battery size on the 16" at least. 100-watt-hour is the standard max size for TSA.


Him and Steve were a great combo keeping each in check. Ive should have left a long time ago.

He is now designing at Ferrari: https://www.wsj.com/articles/ferrari-hires-former-apple-desi...


The new and improved Ferrari design rumored to include:

- Removal of the brake pedal since you should only need one input to control things

- The gas inlet port relocated to a more aesthetically pleasing location in the very center of the undercarriage

- Total width of the car down to a svelte and sexy 3.5 feet because the thinner it looks the better it is


There is no steering wheel, simply a flat panel where you swipe to steer.

There is 3D Touch where a long press on the steering screen lets you set cruise control


Steering wheels are available but sold separately.

You have to attach them via dongle, but the only input port is in the trunk.


> The gas inlet port relocated to a more aesthetically pleasing location in the very center of the undercarriage

Why is this a problem? You shouldn't be trying to drive it while it's fueling anyway


> - The gas inlet port relocated to a more aesthetically pleasing location in the very center of the undercarriage

It also requires the car to be on the front wheels, with the back awkwardly suspended by the fuel hose.


You joke but the new Ferrari has capacitive buttons everywhere. Even the rear view mirror adjustment buttons are capacitive. Whether this is Ive's doing, I don't know.

https://dealerimages.dealereprocess.com/image/upload/v157591...


You joke but the new Ferrari Roma has touch buttons on the steering wheel and the UX has been widely panned.


As far as I understand, he was absent from about 2015 on. There was an article, which I can't find right now, that discussed the mess this left the design team in. He basically retired to his home, and he would require the design team to come present to him at some local place (he wouldn't go to Apple). Then he'd give zero feedback. (This is from my recollection of the article.)

Pretty telling since that was sort of the heyday of Apple's devices having all sorts of conflicting designs.

Edit: I think this is the article: https://www.wsj.com/articles/jony-ive-is-departing-apple-but...

With a summary at: https://appleinsider.com/articles/19/07/01/jony-ives-departu...

Furthermore, it's hard to imagine Ive and Ferrari gelling. Ive's style is about as diametrically opposed from Italian design as I can think of. I characterize his design as character-less, something Italian designers don't go for.


He may have been hired to sit in his home and not design anything - for the name only.


I get the feeling that it's part ego from Ive and part Cook actually not caring about design as Ive mentioned.


Johnny Ive wanted USB-C to be 'The One Port to Rule Them All'. That didn't work out for a number of reasons.


As someone that didn't like originally giving up my ports and thought Jony Ive was a bit too overzealous with thinness, I actually don't think he was wrong.

USB-C is pretty great, and the sooner we shift things over the better. When I bough a MBP with only USB-C ports I just bought a little dongle thing that slots in to my two ports on the side and gives me USB-A, HDMI, SD and MicroSD. I've used the thing around 20 times and its never been a burden because I carry it with me in my laptops case.

At this point I actually see adding the other ports back as a step backwards. HDMI is old and aggressively large in 2021. My 3 devices that use an SD card actually use a microSD card so I need to carry around an adapter anyway.


I have never understood this push for light, and thin.

Even twenty years ago I wanted just the best features in a laptop.

Give me a quality machine, at a reasonable price; and I would drag around a boat anchor.

I also wanted repairability, and room for future upgrades inside the laptop, but gave up on that one.


It almost has, it's just playing the long long game. People don't replace their monitors, tvs and projectors constantly but eventually they will all be bundled in with usb-c cables. Last time I was in an office, every cable had a usb-c dongle stuck to it permanently.


I totally loved USB-C being the only port(s) on the device and thoroughly miss it on my Thinkpads.


> Apple playing 5d chess.

I think you're giving them too much credit. This is Apple's way of admitting they were wrong.


5d chess is usually only ever said in a sarcastic context, so the implication is that Apple didn't plan this at all


Well at least they admitted it. If they doubled down with the butterfly keyboard and touchbar crap they would rapidly die to a better designed linux capable book. or just become one of the many commercial vendors.


More to the point, they now have someone else designing these laptops.


It would be interesting to read the former designers take on the new machines.


The worst part is that everybody copies them. Now, two USB-C ports, one of which is used for power, is standard.


I far prefer just plugging in one USB hub than plugging in like five different cables. I have a Thinkpad with loads of ports, but I just use two (one for a wireless mouse that's always plugged in).


I've got a reasonably cheap screen with USBC input. The screen is the hub.

It's great, as I only need to plug in one cable and my laptop is powered, connected to both my external monitors (displayport mst) and USB peripherals.

It's also a reason I asked not to have a Mac when I changed jobs. The Intel Macs and the older m1s don't support MST.

I wonder if the new m1s finally support it?


MST would work correctly _in Windows_ on my last-gen MBP, so I think it may be a limitation of macOS. Apple has only ever officially supported MST over Thunderbolt, never over USB or DisplayPort.


I think the usefulness of a bunch of different ports/port types is for when you're on the go. A hub/dock is of course way more convenient when you're at your main workspace. But if you're traveling or even just on your couch, having extra ports makes things much more convenient.


I like the hub/dock lifestyle as well but… USB-C/Thunderbolt is complex and there are many gotchas.

As more folks returned to the office, we get lots of issues with shitty network and other drivers on docks.


Issue with my rMBP is if i have too much on said single port; the laptop overheats and crashes. I end up with all 4 ports plugged in anyway to get decent performance.


It's good to have multiple options. I lug my laptop around and (rarely) need to connect multiple devices to it and sometimes I run out of ports. At home, I use a dock and the ports are useless, but that doesn't mean that I don't like the option to have more ports on the machine.


the great thing with this update is that you can still do that :)


I prefer USB-C with PD to all of the vendor specific ports except maybe MagSafe.


Same, but there still usually aren't enough ports in total. Looks like the new Macbook Pros have 2 ports on one side, 1 port on the other. At least keeping 2 on each side would've been much better.


I had the same thought, if you're charging on the right side you don't have a USB port left there.


That is the biggest problem with USB-C IMO. The ports can all have different capabilities and it's not always obvious what they can do.

For the most part though it seems they've settled on Thunderbolt, Charge, and bog standard ports. Are there any that do Thunderbolt + Charge?

It's a shame about the name because the Thunderbolt icon would be perfect to indicate a charge port. Instead we're stuck with a little plug icon that is usually isn't discernable.


That doesn't mean that people are copying them. Thunderbolt 3 adopted the USB-C format. Until USB-C, USB wasn't capable of delivering the power required for charging. It made logical sense to switch charging to USB-C. If anything, Apple copied USB-C.


I think the point was 2 ports is 1 port in practice. And so few ports used to be rare.


Sadly no. My new Thinkpad has only one USB-C port and it's used for power. So I cannot use the USB-C port while charging. And it has three USB-As and a freaking HDMI.


Maybe next year's refresh will have USB-A ports again!


Apple, according to the rumors, considered even bringing back USB-A but decided against it.


It would be nice if they could have squeezed one in again so you don't need to pull out a dongle/hub every time you want to connect a "legacy" USB device when traveling. But not really a realistic wish and wouldn't have wished they took anything out to accommodate it.


What legacy USB devices are you plugging in while traveling?

Traveling is when I'm least likely to do that. And regardless, I travel with this[1] anyway, so no problemo.

[1] https://www.amazon.com/Purgo-Adapter-2018-2016-Delivery-Thun...


I'm not the one you asked, but heres a few things that I might need to use a USB-A port for when traveling:

* My yubikey to perform 2-factor auth

* The checkra1n jailbreak for my work iPhone (I do some iOS dev and occasionally need a jailbroken device. For whatever reason, it doesn't work with a USB-C-to-lightning cable, it has to use a USB-A-to-lightning cable + a dongle.)

* A USB flash drive for transferring a presentation or whatever. (I know these come in USB-C variants, but I don't have one.)

* Charging all the random things that charge from USB-A ports.


Gotcha. Well except for the weird checkra1n thing, everything else is easily migrated to USB-C. To help with that migration, I highly recommend this[1] cable (or one similar to it) as a technological Swiss Army Knife.

I made the transition at the beginning of the year to "USB-C everything I can" and it's very liberating. Ironically the only devices I have left that are not USB-C are my iPhone and AirPods (and my Kindle, but they just released a USB-C model which I will soon use to replace my current one).

[1] https://www.amazon.com/SDBAUX-USB-Compatible-Electronic-Tabl...


Yea, fair enough. I am generally migrating in that direction, but probably a bit slower than you.

Looks like a handy cable.


As I said not something I could reasonably expect or that Apple should reasonably have done. Just another set of changes to make/stuff to buy.

In practice I’ll probably mostly just deal with for a while by carrying a hub with me.

Of course I have a ton of other things I’ll home but that’s easier dealt with using various usb hubs.


Yubikey has a USB-C version with NFC now.


Would USB-A fit in a macbook? No way for the air right?


I mean, Apple gets to decide how thick the laptop is, so they could make it work if they wanted to. Didn't the original Macbook Air have a USB-A port?

Also, HDMI ports are approximately the same height as USB Type-A ports, so presumably it would fit in this design.


> Didn't the original Macbook Air have a USB-A port?

It did


I am happy if USB-A dies.


For me C can die. Having multiple specs on one plug and various cables that all behave differently is ridiculous.


But that notch is kind of a really ugly blight on the face of an otherwise good looking update. Especially since it's lacking faceid or something equivalent to windows hello


Go back to when the iPhone introduced the notch. There were so many comments just like yours but when was the last time you heard anyone say anything about it? It gets a lot of attention because it's quite visible — literally right in front of your eyes — but it's also in an area which is dead space most of the time and the most common scenario where it isn't is full screen video conferencing, when you'll be glad for the better quality.


> but when was the last time you heard anyone say anything about it?

Literally during the latest iPhone announcement where Apple was even timid about showing off the front of the display or talking about the smaller notch, as everyone mocked how dated the notch looked.


Was it really “everyone”, or just a few very online people in your social media? The sold out pre-orders suggest this was not as universal as you're portraying it — for example, looking at I see a note that it was smaller and you have to go 4 pages in before there's a single comment complaining about it, and that's in a community which loves to critique Apple designs!

https://arstechnica.com/gadgets/2021/09/apple-makes-the-ipho...

Again, not saying this is nothing or that I love it, only that the buying public does not seem to feel anywhere near as strongly about it and I've never heard anyone complaining about it on a regular basis. Most people get used to it, if they're curious they probably understand that it's for the camera, and get on with life. I'd be quite surprised if this did not follow a very similar trajectory — especially since the display is larger so you could black out the entire top of the screen in software and still have more screen real estate than the previous model.


People buying a device doesn't mean they liked literally every aspect of the device and have no complaints about any aspect of it.


No, but it definitely means that it's not the big deal which overheated rhetoric in forums often implies. Experienced users are often extremely reactionary and wildly overstate the impact of highly visible changes but in most cases if you actually survey their usage later they've almost always gotten used to it.

In this case, the 16" screen gained ~0.2" diagonally and 314px extra resolution. Those extra pixels mean that even with the notch you're going to see more on the screen than you could before and in my experience that's far more likely to be the part which shapes people's impression of a new device.


> People buying a device doesn't mean they liked literally every aspect of the device and have no complaints about any aspect of it.

10 people complaining online doesn’t mean that all the millions of purchasers dislike it too.


I like the notch. It makes the phone look different to every other glass slab on the market.


Why are you moving the goal posts? At first you asked whether "anyone" talked about it, the parent gave you a reasonable answer, and now you're talking about "everyone." Accept that some people do talk about the notch, however small the minority opinion is.


What I should have said is anyone talking about it unprompted — there’s always someone talking about any design decision but you see the average priorities based on when people talk about them unprompted. People complain about the Touch Bar or the old keyboards a lot, or the sided charging / performance issues, but the notch seems much rarer to hear unprompted complaints about.

That doesn’t mean it’s zero but it suggests that the compromise was actually quite reasonable, even if there are a small number of people who vocally disagree.


have you noticed how screenshots and background images for iphones marketing material is chosen so that the notch is almost/totally invisible? it borders on false advertising. it's a design company, they know the most how ugly it is.


Do you have some examples? Looking at https://www.apple.com/iphone/ the notch is emphasized on the icons at the top of the page and the only images where it’s not clearly visible are the ones showing the camera on the back of the phone.


I think the notch works ok on the iPhone as iOS is specifically designed with it in mind. The small spots abreast of the notch are fine for battery meter, clock, and reception bars.

The Mac notch is right in the middle of the toolbar. I have lots of applications that use that space for menus and will be curious how cumbersome it will be now that we need to literally navigate around it.

Will we need to move the mouse down to get around the notch, or will the cursor be allowed to go under the notch?


In the screenshots shown, most of the apps do not have menubars which extend to that space at that display resolution — this could definitely be more of an issue for people who increase the size for accessibility reasons.

From the full-screen screenshots, it appears that it's simply black across the top of the screen unless the application has been designed to support it. That looked like the regular Safari developer tools with an unbroken bar slightly lower down in the video.


My thought is that it’s a pro laptop. Does it really need a built-in webcam? How many people never use them or even tape over them? I could live without it, and if it’s actively providing a detriment, that’s all the more reason to get rid of it.


You mean like (I'm guessing but I think it's a safe guess) the vast majority of "pro" users who videoconference? The opinion that a laptop maker should take out a webcam in 2021 is... interesting.


I never ever used webcam in my life and I'd gladly replaced it with proper display.


Yes, they should make a camless model just for you.


Agreed. In fact, I think I'll mind it less. On my iPhone the only time it looks funny is in landscape, otherwise it just sits in the top-bar. On a laptop that isn't an issue.

I think for me it'll fade away, especially with a dark-mode UI.


> On my iPhone the only time it looks funny is in landscape,

Laptops are effectively always in landscape.


But the notch on the MacBook Pros is on top, not on the left.


When I use it in portrait, it’s on the side.

And yes, sarcasm.


There are literally still so many comments just like that and plenty of people who simply won't buy a phone that has a notch. The comments never stopped.

The screenshots showing application menus almost hitting the notch are ugly af. I'm sure that there are applications that have even more menus than Photoshop or Premiere.


> There are literally still so many comments just like that and plenty of people who simply won't buy a phone that has a notch. The comments never stopped.

… and yet, sales have continued to be extremely high, which suggests that is a self-selected group of extremely vocal opponents rather than a real market trend.


...as if the notch is the only deciding factor.

I hate the notch, but I'll still buy iPhones all day long if my only other choice is to buy products with questionable privacy from an advertising company versus dealing with Apple's shitty designs.


That's their point, it's literally self selection: people who do not care about the notch are not going to write comments about it, and people who care are never going to shut up about it, giving an extremely poor window for understanding how much it actually matters or how it impacts real use. That complaints like this are self-selective is both well known and the OP's entire point; it's amazing how much effort all the replies seem to be going through to not understand this.


Notches on cell phones are so 2018. Apple is really behind in that area.


I guess we always have the option of showing black on that part of the screen and effectively getting the old display back (ie a huge bezel).


You get more usable screen though. I am curious how legacy apps with way-too-many menus will display with the split.


The split is in the global menu bar which is owned by Apple design wise. It will be interesting but I imagine it will work fine, also don't forget early OSX builds had a centered Apple Logo with logic to handle this.


They'll display the same way they did on the smaller screens, or with lower DPI settings enabled I'm guessing.


My only concern is if it will distract me from work with it always sticking out.


it is useful because it will be on the middle of the status bar. After 2 hours my guess nobody would notice anything weird. Like with the iphones, but this one actually makes sense.


It conflicts with existing Apple design decisions, such as Safari in full-screen with the compact tabs, or Apple Calendar in full screen mode. I wonder how that will work.


Full Screen apps do not use notch space as seen in their full screen demos


So they're not really "full screen" any more? Why even bother with the feature, and just switch it to "maximize" instead?


They are as big as pre notch. It's just now the toolbar has moved in to a space which was once black bezel. Full screen is still useful for hiding the toolbar for a distraction free view.


Except for purposes of immersion in games and movies, I might just always have the top menu bar visible if the treatment for many full screen apps is just to have an opaque black bar with no content.


I don't think Apple really designed Safari to be ordinarily used in full-screen with compact tabs.


People will get used to it... but they could have just shifted the screen down, using that chin. For example there are plenty Windows laptops with 3:2 aspect rations, where they use up all the vertical space.

They could have just moved everything down for IMO a slightly better design.


It won't be in the middle of the status bar when running anything full screen.

Some people run applications full screen most of the time. For example the browser, text editor, chat, etc.

I wonder if it automatically inserts a black bar in those situations.


It does.


I agree that the notch is a bit large to just be a 1080p webcam. I don’t hate it though, aesthetically. If the next iteration doesn’t have Face ID, I’ll be a bit annoyed. Seriously considering trading in my M1 MacBook Air for the 14”. Tim Cook really knows how to drain my accounts!


sure. But with video conferencing being a new part of the job, it's likely a good tradeoff


https://www.apple.com/macbook-pro-14-and-16/

Looking at the product photos it looks like they black-bar the entire top of the display when anything is full-screen? So that seems to make the notch even more intrusive as you lose out on all the screen real-estate next to it when something is fullscreen.

Or rather, the menu bar is now a permanent bezel that nothing else can occupy?


> you lose out on all the screen real-estate next to it when something is fullscreen

That would have been bezel in a notchless design.


The bezel wouldn't have been as big. The resulting "fake bezel" is larger than necessary to house the camera, as it needs to shift where the camera is relative to the physical dimensions of the display and the border that it needs.

Just see literally any other laptop that has both a fairly small bezel and fits a webcam in it, like the Dell XPS 13. The camera in the macbook may be better and require a larger lens assembly as a result, but it's surely not as big as the bezel + notch itself requires.


From the picture, it looks about the same. The top bezel is huge on the 2020 macbook.


As someone who always hides the menu bar, though, having a notch is going to kill being able to switch tabs in Chrome, search in Outlook, or broadly, use any controls at the top of a window in maximized mode. I’m guessing the hidden menu bar is going away on this machine because it would break usability of all other apps.


Someone will write an app that blacks this bit out for you and you can carry with your life like an Englishman in a Parisian restaurant…


The notch is an explicit design choice, it's visible and iconic. It differentiates the new models and you can make a reasonable argument that space is always used by the status bar in macos currently, so this is a net improvement without question.


And they got rid of the touch bar which everyone dumped heavily on


I love how they boasted about bringing back the regular keyboard. Even though they're solving a problem them created.


Damned if they do, damned if they don't? Sounds to me like they listened and changed course.


I think the parent comment is taking issue with how they frame it, not with the actual decision


What are they supposed to say? "We're sorry we tried something new and it didn't work out?"


"We listened to you and brought back the functions keys–and made them full size!"

That's just my half-assed stab at it. Presumably someone whose job is to write these things for a living can improve it meaningfully while maintaining the essence.


They said pretty much this minus the "we listened to you" part. I don't know why you need a pat on the back. Are you going to send them a thank you card for every thing they do right?

It's a company dude, Apple is not your buddy. They're not listening to you, they're listening to market research and usage analytics.


Listening to market research = listening to users.


They are trying to sell Me a product though


Yes?


Why?


Because it's the truth?


Yes, but instead of acting like bringing back the regular keyboard is an amazing achievement, they could instead just admit that the touch bar wasn't a good idea. But admitting errors is not the Apple Way.


It was a fine idea, just like 3D Touch on iPhones, that just didn't get enough use for the complexity. It was a gamble and didn't pay off.


The difference to 3D touch is that they removed existing features. I wonder what if they keep physical F keys along with Touch Bar.


Still have not solved the tinytiny inverted Tee arrow keys arrangement. Need to improve on the former IBM's 6-key cluster below the right shift key, or arrange full keysized arrows in a cross pattern breaking out of the rectangle at the lower right corner.


How are they supposed to frame it? “Sorry we messed up the keyboard. We’re putting a regular one back in.” Doesn’t seem like great phrasing for launching a new product.


I'd prefer that. But I guess that's why I'm not an apple customer


Seems fine to me.


They didn't boast. Saying many people prefer hardware keys is closer to admitting a mistake than boasting.


They were "boasting" about making the function keys taller.


Which makes sense because some of the earlier MacBook keyboards with physical keys, like the one I'm using right now, had half-height keys.


If you listen closely you can hear the IT procurement IOs being opened up en masse.


Looks like they only “got rid” of the Touch Bar for the more expensive model. Really trolling their customers now


Well, the 13" Pro already has the M1 chip and will probably be gradually replaced by the 14", so there's no reason to update it. Where's the trolling?


No they didn't? The 14 and 16 inch don't have the touchbar.


Exactly. No touch bar and cost more money


The Air doesn't have a touch bar and costs less money. It's also a perfectly good -- no, sorry: an excellent laptop for most applications, including development.

I suspect the 13" M1 Pro was probably just something to cater for those who were skeptical about a fanless laptop since no one really knew back then how well these machines would perform. Now that the M1s have been out for a while and no one seems to be complaining about the Air, they can drop the M1 Pro.


> It's literally everything you could want in a new MacBook Pro.

If it doesn't run VirtualBox then I can't use it to write code, so why does it even matter if it's faster?


> why does it even matter if it's faster?

Have you considered that plenty of people don’t need VirtualBox to write code?


Maybe you should be questioning why you need VirtualBox to write code


Security, provisioning, dependency management, infosec, etc. the list of reasons for someone to develop in a VM is endless.


yes. also vbox is not the only way to run VMs


Parallels supports linux VMs. https://kb.parallels.com/128445

You could use that now or wait for virtual box to catch up (I assume).


I'm writing this comment from Ubuntu running on my M1 Macbook Pro, so...?


How is the GPU performance?


No idea really, not sure I've done anything GPU-heavy on Ubuntu.


You can run Linux with UTM.


Can I keep my existing vagrant / Ansible setup? Or, basically, what would I need to use in order to provision the instances?


You can keep your ansible. Does vagrant support alternate providers for running VM's? Then yes, you can keep your vagrant too.


It looks like Vagrant doesn't work yet with UTM: https://github.com/hashicorp/vagrant/issues/12518

Conceptually I'd love one of these machine, but realistically if I can't develop locally in an Ubuntu VM then I might as well just wait a year or two to buy one until that's possible. I'm not going to use Docker, and realistically I have zero interest in spending multiple weeks rewriting all of my provisioning scripts.


Vagrant has a QEMU driver, which is what UTM uses anyway. I can't say it will do what you need the way you want but I suspect it's possible to get there.

QEMU can virtualise arm and emulate x86.


Thanks, I'll check this out! I just need to run Python on Ubuntu, so hopefully that isn't too crazy of a use case.


You can always use docker


Docker doesn’t really work on Mac. Like it technically runs, but just barely. Because it uses containerization inside virtualization, it pegs the CPU just to run a hello world app and completely drains the battery in 30 min. Oh and the fans sound like an airplane taking off.


I don’t know anyone at work who would agree with this. Docker is fine on macOS, honestly always has been. Sure, it will show up in Activity Monitor, but so will any other software that is actually doing something. I have never seen my CPU peg because of Docker, and I regularly run large stacks, use Earthly for all my builds, etc. The overhead of the VM + containerd is truly not that large.


CPU % in top and similar tools isn't a great proxy for energy usage anyway, it's a lot more complicated than that.


Docker is terrible on Macs, but I used to run Docker+vscode+Firefox+Brave+Slack+misc. I'd run into issues on my 2015 MBP from time to time, but on the 2019 models it was fine.

gulp.watch would often eat an entire core, and stuff in Docker ran slower than on Linux, but my laptop could keep up. vscode didn't lag and Firefox ran just fine—all with 5+ Docker containers running.


I have no idea what you are talking about. I have about 7 containers running the full work app and I can't hear any fan noise and the laptop is running cold.

This thing is running Rails, mysql, elastic search, redis, mongodb, and workers inside docker. As well as nodejs, vscode, a handful of electron apps all at the same time and the laptop is cold.


Not my experience at all, you must have something weird in your setup. For a full dev environment I'm running containers with postgres, cockroachdb, several ruby apps, kafka, some JVM based stuff, and clickhouse and it does hit battery life a bit but certainly not pegging the CPU, actually it's using less than Slack...


Not my experience either on a fanless M1 Air. I can run lots of containers without it perceptibly heating up or slowing down. Battery life does suffer but I can still usually get at least 4-5h of solid programming like this.


> micro LED

Mini-LED backlighting. It’s still an IPS panel. MicroLED is something else entirely. Had me excited for a second.


Yup. This was more than many of us demanded TBH.


Did anyone miss the headphone jack now supports low-impedance headphones?

No reason they had to do that - just pure icing on the cake.


High-impedance headphones. Driving low-impedance headphones is easy.


Yes, but let's see the price tag for an M1 Max with 64 GB is, and then we can rejoice. Or not.


$4,298.65 in WA after tax, free shipping.


Sounds like it's time to make a trip down to Oregon


Same price as the i9 it replaced, no?


5600M SKUs were more expensive than equivalent M1 Max options.


$3899.


I’m so impressed by the screen but it’s too un-ergonomic for me that I’m sad instead. Would have been so nice if they had released 27” desktop screens to connect with these finally wonderful laptops.

It’s not like a huge majority of office workers have been staying at home or working from the office for the last year and half or something. The coffee shop surfing, working in the plane / train must have taken a dive.


Just buy a non-apple screen? I run 3 Samsung 4K screens on my MBP.


What a beast! I already feel sorry for my MacBook Pro 13 with M1. Left in the dust essentially.


The new models are a lot more expensive though. And the entry level model of the 14 inch has an 8 core variant of the m1 pro, which probably will not perform that much higher than the 8 core m1 in the 13 inch.


same for my m1 air! :D... but then I remembered that apple does trade ins! :D


M1 air and not at all feeling this way.

The original M1 pro had a touchbar, that alone make it suck. But the M1 is still thinner by a mile and has no fan!

Many people overlook this but no fan means no dirt will get into the chasis. That is HUGE.


I guess M1 Air is targeted towards execs, business crowd, marketers and such who's primary day job is not into creating, not something that'd require Pro level capabilities anyway.


The M1 Air is the best laptop for the average casual laptop user (school, emails, photos, etc). The battery life alone makes it a huge upgrade over any Windows machine. I also love that it doesn't have any fans, side that's usually the first thing to go on any laptop. I expect to get years and years of good use out of the Air.


Or programmer. We don't need that much juice really.

If our stuff makes M1 runs slow ... It's probably poorly optimized code.

Of course game dev is exempt from this.


I already have an M1 Air, and I'm looking to buy the 14 MBP just for the display (working outdoors yay), battery life, and RAM (16GB + higher bandwith). There's enough horsepower on the M1 for me, not even noticeable compared to the i7 MBP 16.


The 14” has several hours less of battery than the M1 Air. Dealbreaker if I’m honest. https://www.apple.com/mac/compare/?modelList=MacBook-Air-M1,...


That's.. interesting. Seems like the "wireless web" stat is really helped by the additional 2 efficiency cores on the Air, while the video stat is helped by the dedicated media silicon on the new Pro. Not to mention on what brightness (40% brightness on Pro is 100% brightness on air). Good to keep in mind.


Many types of programming require big resources these day. Training machine learning model is the classical example.

But also those who run multiple VMs at the same time.


> But also those who run multiple VMs at the same time.

I tried this route a few years ago but gave up out of sheer frustration. I wonder if it's improved now. At this point I'm more inclined to setup my local stack in AWS and be done with it. Like no local laptop testing at all.


I wish they offer non mini-LED display option since developer don't need such quality. Apple need to break down "Pro".


Au contraire, as a developer staring at screens all day I think it's very important to invest in high quality displays


As a developer, I think we should invest in big external monitor, not looking at a tiny 14in display, however nice it is.


Agreed. Laptop form factor itself is also bad for ergonomics.


Generally agree to invest to avoid eye stain, but this is overkill.


How can you generally agree but then say this display is overkill? The previous displays were definitely not the peak of quality to avoid eye strain.


To reduce eye strain, reducing glare is important so using non-reflective screen by sacrificing quality is better (or ideally eInk). XDR displays (and previous displays from Apple) are great for quality, but not fully optimized for reducing eye strain.

https://www.eizo.com/library/basics/10_ways_to_address_eye_f...


I would argue dynamic backlit screens are terrible for development actually. High contrast is intense on the eye.

I bought a QLED TV 3 years ago, and tried to use it as an external display for my laptop. It's painful to look at. You switch from a white window to a black one and the TV adjusts backlight dynamically and blasts your eyes. I would react by putting my hand in front on my eyes like if i had looked outside from inside a cave after getting used to the cave.

Something like e-ink would be best for office work indeed. Lighting is external so controlled by the user and in line with the rest of the room


> High contrast is intense on the eye.

Is it really the contrast though? No contrast is sharper than blank and white, and reading a book is fine.


Sorry, I meant to say brightness contrast: high-end dynamic displays will turn off backlighting in black areas, and turn them all the way up in white areas.

Let's say your movie shows a full moon in the night. Looking at that moon on the screen is now like looking at a torchlight in the night in real life. The moon section of the screen is blasting high brightness right at your eyes.

This is what I meant. I tried doing office work on such a monitor, and quickly realize it's good for immersion, but terrible for office work, for which you want stable brightness.


Designers use MacBook pros too.

Everyone likes to have mobile workstation.


Yes this is excellent for designers, but here's also many developers.


mini LED, not micro LED


Yup my bad


Yet there still is no support for writing HFS (not plus) formatted floppies. Yeah, not really everything I could want...


Kills the touch bar, the light touch keyboard, adds back weight, card reader, ports and the mag safe.

Feels like an apology for prior design decisions. My 2011 MBP is new again!


I have owned macbooks only since 2017 and found the older (2011 era) aesthetic design to be bulky and not too appealing. I own a 2020 model, which looks sharp and professional. I don't like them going back to this rounded edges on the bottom. I love everything else, except this simple gripe over aesthetics.


> I have owned macbooks only since 2017 and found the older (2011 era) aesthetic design to be bulky and not too appealing. I own a 2020 model, which looks sharp and professional.

There used to be a time when what people considered “professional hardware” was versatility and durability over pure aesthetics. I’m not saying we can’t have both form and function with professional grade hardware but with the later iterations of the MBP Apple have put form ahead of function and then trained an entire generation of developers to look at hardware superficially. I honestly weep for the industry if this is the path we are destined to continue down.


I never get people who moan about people caring about aesthetics.

Computers are our workplace, we spend most of our lives in them.

Would you be just as productive if we put your desk in a reeking pig sty?


You’re conflating form and function with your example (“reeking” is not an adjective for form, and a pig sty is not even a functional office space to begin with).

I have no issue with people wanting their work environment to be pretty but aesthetics shouldn’t hamper the person’s ability to get their work done. (at least not unless you work in an industry where aesthetics is your job).


It’s about wanting your environment to be pleasant.


I see this is the time any company can do both ground breaking performance and a solid design let alone Apple. I don't want them to go back to the intel with discrete gpus and remove extra ports. I want a sharper design is all. It's not much to ask both, Apple after all cares a lot about design, I don't know how they came up with such a design while their all their iPads lineup (bar basic iPad) and iPhone lineup has a boxy design but this MBP is rounded.


“Rounded” is a common design element throughout the history of Apple — including iPhones and iPads. It also has zero baring on the function of the device so who cares if it’s a little more boxy or rounded? It’s supposed to be a “Pro” device not some piece of art.


I suppose I'm the only one who cares and at the same time I never claimed it affects the functioning of the device. Like I mentioned it in another comment, it is a very minor and a picky thing, but Apple takes such a great pride in the uniformity and design. With all of their current generation iPhones and iPads having a boxy design, it is strange they resorted to such a design choice that too in their top of the line macbook pros


I think there's a lot I agree with there.

But I will say, the aluminium body is genius in terms of form and function. That's not an aesthetic compromise, it's the right thing.


I think many owners of the old machines like them for how they worked.

Function over form.


I miss the 2012 model every time I use the 2018 model I own now. One was peak Apple design everything worked, was robust, and just genuinely pleasant to use. The other rock bottom form over function with a disastrous keyboard that I never got used to with it's crap layout, lack of essential keys, shitty cheap feel, etc. And that was before it started having the well publicized issues.

I like that they are going back to basics with the new models. Not crazy about the notch but otherwise it is all good. Decent screen, memory and ssd are super expensive but at least there's plenty of it now.

I'll wait a few months to see if their quality levels are back where they should be. Because I'm beyond taking their word for it after my last experience. I'm particularly curious to see if it delivers on the performance hype and thermal behavior when actually using it for doing work. If that's even close to what they are promising, it's going to be a bad year for Intel.

Does anyone know if those neural processors do anything useful or is that only for people that use specialist stuff like video editing tools? I run docker, intellij, vs code, etc.


>Function over form.

All the time. The Macbook pro is meant to be for pros, pros care about other things like having physical function keys over it having a "light blue color that matches my shirt :)".

I haven't bought a Macbook since 2016 bc of all the trash they've been doing. A laptop without magsafe would last ~2 weeks in my household, so just that thing makes it for me.

Also, best battery, best screen, best trackpad, ports. Assuming nothing weird comes out later (screen or thermal issues) this will be the best laptop on the market for the next 10 years.

Great decision to have fired Jony Ive back then.


Oh I totally get that, I like functionality and the 2017-2020 MBP barely qualified as Pro laptops. I have a huge windows desktop for my actual pro work, MBP was just a mobile device and for some freelance work. A sharp design would be nicer is all I'm saying


Well, perceptions change. In 2012, the aesthetic and design was state of the art. Felt I looked good every time I pull it out.

But I still use a 2012 MBP. Doesn't look sharp anymore, and its always been pretty heavy, but they keys work sooooo well, the touchpad is a miracle of engineering, 9 years of daily use and everything still works beautifully. And frankly, I mostly don't notice any lag still (though I might if I was doing video editing). And the port selection meant that I could hook up to just about anything with no hassle ... including ethernet. Wifi is still inferior to an ethernet port.


I understand the perceptions in fashion and design will keep changing. But it is only recently Apple adopted boxy design in all their iPads and iPhones except the basic iPad and iPhone SE II. Why not follow that, the trend is set across the board. This top of the line macbooks have a design similar to the basic $329 iPad, I can't wrap my head around such a decision.

I know it's a very minor and a picky thing, but Apple takes such a great pride in the uniformity and design, it's strange they resort to this.


I've owned Macbooks since 2003. Typing this on a 2020 Macbook Pro 13". It's just been a long evolution of getting thinner. IMO, and as others have stated, going all-in on thinness drove a series of design mistakes that seem to be reversed here. Rounding out the thinner leading edge gives a bit more space, and would guess makes it feel more solid. I don't like the thinner front end being a little suspended above the surface of the table, it makes me worry about it bending.

The 2003 12-inch Macbook with the aluminum keys was like a textbook made out of solid metal. That keyboard felt great.

This looks like a great machine to me and I am a little sad I bought in Feb!


The 2015 15" Macbook Pro was 1.8 cm thick. The 2019 16" Macbook Pro was 1.6 cm thick. The mew 2021 16" Macbook Pro is 1.68 cm thick. So the 16" got slightly thicker than the old one but not as thick as the old 15".

The 2020 13" Macbook Pro was 1.5 cm thick. The new 2021 14" Macbook pro is 1.55 cm thick. So that is only 0.5 mm difference.


Yeah, the design suck. But these are machine that are going to be used 8-10 hours a day. Better be functional than trendy.


I know I'm in the minority regarding the keyboard, but on my 2019 model it's been flawless and I really have come to like it quite a bit, more than the one on my 2014 even (except for the vertical arrow keys that is. Seems like nobody at Apple actually tested their usability)


Is it the late-2019 16" MBP?

If so, that's not the keyboard everyone hates. They were already backtracking the butterfly switches by that point


No, it's the 13" with butterfly switches. The only thing I worried about was dependability but they added the extra rubber membranes that year (I think) and anyway, the keyboard has caused me zero issues, and made every other keyboard feel clunky to me, including my previous favorite keyboard on my 2013 15" MBP.


I recently got a new M1/MBP and really love the utility of a customized TouchBar (using BetterTouchTool). I'm sad it'll be gone! It's actually great - but certain functions are missed (like a dedicated volume buttons).


I was laughing when every single one of these changes was presented as "revolutionary" in the keynote. No, you just had to revert everything because your previous revolution was universally hated by users.


Ignoring that none of the changes was presented as "revolutionary" (the touchbar wasn't mentioned, magsafe was referred to as "Brought back", and the ports were simply noted for convenience), for some small but very loud subset of HN users, an Apple event is all a giant lie if it apparently isn't hosted by some sneering Apple detractor.

It's an Apple product launch. Like every product launch ever in the history of ever, they point out the features of the thing they launch.


They said "function keys, replacing the touchbar." So they did mention it.

But yes, I don't care how they market it. They'd be dumb to pause and say "oh yeah we were stupid, here's your old toys back."


> They'd be dumb to pause and say "oh yeah we were stupid, here's your old toys back."

Maybe I'm weird, but my respect for a company that did that would go way up, not down.

Obviously they wouldn't say "we were stupid", but I'd absolutely appreciate an admission along the lines of, "during our design journey over the past X years, we've realized our customers prefer having a full function key row on their keyboard / more ports / MagSafe / etc., so we've listened and are bringing them back!" To me, that signals a group of folks who know they are fallible, listen to customers, and do their best to meet customer needs.

But of course admitting those sorts of things wouldn't be consistent with Apple's brand. Apple is all about "we know better than you know what you want and can do no wrong". Which is fine, and seems to have created a lot of success for them, but it's always turned me off.


> Maybe I'm weird, but my respect for a company that did that would go way up, not down.

I think technically-minded people would take that well, but the business/management types don't like doing stuff like this, likely because it has a chance of making the stock fall (or even just not rise as much as they wanted it to).


I wonder how much of the mechanical keyboard community increased sheerly because of the fact that many MacBook owners hated their laptop keeps.


> "oh yeah we were stupid, here's your old toys back."

"We heard your feedback and we listened."


They've certainly called the touchbar revolutionary:

https://twitter.com/apple/status/791704819811573760


> I was laughing when every single one of these changes was presented as "revolutionary" in the keynote.

Exactly none of these changes were presented as "revolutionary", that's just in your head.


I had the feeling they owned it. Like, the woman said something about "no need for adapters" and you could have maybe seen a little smirk on her face while saying that - but I don't know...


they listened to customers and reverted a lot of bad decisions from the past. What else do you want? That they admit they made bad decisions in what is essentially a sales event/pitch? This is more than anyone could have hoped for imo.


The more hilarious thing was the hate for it.

I would much rather have more usb-c ports that can do anything and add dongles, than have ports that are functionally limited.


>> because your previous revolution was universally hated by users.

Nah. It was hated by a small but vocal minority of users, of which HN has a lot of.


You're provably wrong : if it was a minority, they wouldn't have reversed. See headphone jack on iPhone.


"Provably wrong"? On the contrary, if it wasn't a small minority, Apple wouldn't have continued to break sales records with each iteration they release.

I posit that "hate" is too strong a word. I would describe the changes as merely unpopular.


Power users forget that function keys and even command shortcuts are pretty much useless for normal users.

>About 90 per cent of computer users don't use CTRL-F to search for a word - as they don't know such a keyboard shortcut exists, a Google survey found.

The results stunned Google's Uber Tech Lead for Search Quality and User Happiness, Dan Russell.

I think we just all assume that we all know it, but no one actually does."

https://www.smh.com.au/technology/only-one-in-10-know-what-c...


To be fair, the Macbook Pro is aimed squarely at power users. The majority of normal users are better off with the Air.


>Nah. It was hated by a small but vocal minority of users, of which HN has a lot of.

I have not met a pro user that wasn't annoyed by not being able to plug in HDMI at some point without a dongle...


It was even worse. Those macbook owners would cause the others in the meeting to be annoyed when they couldn’t plug in and had to send their slides to someone else with a hdmi port and keep repeating the words “next slide”. It was viral annoyance.


My employer issues a dongle with a type A, type C, and HDMI port on it with every MacBook. That only helps if you actually have the dongle with you. One popular option was to just keep the dongle attached at all times. Personally, I found an adhesive pouch and attached that to the back of the monitor, and carried the dongle around that way.

I wonder when I'll be able to get one of these new Macs at work ...


At my office they just gave up and replaced the HDMI connectors with USB-C connectors in every meeting room. Now it's the people with the older windows laptops who need a dongle.


Nice to meet you. I could not care less. If I need to plug something with a cable, it may have a dongle as well.


I personally think if it was just a vocal minority, Apple would not have relented. They play the long game, so they probably saw a decrease in their user base somewhere, or at least a trend there.


>small

Do you have a source for that? Because if that was the case, it would seem silly for them to undo.


I think he meant "hate" is too strong of a word. My impression is it just didn't serve much of a purpose for most people. I almost never use the function keys for anything but volume control anyway, and probably wouldn't have used the touch bar for anything but that either. It was just a waste. I probably wouldn't have hated it if I got one, maybe even found some nice uses for it, but I still think it was the right choice to get rid of it.


Not really, the performance will make your 2011 mbp feel an extra 10x older now.


And then they added a notch hahaha


A fair warning to developers: You're getting into an adventure with M1.

From docker images not built for M1 with segmentation faults on qemu (eg. Liquibase for spring developers), to _significant_ troubles trying to make React native apps build with the M1 and XCode.

Don't get me wrong, I have a Macbook air M1 and I love it, but it hasn't been a love without pain.

Also, the magsafe feels like it comes too late. Almost like a political feature in response to the EU measure of enforcing a single charging cable.


React native issue is resolved in the 0.66[1] finally. It was walking on egg shells dealing with a weird combination of rosetta and arm configs. I have not faced docker issues except in the first few months but maybe that's just our workflow where we already had aws graviton instances.

[1] https://reactnative.dev/blog/2021/10/01/version-066#better-s...


I'm curious about the docker image situation.

I had thought that the M1 could natively emulate x86 instructions. So why then can you not run the native x86 docker images? Is it a virtualisation issue?


The M1 does not emulate x86 instructions (in hardware). Emulation is provided by Rosetta 2, a combined ahead-of-time and just-in-time emulation that can run x86-64 macOS binaries on Aarch64. However, there is still some hardware support for Rosetta 2 in the M1, e.g. to support strong memory ordering.

More details on Rosetta 2 internals can be found at https://ffri.github.io/ProjectChampollion/


It does, which is slow, then the disk emulation makes it slower. Yes you can do it, but it’s not much fun.

Improvements are on the way.


Apparently, it does work in most cases, but aarch64 images have better performance and are more stable.

https://docs.docker.com/desktop/mac/apple-silicon/


There is no hw emulation, it's all software.


Is there any reason Apple couldn't implement a magsafe type connector but using USB-C form factor? By that I mean, you could use any USB-C compatible cable in the same port (just without the magnetics)

It seems like they could have gotten the best of both worlds that way.

Am I taking crazy pills?


You can find these on amazon, for example:

https://www.amazon.com/Ansbell-Magnetic-Adapter-Transfer-Com...


USB-C is already a friction fit. Somehow adding magnets to it would just make it attach more strongly, which isn't the point of MagSafe. I feel like making a slightly loose USB-C cable but with magnets would be a bad user experience, and probably against the USB spec.

I don't really get what you're suggesting.


Surely an alcove could be created such that the usb c connector creates a firm fit once the magnets latch on.

e.g. magnets pull retractable flaps open such that the magsafe USB-C can be retracted easily. When using regular USB-C the flaps are locked in place, so works in that case too.

I'm sure it's possible, but perhaps not worth the effort. Not one of humanity's most difficult problems.


“Retractable flaps” sounds like it’s begging for a -gate suffix. Flapgate just rolls off the tongue. Apple doesn’t like moving parts, for obvious reasons.

I think Apple did exactly the right thing here. If I have my MacBook hooked up to my desk monitor, I’ll just use a single USB-C cable for power, video, mouse, and keyboard like I do today. That kind of desk setup isn’t a tripping hazard.

If I take my laptop to a coffee shop and need to charge it, I’ll use MagSafe because it’s more likely to be tripped over or yanked in that situation. What’s key is that I only needed power at the coffee shop. What else would be going over the cable in that kind of setting?


It's very difficult to justify the purchase because ultimately I've switched to thin client for serious development long ago.

I guess I'll use the unrivaled performance per watt to drive Jetbrains.


> I guess I'll use the unrivaled performance per watt to drive Jetbrains.

So you're advocating for nuclear power?


You can still charge using USB-C


Which has me wondering. The MagSafe brick is 140w and current USBC can only do 20vX5a (100w). New USBC version can do 240w apparently - but will the USBC on these laptops be able to pull 140w+ via the USBC?

edit: more digging says that the MagSafe will be using USBC PD 3.1 spec - which presumably means 240w potential or at least as much the brick. I only wonder because I have stupid issues like the Microsoft Surface Book2 that tries to pull more power than the provide power brick can handle!

https://usb.org/sites/default/files/2021-05/USB%20PG%20USB%2...

edit 2: okay, while the Block will be PD3.1, the Thunderbolt 4 ports (USBC) only follow PD2.1, so they will likely be limited to 100w.

https://www.theverge.com/2021/10/19/22734233/apple-140w-macb...


Yes, that is true. And I hope that would be the default (at least in Europe).

But the timing of this launch (when all new machines were already using USB-C), with the decision of the European commission is quite a coincidence. Which makes the feature feel a bit out of place, or just a good upselling opportunity for Apple.


Honestly, I’m not sure I see this. They made the charging cable USB C at one end, shipping a standard USB C PD adapter and even mentioning that a USB C to C cable can be used with the same adapter to charge the same laptop. I don’t think they could have done much more without compromising MagSafe, and this allows for the same portability of cables and adapters that people like about USB C PD.


Do you have more information about that Liquibase issue?


What’s the problem with building React Native apps?


I have been waiting on this announcement to buy a new Mac, but now that they are out, I don't think I will buy one.

The prices are 2x a Macbook Air, but the utility for me doesn't match. If the 14" was closer to the previous 13" MBP price of $1500 I would be ordering one now, but I will be getting a Macbook Air instead.

Note to Veterans: Apple gives a 10% Veterans discount on everything, including the refurbished store.


Isn't the actual macbook pro always like $1800 or more? It's been confusing since they were selling two different computers both called the macbook pro for so long, but the workhouse with dual fans was always this much and that wouldn't even get you the RAM bumped up to 16" specs iirc.


In addition, the MBA's minimum power draw is much lower than with the new MBPs. The M1 chip has four high efficiency cores where the M1X have only two.

However, mind that the screen brightness is much lower at 400 nits vs. 1000 nits on the new MBPs. Hence, using the laptop in the sun might be less convenient in comparison.


you get +8 GB of RAM, +256 GB SSD, a much faster CPU, and a 120 hz display for the extra $1000


These things don't cost anywhere near $1000 combined. It's a "pro" tax with the standard Apple tax on top. That extra RAM costs, what $50? The extra SSD space is maybe another $100? The display is the only wildcard here and you'll have a hard time convincing me that's an $800+ component upgrade.

If you want the M1 chip and/or want to be in the Apple ecosystem the price is probably reasonable, but in comparison to the price points of previous-gen MacBook Pros stacked against competitors, this one seems overpriced.


>That extra RAM costs, what $50? The extra SSD space is maybe another $100?

I have had this told to me endlessly. And then I see the laptops these people buy and its the cheapest machine with the highest numbers on the sales page. The laptops are abysmal quality and fail in every single metric not directly listed on the spec sheet.

There is so much more to an SSD than capacity. Along with every other component. Also take in to consideration that these components are all on the same chip which makes them significantly faster and harder to produce than the average m.2 drive as the more components on the same chip, the more likely there will be errors so those top spec chips are the top of the production batch.


Much better mic, much better camera, much better speakers, much better battery. There’s a lot more in this 2k Pro compared to Air.


On top of the speaker and battery as mentioned, the extra RAM actually has better bandwith (the M1 probably has 16x1, M1P at 8x2). Still a hefty price tag, I agree. But not out of line from their previous "true" Pros (2 fan variety).


I agree that you get a lot, and that the new laptops are great. If I was going to be using this full time as my work computer I would get it, but as my personal/fun computer the upgrades don't make sense for me.

I can't wait for my work to upgrade me to one of these :)


Is the CPU actually faster on the entry level? Both M1 and the M1 Pro have 8 cores.


Different kinds of cores. M1 has 4 performance (high power) cores and 4 efficiency (low power) cores. M1 pro base SKU has 6 high power and 2 efficiency, higher SKUs have many more performance cores


Hark to justify 2k for 512 GB of storage. I think I will go the macbook air route too. Just upgrade the storage. For the price difference I can get a mavic air 2s drone.


I’d love to have it, but OSHA requires that we don’t work too much on the laptop. So, I would need a keyboard, a large screen and a mouse…

I must not be the right audience. But would a videographist really mount videos on his laptop?


It’s a desktop replacement. Someone does all their work on it and sometimes takes it with them - it’s how I use mine.

And this can drive my four monitors so it’s a candidate for an upgrade.


Different priorities for different people.

13" and 14" are unusable for me. They're just too small.

I used to love that form factor when I was running through airports in a different country each month, which is why I'm typing this on an 11" Air. But in today's world, a bigger screen is more important.

This one has 16.2 inches, which is almost as big as the old 17-inch Mac laptops, so I've ordered one.

If I ever get back to traveling a lot, it'll be the 16" at the hotel, and then just take an iPad to the client. No need to drag a whole computer with me anymore.


It was kind of confusing the way Apple did it but with the Intel 13" Macbook Pro. There was the "2 Thunderbolt 3 ports" which started at $1299 and the "4 Thunderbolt 3 ports" model which started at $1799. Other than the ports the 4 port model had an entirely different generation CPU and the Touchbar. When you compare it to the "4 Thunderbolt" model it is replacing it is a slightly more palatable $200 increase. They are keeping the 13" M1 Macbook available as an entry level Macbook pro.


It was obvious for me when apple released the m1 13 inch pro next to the air that the 14 inch would land at the higher price point. That’s why I immediately bought the air. If I were buying a new macbook today, I would be tempted by the 14 inch pro but I would probably still buy the m1 air.

The air is still a very nice laptop. It is really fast, faster than the 9600K in my imac. And it has a beautiful screen with rich colors that I never use at max brightness indoors.


Same - I am quite glad I went for the Air


Exactly this. I use cloud VMs for heavy lifting. But if I used a laptop for even an hour every day, I would snap this $2k machine.

I just wish they had a $1000 M1 laptop with ports.


Honestly, same. I was afraid I'd regret purchasing my M1 Air last year, but my needs for a personal laptop have gotten simpler as time goes on. I want a small, thin, and light laptop that's easy to travel with. I've got a big gaming PC if I need power.

My work will probably upgrade to these when it's time to refresh our work machines, but for my own personal use the M1 Air just hits all the sweet spots.


Well, the Macbook Air at $1.0k is still better than any Macbook Pro ever made before now


I agree but coming from the perspective that the M1 Air is absolutely crazy value if you aren't constantly doing intensive tasks and if you don't mind paying less to have dongles.


I really enjoy the way that they admitted their design mistakes of the past by having the speaker mention that users love tactile function keys and made it come back. They tried their best to say it was a mistake without explicitly stating it. It's almost like they did real design and usability studies, and then acted on it!

I find it so surprising that in this day, a big tech company is actually listening to its pro user segment (at least more than before).

Fingers crossed for lightning port switching to USB-C, Touch ID making a return to mobile devices (does not have to replace Face ID although I wouldn't care if it did), and the screen notch going away over time.


It's laughable incompetence and hubris, honestly. If you read any material from the original UXers of Apple they would have said the same. "Fingers on glass" can never beat the tactility of having actual keys. The versatility it affords mobile devices in terms of gestures and such is inappropriate for professional devices where we need speed and accuracy.


> (at least more than before).

I think this happened because Jonny Ive left Apple.


This is literally a dream laptop. Improved in every area and lacking in none I can think of right now. And the price is crazy cheap for this performance if you ask me.


Its not crazy cheap, a lot of people overvalue mobility when 90% of white collar work is done at home or in an office.


Especially now, I’d assume more people are doing work at home _and_ at the office. Sure, you could get two desktops and probably for cheaper, but then you have the cost of time wasted switching contexts and messing up file sync, etc.

I personally work from home but go to a coffee shop as my “office” for a bit every day. Mobility is crucial to my, and many others’, workflows.


Or on the bench in the park, at the beach, on a boat, camping out of an RV. Remote work opened so many opportunities to work in awesome places all over the world, why wouldn't you take advantage of it?


Even when mobility just means from the office to the kitchen to the couch, that's a pretty big benefit.


Even when mobility just means flying to another state for a week, a laptop is still worth it. I would never switch to developing from a desktop now that laptops are so capable.


Cheap compared to what you get in similarly spec-ed non-Apple hardware. This will blow the pants off a similarly priced ThinkPad.


I work 90% at home... but my job requires a computer 100% of the time.

If you make 120k, travel once per month, and you waste ~3 hours of productivity each time you can't take your desktop with you... you would have made up for the difference between a $2k desktop and $4k laptop just in wasted labor.


when you work on computers your whole life its nice to be able to get outside every once in a while


If they can (really) deliver Desktop performance, then they are (really) cheap. There is nothing on the market that can deliver such performance that isn't desktop. But the jury is still early on that one, I'd wait for the reviews.


There is one area where the M1 is lacking - software support. There are still areas (eg Docker) that don't work well. It's definitely coming but it isn't here yet.

If you work outside of those areas though, an Apple Arm CPU is amazing.


I have an m1 air and I’ve used it for angular and node web dev including docker. Docker support for me was rough back in march, but it is flawless now. There is no difference to using docker on my intel imac and my m1 air. The rest of the ecosytem has also caught up. Nvm and older node versions were a challenge for a while, but now it works well and I can install old intel builds of node without a problem and use nvm to switch to them.

I was worried about software compatibility on m1 and was glad to have my intel imac for those things it couldn’t do, but in practice it now does everything. Some things require a bit of googling, but that’s par for the course for developer tools. And on the upside, the m1 runs rings around the 9600K in my iMac and it does this without a fan.


This is what I'm afraid of, too. Maybe things are working well and seamless, but the last thing I want to do is have to worry about being the first dev on my team using non-X86, and pushing Arm versions of Docker images all over the place or having to modify my Dockerfiles to use Arm versions instead of X86.

One of the reasons Mac was so nice was that everything worked the same locally as it did on Linux prod machines, whereas even today there are subtle bugs with naive Java or Python code that doesn't deal with paths or line endings correctly. On big multi-team projects, there are always some junior devs on Windows that just don't get it right.

This seems like another problem waiting to happen as more devs start using Arm, while the majority of developers and environments still use X86.


I've been using x86_64 docker containers with the QEMU emulation just fine, and most open source available docker containers are ARM compatible these days.

I don't push containers from my laptop to anywhere, instead those are built on CI/CD... and we are building both x86_64 and ARM containers these days as we are launching new workloads on Amazon's Graviton because it is cheaper and more cost effective.

The eco-system is not just x86 anymore.


It isn't too bad in my experience, having been using an m1 Air since release.

For images with an Arm version available, and that's seemingly quite a lot, it mostly just works(TM). No Dockerfile changes needed.

If you have no other options, you can even run x64 stuff with a bit of low impact futzing. Docker Desktop will use QEMU to do this for you, nearly seamlessly. (though rather slowly)

The biggest gripe I have is the local volume perf, which is "tolerable." This is a problem for Intel Macs too, although to a lesser extent.

(edit for typo)


What issues with Docker do you have? I use it every day and have no issues


It’s a vm


It was also a vm on intel. It has always been a vm on any OS that’s not linux.


This is always going to be the case though. Linux is not macOS and there is no way to do containers on macOS like Linux does.


You also wouldn't want native mac containers. The idea is to run everywhere. Bundling linux in is an improvement for portability when the production for these containers will be linux.


The lack of bootcamp imo is a mistake too given the nature of enterprise software. I know people that are beholden to Autodesk software that doesn't even have a mac version and recommends mac users install via bootcamp, which you can't even do anymore despite there being arm builds of windows.


I fully expect Docker M1 support will match Intel shortly after this new macbook starts shipping


why? its the exact same cpu arch they've had since M1 was announced. Things are better but not 100% and there's no reason to assume they will magically get fixed


Docker has worked almost flawlessly for me. I had a single container which did not run properly but I just rebuilt the image on my M1 mac and it worked fine.


I was surprised at the price point as well


Maybe, I am missing something but the price seems high.

Possibly this is reflected in my general consumer use case, but all of my high computation oriented use cases are offloaded to a desktop.

Where are you deriving the majority of the value for this offering?


In general, the people here will use the perf to run webpack faster.


webpack is usually fast enough, jest is where I need CPU & RAM.

If I were to write tests, that is.


Pro tip: if you can switch some tests from using JSDOM to Node as the test environment they will run significantly faster.


Ah, thanks for the tip. It will probably not change much for the project I currently work on, with tests using react-testing-library, so JSDOM is required in nearly all test files.

But at least now I know :D


>all of my high computation oriented use cases are offloaded to a desktop.

That's very unusual. If you want a laptop for the easy stuff you're usually looking at a MacBook Air or iPad in the Apple ecosystem. These laptops are focused on people who want to be able to do "everything" on a portable computer.


>These laptops are focused on people who want to be able to do "everything" on a portable computer.

At some point when you are "doing everything" you start to be tied to a physical location anyway. Multiple screens, hook ups to external equipment. So it just doesn't make much sense to me to actually need to do everything in a portable manner.

Edit: Apologies, I didn't address the part about the Mac OS ecosystem.


This is for people who require portability epsilon percent of the time.

If you never require portability then a desktop is sufficient.

If you frequently require portability then it's worth getting an air and building your workflow (if possible) around offloading heavy computation to the network, e.g. heavy use of build servers, SSH, syncing.

But many people spend most of their time at a desk, but sometimes require portability. For example in a meeting, visiting a client, working from home occasionally, etc. For these people a heavy duty laptop is the best solution.


> These laptops are focused on people who want to be able to do "everything" on a portable computer

Like 3D rendering? Or deep learning training?


I suspect there are people who want to do 3D rendering on a laptop, and these devices are probably a good option for them.

More broadly, that's why I put 'everything' in quotes. There will always be cutting edge workflows that require the power and cooling you can only get on a desktop or server, but that's a minority of a minority.

There are plenty of people who want to run multiple VMs, edit/render 4k+ videos, edit large audio projects, do semi-complex data analysis, or any of 100 other tasks without needing to choose power vs portability, or manage multiple devices.


I've always wondered who is buying these super fast apple laptops? Freelancers with that much cash to throw around on a laptop, who wouldn't be buying an imac or mac pro instead for nearly as much? If you are doing this sort of stuff for work, you probably already have a server you connect to or at least a beefy workstation under your desk you are using for the heavy lifting. A lot of people in this space just use an i7 dell workstation from their employer that's probably cheap and thats that. Especially in the age of aws there is no reason anymore to run a lot of data analysis stuff locally anymore when you can get compute power for seconds at a time. I feel like they must sell so few of the top spec macbooks.


Remember, stuff is “cheaper” for freelancers / businesses than consumers. For me, on $2k in self employment income, I have the option of either:

1) Giving a bit less than half of it to the government and getting nothing, or 2) Giving all of it to Apple and getting a MacBook.

Works out to an effective 35+% discount on every business expense.


I have always used MBP for rendering. Usually not for final renders. I'm not always connected to high-speed internet so it is handy for work-in progress renders especially if you are not in the office.

Or even if you are working on model that is on the server you still need to have good specs in order work with it.


You can remote into the server and use that horsepower from anywhere, no?


Freelancers who work from home and an office or on the road. I edit photos and video and haven't had a desktop machine for almost 20 years.


I'm sure a few people are doing this but it seems like such a teensy market compared to freelancers who generally work in one place in front of a massive imac screen.


Why do you think that freelancers “generally work in front of a massive iMac screen?” That’s anecdotally very wrong, especially depending on what type of freelancer you’re talking about (the term alone is too broad to be useful).


> I've always wondered who is buying these super fast apple laptops?

Employers for their employees?


Usually they do a bulk order of the base model.


The previous "high end" 13" Macbook Pro started at $1800 so this is a $200 increase for larger higher resolution 120Hz screen, much faster performance, and all the ports they took out years ago.


>my high computation oriented use cases are offloaded to a desktop

That used to be pretty common. But I'll almost certainly use one of these to replace ~2015 vintage iMac and MacBook Pro. Unless you want to keep the contents on a laptop and a desktop separate or have very high-end multimedia workload requirements, it probably doesn't make a lot of sense to pay for and manage two separate devices at this point.


I don't have a desktop.

I think a lot of pros these days don't.


Yeah I just scooped one up, already just a few hours after this announcement and shipping is no longer oct 26 but instead dec 10

From what I can tell this is very high demand. Obviously we didnt know enough about Apple’s launch planning and inventory but it seems like this only happens when there is very high demand.


It was at December 10th within 30 minutes of the keynote.


Maybe its certain customizations that needed that much lead time, whereas a base model did not


strange definition of crazy cheap. esp since we dont know how well it compares yet


they brought back the t-shaped arrows too!


That's been a thing for a couple Intel generations now. I think only the 2016-2017 laptops had those horrible hodgepodge of arrow keys.


Yeah I ordered two maxxed out macs for my dad and I. Price not an issue when you factor in opportunity cost.

Everything and more I need. Otherwise would need to fork up $20k for a mac pro


I can think of a huge area in which it's lacking: GPU performance.


Did you check what they were saying about the M1 Max? It is comparable to the fastest GPU in a notebook they could find while using 100w less energy. We have to wait for the reviews next week what it actually means though.


Also you can't compare GPUs _exactly_ since apple ignoring everything except their own Metal thing


If you care about GPU performance, you don't care about 100W of power.


If someone is buying a laptop they are typically going to care about what 100W savings means in a laptop: less heat and noise, longer battery life or less weight, longer component life, ...

Regardless of whether GPU speed is a top priority or not.


You can't really trust Apple's claims on GPU performances though, we'll have to see how that really pans out on actual benches.


Apple were making direct comparisons to the GE76 Raider 11UH-053 which has a RTX 3080 in it, indicating that it had the same performance for 100W less power consumption. That seems pretty competitive to me!


In what way is it competitive? Like, are there any big titles that I can fire up and have it perform like a desktop gaming PC?

Or is it merely competitive in specs or in applications like Illustrator?


They do not market these as gaming devices. The GPU is intended for rendering content for pro use cases. It is competitive if this is your use case.


Where are you seeing that? The only comparisons I've seen were labeled as being against Intel and AMD integrated graphics.



It’s in the presentation.


Everything in the OP page that talks about GPU perf has a footnote on it that is comparing speeds to Intel Irix or Radeon Pro 5600 (which is about the same).


Yes it’s in the presentation. The video. Skip to the graphs. They have tiny footnotes giving the models they’re comparing to, the highest one is a mobile 3080.


Really? The M1 is a bit slower than a 1650, so 4x that performance gets you solidly into 3000 series range. There might be a handful of PC laptops that can outpace the M1 Max, but none in remotely comparable a package.


There is no way a SoC is running in the same range as an NVideo 3000 series GPU.

The GPU performance of my M1 Mac Mini is roughly comparable to that of my Pixel 5 and Quest 2--both of which are running the Qualcomm Adreno 650 GPU.


> performance of my M1 Mac Mini is roughly comparable to ... the Qualcomm Adreno 650 GPU

I find it hard to believe and benchmarks [0][1] don't seem to support that.

[0] https://www.notebookcheck.net/Qualcomm-Adreno-650-GPU-Benchm...

[1] https://www.notebookcheck.net/Apple-M1-GPU-GPU-Benchmarks-an...


If (Big if) the provided graphs on apple's M1 Pro & M1 Max article are to be believed you're getting 90% the performance of a RTX 3080 Mobile (A 3070 with a clockspeed cut) but at a quarter of the power draw. This seems to be quite a development.


For all the other claims on GPU performance, they call out that they are comparing to the integrated graphics, and they name the GPU specifically. With this one claim, they name a computer, but not the GPU. "Compared to a computer that has a laptop 3080 in it" is not the same thing as "compared to a 3080."

And they don't define "performance". Is it fill rate? Is it TFLOPs? Is it memory bandwidth? The M1 Max has 64 GPU cores. That sounds like a lot! It's not! The NVidia Laptop 3080 has 6,144 cores. That's NINETY SIX TIMES as many cores. I'm sorry, but supposedly achieving 90% of the "performance" of a GPU that has 2 orders of magnitude more cores is just not in the realm of believable. They are being intentionally evasive here.


Wouldn't the 3080's core count you mentioned be more comparable to the 'Execution Unit' quoted on Apple's slides: 4096 EUs on the 32-core variant?

https://images.anandtech.com/doci/17016/2021-10-18%2019_21_1...


They very much put some numbers on performance in the keynote, you can go back and watch it.

The Max:

10.4 tflops

327 gtexels/sec

164 gpixels/sec


What I see on desktop 3080 is:

465 Gtexels/s

164.2 Gpixels/s

if those numbers from Apple are real -- that's bonkers


It's refreshing to see them actually listening to customer feedback, I don't think this ever would've happened in a Jobs/Ive world. Would they have released "pro" focused M1 chips/laptops? Absolutely. Would they have gone back on the (IMO) trainwreck that was touchbar and removing everything but 2x USB-C ports? No way.

ALSO, I'm actually kind of shocked they're supporting charging by both USB-C and magsafe. That is 100% the right thing to do and 100% the opposite of what Apple would normally do (namely lock you in to having to buy magsafe-3 and magsafe-3 only to charge the laptop).


Obligatory reminder that Apple killed the miniDIN-8 and ADB ports on the original iMac before almost anyone had ever seen a USB cable in real life.

They sold the notion of charging off your monitor cable pretty hard, there’s no way they could go back on that.


And to be honest, I use my laptop for home and office work. It’s very nice to be able to have to just (un)plug one cable and have everything (charging, 4K100 video, keyboard, mouse, webcam) working.

If Apple would have killed charging via USB-C PD, it’d have been a big regression for me.


Happy to see they added Magsafe but ALSO still support charging via the USB3 ports.


Controversial opinion, but I think adding Magsafe back (along with the other ports) was a step backward for the industry.

Apple pushed the whole industry to standardize on one connector for everything. If anyone really needed those other ports, a small $30 portable hub would add them all back.

Magsafe 3 is just another connector that will add more e-waste and proprietary cables to the mix. It's nice that they kept the USB-C charging, but now non-2021+ Macbook users can't reuse the charging cables.


Agreed. I'm a fan of function over form, but Apple standardizing on a single port pushed a lot of accessories and other manufacturers towards USB-C. I use my (USB-C) mac charger for my phone all the time, simply because it's there and the cable is longer.

I didn't really find Magsafe that useful anyway, as I'm pretty much always hooked up to a monitor via USB-C (which charges as well).

I guess it's nice for those that miss it, but we were starting to finally get to a world with a one-size-fits-all cable. Remember back in the day when old-style cell phones/laptops all had proprietary chargers?

But it seems the momentum for USB-C is there, so hopefully consolidation continues. Anyway, the EU and maybe others seem to be moving towards legislation to consolidate I/O types anyway.

I'm guessing the HDMI is partially due to bandwidth limitations of USB-C, though I forget the specifics.


I would need to look it up and and of course it depends on the version, but afaik HDMI usually lags behind DisplayPort in terms of bandwidth and features.

The SD-Card slot I can understand; for Magsafe I feels the same as you that I didn't really care for it, but HDMI... they might as well could have put a VGA port on there haha


I guess it’s for TVs and conference room projectors, these never support the DP, and thus it’s very much a real ‘pro’ feature for office work.

So, SD for pro photo/video (and they are very right about full-size SDs here), HDMI for pro office, USB3 for everything else, seems legit.


> I didn't really find Magsafe that useful anyway

Its like insurance, you dont need to for most part but then when luck is not on your side that is when you need it the most.


It is function over form though. The number of times I thought I plugged in the charger and didn't do it properly and the number of times I tripped on the cable when working on the couch alone makes it worth it. The simple indicator light on the magsafe makes it so much easier to see and confirm that it's charging or it's fully charged.


It almost never gets commented on but you can sneeze on the usb-C ports on the MacBook Pro and they will unseat enough to stop charging. It’s a shame the standard didn’t get notches like lightning cable which can easily hold the weight of a dangling iPhone pro max.


> I didn't really find Magsafe that useful anyway, as I'm pretty much always hooked up to a monitor via USB-C (which charges as well).

I’m the complete opposite. I have to buy USB-C cables solely for MacBook Pro since nothing else I have uses it. I have plenty of HDMI and DP cables but literally only use USB-C on my laptop because I was forced to. Reverting back is a welcome change. USB-C only works as the only standard if you’re only buying relatively new tech products otherwise its the XKCD meme about an 11th standard.

My tv doesn’t have usbc, my monitors don’t have usbc, my headphones and microphone still use a jack, my phone is no usbc, etc…


Yeah, but mac standardization was pushing all of those devices towards USB-C. I'm sure we'll get there eventually anyway, but it's a nice push towards standardization.

It does create more friction in the short term though.

I don't see any reason we shouldn't move towards a single standard for all I/O, whether it's USB-C or otherwise. Maybe there are technical limitations to supporting so many things in one spec, I'm not that deep into the weeds.

Otherwise it's just incredibly wasteful to have all these different connectors, and it's anti-consumer. I mean I used to have a bag with 100 different cord types I'd have to rummage through every once in awhile to find what I needed. That shouldn't have to be a thing.

Fortunately we're down to like 4-5 standards now...


The e-waste from chargers is a non-issue. Our oceans aren't filling up with dongles, nor are our landfills.


Also the whole point of magsafe was to save laptops from smashing when someone trips on the cable. They likely reduce ewaste in total.


> Controversial opinion, but I think adding Magsafe back (along with the other ports) was a step backward for the industry.

It's too useful not to add back. Every laptop should have something like this; it prevents accidents and damage.


And that tiny little light is so very useful when you're plugging it in just to charge it...


I agree with this. The mag connection should have been at the power brick and keep the USB-C on the device.


That doesn’t prevent the kind of accidents that mag on machine does.


You can still use your old type-c cable, I don't see any issues with it.


Reusing charging cables is only a big deal in online arguments.


At least it is a USB-C terminated on the charger end. That means you can reuse existing USB-PD chargers while only needing to replace the cable.


>Apple pushed the whole industry to standardize on one connector for everything.

Because people actually want one cable for everything.

USB-C only guarantee 60W charging. Instead of telling user to use USB-C PD 2.1 Cable with Thunderbolt 4 support and this logo with x. Apple now tells its user if you want it to work at maximum speed and charging please use Magsafe.


> Because people actually want one cable for everything.

The problem is that the "industry" gave us one port for everything while giving us a confusing mess of mixed capability cables that use that port.


> USB-C only guarantee 60W charging.

The original PD 1.0 spec went up to 100W (20V/5A). Newest PD spec (3.1) can do up to 240W. And I think apple's magsafe cable terminates in a male usb type C, so it still uses PD internally at least.


More than 3A on PD requires tagged cables, I'm not even sure I have any 5A capable cables here. I do have tagged cables as tagging is required for USB3 over USBC which is negotiated with the aid of PD. Magsafe does make it rather somewhat simpler for the end user and I never really liked that the main cable for a portable could be so easily damaged, I liked magsafe and the magnetic connector on the Surface laptops quite a bit.


Precisely. Not all USB-C Cable are PD cable. Telling the story again.

I was trying to help a lawyer out and trying to explain why the $5.00 USB-C cable he'd bought from Amazon wasn't delivering 4K video to his expensive monitor AND powering his laptop too.

Me: OK: so its a USB-C cable, but its not a high data rate USB-C cable.

Him: But, its a USB-C Cable.

Me: but, no, not all USB-C cables are high speed cables. And some of them can't do high speed and power delivery

Him: but... its a USB-C cable: it plugs into the port.

Me: Um... just because it plugs in, doesn't mean its going to work. You can have USB-C cables that are actually slower than the old USB ports.

Him: but.... shouldn't it just work?

And so on. For... 15? more minutes? maybe 30? I finally got him to buy a "proper" belkin USB-C cable . Which was bought from a company that should be anonymous, but lets just say that a "refurbished" cable was shipped, which, surprise, surprise ,didn't work, for ANYTHING. This basically sums up everything that is wrong with Tech thinking vs User Thinking.


It's tricky because the good part of that port is you can plug anything into it, thunderbolt, usb, etc. But the bad part of that port is that you can plug anything into it and its hard to tell why it might not be working as expected.


Finally this dongle hell is over. It was so cringe worthy to see all Apple users carrying around a bag of dongles. Reminded me of the old LAN party cable fun…


Who was carrying around a bag of dongles? Any sane person would just get a tiny hub that fits in your pocket and has USB-A, HDMI, DP, SD, and whatever else you need.

Which is honestly a much better solution than adding SD and HDMI back to the Macbook. I know it's a controversial opinion around these parts, but I never understood the dongle-gate fiasco. A $30 portable hub will add those parts back to anyone that needs them. I sure as hell don't.


It starts like that, then you have some ports fail on that $30 hub then you are buying more junk you wouldn't have needed in the first place if apple had more sense.


I don't know. I have my MacBook Pro since 2016 along with it's HDMI/USB-A Dongle and while the macbook screen failed on me (twice!) I still have the same dongle.

But to be honest I don't need it suuuper often. Just for my USB-A microphone and the occasional projector somewhere. I live in the USB-C future Apple is turning away from haha


USB A. Ethernet. Dongles are here to stay.


I can't remember the last time I've needed USB-A when not at my desk at home, except in the weird cases where I left my Yubikey 5C on my other laptop and needed to plug in my Yubikey A. Generally, when I actually need USB-A, I need a _lot_ of it, so hubs/dongles are a necessity regardless.

And if I'm going to use a hub, I might as well get HDMI, Ethernet, charging, etc. out of it.

The addition of HDMI has eliminated at least 90% of the cases where I've needed to take a dongle somewhere with me.


I need to use usb-a every day to charge my 2020 iphone with the cable that came with it


The iPhone 12? You must have gotten a bad shipment because they’re supposed to include a USB-C to lightning cable.


Second gen iPhone SE.


I bought a USB-C lightning cable for my older iPhone X, but went magnetic wireless charging after so hardly ever use it.


I have a wireless charger, but I will opt for the cable most of the time since its so much faster.


It's a trade off: wired is faster at charging, wireless is faster at setting up the charging session. If you are going to bed anyways, wireless wins since you just plop the phone down and it is charged when you wake up in the morning.


Even then I take wired charging. It takes all of one second to plug in a lightning cable to the phone. I've plopped my phone down on my charging pad on the other hand, and walked back to find that I had just missed my mark, and the phone was sitting there for hours not charging at all. Happens pretty often and now I need to spend an extra second to confirm that the phone is getting charge while on the pad, vs with lightning when its obvious and just works.


> I've plopped my phone down on my charging pad on the other hand, and walked back to find that I had just missed my mark, and the phone was sitting there for hours not charging at all.

Yes, this is why Apple is including magnetic guided wireless charging in its new iPhones. Lacking that, I had to carefully choose my charging pad to avoid these kinds of problems.

Fishing around for the lightning chord and then trying to line it up with the charging port on the phone is annoying, especially at 5AM in the morning when I'm half awake. Not sure this would be a win for everyone, but it is for me.


Mouse, external keyboard, headset... They all use USB A only.


If they were carrying around a bag full of dongles they'll still need most of them, as this just adds an HDMI port.


And SD card, for photographers and 3D printers


This was my biggest concern. Having the option to use MagSafe is much better than always being required to carry around yet another charger.


Same here. I want to use MagSafe at home and the office, not carry around a bulky bespoke charger just for one device while traveling.


From what I can tell, the charging brick is the same either way. The new MagSafe is just a different USB-C cable that has the MagSafe connector on the end.


Ah, awesome, that was actually something I was hoping they would do (and probably suggested once or twice in some HN threads). If I can just plug the MagSafe cable into any old USB brick, that's way more convenient.


This was my biggest concern too, that my £300 dock would be rendered instantly obsolete. It’s nice that I could still plug these new pros into a dock with a single usb-c cable and have charging, multiple displays and usb connectivity all with one cable.


EDIT: Never mind


MagSafe cable does indeed ship standard - see https://www.apple.com/macbook-pro-14-and-16/specs/.


The MagSafe Cable + relevant USB-C Adapter is present in the box, per the site.


It’s listed as “in the box” on the product page.


It is shipped with USB-C -> Magsafe cable in every configuration I see.


For me personally, they hit it out of the park -- except for the notch, which seems like a complete showstopper to me at first sight. Maybe it's not so bad in practice if games and full-screen videos keep that whole strip solid black? But almost all the screenshots are of fullscreen apps, so it seems Apple thinks it looks a bit ungainly too.


I can live with the notch, but honestly I would pay more for an option to remove it along with the camera.

For me, it really depends on how this camera compares with a standard Logitech C920. If it's as good or better then I'm fine with it, but if it's just a minor upgrade over the terrible pre-existing camera then I'm not sure why they would mar an otherwise fantastic product.


Apparently the camera in last year's M1 macbook was already a huge upgrade compared to previous ones, and the new macbook pros have a higher quality 1080p sensor, and both leverage some of the special IP blocks on the chip to make the image even better, so it should almost certainly do better than the C920 (especially around audio capture as well, if you use that)


The M1 cameras are not a huge upgrade, they are the same as the ones macbooks have had for years now; they just now make use of the M1's image processing hardware, and even then the improvement is nothing to write home about.

I love my M1 Air but this is one area where it doesn't really shine.

If the 1080p camera is anywhere near the one in the new iMac, it should be better than anything you need.


Interesting. I can't speak to the Pro, but my M1 MBA camera didn't* seem any different from the one in my 2013 MBP. If this camera is actually better than the C920, that's pretty cool.

Does anyone know how an iPad Pro camera compares? I'm guessing that'll be a decent indicator of what the new MacBook hardware is like.

*: I use the past tense because the camera lasted about a month before it randomly died. Now the camera light is always on (or at least it was before I taped over it) and the OS no longer recognizes it. (None of which I really mind, since I was planning to tape over it either way.)


Same. I finally broke and bought a nice webcam. I'd rather just not have one on the laptop than a notch!


I wonder how this will work with Apple Calendar or Safari in full screen (with the new iOS 15 compact tabs). I wonder if they are moving to a concept of full screen where the top bar is always visible.


From the marketing images on the website, it looks like fullscreen apps will just have a black bar across the top. Kind of frustrating! The inaccessibility of the top menu bar is the main reason I never use the native full-screen mode on macOS. Since the menu bar is now within the extra added space on top, I hope we eventually get the option to leave it visible.


The new version of macOS allows to show the menu bar at all times, even if the app is in full-screen:

> Full-screen menu bar. You have the option to display the menu bar at all times in full screen so you can easily view the app menu and other glanceable information anytime.

Source: last feature in https://www.apple.com/macos/monterey/features/


Oh, nice! Thanks for pointing that out. Seems a little strange they didn't use that in the marketing shots to show off the extra screen real estate.


If for many apps the top part of the screen is just going to be opaque black with no content, then that might push me towards a kind of full screen where the top menu bar is always available.


It'd be interesting to see a design like Safari 14 with the address bar to the left of the notch and buttons on the right.


Honest question, did you ever use an iPhone with a notch?

In my experience, it kinda disappears within minutes. I barely notice it’s there.


The notch gives us smaller bezels / more screen real-estate. It _kind of_ makes sense.


smaller bezels

Not really, no. The Dell XPS for instance has just as small bezels but without a notch.


The Dell XPS has, hands down, no joking, the absolute most dogshit webcam I've ever seen. Maybe they need a notch.


And for the longest time they put it in the bottom left corner for a good view up your nose. They also seem to mount the mic inside the fan bearings.


I’d heard about the dreaded bottom camera.

I bought an XPS for my fiancée a few weeks ago. I’d heard the XPS camera wasn’t great, but it turned out far worse than I could have feared. They couldn’t have made a worse camera if they tried. She needs a decent webcam for doing teletherapy, so we had to send it back.

Unfortunately, I’ve not been able to find a laptop that has a good webcam. We’ll probably just get her a MacBook Air, which is at least serviceable in that regard. (I’d get her a 14 inch MBP but she won’t let me spend that much on a laptop.)


Having used many Dell XPS with the webcam in different locations they're all terrible. If a notch is the compromise needed for thinner bezel while retaining good webcam quality that is fine with me. You don't lose any screen space after all. It just means the menubar (which is a real waste of screen space) gets moved up and your "desktop area" is the true rectangular size of the display.

Sure it looks odd but I can see it working very well due to macOS's menubar UI design. On Windows it wouldn't work nearly as well. Interestingly it would work very well on GNOME (with the clock moved to the right hand side) as well!


720p webcam though, just like the old macbooks. Don't really know how thick the screen is though on either


I suppose it does boil down to how much they improved the webcam. If it's still crap like the old one, why bother with the notch? Just slap a 720p webcam on there in the bezel and ditch the notch.

Sidenote: I think Apple could make a killing selling a $200 4k webcam. The market doesn't have many good options now, and I'm sure many Mac users would buy one.


The original Apple iSight camera (640x480 resolution Firewire cam) was really nice for 2003. It retailed for US$149.

https://en.wikipedia.org/wiki/ISight


I imagine most games and videos can be run in 16 by 9 aspect ratio and that'll draw black bars above and below thus hiding the notch anyway.


From my understanding the screen is 16:10 plus the extra space on both sides of the notch.


I hadn't even thought of gaming, but most games I play tend to use the entire screen. Though maybe with a ratio above 16:10 that will be different?

Games have never showed awesome support for Apple's weird decisions, so I guess I would expect some issues there, unfortunately.


Heh, if push comes to shove, I reckon games will have to add support for `safe-area-inset-top`


Complete show stopper? A notch?


It doesn’t seem to bother phone users that much.


And so ends the last of the Dark Days of Johnny Ive. Don't get me wrong: he innovated a lot. My theory is when Jobs died, he lost his counterbalance and his designs suddenly became without compromise (no that isn't a good thing).

It's when we saw the 12" Macbook as the crusade for thinness at all costs (terrible performance, only one port, a terrible version of the macbook Air), the end of the Macbook Air (before getting resurrected a few years later), the butterfly keyboard (allegedly to save 0.5mm in thickness), the Touch Bar (primarily there to boost Average Selling Price) and the loss of MagSafe.

I was surprised last year how good the M1 was. The second generation looks even better. This thing has function keys back, no Touch Bar, MagSafe!!!, some non-TB ports and up to 64GB of RAM.

Shut up and take my money!


The 12” MacBook was beloved by my elderly parents, who liked the low weight design and didn’t need to do anything other than check email and the web. Horses for courses. It was a great machine for the right audience.


My girlfriend also loved her 11" Macbook Air and was disappointed that the smallest option is now a 13" Air.


The 13 inch m1 air is barely bigger than the 11 inch thanks to smaller bezels. I have both, they don’t feel much different in practice.


Genuinely curious: what could you do on this that you couldn't do on iPad?

I am perhaps biased on this issue because I'm still mad about what happened with the Macbook Air (13" in particular). I bought one in 2011 and it was an amazing laptop. It was a sweet spot of compromise on power and portability.

But the best feature was the price: there is a world of difference between a $1300 laptop and a $3500 laptop. Both are reasonable sums of money but the first is infinitely more "replaceable". You don't feel as bad about losing it or having it break.

What came after were much more expensive for features literally nobody wanted (eg loss of ports, loss of MagSafe, a significantly worse keyboard, more expensive repairs, Touch Bar) at a higher price. The 13" MBA languished for years just needing a display upgrade and that's it.

I knew it was the nail in the coffin when the 12" Macbook came out because Apple wouldn't have 11", 12" and 13" SKUs (they also had the 13" MBP by this point) and I was right.

So for me the 12" MBP was everything bad about the new Macbooks with worse performance and a higher price. I didn't (and still don't) really understand who the 12" Macbook solves a problem for.


And the thing is, I don’t mind they put a weird, shallow keyboard and one USB C port on that 12” MacBook. They were trying to make the lightest laptop they could. Those compromises made sense.

The same compromises made no sense on high-end pro machines.


The 12" wasn't for everyone, but I know 2 people who had one and they both thought they were just fantastic.


I have the first (2015) version of the 12”, it’s still my main laptop. It’s powerful enough for light coding and music production which is all I need. I love the form factor and if they brought a new one out with an M1 chip I’d buy it in a flash.


The 12” was a truly awesome machine, and it would’ve been even more great with the M1.


I've been extending the life of my MacBook Pro 2012 (13") all throughout those Dark Days (with 8GB RAM from the start an SSD added later). As a software dev using it day in day out. It's really at its end now, runs only on AC and the only reason I can still run (yet not debug) the apps I build is with a bit of a hack.

I skipped the crappy keyboard, and loud, hot running versions. The M1 was a bit new and low on mem for me (wrt the future), but I just ordered the M1 Pro 32GB.

Only hesitation I had concerns ARM support. It sure is getting better and better, but if they would have added one more superb Intel model right before introducing the M1's I'd have gone for that. But that might have hampered M1 adoption so I can see why they didn't.


I have a MacBook Pro M1 since last year. With Rosetta and the added performance you will not notice any issues running any normal Intel app. Most apps have now even a universal version. Only a very few edge case apps are unavailable. So I am happy and really love the power and longevity that M1 brings.

You can always check if apps work normally through: https://isapplesiliconready.com/


I think the 12" MacBook was a great design. The ultimate compact laptop - I even would have bought one if that day the Apple Store had one in stock. But it didn't work out technically. Maybe it had, if the Intel 10nm process had been right on schedule. With the 14 nm parts, it was just too much of a compromise. With the M2, Apple might consider bringing back that form factor, that should work nicely. And move the Air to 14", tiny bezels. That would be the outer form factor of todays 13".


Who do you think is responsible for the new iMacs?

They took a very versatile all-in-one, removed almost all of the ports and added an external power brick -- just to make it a bit thinner. Which is even more ridiculous considering it's a desktop.


The iMacs have almost always been just laptop components in a display case (I'm obviously excluding the iMac Pro). I think getting rid of the USB-A ports is fine, and the SD card slot was always in a weird, hard to reach spot.


Ok, these I find a little weird, honestly.

I guess they fill a niche for some? My best guess is that these were a compromise with what the M1 could then do. It's kind of the same reason we didn't have M1 MBPs last year. That's just a guess though.

I hope to see less-compromised M1 iMacs in the future though.


I love my 12" MacBook and will be sad when I have to replace it. Its single port is not a big problem for me, and I really appreciate how light it is. If I replaced it with an M1 MacBook Air I'd be bothered by how much heavier it is.

If they had made an M1 version of the Retina MacBook, there would've been no complaints about the performance.


> It's when we saw the 12" Macbook as the crusade for thinness at all costs (terrible performance, only one port, a terrible version of the macbook Air)

Honestly, I'm kind of surprised they haven't brought this back with an M1 in it. It was mostly terrible due to compromises forced by the terrible chip it used; an M1 would be within its power envelope.


I liked the 12". It was an iPad that runs macOS. Rob Pike (golang) once said: "my two-year-old 11" MacBook Air is the only piece of computing hardware to make me happy since I can't remember when." (https://usesthis.com/interviews/rob.pike/)


It's great Apple is changing course back to where they were in 2013. What sucks is that all those laptops built between 2013 and 2022 will be dogs on the used market.

For my money, I'm getting the Framework latop and I'm going to bite the bullet and run (in order of preference) FreeBSD, NixOS, or some Linux (Ubuntu, probably). I'm tired of Apple's shit.


> FreeBSD, NixOS, or some Linux (Ubuntu, probably). I'm tired of Apple's shit.

You're tired of Apple's "shit", so you want to build a laptop on top of hardware that cannot be taken to a store and replaced the same day (Apple) and install an OS like FreeBSD which is going to be rock solid but supported by virtually no one?

I'm all up for FreeBSD and Linux - love them and have ran them both on laptops and desktops for years - but given I want to get work done and collaborate with other people they're not good options.

Personally I think you'll build a great machine that will blow the socks off of a MBP, but you'll be battling with interoperability with other people and businesses; and you'll be battling the OS itself to keep it working as a slight driver change breaks everything.

And that doesn't even mention the hardware issue: what happens when something pops?

Just some thoughts to consider.


Considering the framework laptop is the most user repairable laptop ever released, your comments are pretty confusing.

https://frame.work/marketplace


I'm not convinced I'm going to be able to swap a USB-C port for a Display Port and FreeBSD is just going to be like, "OK! I can handle that!" It's more likely it's going to laugh in my face and simply not do anything at all. Worst case: it'll crash.

And even after shutting down and swapping the components out, I'm not convinced drivers won't be an issue.

Do you have more information on this?


There is nothing fancy or proprietary about this, it's a usb-c to displayport adaptor that is designed to fit flush into the body. You can use the same adaptor on a macbook if you wanted to, and certainly don't need to shutdown the machine to do so.

https://frame.work/products/displayport-expansion-card


https://jcs.org/2021/08/06/framework

> 2021-08-09: I added Tiger Lake LP support to pchgpio, enabling GPIO interrupt support for ihidev, which was needed to make the touchpad work properly in non-polling mode (and apparently not cause the EC to randomly freak out and power down the system).

> 2021-08-31: Closing the laptop’s lid does not currently cause an S3 suspend due to a bug in the DSDT of the Framework Laptop which references an invalid EC device in the ECDT. I’ve been told by Framework that this will be fixed in a future BIOS update.

> 2021-09-21: A fix has been committed for the touchpad not working after S3 resume.

Let's get the basics working first before we call Apple's products "shit".


Well, like so many people, I've not given much thought to what I'll do if my laptop "pops", because I've had this one for 8 years now, and it's been pretty solid all that time. Presumably if it blew up tomorrow I'd use my phone to pick up a used laptop of some kind and restore from backup. Right now that's an MBPr. In future I hope my FreeBSD Framework. As for sharing, I believe I can run what I need to be productive under FreeBSD. But time will tell. (It would be nice if a FreeBSD guru would volunteer to baby the Framework crowd with a guide. I would click the paypal donate button.)


I think you'll be fine, but I just wanted to point out the possibility that there could be interopability problems with other people, especially if that machine is something you take on site with you to clients, etc. That might not be a thing for you, though.

Would love to see how you go because, and this is where I get honest: I want to move to FreeBSD on the desktop and basically just use a browser and VSCode with Remote-SSH. That's what I do now on a $3,000 MBP. It's something I can do, technically, on a $300 Chromebook.


The touchbar Macbook Pros were released 2016 rather than 2013.


2013-2022 will probably still be in demand for anybody that's camping out on Mojave to keep 32 bit support or has Intel-related needs, though that demand profile will of course change over time.

Framework + Linux sure seems compelling, though. I'm definitely tempted to try it.


Some people will still be needing an intel machine or a bootcamp machine for software compatibility reasons


> all those laptops built between 2013 and 2022 will be dogs on the used market

In what sense? I'd bet they still sell for more than similar Windows laptops from those years.


I assume they mean by comparison to what they would have gone for in an alternate timeline where Apple hadn't bungled the design in the first place.

They could have released another five years of evolutionary updates to the MBPr line instead, and then everything released between 2015 and the first half of 2020 would have looked less obsolete by comparison. i.e. There would be more demand in the used market for those machines if they were more cosmetically similar, even if lacking Apple Silicon and mini LED screens.


What sucks is that all those laptops built between 2013 and 2022 will be dogs on the used market.

Prices will fall, and they have been falling, but they’re not going to crater. The high end models will sell to people that have workloads that are intel only. The cheaper models will sell to people who want to pay less than the price of a new mac, or who don’t understand what they’re buying and think they’re getting a good deal on a 3 year old pro mac priced above the m1 air.


what happened in 2013?


When I interviewed at Apple years ago there was a poster on the wall that had a picture of a MacBook with a MagSafe charger connected. A kid was crossing by the desk the laptop was sitting on it and it was about to cross the charging cable. Without MagSafe laptop could've fallen on this kid's head.

The writing on the poster said: Come work with people who invented MagSafe to save children's lives. (or something similar)

I really liked that poster. Never worked at Apple but I still remember that moment.

MagSafe is great! What a shame they took it away for a few years!


In another commercial, a child walks near the edge of a cliff, loses balance and grabs the one thing in his reach: a laptop charger cable. Unfortunately, it's MagSafe, and the child falls into the abyss ...


Because if it hadn't been magsafe, the child would be tethered to… a laptop? Presumably bolted on to whatever surface it was sitting on?


His dad was carrying the laptop.


And also his dad is Jony Ive


This really happened, I read about it on LinkedIn.


This really happened, I was actually the laptop in this story


Thank you, guys :))


I won't be getting one, because the M1 Air is currently more than enough for comfortable webdev, but I'm genuinely happy for people who will be getting one, because those things are incredible!


My first hope is that the Pro line leads to more ubiquitous ARM support for professional software tools.

I'm lucky that most of my WebDev workflow works fine on M1, but there are those rare instances I bump into something that doesn't work quite right. And unfortunately, these instances tend to be pretty frustrating.

My second hope is that more ubiquitous ARM support lights a fire under other CPU designers to build better high-performance ARM chips for other desktop operating systems. I've been interested in switching to Linux for some time, but the hardware/performance of my M1 Air simply has no real competition at the moment.

An ARM-powered Framework laptop running Arch is my dream.


Are there any recent examples of web development things that haven't worked right on the M1? I'm looking to upgrade but am worried about this.


Node 14 won't compile natively on arm64 right now. Though I do understand it's installable with Rosetta.


This is me too! I’m content with the M1 chip for mobile and ML development. I know this machine of mine will last a few years before it needs an upgrade!


Don’t worry, webdevs around the world will do their best to hasten it as much as possible!


I think every React component should be a docker vm.


Same, I have the 13" M1 Pro and I don't regret it.

I don't.

I don't regret it...


So true. Maybe the refreshed Air in a few months will have more ports and be an even better value (relatively speaking).


Yes. Agreed.

Most people do not need the new notebooks. They would be content with the Air and the 13-inch Pro.


Everything they’ve said about the new MacBook Pros is extremely promising, but they had to add a notch to the screen. Just why? All for the sake of reducing the top screen border by a couple of millimeters.


Try this: Instead of thinking "Boo, they took away part of the screen with a notch" think "Yay! They extended the screen a few mm on either side of the camera"

If you want to join us in the cult of Mac, learning little contortions like that will make you a lot happier.


I remember naively asking on the Mac forums once, whether there was any way to have the laptop lid closed in Mac OS without putting the laptop to sleep.

Of course not, I was told. Apple's laptops use a superior thermal design, which could damage the laptop if the lid is closed while it's left running. Who would want to do that anyway? Better to have good cooling.

I didn't even bother to reply and mention the fact that my Windows install on the same laptop allowed it without complaint.


For whatever it's worth, I've run my MacBook Pros in "clamshell" mode (e.g., lid closed, connected to an external monitor and keyboard) all the time -- I did it with every work Mac laptop I've had and with my personal one back when I was using it as a desktop. I know there's lots of "never do this with a MacBook it will overheat!" advice out there, but it was just never an issue for me, and this was across enough different devices and generations that I don't think I was just consistently lucky. Are you describing something else, e.g., closing the laptop lid without connecting an external display and preventing it from going to sleep?

I never had an issue with that when I was using Windows work laptops, either, except that I recall both ThinkBooks that I had seemed to have about a 50% chance of a full kernel panic style crash when I unplugged the external display. This was back when ThinkBook was an IBM brand, though, and IIRC I was running Windows XP on both.


This was back around 2012, and it looks like not sleeping when the lid is closed is officially an option in the power settings now, so it sounds like things have changed. Although I don't think overheating was ever a real problem in the first place. With my 2012 MacBook Pro when the screen is closed, the air from the vents can still escape out behind the screen.


I remember clamshell mode working in 2006. Didn’t use it much though, was afraid it would fry my Macbook Pro. You could boil an egg on the first Intel Macbook Pros.


What? Of course you can run the laptop with the lid closed, it’s called clamshell mode.


This was in around 2012, maybe attitudes have changed. This isn't my thread, but shows what responses to the question were like at the time: https://discussions.apple.com/thread/2805582


You only have to go a few responses into that thread to get an explanation that the action of closing the lid will trigger sleep, but an external keyboard and mouse will wake it back up, allowing for the machine to be used while closed. You may not have originally intended to make the distinction between having the lid closed vs. the process of closing the lid, but it's clearly the former you were worried about, and that use case has always been supported.


Mac laptops have supported clamshell since at least 2000.


But it takes the menu bar space, which is already scarce even without the notch. Some people may have many app icons and "widgets" (e.g. iStat Menus) on the menu bar.


Agreed on this point.


Why not?

I hated the idea when it came around for the iPhone - but actually using one showed me that it just isn't an issue. There's a status bar at the top anyhow, and the center of it is unused.

On a laptop, I see it as just extra pixels dedicated to the OS's status bar.


Update:

>macOS Hides the Notch on New MacBook Pro in Full-Screen Mode

https://www.macrumors.com/2021/10/18/macos-hides-notch-on-ne...

I take back what I said about the notch. It is not as bad, as it initially seemed to be


The notch occupies otherwise wasted space in the menu bar - how is that not purely positive?


Well, unless you’re only ever using text edit, the menubar is there for er, menus? It’s going to be interesting to see the gymnastics around this


There was at least one shot of it -- it'll be a gap in the menu, with top-level menus flowing around the notch if necessary.


It’s not really clear what your point is. There is plenty of room in the menu bar for the camera to be in the middle.


Is there? Photoshop has a bunch of menus, not sure what they're planning to do. See this screenshot for example:

https://i.imgur.com/hHEl3cN.png


Menus are positioned by the OS, presumably they won’t collide with the notch. There is no problem.


I did notice Xcode used up all the space before the notch:

https://imgur.com/a/Qlxr7t2

And apps that don't have a traditional Mac menu just have a black bar hiding the notch:

https://imgur.com/a/McNBTHY

I actually don't know what Mac OS does today when it runs out of room for menu items.


I bet there will be software adding a black line, hiding the notch tomorrow so no big deal. Those are additional pixels anyway.


They basically showed that in some fullscreen images in the demo, so I think that's how it'll be.


Top-Center is among the most useless screen real estate. It takes just about the same amount of space as the keyboard language selector a bit to the right, or any one of the menu items to the left. And those two interface never fill up to the point where they need that center spot (or I'd be cleaning out those mostly annoying gadgets top-left).


>Top-Center is among the most useless screen real estate.

Especially on MacOS. 100% like this design - good camera, smaller bezels, useless space being cut out.


Top centre is where Gnome Shell on my Fedora install shows time, notification indicator and application notification icons (hexchat, Steam, etc.). Hardly an useless part of the screen!


MBPs are designed for macOS, if other OSes do other things with the hardware, that's on them. Why should Apple care when probably 99% of their users (if not more) are exclusively using macOS on the hardware with other OSes in VMs?


But it doesn't work on my non-standard OS that wasn't designed for the hardware!


Can you make Gnome Shell do something else, then? Or if you don't want the extra pixels, I'm sure you could arrange it so that your OS does not use them.


Sure, you can move the clock and indicators, but it's the default place for them. Also not sure about eq. fullscreen applications.


More smudges on the camera, since this is where the user grabs to open the lid, and the camera seems to be closer to the edge.


Not to mention - how do you cover the camera to maintain privacy if its now part of the screen ?


I don't understand. Was your camera cover extra wide or why would this be different now?


My Thinpak P1 has a moveable mechanical shutter I can move to physically block the camera lense. I don't see how you could do that reasonably with a notch design.

Before that on previous laptop models I used a post-it note or a piece of paper held in place with a clothes pin.

With the notch design you can't do that either without blocking parts of the UI!


> With the notch design you can't do that either without blocking parts of the UI!

I thought I was following until this part. Why couldn't you just put a post-it or whatever over the camera itself? There are no pixels where the camera is (although I would be surprised if this is not something under active research).


Oh, looking more closely the notch is actually quite wide - yeah, that might be doable to fit a post it over it. Still a lot less maneuvering space for that than on a normal laptop.


I'm sure someone will leap in to correct me if I'm wrong here, but I think the LED "camera on" indicator is purely hardware on modern MacBook Pros -- you can't override it in software -- and I doubt that's changed with this one. So I'm not sure how big a deal this would actually be in practice.

(I would probably go for a bit of black electrical tape if I were really worried about that, though.)


The camera is pretty much in the same place?


The vertical screen space they might have freed by adding a notch was immediately taken by increasing the top bar height. I don’t see a purpose for such a trade off.


You still end up with more usable space since now the menu bar isn't taking up those pixels.


If I were to hazard a guess, it's more of a brand recognition thing - now you don't even have to see the apple on the back to know someone's using a macbook.


> If I were to hazard a guess, it's more of a brand recognition thing - now you don't even have to see the apple on the back to know someone's using a macbook.

The intuition driving your hazarded "guess" is unerring.

A recent leaker called out the notch which brought renewed attention to a different leaker, ty98, who mentions notch (and other details) as early as August. [0]

Coupled with confirmation of ty98's leak that there is "No "MacBook Pro" logo on the bottom bezel", the MacBook Pro notch may very well be a branding element.

Instant visual recognition without using words. The notch is branding that transcends language.

[0] https://www.macrumors.com/2021/10/17/apple-notch-design-obsc...


It's a macbook though. You can just write a program that moves your screen down a millimeter or however much and live your life notch free with peace of mind.


It sits where the menu bar is anyway. Similar to the status bar on iOS.


I agree. I finally upgraded from a Iphone SE to a 12 Mini. The notch is horrible and I’m not sure why people wave it away with “I got used to it”. I’ve had it for months. No idea what lunatic at Apple thought it would be nice for a laptop.

Can’t they come up with something a little more original than removing screen real estate to stand out?

I will be switching back to an SE just based on how overall unwieldy the new phone is. The notch is just one of the nails in the coffin.

The notch to me says, “now I know which products not to buy”.


Really impressed by the new MacBook Pro. Part of me would really like to order a 14" model with M1 Pro and 32GB RAM. However, I really don't like Apple's direction with macOS. I'm still running a 13" MacBook Pro from 2013.

What's a little strange is that I first came to Mac with the introduction of OSX 10.0 although I wasn't overly keen of the hardware at the time. Now, I feel that the situation is reversed. I really like the hardware and am beginning to despise the software -- possibly to the point of abandoning the platform completely.

Several years ago, I moved my music collection out of Apple's software and I use an Android phone. I use an old iPad for web browsing and Youtube. I purposefully transitioned to a point where I can leave the Mac platform without a huge effort. It makes me wonder.


I'm exactly the same. I was looking forward to leaving Apple's ecosystem and getting a Framework laptop or something similar (I'd prefer AMD and I'd like to run Qubes, which is rather picky with its hardware). But with this release Apple is making it hard to do.


Real keys and not touchbar, advertised as a feature in the presentation. What a leap of innovation. Though Apple makes the best hardware, they are arrogant as ever.


How exactly would you expect them to convey that information without sounding arrogant? We all know about the TouchBar. They know it was a mistake, we know it, so why mention it in what's essentially a sales pitch?


The new "feature" is that the function keys are full height. Apple laptops had half height function keys for as long as I can remember.


I didn't even notice that they were full height. There was something off aesthetically when I saw the pictures of the new keyboard and I figured it was just the missing touch bar.

As a touch bar hater from the beginning, I'm super excited to be able to hit keys to play/pause/skip music, mute, and adjust brightness without having to look at the touch bar to do it.


So many of the 'features' are merely the return of a few features kept hostage for too long.

Ports, a half decent webcam, scissor switches & mag-safe are exactly the 'features' that took zero effort from Apple's side to implement.

I like that this seems like the first real 'pro' device Apple has produced since the 2015 Mac, but the reason it took as long has entirely to do with Hubris rather than technical constraints.

The M1 series of processors are IMO the only new standout innovation in these current-gen devices. That being said, it is seriously impressive innovation and the other laptop manufacturers appear to be even more complacent. (Looking at Dell and Lenovo making the same device for 10 years with only the smallest of changes each time)

I like what Microsoft is trying to do with their devices, but they don't seem to be too keen on competing against their 3rd party customers directly. So you get laptops with weird form factors or ones that try to go for value-for-money instead of top class performance.


It's more amusing than anything. I'll take the correct product decisions over pitch-perfect PR any day.


Sure, but did anyone had a serious use for them? Surely there are people, but I have the _feeling_ that the Touch Bar was a failure from the beginning. Also they brought back a magnetic snap cable for charging (the same my Air mid 2012 has that I'm typing on rn) and they added a HDMI and a SD card slot. It's all coming back ^^


Wow! Apple just can't win with you, can they? I'm glad they aren't arrogant enough to revisit some of their design decisions.


I suspect what the OP is reacting to is not revisiting a design decision but the presentation of it.


Other companies say "We listened to your feedback and delivered."


64GB of 512bit unified memory is REALLY fast/huge This will be better than many training GPUs for ML...

Better than dual socket servers...

I wonder if the mac pro will be dual proc...


So, now we know that LPDDR5 will be coming with at least 16GB per die stack. A doubling from LPDDR4. One package = 128 bit, a double of regular DIMM I/O width.

I see, it's not too much behind even HBM2, of which we may never see a mobile variant.

I was long pointing to people making laptops that LPDDR4 is much cheaper than DIMMs in overall, despite nominal per-GB cost being higher.

The elimination of manual assembly, termination, extra through hole parts, along with LPDDR actually taking less PCB area, less layers, and being less demanding of the PCB material easily compensates for higher chip cost.


Yeah, but can it run CUDA pipelines?


Considering that CUDA is a proprietary technology from Nvidia, how could they?


There are a bunch of CUDA translation shims being worked on.


Not holding my breath, it's been almost 5 years without a working CUDA shim. Hopefully this will push that work over the edge though. If I had the relevant skills I'd contribute...


I think they want everyone to move to Metal Performance Shaders. I've done some stuff on them, but not nearly as developed as CUDA.


Any idea how the neural engine does vs the gpu?


doubt it


I was excited to replace my 2017 MBP with a new model that has a decent keyboard and old-school features like multiple kinds of ports and an SD card slot.

With a starting price of $2000, I don’t know that I’m going to pull the trigger right now. My guess is this is a sign of a supply chain crunch, and they are maximizing profit instead of revenue, which makes sense.

Still disappointing that getting these basic-seeming features (I don’t care about the performance) costs $600 more than my last MBP. The base Pro shouldn’t jump in price by 50%.


They've made 16GB and a 512GB SSD the base spec.

Maybe they're tired of the "only 8 GB ram" comments that always get posted. /s


Yeah they’re basically forcing everyone to purchase a non-base amount of storage (and RAM, to a lesser extent) and charging them for the pleasure.


For me the price of RAM and SSD were the downers. I was hoping that after over 7 years I'd be able to buy a new computer with more RAM and drive space without paying that much more, but to upgrade I'd need to spend $600. The RAM is less important for me, but the fact that drive space hasn't grown much is a bummer since data accumulates every day and the kinds of data only seem to be growing in size.


The 13” is the base Pro.


I don't really like the chassis design. It looks dated. But the rest of the Macbook Pro sounds really exciting!

For those who missed the keynote, here are some laugh-inducing moments - "Our Pro users love to use physical function keys. So we have added them"

"Our Pro users like to connect a lot of devices without using a lot of dongles"


I had the same thought. It looks like the 2011-2012 era chasis. Kind of funny but I imagine it’s necessary for the added ports. Probably couldn’t work with the slimmer form factor because of those, and scaling it would make it too big. Guess we can’t have our cake and eat it too.


It looks like a PowerBook to me, and yeah, it looks a little dated, and the curves look strange. But Apple being willing to release a laptop that "looks strange" is to me a signal that they want to value function over form on Pro laptops, and that they care more about cooling and power than they do about thinness and "elegance". It's a great step in the right direction. They can make the MacBook Air as pretty and "elegant" as they want but anyone with a boiling-hot 16" MBP will sure tell you how elegant it is to have a laptop burn their lap and throttle all the time.

It was time Apple valued pro users more than they valued their laptops looking good in hero pictures.


I suggest thinking about design as not belonging to a 'date' or an era, but whether it solves a given problem.

We've grown up in the era of designers that make us think that design is aesthetic and like fashion, it evolves. I think of it like an engineer - its job is to solve a problem.


I'd guess that's about thermals. Thin looks good but runs hot.

Apple have de-Ived and gone for practicality over fashion-accessory style. I'm not convinced that's a bad move.


I have nothing against being thick. The 2012 model was thicker but I like its design


It's not really thick though. In fact, the 14" is 1mm thinner than the M1 13" Pro.


I think the move back to an older design language is purposeful, to remind people of the "good old days" before the 2016 design.


Yeah it’s surprisingly ugly. Maybe it’ll grow on me but I can’t help but think this is the first post-Ive laptop


How often are you looking at the bottom of you laptop?


Not often but an aspect I respect about Apple is how they get the details right even when you don't see them. A profile of Ive mentions how they've thought about stuff like the color of the internal chips:

> One afternoon, Ive and Bart André removed the bottom panel of a MacBook laptop, revealing black and silver components arranged, with unnecessary orderliness, on a matte black circuit board. Ive looked down happily. “This is such an extraordinarily beautiful thing,” he said. André noted that, in a competitor’s computer, the board would be green. He sounded embarrassed on behalf of that other machine. On the same table was a plastic model of an existing Apple headphone—an EarPod—the size of a golf driver.

Having ugly feet on a laptop does betray a slight change in values. And yes, it's minor but it's like the extraneous requests on a rock star's rider. If they're not getting this right, what else are they missing?


M1 aside, it's pretty clear that Jony Ive was holding back Apple's designs here. While he worked at Apple it was thinness at all costs, right after he leaves the MBP suddenly gets a tiny bit thicker in order to return a huge chunk of the features that were previously removed. Probably the only thing on this new laptop that can't be pinned on his departure is the M1 series of chips.


The 14-inch actually gets barely noticeably thinner.


Not trying to be snarky here, but... it'll probably come across that way.

Will we ever see the end of beach balls? I've got an m1 Mac mini, and ... I see far fewer, but I still see them. I don't understand what with so many core, so must 'fast' and 'powerful' stuff, that my computer will still freeze and lag doing seemingly normal stuff.


The beachball isn't directly related to CPU speed. It indicates the main thread of the foreground app is busy for more than a few seconds. It may be, for example, blocked waiting for something from the network -- which is one reason it's important for macOS developers to move filesystem and network access to background threads as much as possible.


It's been a while since I developed on Mac OS, but ISTR that beach balls happen when the main thread blocks and doesn't respond to events. If an application does that it's due to the developer putting too much on the main thread, as much as whatever the CPU's doing.


Based on long experience I suspect most beachballs nowadays are because of locks, not lack of CPU cycles. The UI thread is waiting for something to happen in the background that should have been fast, so the developer did it synchronously, but for some reason it isn’t. Maybe a helper process failed, or there’s a network glitch, or maybe the developer just didn’t get multithread locking right and the wait will never end. In any case, a faster processor does nothing to help.



Good reason to support Linux port project.


That's less of an Apple problem and more of a developer problem.


I don't plan to go back to MagSafe. I have multiport USB-C chargers all round my house and don't really want to buy lots of cables or proprietary chargers to add to the mix. I'm sticking with USB-C even if it's slower. SD cards I don't use. And HDMI I rarely use so dongles are fine, but it's nice to have I guess!


In a similar boat re: chargers. It's fantastic being able to grab any ol' USB-C charger (of sufficient power) and charge just about any modern laptop short of maybe the beefier gaming laptops / portable workstations (I haven't tried charging my Dell G5 via USB-C, for example, so maybe it'd work, but I ain't betting on it).


If I chose to buy the 14" model which comes with a 96W USB-C power adapter, does anyone who has experience of Apple products know if I could still use my existing 65W usb charger to power or charge the MacBook?

My Dell XPS 15 does allow me to charge it with a 65W USB-C charger which is nice when I don't want to lug around the beefy 130W power adapter but I do have to be careful not to stress the CPU too much or it will start draining the battery. I wonder whether MacBooks have a similar "graceful fallback".


The new Macbook Pros support charging through the USB-C ports or the MagSafe port.

You should be fine using the USB-C ports with your adapter.

It's as of yet unclear how well the MagSafe cable would work with your adapter.


My 2016 MacBook Pro came with an 87W charger and I always run it off a monitor that supplies 60W over thunderbolt and it never drains. I used to mostly run it off a 45W charger and would only see it slowly drain if I was really pushing it with compiles.


Thanks, that's just what I wanted to know.


I don't have any USB-C but I've got some MagSafe. Unfortunately this is MagSafe3 so I assume that means buying new chargers. I like the SD and HDMI, but IMHO they are still missing "standard" USB ports which are used by most flash storage devices. It's not even about the cost of buying new devices for me, it's about someone handing me an SD or a thumb drive with some data on it and being able to just use it. Apple is very very poor when it comes to that.


Interestingly Apple now sell Magsafe3 -> USB-C Cables.


They said you can still charge it via USB-C as well as the magsafe. Your choice.


The other end of the new MagSafe cable is a USB-C plug, so that should be fine.


Yeah I don't even use any USB-C charger to power my MBP. My USB-C monitor delivers 90W to the MBP.


Agree on the chargers. SD card is great if you do anything wit Raspberry Pi's and photos.


Or buy an adapter like this https://www.amazon.ca/BASEQI-aluminum-microSD-Adapter-MacBoo... and have yourself some extra storage expansion. Might be slow, but good for things you access rarely.


This was a "Shut up and take my money" day for me.

I've been running on a 3-year-old machine, straining at the leash, and it's time to change.

I made my order about a minute after the store went live, and I won't get it until next month. I suspect part of that, is because I'm getting the M1 Max processor.

They'll make a lot of money, this week.


I've been running on a 3-year-old machine, straining at the leash, and it's time to change.

My daily driver MacBook Air is very close to ten years old. I wonder if I'll get ten years out of this one, too.


Xcode is getting so damn big, that I'm actually running out of disk space. It's insane.

I develop on the computer, so I tend to keep it busy.

My computers are always in great shape, aesthetically, but I run them hard. The fans are generally going from about 6AM to 9PM.


It would sure be nice if they used any of that spare space inside to let us load up our machine with M.2 drives. Maybe that technology is too futuristic though, much like function keys or magnetic chargers.


Not much extra height. They are barely thicker than the last generation.


Sure, but there's obviously extra space inside because of the SOC. M.2 drives are miniscule, if Valve could fit one in the Steam Deck, Apple has no excuse.


Same, I'm replacing my 2016 13" MBP. Thought about getting the 13" M1 but decided to wait for this refresh and I'm glad I did.


I'm very curious to see how the 64GB of RAM GPU performs with deep learning models fine-tuning. It might be impressive.


Do any deep learning frameworks even have metal support? I could see inference working well on these laptops but they lack so much of the specialized hardware I'd be surprised if training was even possible for most useable models.


Apple is maintaining a Tensorflow plugin: https://developer.apple.com/metal/tensorflow-plugin/


There's been work on PyTorch to move some over, but it's still all CPU (which still, on an M1 isn't HORRIBLE) last time I checked.


Tensorflow supports Metal AFAIK.


Wouldn't you use the Apple Neural Engine for that?

https://www.infoq.com/news/2020/11/apple-tensorflow-accelera...

"Recently Apple released the new M1 "system on a chip," which not only contains a built-in GPU, but also includes a 16-core "Neural Engine" which supports 11 trillion operations per second. Apple claims the Neural Engine will support up to 15x improvement in ML computation."


The neural engine seems to be only about inference. For training it seems most systems rely on Metal like the Apple Tensorflow plugin [1]. But I have never tried to do ML on macs so I am maybe wrong.

[1] https://developer.apple.com/metal/tensorflow-plugin/


Do people really train locally? I'd have thought the field moved to aws or some other hpc by now. Seems like a waste buying such a nice laptop to just melt and abuse it when you can abuse amazon's hardware instead. The battery won't be happy being discharged every day since it seems like in my experience macbooks don't bypass the battery when on ac power unless you start the computer up with the batty unplugged (not so easy on newer computers)


The greatest improvement: no Touch Bar.


TBH I'd love to have Touch Bar but improved with haptic feedback, making it much more useful.

It helped me a lot while debugging and loved the customization it offered for the "static" keys on the right, but having no haptic feedback always caused me to miss the button unless I stared at it, killing the purpose greatly.


Haptic feedback, and also elimination of the lag. It often takes >100ms for the bar to respond, which makes a quick adjustment to the sound slider impossible.


The lag truly makes it unusable. Trying to mute? Hit the mute button. Did it register and is just laggy? Or just missed the tap altogether? Why have an always-on touchbar that doesn't even respond to taps consistently.


Yup, agreed. Those were solvable problems though, and really easy ones. The bar itself would have benefited from ProMotion for example: be super responsive while touching/dragging sliders, and stay at 10Hz or even less when static. Couple it with haptic feedback and I'm pretty much sure that many would love it. It had great potential but they killed it.


Second greatest: Restoring popular ports that were previously removed. Such innovation :D


I love my touchbar.


I liked the concept when I first saw it but honestly it's more trouble than it's worth. On my old work laptop, it would sometimes bug out and stop registering touch. That's a problem when you are in the middle of a call and you need to adjust volume because one of the speakers is too loud or too quiet.

I also don't like how it's harder to operate than physical buttons. Too many times I hit the wrong spot on the bar and for example ended up putting my laptop to sleep instead of adjusting brightness. I've also tried to configure Ableton Live to do something useful with it (maybe mute/unmute tracks or control their volumes); but with little success.

Long story short: I'm happy to see it's gone.


I wish they'd offer to remove the touchbar from the laptop I have.


I think they left in on the 13" MBP? But why?!


Because there isn't a new 13" MBP. 14" and 16" are the new ones with the M1 chip.


I don't see the touchbar on the 13". (EDIT: Ah, 14")


The perfect laptop now


Can't believe they did so many positive changes with the MBP only to add the notch from the phone in too. Almost the perfect update. Does this mean every app in full screen has to be updated to account for it?

> posted in wrong thread so copying here


Maybe I've drunk the Apple Kool-Aid, but there's another way to look at this: it's a perfectly rectangular 16:10 display, plus they've added an extra (albeit imperfect) strip along the top, 74 pixels high, allowing them to push the macOS menu bar into the bezel, leaving you with a clean and unencumbered 16:10 desktop area for your content.


This is a good framing.

It is not a loss, it is a pure gain.


That makes me happier, but it's a deep gash into an otherwise perfect rectangle. It's a little like the buying a brand new table (or car) and the first thing you do is accidentally put deep gouge in it. Perfectly serviceable? Yes. But still needlessly marred for the life of the product.


It's function over form. Exactly what people always say Apple should focus on.



Yep!


I don't see much issue with having a notch on a laptop since it now lives in the middle of the menu bar now. That's typically negative space in most apps anyways.


For my full screen web browser it's right where the tabs go.


No it's not. In previous iterations, your tabs went below the camera. This doesn't change that.


Not sure I understand your distinction, the screen is taller now, if my browser wants to render to the top of the screen it has to contend with the notch.


If your browser changes nothing, the top bar will be blacked out and your tabs will display exactly as before. This only changes if your browser explicitly updates to handle the notched display.


It sounds like they're saying the OS will automatically add a black bar to the top in fullscreen mode, which will look seamless thanks to the contrast rate of mini LED.

If that is the case, it sounds like a pretty elegant solution and significantly reduces my concern about the notch. It also addresses my confusion about some of the images on Apple's website.



I was going to answer "presumably this is why Safari has always put tabs below the address bar", but the most recent version finally changed that by default. :\


I can’t believe they didn’t color the menu-bar black to hide the notch.

I suppose the notch is only in the way if you are watching 4:3 video in full screen. For every other use case, the notch is hidden in either the menu bar or black, horizontal bars.

I’m also wondering if the led backlighting is arranged so that backlight is completely turned off in the black horizontal bars shown when viewing 16:9 video in full screen.


>I’m also wondering if the led backlighting is arranged so that backlight is completely turned off in the black horizontal bars shown when viewing 16:9 video in full screen.

That would likely help a lot, very interested to see it in action.


I wonder how it will work with a mouse. What happens when you’re hovering the menu bar and go inside the dead area? Does the cursor disappear and return on the other side? Making the dead area unvisitable (so the cursor keeps following the border) seems the most logical solution probably, but it makes it very awkward to go from one menu item to the next if the notch is between them.


No, remember that the notch extends the display upward, and does not expand downward into the display. Apple's reasoning is now you get a 16:10 display without having the top menu bar enroaching within it.

Which I'm perfectly fine with. Useless black area made useful.


Is MacOS also going to restrict all apps that run full-screen to the 16:10 display area and not the strip?

If they have done that, it's fine.

But even then I can think of a bunch of people who buy MacBooks to run Linux.. it's going to be a showstopper for them.


Isn't that space just menu bar 99% of the time in Linux also?


The Apple Event showed that full-screen apps and videos have a black bar extending the whole notch height; which still leaves you with an Apple standard 16:10 screen area. Since it's mini-LED, it'll be just as black as the previous bezels, and since the non-notch area is still 16:10, you're not missing anything at all; and the menu bar in a sense "doesn't take any usable space" anymore.


Nit: mini-LED is not the same thing as micro-LED, so it remains to be seen whether the LEDs are small enough/aligned in such a way that the black bar at the top doesn't have any backlight bleed.

(Maybe this was mentioned in the presentation...)


The funny thing is I checked the webpage to make sure I got it right, but still copied it wrong.

Ha!

Thanks for the correction

They did announce a very high contrast ratio so hopefully it won't be too bad.


That portion of my screen on my Mac has been a solid grey square with no meaningful information for 99.9% of the time that the laptop has been powered on. The aspect ratio is 16:10 so it’s not going to get in the way of any 16:9 content. I feel like this is a pretty bad take. Would you prefer a solid black bezel?


By default full screen apps do not use that space. It remains be black.


I suspect there'll be a setting to have the notch area ignored for full screen apps by default, or something like that.


Not a huge deal for me, I dont notice it on my phone, I dont expect I will on my new Mac either.


> Does this mean every app in full screen has to be updated to account for it?

A simple black bar in full screen apps would be sufficient I guess.


I wonder. Maybe full screen will behave differently and the menu bar will never be hidden?

Otherwise I am looking at my web browser in full screen and how do you design around that? You have to push the tabs down anyway, might as well just make the menu bar static.


You could get clever and have apps program around it as a dead zone (tabs jump either side of it when moving them around) but sounds like a pain.


I can’t believe they’ve done that either. I’d honestly consider getting one, but I can’t stand notches on any device.

And I’m sure we’ll see everyone else start to copy this “feature”.


This notch looks like it'll be fine. The notch I'm worried about is the one all the PC makers are gonna decide they need now but with zero integration or standardization with Windows.


Feels a return to form, finally approaching this machine from the needs of the users rather than what's compelling for the industrial design team.

Every bad decision of the atrocious 2017+ era laptops reverted.


I don't know if I'm allowed to swear on here, but thank fuck for this. It literally doesn't impose on Apple to do this stuff; they can keep their 28% margin, keep things soldered if they want, but making functional laptops that aren't made to be stared at or look good in pictures is a bare minimum.


Unfortunately it looks like Apple still has no option to disable temporal dithering or other sources of flicker (PWM, pixel inversion, etc).

A minority of people have binocular vision dysfunctions (like convergence insufficiency) that give them severe eyestrain, headaches and migraines when this flicker occurs. Apple should treat this like an accessibility issue (like VoiceOver) but does not. The current treatment from behavioral optometrists is not always effective.

I recently found a whole community on this: https://ledstrain.org.

If you have expertise in displays, please join us on LED Strain! We were hoping Apple would address these accessibility issues and let people with vision problems use their products.


Have you considered trying Neurolens? http://www.neurolenses.com/

If you've already tried prism lenses to help address the convergence insufficiency then these won't help. These are basically prism lenses but they're a progressive prism so the prism is strongest at the bottom and nonexistent at the top so you can wear one pair of glasses all day and have them help with devices but still have good distance vision through the top part of the lens.


Thanks for pointing this out.

We actually have a discussion about Neurolens on LEDStrain: https://ledstrain.org/d/754-trigeminal-dysphoria-and-neurole...

I plan to give them a try. But unfortunately it does not work for everyone.

Have you had success with them? Many other members have convergence excess (esophoria) or vertical heterophoria.


I have exophoria and myopia I am quite comfortable at the computer all day. Before I was diagnosed with exophoria and got these, I found my own way to cope which was to wear an old pair of glasses which were a much weaker prescription in terms of diopters. Whenever at the computer I would need to switch glasses to the weaker prescription or my eyes would bother me. With Neurolens wearing the same glasses all day whether or not I’m at the computer feels perfectly fine and I have zero eye strain at the computer.


Great to hear you found a solution. I believe Neurolens is focused on treating exophoria where the prism correction required is different at near vs. distance. In most patients, the correction required at near is greater.

The Neurolens are also unfortunately very expensive and seems to not be covered by insurance. But I think most people would gladly pay out of pocket to get relief. I surely would.


Do you have to use laptops with cathode backlights instead?


Ah, so for most of us LED itself is not strictly the problem. Low quality ones with a low refresh rate, like in some department stores, are a problem.

It is likely other sources of flicker like temporal dithering (FRC), you can see it here (it is on Windows too): https://youtu.be/0y-I3hqQgCQ

The visual noise/dots you see is the dithering. The e-ink display is slow enough so we can see it.

People with BVDs are strongly affected by that noise...and unfortunately simple pair of glasses does not help.


Can't believe they did so many positive changes with the MBP only to add the notch from the phone in too. Almost the perfect update. Does this mean every app in full screen has to be updated to account for it?


I don't think it will be that bad on Macbooks, because the notch will be embedded in the top menu bar.


No. It looks like the notch only cuts into the menu bar area and letterboxes fullscreen apps.


If it works like the iPhone, it'll be alright. It doesn't really cut into screen real estate. You just get a little extra notification area


Well the iPhone has a touchscreen whereas this will be used with a mouse.

I'm wondering what the behavior will be to run the mouse across the top of the screen?


On the contrary...you get more screen real estate.


> letterboxes fullscreen apps.

Sounds odd but I'll see what it's like in practice


Look at the old Macbook pros. The bezel at the top was thick. All they did was cut into the bezel so, in letterboxed mode, the display is the same size as the now previous gen of Macbooks.


I wonder if the OS just presents a virtual lower screen resolution in full screen mode?


I'd hope MacOS can account for it


It seems the era of few ports, gimmicky keyboards, and thinness over function in MacBooks is over.


the notch is a major design compromise and apple is not trying hard enough. i can not imagine paying six thousand dollars for a new laptop and literally staring into an unforced error every day. (See, Dell XPS bezel-less displays like this: https://twitter.com/SpencerDailey/status/1450170126360358914)

I hide my menu bar for maximum space, and you can't do that here without eating into your main apps' vertical space. the notch also reminds people of iphone features, which makes a touch screen an even more obvious omission, as well as Face ID. (FaceID is trending on twitter b/c people assume this laptop ships with it: https://twitter.com/MKBHD/status/1450162489795170307 )


Dell XPS laptops not only pull of a bezel-less design, but also fit a webcam with infrared component for Windows Hello face unlock in there too. Apple has a notch with no Face ID.


Have either of you ever used a XPS camera? They are terrrrrible.


what's the general need for microSD?

My XPS-13 has one but I've never understood why its such a big deal. I almost never use it.


There are a lot of different use-cases for SDcard/microSD. It's not useful to everyone, but I think it's definitely useful enough to have built in. Same with HDMI.. I'd probably use SDcard more than HDMI (like for setting up Raspberry PI, accessing videos from dash cam or using my old camera, which is still better than iPhone for some types of pictures). I think those ports are aimed at the professionals in certain sector, but lots of regular people use them too.

Recently I've also used microSD with USB-microSD adapter instead of USB sticks (like for music in the car).. last time I wanted to buy USB memory stick it was the cheaper option, and the adapter isn't much bigger than a typical USB memory stick anyway.


As someone who presents a lot, primarily at events. It's nice to have HDMI built in given that HDMI is the standard for projector connectors and I expect it to be so for a long time. (VGA stuck around forever.)


Even without presenting a lot, being able to impromptu throw my screen up onto whatever random TV/projector is around via HDMI has come in handy surprisingly often on my 2013 MBP. If I had been limited to DisplayPort, I would’ve been able to do that <5% of the time.


True. At a conference, I should have all my dongles with me. But I don't in a random conference room--not that I've spent much time in my local office for years.


VGA is still around in my college, and the projectors are what I could call utter trash.


Given how much Apple charges for storage, and the fact that it is not expandable, SD card slots offer a reasonable way to get more storage. With a starting price at $2000, this may not make sense for people who don’t need a pro level machine but were looking forward to features like this


SD cards are not a reliable medium term storage mechanism. They wear out much faster then SSD drives.


Yup, I’d just use it for storing my iTunes library and one of my iPhone backups. I currently use an external drive, which I don’t take everywhere. SD would make it trivial to have both of these with me at all times. I’d also be able to keep all my photos and downloaded videos for travel, without having to worry about filling up my internal storage.


Yep. But there are huge types of cases where that's still useful. For example, I have all my docker images on a removable SD card. It's all cached so it doesn't matter but its hundreds of GB I don't need on my main storage.


That's exactly what I did - for previous Macbook Pro models with SD card slots you can find flush sd cards or adapters with an aluminum end that blends right into the case (you can still remove it fairly easily but I hardly ever do.)


Photos. Photo journalists need often to send pictures they just took to their agencies to be in the first places in the race who brings the content first on their web page.

I actually use my microSD slot on my xps quite often for camera related stuff.


photo/video people use those ports constantly, like all day every day.


Can confirm. Outside of the higher-end cameras (photo and video), EVERYthing still uses SD cards.

And it's easy to adapt microSD cards to it for Raspberry Pis, drones, dash cams, phones, audio recorders, etc.


OK, but there are dongles for that, I use one (the same one) for 10 years, since I got my first camera.

Used it before laptops had sd port, and now use it when they don't have it again. No issues with that.

In case of cameras there are sdcards that are wifi enabled, and also newer cameras should support wireless transfer - isn't that easier and faster instead of pulling out the card?


>In case of cameras there are sdcards that are wifi enabled, and also newer cameras should support wireless transfer - isn't that easier and faster instead of pulling out the card?

No, because transferring 1TB+ of data from your camera over the crappy built in wifi chip is a much worse user experience than just plugging in an SD card. Lots of cameras also require you to connect to the camera's hot-spot, which means while transferring that data you are not on wifi. If you're just sending a few jpegs over, fine, wifi is great, but for people to whom the "Pro" moniker actually matters, it's a big deal.

Dongles are fine for sure, but having a slot built in is better, and there's no real reason not to have one other than "aesthetics"


> No, because transferring 1TB+ of data from your camera over the crappy built in wifi chip is a much worse user experience than just plugging in an SD card.

I didn't thought that pro cameras that support 1TB sd cards have crappy wifi chips, I stand corrected.

But usually sdcards are not the fastest storage mediums.

> there's no real reason not to have one other than "aesthetics"

There is a reason - they take up space inside the laptop that could be used for something else.

Not sure if that applies to macbook, but additional ports are not a free lunch.


>OK, but there are dongles for that

Well, why have any ports at all? A single USB-C suffices; For everything else, there's a dongle.

No need to stop at ports either. If the SD Card reader can go through USB-C, so can webcam, audio, and the peripherals.

Surface and iPad have shown us that a portable device doesn't need a keyboard/touchpad, and those who want them can connect via bluetooth.

Of course, even having bluetooth built-in is superfluous when perfectly fine Bluetooth dongles are available.

Finally, there are great USB-C portable monitors out there. Why limit the users to something built-in, especially when not everyone has a need for it?

The same goes for the battery; given the abundance of USB-C power banks.

Once we get rid of all these unnecessary bells and whistles, we'll end up with peak MacBook: a shiny, metal square with a single USB-C port, an Apple logo on top, and all the dongles one can dream of (available separately).


Besides photo/video which was already mentioned, SD is commonly used in 3D printing (to transfer STL to printer)


Photo/video import mostly. I don't feel like transferring 1TB of data over crappy built-in wifi chip on my camera or having to plug the camera in and have it act as a fancy card reader.


microSD none, but SDCard readers (the one here) are extremely useful to photographers.


> what's the general need for microSD?

I use a USB pendrive formatted with a case-sensitive partition to cache data for an app I use, because it's far better to spend 20$ on a pendrive than 200$ for a SSD upgrade.

Given the choice, I'd prefer to use a SD card for that than a USB pendrive, as the USB pendrive requires either a USB hub or a USB pen always sticking out of the chassis.


Importing pictures videos from camera, car DVR, just storage, gopros, drones, raspberry pi etc.

Much faster to pop the SD card than to connect via USB to transfer files. Much more convenient not at desk.


Import your days photoshoot to review your work, reformat the card and most importantly to back it up to TimeMachine.

Wonder how many people just now shoot on their iPhone 12/13 at least for snaps.


I am happy to get it, I still shoot with a large-sensor digital PAS camera and bought a hyperdrive just so I could rip the cards and have an HDMI port.

Excited to leave that thing behind.


Wish SD cards were better supported by stereo equipment. Recently realized that USB flash drives are kinda clumsy for that.


Probably for the photography workflows where you grab the card out of your camera and then process and edit your RAWs.


AV/Photo import?


I've got to say these M1 chips are very tempting. I've been a Linux guy for a long time now and these make me contemplate the switch. I really wish apple had a real competitor in the laptop space, but honestly no one seems to come close to apple's hardware quality. I'm really hoping to see some M1 style chips for non-apple hardware in the near future.


intel 12th gen, alder lake, does mix high efficiency and performance cores, like the M1 series... but it's certainly not a total in-die solution like we see with the M1.

And also, the wait time for alder lake to ship in actual laptops is unknown.


The hardware is amazing, but I could not see myself going back to proprietary closed source OS. I would like to chose the distro (hey some people want NIX some OK with Ubuntu), X11 vs Wayland, pick the display manager, decide if I want to run LTS or beelding-edge, etc. My only hope is that some day Linux support all MBP hardware well enough to use this a Linux laptop.


It seems like Apple purposely comes so close to offering the perfect laptop only to falter hard on one or two features. In this case, it's the battery life. Continuous web use time is down from 17 hours on the 2020 13" M1 to 11 hours on the new 14". That's "up to" time, mind you, and with real use as a developer I expect getting 75% of that at most. So 11 hours just isn't enough time. I buy a laptop for mobility. I shouldn't have to plug it in at all during my work day. Yes, it's better than competitors but still lacking.

I would have been happier if they took the 13" M1 and added the ports.


>In this case, it's the battery life

Pretty sure you are expected to go with Air if you want the battery life.


yes this is not an upgrade to the M1 at all - it is a parallel product! Seems like M1 can still be a very good deal.


Ok, I believe you're being sarcastic, because this little paragraph is ridiculous, so I'll address it as if you are - because then you have a very good point.

This is Apple's Pro line. There is no more performant laptop that they make. It's a portable workstation - a home tower that you can take with you. If people want battery life of 10+ hours or a light weight, they get the model optimized for that. Apple seems to not offer a real pro laptop (portable workstation) - period. People who use these pro laptops are either compiling code or doing heavy graphics.

Work gives us Dell Precision laptops with 9 hours battery life during casual use, a pretty dang powerful discreet GPU, and oh - Xeon CPUs. No offense apple, but this "pro" laptop is a toy in comparison.

So I don't get it - apple's plan. I couldn't seriously use their products for work unless I want to lose productivity. Why don't they make something that's actually "pro" instead of calling what everyone else calls mid-tier a "pro" and then completely excluding the actual pro target market all together. Do they just not want more money?

As a developer, waiting less time for your device to finish an operation, while sitting there reading HN, makes you not need as much battery life. And the Dell seems designed for a full workday plus an hour. Now, I'm not a dev - did that for a year of my 20+ year career, too much sitting looking at code. I do however script a lot - with gigs of text output from data collection of logs and performance data, usually doing basic calculations or transformations. I look a lot at large datasets and graph them. I need a big GPU, I need fast CPUs. I mount up a 20G RAM drive because the disk is too slow.

Now yeah - when compared to another Apple laptop w/ a Xeon, the M1 is faster for single threaded specialized workloads. Because Apple uses CPUs a generation behind everyone else, has worse cooling, and still loses on multicore. In real life, when I run a shell script on my laptop, and a coworker runs it on the same dataset on an M1, his is an all day run - he does it overnight. I do it over a long lunch - it's not even a real comparison.

Now yes - My laptop gets hot, fans are blasting, and if my charge is at 80%, it literally won't charge while the script is running. But... that's what I want from a "pro." Now, his isn't a pro, and it stays barely warm. But we're talking the difference between 7 hours and 2 hours here. I don't think the "pro" is going to be all that much better.

So I get the apples to old apples comparison. However apples to flagships from others - I highly doubt apple a serious contender for the portable workstation. Which is what "pro" should be.


Let's wait for some benchmarks, I see no reason not to have an open mind here. They seem to have taken a very serious shot at this, and even on paper, having up to 64GB of VRAM available opens up a lot of opportunities.


The RTX 3090 - literally the top performing card available, that doubles as a space heater has 24GB of VRAM. Either you're wrong, or Apple built a Civic with a huge exhaust. Which will sound like a vacuum cleaner...

Here are the top Dell specs to compare. 128GB ECC RAM. 6 core Xeon. NVIDIA RTX a5k. Huge exhaust panel on the back (keep the CPU clocked high longterm instead of for burst). 120Hz 4k display (the 4k they've had for many years now). 14TB storage.

And you know what? It's pretty thin. And very sturdy.

Now, I'm not crapping on Apple's new laptop. I'm sure it's awesome, and can compete very well with the mid tier laptops from other vendors, at twice the price and half the durability. And it's always been that way. From keyboards that break from typing, to keyboards leaving key imprints on the screen. Can't expect much from a looks-first company that can't get a keyboard right, and builds their phones out of slippery fragile glass.

My issue is with their constant misleading meaningless marketing garbage. They are the orange clown of computers. If they made an umbrella, they'd build it out of laser-drilled ice, in the shape of a beautiful swan, and melting would be the built-in cleaning feature.


In terms of the outsides, this matches up almost exactly with Benjamin Button reviews the late 2016 MacBook Pro https://blog.pinboard.in/2016/10/benjamin_button_reviews_the...


Does anyone know why would it need 140w power adapter? I thought the M1 chips are supposed to be more power efficient than the Intel ones, and they never needed >100w power adapter in the Intel days.

The new USB-C spec to support >100w power delivery was just out, I really hope they are following that spec in their new 140w USB-C power adapter.


It’s for the fast charging feature, where it charges 50% in 30 minutes.


Fast charging possibly?


The worst thing about the NotchBook - this union of iPhone notch with MacBook - is that all the other laptop manufacturers have a way of copying Apple's decisions: non-replaceable batteries, soldered in RAM and SSD, fewer ports, near-zero key travel for keyboards, and now a notch. The Dell XPS line has had ultra-thin bezels for ages with no notch but now future XPSes might have a notch just because some Dell suits thought, "Look at how many of these NotchBooks Apple is selling. Consumers must want the notch." But correlations of sales are not causations of sales.


I know it will be an unpopular opinion here, but bringing back the HDMI and SD card port is making the macbook much thicker and will eat up some space than can be used by battery instead, all that for ports I will never use. I wish there was another option without these ports.


Given the power efficiency of the M1, I’m not sure it matters all that much.


The 16" is already built to maximum battery capacity by law


Interesting, honestly didn't know that was a thing.

Is this the first macbook that's hit the 100WHr limit, or has this been a barrier previously? But I mean like, that's it? No doubt there are gains to be made with a better battery, charging, cycling, cost and so forth. But without changing the regulations, a better battery can only get you more space, not higher capacity. Wild to consider.


It's not. There's _no_ law that dictates battery size. 100Whr is the largest size you can take on airlines. Economically, no laptop manufacturer is going to make a laptop you can't take on a plane.


Some laptops have two batteries, one of which can be hot-swapped or removed and charged while the other keeps running in the meantime. Would love to see that on Macbooks and could be a fair compromise for the airplane problem, though obviously it will not happen. (The battery was user-removable on early Macbooks but that was long ago)


Then get an Air.


I agree. I don't want in-laptop HDMI or an SD reader - I can use an adapter for the half dozen times I'll ever need them.

But, I also want the high end processor and memory, so alas, here we are.


Can anyone comment on how well M1 chips work with X86 development workflows e.g. Brew, Docker, etc? I know there are still problems if you're heavily dependent on Virtualization software like VMWare and Virtual Box.

For example, will I be able to just do the same "brew install x" for the majority of the *nix apps, libs, services, etc. that I use daily?

How bout Docker Desktop and Docker images? For example, if all my teammates are using older Intel Macs, PCs, etc. and we deploy and develop using X86 images, will I need to be very careful that I don't end up pushing Arm based Docker images to our repo? Will I need to have modified Docker / Docker-Compose files that reference Arm versions of our images so they can run on my machine?

This just seems like a pain if you're the only one using Arm, while the rest of the team and various environments are on X86-64. It took years before most Node, Python, Java, etc. incompatibilities between Unix and Windows were ironed out, and I still run into issues when, e.g., an inexperienced developer on Windows hardcodes some 'backslashed' path in their app or assumes Windows line endings instead of standard Unix.


I really hate the feet. Just the bottom half of the form factor looks like they're going back to 2006 design.


Very ugly but luckily you don't look at that part much on a laptop. Especially on macs, the rubber feet just fall off and the case ends up scratched and marred up overtime.


Yeah I have a 2012(?) and one of the only things is that the feet have all worn flat and smooth, and feel like they are going to fall off. And if I want to overthink it, it's probably one of the cheapest feeling things about it.

What do you mean the rubber feet don't stay on? They make millions of these, milled out of aluminum, they can't keep the feet on?

Personally, I would trade a lot of design elegance for some hardy laptop feet.


I have two 2012 era macs and both are on their second or third set of rubber feet by now. The adhesive just gives out over time.


If the M1 Max maxes out at 64GB does that pretty much imply no new Mac Pro until M2?


The max memory isn't an indication of that, the M1's max of 64gb is limited by LPDDR5 module capacities.

What would block a new Mac Pro would instead be DDR4 or DDR5 support instead of LPDDR5, and also PCI-E lanes. Both of which would likely require yet another change in silicon design.


It looks like it, previously Apple have indicated Apple Silicon Mac Pros will come in 2022. That will be interesting to see. I'm wondering how the economics of a super high-end processor exclusively for the Mac Pro could work out. It seems unlikely a single chip aimed just at the mac Pro market could be economically viable. I wouldn't rule out a multi-CPU architecture with dual high end M2 processors.


Yup. They will have 20/40 core arrangements as per Gurman

https://9to5mac.com/2021/08/01/apple-silicon-roadmap-gurman-...


The rumored 20-core M1 Extreme and 40-core M1 Plaid haven't been announced yet.


Was so hoping for a Mini. Sigh.


If I could get my current M1 mini with the 64GB chip, it would be absolutely perfect.


Gotta be just a matter of time.


Same


Soon I bet. Probably only so much they can ship right now with all the supply issues.


Yeah, some of the configs are already estimating December ship dates, and that's like 20 minutes after the store opened. They are not going to be able make enough Pro chips


I can live without HDMI, but it never made sense to me to put SD on the chopping block. In what sense was USB-C ever supposed to be an alternative to SD card slots?

Edit: For clarity, read my reply to anamexis before responding to this.


Opposite for me - I almost never used the SD slot, so a dongle was no problem for me, but I used HDMI all the time.


I'm not commenting on whether storage expansion or video output is more important. That obviously depends on the person.

I'm saying that while USB-C is an effective solution for video output, and therefore can conceivably overtake HDMI at some point in the future, it doesn't address the same use case as SD at all.

I can use an SD card slot for permanent expandable storage. I can stick a terabyte card in there and forget about it until I need to move it into a new device. USB-C doesn't help with that use case at all. It's actually worse than USB-A, because at least the latter has decent slim fit drives available that will stay in without sticking out too much.


Hidden in your comment is the intent, Apple was selling the MBP with iPhone marketing intentions. You can put more of a $$ premium on storage if the micro SD card slot does not exist, much like an iPhone.


>and therefore can conceivably overtake HDMI at some point in the future,

HDMI will live on forever in the TV space and professional equipment. USB-C / DisplayPort was only ever an effective solution in the computing industry, not consumer / professional electronics.

Same to SD Card, both are targeting to Video / Graphics professionals.


Does USB-C have some fundamental limitation or downside compared to HDMI? As a non-expert in HDMI or the needs of professional TV equipment, that isn't clear to me. The only thing that comes to mind is possibly increased attack surface (BadUSB-type attacks).

I'm of course speaking without consideration for whatever momentum the two solutions may or may not have in the current marketplace.


You can't run 10 meter USB-C cables for video, unlike HDMI


And this debate is why Framework's approach to port modularity makes so much sense and why it's not just a gimmick, despite what early critics were saying around here.


Even for users who want the default ports, having the ability to swap out ports that have worn out without having to replace the entire logic board would be a huge win.


HDMI makes more sense imo. You can hook up an SD card with adapter, connect the camera via USB-C or wireless. With Displays you aren’t as flexible.


There is no display that accepts HDMI that would not work with a USB-C -> HDMI adapter. Add in the fact that HDMI is generally inferior to connectors like DisplayPort and you have a wasted slot. I can almost understand the SD card, but the inclusion of HDMI is just a waste of port space.


Maybe I'm just unlucky, but I've had consistently bad results with USB->HDMI adapters. From not working at all, to random CPU spikes, to my current situation of the external monitor intermittently flashing blue. This is with both Apple and third-party dongles.

the inclusion of HDMI is just a waste of port space

This is Apple; they'd have zero ports if they could get away with it. Clearly they've done research and found that it's important to a significant portion of the target market.


I'm guessing it's mostly for people connecting to projectors in conference rooms.


Lots of people use SD cards as basically "permanent" extendable storage. And this is something that's much more necessary on lesser Macbooks because they come with such limited standard storage. A 256GB SD card is under $40, while adding that much storage to the SSD in the Macbook is hundreds.

HDMI is just one of the many display ports around, and the people using it are probably tied to a desk or something, so having a dongle for it is really no big deal.

I don't mind using my Anker for HDMI, but it sucks that I have to use it for SD.


I had a SD for extended storage in my MacBook (until I upgraded the SSD, those were the times), but if we’re being honest this is an issue created by the ridiculous storage prices Apple charges. Otherwise you’d just buy a large enough SSD without bargaining if you’d even need it.


That's part of the problem, but it's also convenient to have a storage card that you can easily move between machines as needed.


I stopped using SD cards around the time that my iPhone replaced my DSLR. And when I worked in more advanced settings (photojournalism, film school) it was CompactFlash anyway.


> I can live without HDMI, but it never made sense to me to put SD on the chopping block.

HDMI means external monitor. For some (most?) an external monitor is a must-have.


HDMI => external monitor, but external monitor !=> HDMI


USB-C supports video out.


In an average city block, I'd expect that ~99% of displays—TVs, monitors, projectors—would support HDMI-in. Plus A/V receivers, for that matter.

Maybe, maybe 5% would support DisplayPort over USB-C (or anything else that lets you use USB-C for video in), and that's after a few years of Apple pushing it as The Next Big Thing. I fully expect HDMI will be more common than DP+USB-C in five more years, and quite likely in ten. For one thing, it's really nice to install in offices and houses because you can have long runs of it (50+ feet), with relatively cheap cable (compared to video-capable USB-C, certainly) and it'll still work. If something replaces it, I think it'll either be wireless or some other cable, not USB-C.


To be clear, I'm explicitly not commenting on market penetration of either solution, but rather their inherent capabilities. I don't see USB-to-HDMI cables or adapters as particularly cumbersome, whereas there isn't a way to just cram a storage module fully inside a USB-C port as though it were an SD slot.

As far as the cost of HDMI vs USB-C for long runs, that's a good point. I could see similarly cheap/efficient video-only USB-C cables potentially addressing that use case, but I don't know enough about the internals of USB-C to comment on whether that's viable. I could also see HDMI being used only behind the scenes in the future (like how computers don't typically have fiber optic or coaxial ports).

At the end of the day, it's not like I'm suggesting that HDMI should be removed from computers. My point is that the argument for removing it (whether or not you agree with that argument) never applied to SD ports in the first place.


USB-C monitors aren't particularly common and it can be hard to find good ones. I don't think LG makes their 5K Ultrafine anymore, but they still make 4K monitors with USB-C video. It's pretty convenient because one cable connects to the monitor and provides power to the laptop.


But only on much shorter cables (up to 1 m) than normal size DisplayPort or HDMI.

In my home, I use 2 monitors. One of them is close enough to the laptop so that an USB-C cable can be used. The second is too far away. If the laptop had no DisplayPort or HDMI, I would have needed a dongle.


There is a large selection of 2m+ USB-C to HDMI cables.


Those include the USB-C to HDMI converter in the connector, instead of in a separate dongle.

It is more convenient than the separate dongle, but not as convenient as having a HDMI connector on the laptop.

If you have HDMI on the laptop, you might find a HDMI cable wherever you go. With only USB-C, your only chance is to always carry with you the USB-C to HDMI converting cable.


> In what sense was USB-C ever supposed to be an alternative to SD card slots?

You use a USB-C dongle.

99% of people never use SD cards so don't care. The people who do care can use a dongle.


The point is that people who do care want an always-in solution.

I have two low-profile always-in peripherals in my 2017 Air: SD & yubi. I have the same on my modern Dell Precision (which is otherwise a MBP-esque all-usbc chassis). These can be taken out, but rarely need to be & are always available. That setup isn't possible with any current Apple devices (until now).


Apple put the SD card slot back in, we can stop pretending now.


Yeah I think it's an unfortunate step backwards. Now I have to carry around a SD card reader that I'll never use, hard-connected to my system.


If you never use it... why do you have to carry it around?


Because it’s built-in. I can’t leave it behind. That’s the whole point. It’s also e-waste - a whole slot most people never use.


I am pretty sure 99% of MacBook Air user may not care.

But MacBook Pro has a different audience. Lots ( if not all ) of media professionals needs it. And I am willing to bet it is more than 1% of the MBP user base.


What % of people are media professionals? Tiny.


I seriously doubt Adobe would bother maintaining Photoshop. Lightroom and Premiere Pro on Mac platform if it was tiny. I would not be surprised if it is a decent double digit percentage or something like 30%+.

Before Web Development and iOS Programming was a thing. They were the majority of professional Mac users.


Ahh, Apple, always looking out for the 1% of users!


And here I am with my Lenovo S10-3 still doing fine in 2021. My use case is if I can produce something fast with this machine then half of the battle has already been won because by default I need to optimize for the lowest possible spec of my target market.

Obviously, I know this won't apply to everyone but anyway I hope this will continue for me a couple more years.


I'm using a 6 year old ThinkPad X250 with 4GB of RAM.

The benefit of using it is that I have zero tolerance for bloated software because it just won't run on my machine. This includes the software I make.

And you're right, developers SHOULD be using the lowest possible spec of the target market. Whenever I visit some bloated SPA that crashes my browser I always imagine the developer being some smug chap with a fully specced MacBook Pro patting themselves on the back for what a smooth website they've built.


The idea pad? Is it your main computer? I have an S10-2 I bought for installing a transflective display, and I can’t imagine using it for much more than noodling around with. I just wish it was just slightly larger to accommodate a decent keyboard and a slim battery.


I have another machine for mobile app development. It's also used for consuming multimedia and gaming. The low spec machine is mostly for web development and my main Linux terminal for managing other local/remote machines.


For me, think about that memory bandwidth. No other CPU comes even close. A Ryzen 5950X can only transfer about 43GB/s. This thing promises 400GB/s on the highest-end model.


The 16" has a 140w charger, the 14" has a 97w charger. Did the TSA stop limiting chargers to 97-100w? I priced both w/32core 32gb 512ssd and the 16" is only $299 more than the 14" .. I want the 14" though, I'd like that 140w charger..

edit: I confused this with the battery maximums, nevermind! Thanks for letting me know!


TSA regulates the SIZE of the battery, not the charging mechanism.

https://www.tsa.gov/travel/security-screening/whatcanibring/...


The limit is on battery capacity at 100Wh, not charger wattage.


I would be extremely surprised if the TSA started checking the wattage on power supplies. I've never even done the separate liquids thing.


That's a good point. I've never had a bigger charger than the 97w so I've never had to think twice about it. I wonder if this will be obviously 140w, when I see bricks like that on gaming laptops they're usually meant to sit on the floor not plug into the wall directly.


Flight limit is on the battery, limiting it to 100 Wh. I have a laptop with a 230W power brick and never had any problems.



64Gb RAM, 400Gb/s! Ports! No touchbar! But 64GB RAM on a mac book, finally! Sold! Mine arrives next week!!!! And it is less than $4000!


These are some top notch MacBooks!


They are a touch screeny if you ask me.


I hate the notch. I wish I could get the 14inch body with the M1 (not pro/max) chip in it for <$1200. All I want are ports in last year's MBP but don't feel the need to spend $2k on a laptop I infrequently use.


All laptop manufacturers have relied on Intel for years. That was fine when Intel was competitive. But now all those manufacturers are massively behind. They can either wait for Intel to catch up (unlikely), switch to AMD, which is better, but still behind, or they can try to move to ARM. Though that's really hard since they're relying on Windows.

Really this is a massive miss from Microsoft and their partners that many saw coming years ago. It's obvious that this is precisely why Apple likes to bring tech in-house. To avoid depending on something that isn't competitive.


The cool thing about these macs is that apple is starting to decouple the processor from the type of Mac. You can choose between a pro and a max . If they do this for it’s whole product line it would be interesting.


As a PC user, I'm sold and looking to get one soon.


I know what you mean, my current PC laptop is a horrible experience, I used all mbps before this. Dell xps 13z 2n1. It's so, so cool and pretty but it throttles so much that it's absolutely useless for dev work. I got the full blown $2500ish 32gb/16core or whatever, too. Such a waste, sending it to my mom and getting a 14" whenever I can.


I recently shelled out a €1,200 ish on a new laptop too, and it already looks archaic compared to this.

Dell XPS line up is horribly overpriced and has pretty worse performance, and the lack of Ryzen option kept me away from it.

I love my Lenovo's keyboard though.


as beautiful as the hardware is, I just cannot stomach buying another machine I can not comfortably replace the battery in. Either Apple returns to its earliest roots of making machines that are easy to repair, or find me and many others buying machines from the likes of Frame.work instead.


Is it difficult to replace the battery in these new models? I changed the battery in my 2016-ish macbook air in about 5 minutes, and it looks like my 2018 macbook pro is the same design.


It's extremely difficult (and risky) to replace the battery in any 2016+ MacBook Pro. In fact, it's so difficult, even Apple won't do it, so when you send a MBP in for a "battery replacement", they replace the entire top case instead of just the battery [1].

The Air models have different designs. As the other commenter pointed out, you're probably referring to the early 2017 Air, for which battery replacement is a breeze. The 2018+ Air has adhesive pull strips, making replacement relatively easy: https://www.ifixit.com/Guide/MacBook+Air+13%E2%80%9D+Retina+...

If Apple hasn't switched to adhesive pull strips for these models, it's very disingenuous to advertise that they were "Designed with the earth in mind" like this website says. The battery is the only "consumable" component in the laptop, so it's completely unacceptable for it not to be designed for replacement.

[1]

https://www.macrumors.com/2018/11/07/2018-macbook-air-batter...

> the battery can be individually replaced in the new MacBook Air [...] In all other MacBook and MacBook Pro models with a Retina display released since 2012, when a customer has required a battery replacement, Apple has replaced the entire top case enclosure, including the keyboard and trackpad.

This implies that (at least from 2016-2019) they were replacing the entire top case assembly in MBP.


> The battery is the only "consumable" component in the laptop, so it's completely unacceptable for it not to be designed for replacement.

Unfortunately, SSDs don't last forever either.


iFixIt doesn't really agree with you about the MacBook Pro with the glued-in battery. The MacBook air however was ironically easier to fix:

https://www.ifixit.com/Guide/MacBook+Pro+15-Inch+Touch+Bar+2...

https://www.ifixit.com/Guide/MacBook+Air+13-Inch+Early+2017+...


I guess I'm pretty much alone here, but I don't really like these new MacBooks.

The things I'm happy about:

  - Function keys are back!
  - They kept touchID
  - Chips are probably pretty good
  - Headphone jack, yay!
But what I really dislike:

  - The case-design looks kind of outdated. I'm getting MacBook Pro 2006 vibes here.
  - The base price starts at 2.000$ (2250€) !
  - The notch... On the iPhone the notch was justified with a whole sensor array for FaceID, now we have something similar sized here for one 1080p camera. With a hole-punch I wouldn't have said anything; but here I'd rather have 2-3cm of bezels than a big notch.
  - I really bought into the USB-C future! I know that's not the case for everyone, and the addition of the SD-Card reader is welcome. But the HDMI port seems a bit...strange to me. It doesn't really cry "future of connectivity". I charge my MacBook with USB-C I connect my screens with USB-C (to display port) and for my occasional USB-A and HDMI needs I get out my one dongle. This actually seems like a step backwards for me.
I know lots of these are highly opinionated, but...yeah: Bummer for me.


You loose one usb-c port to get all the others. I've never needed 4 usb-c ports at a time, but definitely have needed all the others. And one usb-c port is replaced by power, which many times I have had to use one for anyway. So whether you bought into full usb-c future or not I don't see the extra ports hurting you. Maybe a 'whatever', but don't see it as a reason to dislike it.


I guess my line of thinking is that Apple going full USB-C was one major reason that the port took off for computers (USB-Sticks, Drives, Monitors, etc.). It feels like they surrender after winning.

I never really cared for MagSafe but I'm also not super annoyed about it being there (just a old/new proprietary charging port), but the HDMI port...that creates the same feeling for me as if they had added a USB-A or VGA port honestly.


I don't really see it as surrendering at all. The vast majority of consumers don't need the power the Macbook Pros offer. I see this more as an (IMO overdue) acknowledgement of the target audience. The M1 iMac and the M1 Macbook only have USB-C ports.


I was really hoping for wifi 6E and Bluetooth 5.2 but besides this i'm floored!


I really wished for a 120hz 6k external monitor that does not have all the HDR / dimming zone fluff and does not cost $6000 usd.


For me these devices look incredible. Can anything come even close to the performance/energy usage of these? And with a great screen, sound, and good webcam?

And as an iPhone user, the notch really isn't a big deal. In fact it's a positive because you get more usuable screen space.

Yes they're expensive, but, for something we use every day for 8hours+, it seems worth it.


Amazing update. I still wish the m1 air supported 2 external displays though, as I would still prefer the smaller form factor, lighter weight, fanless design, and cheaper price, as I don't need the pro power. Hopefully the refreshed air will get support for 2 external displays sooner rather than later as it is otherwise perfect.


If you use DisplayPort, you can add as many as six external displays on an M1 Air [1].

[1]: https://www.macrumors.com/2020/11/24/m1-macs-able-to-run-six...


DisplayPort != DisplayLink.

DisplayLink hardware isn't known to be super stable - since it all runs through USB, things can get slow or laggy, and can sometimes be subjected to bandwidth limitations (at the port, the controller, etc).

DisplayLink also uses software encoding, I believe - theoretically this shouldn't be an issue for the CPUs in this machine (they can handle something like that pretty easily), but there is a noticeable performance difference versus connecting them directly via HDMI or via a TB3 <-> DisplayPort connector.

You're also reliant on DisplayLink's drivers, which haven't always been the best on macOS. I've heard that this was fixed more recently, but haven't tested.


I wonder what this means for the new M1 pro then. Can I have 2 external monitors AND the integrated screen on? I guess we'll have to try


yes you can


I have a feeling that if they had released an updated Mac Mini with the new chips, it would have appreciably dented the sales of new MacBook Pros.

Too many people don't agree with the notch aesthetic and probably pair a discrete unit with their preferred display. Right now Apple doesn't want their race horse being outrun by an underdog.


I feel like I’m alone with this opinion, but: I can’t buy one of these, as much as I’d like to. I still have to use windows half of the time because of software that’s missing on MacOS. So either I get to buy and maintain 2 laptops or settle for one and I think pragmatic wins over shiny, so I’ll get a boring old windows machine.


Looks like news of MacBook Pro death have been greatly exaggerated


Well, Apple decided to stop digging their grave and finally recognized that the 2016 MBP was just a big mistake. No port other than USB-C, no magsafe, touchbar, crappy keyboard...


Does someone know the implications of the new M1 chips for neural network training? If I get it right, we'll see upto 64GB GPU on the new M1 processors? The processing performance will not be the same as on a dedicated graphics card, but 64GB is a game changer, especially for large models, isn't it?


Hopefully, they didn't change much in terms of software interface so Asahi Linux would work [1] out of the box.

[1] https://asahilinux.org/2021/10/progress-report-september-202...


How many external monitors does it support? I got an M1 mini only to find out it could only drive 2 4K monitor. And the M1 notebooks can only use 1 external 4K monitor. I'm not getting one of these until I can run 3 external 4K monitors like I'm doing on my 2019 Macbook 16-inch right now.


2 or 4 external 4K monitors, depending on whether you choose Pro or Max.

M1 Pro - 2x 6K monitors plus built-in display

M1 Max - 3x 6K display plus additional 4K display plus built-in display

External displays are 60Hz only.


Even if only one external display is run? That’s a bummer. Having a high refresh rate laptop screen with a 60hz monitor is not good.


I was falling in love for this new macBook until I saw the notch. They had to add a notch to the screen. Why?!


Looking at the empty space in the center of whatever they call the top menu bar in MacOS, on my machine, I can kinda see why they'd think it was OK to claim some territory there.

Will probably be annoying to people who use programs in fullscreen mode, I guess, unless that mode just blacks out the whole section and treats it as bezel, which I assume is what all video players will do in fullscreen mode, regardless.


Or they shrunk the existing “notch” from the full width of the display to the least-used real estate on the machine.


HDMI and magsafe are back!


Honestly, both of those combined with physical function keys matters more to me than any processor update they could've delivered. All around this is just a very solid machine that's completely worthy of the pro title.


I really hope the magsafe cable is not fixed to the power brick, it is the only bad memory I have from magsafe 2


Luckily it's not, the cable is a 2m magsafe to usb-c.


I’m seriously considering buying a middle-spec M1 MBP, and I also have a preorder for a DIY Edition Framework laptop[0]. I realize devices which give users more freedom to tinker, repair, customize, improve, etc. may never compete for market share with locked-down devices which are harder to repair, upgrade, and recycle, but my hope is that the former will at least remain a viable option. It’s tempting to splurge on a top-of-the-line MBP, but I prefer to split my money so at least some of it goes toward sustainable and freedom-respecting computing.

[0]https://frame.work/laptop-diy-edition


Well, you get to vote with your wallet.

I'm also considering between the M1 Pro and Framework Laptop but not sure the Framework will be at par with what Apple delivers functions-wise.

Regarding Framework using intel chip, I haven't really seen a strong argument comparing its specs with that of M1 other than the repairability. I'm not quite sure that's everyone's priority. Sure, if you've ever spent 2k repairing your Mac cos you don't have Apple Care, repairability becomes an important issue.


It’s not just repairability. It’s that everyone in this thread says “Apple gave us hdmi! They got rid of the touch bar!” Maybe this falls under “upgradability” but the idea of Framework is that the user decides what to include. Like with FOSS vs proprietary software. If the developers remove a popular UI element, there’s a good chance you can add a compiler flag or find a fork with what you’re looking for. (This is the idea, but we’ll see if it pans out in reality. I don’t like Framework’s smooshed arrow keys; the idea is that I can switch that out to a 3rd party I like. The reality is that there is so far no 3rd party keyboard.)


> the idea of Framework is that the user decides what to include.

I agree that this is true, at least in theory. I also had a preorder for the Framework laptop. In practice, the problem is that modularity in this case seems to come at the cost of fewer ports, even if these ports are more flexible. The Framework has up to 3 available USB C ports after accommodating a charger. That matches the MBP in USB, but the Mac also adds HDMI and the SD card reader. One could even attempt this case with some other laptops like the X1 Carbon (2 USB C, 2 USB A, HDMI) compared to the Framework, for those who would have otherwise equipped the Framework with a few USB A ports.

Obviously, being able to change these ports has long-term benefits where the Framework shines, but personally I can't see my preferred ports changing enough to offset the cost of modularity.


Really hard to tell if the $800 upgrade from 16GB RAM to 32GB RAM is worth it since you also get a mysterious 16GPU to 32GPU upgrade..

10-Core CPU 16-Core GPU 16GB Unified Memory 1TB SSD Storage¹ 16-core Neural Engine 16-inch Liquid Retina XDR display Three Thunderbolt 4 ports, HDMI port, SDXC card slot, MagSafe 3 port Magic Keyboard with Touch ID Force Touch trackpad 140W USB-C Power Adapter $2,699.00

10-Core CPU 32-Core GPU 32GB Unified Memory 1TB SSD Storage¹ 16-core Neural Engine 16-inch Liquid Retina XDR display Three Thunderbolt 4 ports, HDMI port, SDXC card slot, MagSafe 3 port Magic Keyboard with Touch ID Force Touch trackpad 140W USB-C Power Adapter $3,499.00


The M1 Max also has twice the memory channels to feed that GPU. Instead of a 256b wide "quad channel" memory controller, it has a 512b wide "octa channel" memory controller. That's an immense amount of bandwidth with LPDDR5.

There's an in between 24-Core GPU for less money, but it's still quite the jump.


$1999 starting price. Even higher than predicted.


These prices all feel like standard MacBook Pro pricing to me. I just ordered one (M1 Max, 64GB, 2TB) and it cost as much as that tier always seems to cost me.


On the other hand, the 16" max ram 64GB, is less than $4000, which is less than I paid for my 2018 32GB RAM i9 pro.


These all look amazing. Interesting to see them drop the popular 15.4" size. All my accessories that are sized for that form factor will have to be purchased again. I'm thinking especially of the vertical dock I have.


I am not interested in Apple's ecosystem. While I stay with X86 I wonder if and when AMD and Intel will catch up. Or if another ARM chip maker will release a chip as good but without tying it to a proprietary system.


At this point, I think AMD and Intel are _at least_ 5 years away from being competitive if they shifted all R&D to ARM and started funneling insane amounts of money into it. Without that shift in focus though, it could be a decade? More?

Apple has funneled all of their success from mobile straight to the desktop, and they are actively watching any competition for a sign of increased development. This is basically still the first generation of Apple Silicon arguably and it's absolutely bonkers. Let's say Intel tries to shift to ARM for some new product. Apple's first response is going to include releasing some crazy spec boost that makes Intel's attempt look like a joke, with the likely intention to discourage them from trying again.

I think it's more likely at this point, that we see Apple assume a sort of leader role similar to what Intel has been losing since the release of Ryzen. Apple will hold the lead so long that they get complacent until someone can slowly sneak in a new product that leap frogs their technology.

Intel held that top spot for how long though? Over a decade? Almost 2? I can easily see Apple taking a similar foothold if they can bring the price down over time similar to older iPhone models. Imagine buying a "new" $400 laptop in a few years with a M1 Pro/Max chip inside because they're just old tech at that point. Apple already does this sort of chip recycling with products like the Homepod and AppleTV. Both use the SoC from older iPhones that cost 5x as much when they launched.

The fight between Intel and AMD was already interesting enough, but adding in Apple is going to make the next decade a roller coaster! Best of all, it's a win/win for consumers as these companies try and out compete each other.

I'm excited for it!


I don't think that the red and blue teams are that much behind. A high end AMD mobile processor like the Ryzen 7 5800HS easily beats the current M1 in most CPU bound multithreaded workloads, like compile times. Even if the numbers are true and the M1 MAX is 75% faster than the current M1 it will just be slightly faster than the AMD chip produced on an inferior process node. And the 5800HS you can get in laptops that cost like $800.

The next generation of Intel CPUs Alder Lake, that also has a split between performance and efficiency cores, showed some promising numbers in current leaks and should be competitive with the M1 Max in CPU bound workloads. Where Apple shines are all this proprietary additions to the CPU that are massively going to benefit content creators (video & audio), but I as a developer don't think I can profit much from it and prefer to stick to open standards and technologies.

I'm also super excited and can't wait to see what the next years bring us. As you said it, we are going to be the biggest winners here.


$2500 USD (£1800 GBP) to buy in the US vs £2500 GBP. Apple are a total rip off.


Tariffs must be part of this.


Still no LTE modem. The Apple Watch can be configured with an LTE modem, but the highest-end MacBook Pro cannot. Is this because of Qualcomm royalties? Is there any technical reason why the market has failed?


I imagine it's because tethering with your phone is good enough for the vast majority of people.


Why does Dell ship with one?


I assume Dell still offers dozens of laptop configurations, so it’s easier to find your preferred combination of features (if you don’t mind researching).

Apple keeps a much more streamlined lineup than any other major laptop manufacturer.


I travel a lot. I hate having to babysit the tethering and to worry about using up my phone's battery when I am trying to get some work done. It would be awesome to have an Internet connection that works all the time, as with iPhones, iPads, Watches, and laptops from other companies.

It is obviously not a significant engineering challenge to put another radio in the laptop if the Watch has it.

I read somewhere that Apple has to pay Qualcomm a royalty based on the price of the device and that this is why MacBook Pro buyers cannot have an LTE modem. I don't understand how other vendors are able to ship an LTE modem if this is the case. Can anybody enlighten me?


You can plug your phone into your laptop, and then you don't have to worry about the battery. Why waste money and resources on duplicating something you're going to have on-hand whenever you're using your laptop anyway?

Apple has made it a particularly seamless experience between macOS and iPhones, too. As long as you're on the same iCloud account (or iCloud family) as the phone, the phone will always show up in the WiFi list as a hotspot. You can work off your phone's internet without even pulling it out of your pocket to set anything up.


I know exactly how much of a hassle it is. I use it several times a week.

Edit: Someone brought this to my attention: https://www.pcmag.com/opinions/why-apples-macbook-pro-will-g...


Both 14" and 16" can use the highest spec Pro processor. But the 14" only needs 96W while the 16" can go up to 140W.

Does that mean that the 14" will be throttled on heavy workloads?


Good question. Also has smaller battery. I initially ordered the 14" with M1 Max and 64GB of RAM but then switched to the 16" because of the 17hr vs 21hr battery life.


I have been very happy with my M1 systems, and I’m looking forward to getting function keys back. However, the SD card capability is coming back at a time when pros are moving to CFExpress cards.

Unfortunately, there are some form factor differences between these new, much faster, cards used in higher end Sony, Cannon, and Nikon cameras. It’s a bit confusing since there are three types of CFExpress cards—I wonder if the new MacBook Pros can read any of them?


No FaceID. Interesting.


Why no one is talking about the base 14" 8-Core CPU and 14-Core GPU but not a single mention in the presentation ?

How the new M1 Pro 8 core compared the the M1 8 core ?


Unpopular opinion: I am surprised by how much people like MagSafe. Charging over usb-c seems so ideal: one cable for all of my devices. And in a decade, the number of times I've tripped or banged on the cable in a way that having MagSafe would have made a difference is zero. Just trying to understand others I guess... What makes it so appealing, given you have to carry around more stuff to use it?


USB-C locks into place - not ideal for when someone trips on your cable and brings your laptop crashing into the hard floor.


Yes... I understand. My questions was why this seems so desirable to have when it requires carrying more equipment, and seems like an unlikely scenario most of the time.


Different strokes for different folks. I personally already have to bring micro USB, usb c, lightning, and a couple of proprietary chargers wherever I go, plus Magsafe 2 when I bring my personal laptop... so it's not like I'm going from an all USB-C world just because of magsafe. And I have tripped over my cables many times and been saved by magsafe. My work laptop has come crashing to the ground a couple of times so far because a family member's dog pushed on the cable... I don't want to have to deal with that risk profile for my personal laptop!


Anyone else getting an old PowerBook vibe off of these?


These are the best Mac laptops in YEARS. I'm only 2 years into a usually-3-to-5-year replacement cycle, and I'm VERY tempted, especially with the trade-in offer on my 2019 model.

I guess it was Jony Ive that was pushing the "thinness above everything else" mantra that gave us embarrassing keyboards and no ports other than USB-C. I'm very glad he's gone and that's over.


Oof. I was hoping at least 1 version of the 14 would be closer to $1500. These are just too expensive for me. For that reason, I'm out...


No one seems to be commenting on the 140 watt power adapter.

With the caveat that I already dropped $7k (after tax) on a top-spec model, I am very interested in seeing how hot these get. Particularly in light of how cool the M1 runs.

Prior to my current M1 MBP, my daily driver was a maxed-out 16" MBP. It's a very solid computer, but it functions just as well as a space heater.

And its power brick is only 100 watts...


The 140w power adapter is for fast charging I believe.


Too bad Apple shows performance compared to intel cpus. I’d like to see it also compared to 8 core M1.

Edit. Correction. The other link directly to the new processors does give comparisons to standard M1. It’s really impressive! But it’s also unnecessary unless you’re doing some very heavy, specialized work. Normal modern full stack development probably won’t be noticeably faster.


Using the m1 air for almost a year now I can't imagine going to another laptop with a fan. Doesn't matter what I ever do there is never any noise. I'm sure the fan in the Pro models almost never come on and you could also install something to limit max cpu frequency before fan is turned on, but having no fan is the next level.


I'm happy to magsafe back. Even though I usually have mine sitting on a stand, I'm just lucky that I haven't yanked everything off my desk at this point. HDMI I couldn't care less about, but I suppose it provides another way to hook up to my display. I use a usb-c -> displayport connector most of the time though.


The only thing I'm disappointed on is that I was really hoping for Ethernet-over-MagSafe like the iMac power adapter.


Hooray for the return of MagSafe and the other ports! Thank you for listening, Apple.

Just one mention of "games" and it's about the display resolution, so it sounds like Apple still doesn't care about getting more game developers on Mac.

> thermal systems move 50 percent more air, even at lower fan speeds.

Does that mean it's even louder than before?


Darn! My current company has a 4-year replacement policy. Guess I'll have to switch company now to get a 2021 model.


These laptops are very impressive, but until there is clarification on their CSAM scanning plans/their future intentions with that, it's going to make me wait. The thought of my OS being integrated with ANY government database with only the assurances of 'trust us, we will say no' still repels me.


It sucks that such brilliant hardware is effectively locked to such mediocre software and corporate policy.


I cant believe none of the news websites aren't using the title : 'Apple turns its silicon up a notch'


Weird that only the 13" Pro retains the touchbar. The 13" Pro is in a really weird place now. I don't know who it's for at this point. I can't think of any reason I'd choose it. I'd either get the Air or the 14" Pro. I suspect it won't remain in the lineup for long.


It’s weird how now the only touch bar mac is that weird m1 macbook pro which now nobody will be buying when the 14 inch is so much better for not much more money, and the macbook air is almost as good for a lot less money. They went from all pro macbooks sold having touch bars to effectively none of them.


Ooh they're shipping the 16-inch with a 140w power adapter: https://www.apple.com/shop/product/MLYU3AM/A/

Sadly looks like its still not GaN, this thing is a beast...


As someone that has felt many years stuck with his MBP mid 2014 all I can say is: Shut up and take my money.


2012 MacBook Air here. Yep, I ordered a new 16" machine minutes ago.

It's funny how people on the internet were crying, "16GB is completely unusable!" Ummm... I have 4GB, and it works fine for everything. I'm even running Catalina.

Maybe it's not good for games? I don't play computer games. Maybe that's where all the teeth-gnashing and kiddie posturing comes from.


Its not only games, but doing webdev with docker instances up and running and many tabs etc etc it eats through that ram so fast, so its nice to have alot of ram so that the dev experience still feels snappy. thats my 2 cents anyway.


MBP late 2012 right there with you


Have Macbook Pro sales been declining for the past few years? I can understand removing the touchbar as a bad experiment but I thought Apple was obstinate about the move to USB-C. Really did not expect this backpeddling.

Edit - I'm definitely pleased with the move to add more ports. Just surprised.


Is there a reason Apple couldn't just implement magsafe using a USB-C port?

Prong is too long/deep to be workable?


The pricing suggests that the M1 mini was priced aggressively as a marketing strategy, and that now that the dust is settling the cost of the M1 powered macs is going to be less of a value win than I'd initially hoped.

Looking forward to actual use benchmarks for the things I'd use it for.


On paper, the M1 Pro 10-core CPUs are faster multicore than any Mac ever except for the Mac Pro and iMac Pro. They're faster than the i9-11900k.

It remains to be seen if they are as fast in practice as they are in theory, but if they are, I don't think they're overpriced.


Maybe they added the notch so that when they remove it next year people will have to dish out another 2-5k to get the non-notch version. Specially given that there were a lot rumors that iPhone was going to get rid of its notch this year. Now it's likely to happen next year.


My iPhone 13 has an amazing front facing 1080p camera, FaceID scanner thing, speaker, and other stuff crammed into a small notch, but the notch on this is even larger and only has a camera? What gives? Why is the notch so wide?

(I'm still getting one to replace my 2012 MacBook Pro)


Your iphone is a lot thicker than a macbook's lid.


God I want Linux on these things. Apple have built some very sexy hardware, but I'm still not into the OS. It's just... sluggish (mostly in terms of productivity, not performance), too barebones, too basic and toy-like. I need a real OS to make my dream laptop.


Seems like a great improvement over the latest models (since 2015). Still, I'll wait until they fix issues with virtualization. I don't know how virtual machines will perform on the M1 but I doubt it will match performance on PC Linux machines.


I haven't been able to see if the DAC that powers the headphone port supports 24-bit/192 kHz yet. Being able to listen to Lossless audio without a USB DAC would be nice.

It seems possible that Apple would add that since it's a feature of Apple Music Lossless.


During the video, they quickly mentioned that the jack supports high-impedance headphones and later specifically mention that you can connect “high-fidelity” headphones. Skip to about 33:30 in the video. So while nothing shows up on the spec page on their website, I wonder if they’re taking audio output more seriously.


The notch doesn't look annoying on the Mac as that area is used by the menu bar anyway. However, I don't understand the reason for it since it doesn't have faceid. Why not just go for a punch hole design? What else is in that notch??


Does anyone have a handle on how the new M1X is expected to perform on Deep Learning training runs vs a NVIDIA 1080Ti / 2080Ti. I think the 400 Gbps bandwidth and 64 GB unified memory will help - but can anyone extrapolate based on the M1 ?


If you find out, let me know! I'd love to know this, too.


Uhhhh can the MacBooks charge via USB-C? Because I’ve been investing heavily in usb-c chargers (they are $80 and I bought 5 of them for traveling, different room etc). If they can’t be used for charging anymore I’m going to lose my shit for real


Yes.


I put off trip to Bali until this update. Last time, the screen cracked due to lid closed and heat buildup with fan trying cool in a positive feedback loop. Had to fly to Singapore to replace. This time, carrying (soon to be) old MBP as backup.


Really happy with my M1 air for home use, but would really like one of these for work!


I’m curious about all the distaste people have of the notch. It’s more screen space. Would people prefer a larger bezel? My guess is you’ll have the option for a virtual bezel on apps that don’t properly account for the notch.


I prefer a larger bezel because the notch takes the menu bar space, which is already scarce even without the notch. Some people may have many app icons and "widgets" (e.g. iStat Menus) on the menu bar.


When competitors like the XPS 13 have like 1mm extra bezel across yes.


I still don’t understand. Why is a larger bezel something to be desired?


You get a nice, perfect rectangle. It goes nicely with all the other rectangles in the product (case, keyboard, trackpad). Two lines of symmetry everywhere. Now the main rectangle has a big chunk out of it marring it and removing the horizontal symmetry. The larger bezel keeps everything looking nice. It's not like an extra 1/2 inch on the top is going to let me put my laptop anywhere easier. (Except on a plane while the person in front of me is reclining. Which happens a couple times a year, I can live with that, compared to staring at the gash every day.)


I am so excited that the new MacBook Pro contains so many things that customers had been yearning.

Couple that with 64GB RAM and M1 Max, if I am going to spend > $3000 for a laptop, this MBP is basically the only game in town for me.


A big notch and all of that for no FaceID? Only Apple could get away with that.


Killer specs, better keyboard (no touchbar), more ports, and yet the most frequent comment here is about the notch. HN is a tough crowd to say the least, in this instance IMO to the point of it being comical.


These look great, but why bother continuing to sell the 13" or the Air? Or said another way - if the Air and the 13" are basically your "loss leader" then why continue to sell both.


I’m so disappointed to trade a multifunction thunderbolt port for a single use hdmi port when the same thing can be done with a £5 cable. I wish I could get that M1 pro max chip in my current MacBook Pro.


I have to admit, that was frustrating. I’ve been in one meeting, once, where I needed HDMI but didn’t have any dongles.


Apology accepted, but way to drag your heels.

You couldn’t just say sorry could you, Apple? You had to throw in a little barb at the same time. That’s ok: you wouldn’t be you if you weren’t overly prideful.

Escape keys are 1U.


I have a Framework laptop on order. But I must say that I am very tempted. If my attempt at daily driving linux doesn't work out, I will probably get the M1 Max 14'' MBP.


At least there's good choices all around these days. Very recently it was 'everything sucks'.


I dread manually resizing tiled windows every single time now. This is especially true if others in my situation do code reviews on two side-by-side non-fullscreen windows.

> Comment moved from other post.


With all these computational capabilities and an upgraded camera, I’m surprised that Apple put a notch into their MacBook Pro line but left out FaceID. This seems like a bizarre omission.


Now as soon as they add a lightningbolt port (not to be confused with thunderbolt) for the (ahem) headphones jack, then we can finally do away with all of the hideous dongles.


I hope this lasts me 5 years like my 2016 MBP! Ordered and aside from the notch (they are adding Face ID later I take it? And minority report gestures?) I think it’s PERFECT.


Wondering how videogames will play on a specced-out MB pro, and whether this insane amount of GPU power will drive some developers who also are gamers to this machine.


How come when you upgrade to the 24 core M1 max, it adds +$600 when the upgrade lists it as +$200. After you try to remove the upgrade the price only drops by only -$200.

Is this a bug?


I think the reason is that the upgraded CPU ($200 extra) automatically triggers an upgrade of the memory config from 16GB to 32GB ($400 extra). That explains the $600 increase.


Ah, that makes sense.


Looks like they focused a lot more on utility this go around. Going to try out the 16 inch but hoping it can fit in my bags well and isn't a hassle for traveling.


I guess Moore's law is slowing down?

the single-core benchmark for this is about 2x better than the score for my 2012 macbook pro. 2x in 10 years doesn't seem that great.


Pretty bummed about magsafe.

No more taking one charger for 5 devices and another single purpose cable that isn't so easy to get a workable backup/replacement for.


You can still charge via thunderbolt / usb-c!! :)


Aaah that's nice. I hope they don't bundle the charger then.


Do they have an escape key? (This is not a joke question.)


Yes, there are plenty of pictures of it on the page https://www.apple.com/v/macbook-pro-14-and-16/a/images/overv...


Fixed everything they've broken in the past 5 years then added a notch... They haven't had a perfect machine in years, and the trend continues.


So close. But that camera notch would drive me nuts.

Curious: does it cut into a normal resolution/ratio or is it giving you extra pixels at the sides of the camera?


Extra pixels on top of 16:10 iirc


I want to hate it because I really wanted normal USB back, but I guess that ship has sailed.

I love it. I want one. Gonna be tough not to hit that order button!


And just like that, the outrage about the photo scanning malware Apple installed on their iPhone is forgotten.

"The screeching voices of the minority" indeed.


First of all, they never "installed" anything, it was a future feature. Second, they delayed it (possibly indefinitely) after the huge pushback.


I heard they reconsidered.


Interested if the macsafe cable ends in a power adapter or ends in a usb-c plug.

E.g can I attach a usb-c to usb-c cable to the accompanying power brick when traveling?


It ends in USB-C =)


Anyone else think the new cross section looks a bit odd? Very flat on top.

I am glad they did not copy the rounded off corner keys from the new iMac keyboards, though.


Hurray! They finally built a machine that could replace my 2012 MacBook - a proper keyboard with fn keys, SD card slot and MagSafe. HURRAY!!!!


Beast machines for sure, Apple is doing a good job with their M Series chips. I can't wait for the next Intel commercial, complete meltdown.


Now I just wait for the 14in air? All I want is a 16/32gb air with this 14in screen. I feel like I'm always in the lineup gaps of apple.


The maxed out 16-inch model comes in at a cool $6099.


Yup, but no one needs 8TB of internal SSD space. ~$4000 maxed out with 1TB.


Can anyone recommend a dock that does 140w charging? Right now I use a Caldigit one but it only puts out 90w I think so I’ll need an upgrade.


Now if I could just get one with an ortho keyboard layout…

The 14 inch max with 64 gb ram looks fucking T A S T Y. If only my hands could laptop keyboard ={


Am I the only one who like the TouchBar? It was fast, useful and configurable and I rarely use Fn keys anyway, nothing in MacOS uses them.


Well my desktop PC picked today to decide it just wasn't going to boot up any more, making this a pretty straightforward decision...


The keybed now appears to be of a similar color to the keys. Does this indicate that the keyboards are now easily replaceable units?


Des anyone know if the faster internal drive is included on the base model? storage speed makes a huge difference for practical use.


These are pretty much everything I’ve been waiting to upgrade to from my Pro 2015 model which has gotten very long in the tooth now.


This is beautiful... I'd pay $4k for a mac without a damned touch bar!

Now if they can just get rid of the legacy lightning jack on the iPhone.


Why does the notch have to be so big if it's only used for a camera unlike iPhone where it hosts a multitude of sensors?


Holy smokes. I was impressed. I can’t understand how some folks here are underwhelmed. Who is buying and what did you buy??


Come on! 140W adapter? I thought that the efficient M1 design will allow us to return to the 80W range for the 16".


Apple has finally put a braided cable in their charger. I hope we can say goodbye to 20 years of charger cable breakage.


I’m so frustrated. I bought the 13inch M1 in July, and was bummed it only supports one external display and has only two USB-C ports. This makes it pretty frustrating to use at my workstation (can't use all my monitors, hard to connect my keyboard/mouse/peripherals). Not even 3 months later they release this? It feels like such a fucking gut punch, I would have returned my M1 had I known that this was coming…


The October refresh timeframe was posted pretty much everywhere on the internet that mentioned an updated M1 Macbook Pro. The M1 was released last November. This didn't come out of nowhere.


If I recall correctly, I think they were up front about more processors/updates coming later. The M1 is still a good computer, but that 13in was never going to be the same as the fully new gen of MacBook Pros.


Is 8GB more memory and an upgraded processor worth the $700 price difference from MacBook Pro M1 base model?


Did anyone catch how many external monitors these can drive, and at what resolutions and refresh rates?


Up to two external displays with up to 6K resolution at 60Hz at over a billion colors (M1 Pro) or Up to three external displays with up to 6K resolution and one external display with up to 4K resolution at 60Hz at over a billion colors (M1 Max)


The 1000nits display (1600nits "peak", whatever that means), is a major selling point for me!


The number of ports on a mac seem inversely related to how much Jony Ives was involved in its creation.


The notch is awful, it took Apple years to finally upgrade their crapy 720p camera to a mere 1080 and now they added an ugly notch around it that ruins the aesthetic of every program in fullscreen. Even in normal mode I have several menubar apps that show things like cpu load, bandwidth, etc that span almost the full width of the screen, the notch will certainly make it worse.


I prefer my linux machine, but with hardware this good it feels kinda dumb to be on intel/amd...


Just configured a MBP 16" with M1 Max, 64GB RAM and 2TB SSD. Delivery (USA) December 23-rd. WTF?


Simply too pricey… my two cents. Like to see the return of useful ports but 2,000$ a bit high here


Apple is offering $1400 for my 16” 2019 Machine. Really tempted to take it and put an order in.


As someone who loves ThinkPads, I want to cry. Apple is, finally, wiping the floor with them.


I'm curios as to why these much more energy efficient chips require a significant upgrade of the power brick, now at 140W that's more than 40W higher than the Intel chips (and only for the 16" model).

Not a big deal, but can't see what would require the massive upgrade (and I'm a bit frustrated that all my cables are only rated for 100W)


If they'd only add touchscreens. Sigh.

Edit: Ouch, so many downvotes just for asking for a feature. Weird.


I'm very, very happy they haven't done this yet.


Why? Just don't buy that model.


Because it's a crap idea and design, I'll use a tablet if I want to poke at things. A touchscreen on a laptop is like driving a car with joysticks, it's not ergonomic.

Also, Apple never does things half-assed, if they put a touchscreen in they're doing for all models just like they did the touchbar.


Touchscreens are the big missing feature on macOS.

It's not useful as a primary input method, but after using a Windows Surface computer for some time I'm surprised how often the touch display is useful.

Lots of websites are optimized for touch / mobile first. Wether you are filling a form or watching Netflix or Disney plus, touching the screen is just much more convenient. Keyboard navigation is increasingly an afterthought on many websites.

Macs now support running iOS apps. Using them without a touch screen is going to be a very poor experience.

And finally, some things like annotating PDFs are things that are really cumbersome without a touch display -- when I need to do that on a Mac, I just print out the page because using the track pad or mouse for annotations is just not an option for me.


What for? I'm sure some people use them to draw and stuff but it seems pretty niche to me.


The laptop I bought for my dad have touchscreen "by accident", as neither of us recognized it as a feature to go for. Now he says its the best feature of the laptop, from pinching on maps to selecting checkboxes/radiobuttons. I dont know who was more surprised to find it useful.


Interesting anecdote!


Ooohf, I can't stand having fingerprints on my screen. Also, touchscreens are risky when you have toddlers around.



That's bad UX unless the whole of MacOS is redesigned to be used with fingers. Not happening soon.

Not saying never though.


> unless the whole of MacOS is redesigned to be used with fingers

People use a lot of web apps that were designed mobile first, and using them without a touch screen sucks.

You wouldn't use the touch screen as primary input, but it would make a lot of things much easier.


It would also require the whole thing be designed differently.

The hinge would need to be sturdy enough to not wobble when the user is dragging or tapping with a finger. The body would need to be heavier than the screen by a big enough margin to not topple the whole thing over when tapping.


Wow, scrolling controls the animation frame by frame. Never seen that. Cool and frightening.


I don't miss front end work but Apple always has that fancy scroll effects going on ha.


Cool, the screen has a chunk taken out of it, great stuff. I'm sure everyone will consider this a feature, and next year you'll hardly be able to get a PC laptop without a third of the top of the screen devoted to a notch and hard radii ground off the top corners, as well.

Glad I bought my M1 when I did. At least it's a normal laptop.


It has a chunk added to it. It's a 16:10 area with an extra 74 pixels on top for the menu bar. It renders full black in fullscreen apps.


It has two extra chunks of screen where before it was just black bezel? I'm not a user of these machines, nor do I use iPhones, but I still don't understand why people hate notches so much.


The best features are backflips.

Says a lot about recent Apple choices - the backflip says the most.


Its amusing that a lot of the features being hyped are simply Apple giving up on terrible design features they've been forcing for years and just providing what every other laptop does (normal keyboard, no touch bar and physical function keys, ports for HDMI cables, headphones, SD cards).


Will we ever get another 11" MBA ?

Best laptop form factor of all time. Of all time.


That's the 13" M1 MBA now, it has basically the same dimensions (+-5mm)...


So which one? 14 or 16? As the main driver for a person who uses a desktop


I came into these comments and one guy said "The machine I spec'ced out only costs $4300" and another guy said "They just put back the ports and features they should have kept in 2011" and I noped out of these comments. Your mileage may vary.


So basically apple just went back to the things they tried to fix that weren't broken and then messed up the aesthetics by making it a design from 2010 and added a notch. Why do they do this. Does the design team not understand how to design macbook pros ?


> So basically apple just went back to the things they tried to fix that weren't broken and then messed up the aesthetics by making it a design from 2010 and added a notch.

"Apple corrected nearly everything about their laptop design that people complain about, used a design this is considered by some to be the best MBP design ever [1], and put a small intrusion on the screen in the least-used-possible spot in order to provide more overall screen real estate. Why does Apple hate us?"

[1] https://marco.org/2017/11/14/best-laptop-ever


If it disturbs you, there will sure be a software that moves the menu bar below and makes the top black. This is a Mac and not an iOS device after all. I’ll take the extra pixels.


So, they were wrong when they removed those things, and they were wrong again when they put them back?


No i meant they were right to just put back what wasnt broken. I guess I have worded it wrong. I am just complaining about the design team putting the notch.


Wait, what notch? Do the processors have notches now?



I'm curious to know why they decided to keep the headphone jack. I thought the narrative was that users don't need it. it's obsolete and wireless audio is the new normal. Yet, it is still there. It seems slightly incoherent.


What is the closest thing to these that runs GNU/Linux?


ESC key still present!


And function keys!


Yay magsafe is back.

Boo magsafe is now a brand new one, incompatible with old chargers, and even usb-c chargers. YAY! Rejoice!

Jesus I hate apple's policies on dongles and such.

But at least they ackgnowledged that everyone wants an hdmi connection. Everyone.


when will they open up the laptop specs/pricing?



Details available already on their online store, though it's unresponsive at the moment as everyone is trying to check it at once.


They already did. On the UK store:

14"

1.8-Core CPU, 14-Core GPU, 16GB Unified Memory, 512GB SSD Storage - £1,899.00

2. 10-Core CPU, 16-Core GPU, 16GB Unified Memory, 1TB SSD Storage - £2,399.00


Still no OLED or MicroLed. New charger type...great!


2k USD for a laptop in 2021?

Apple's goddamn genius. Oh, it has a newer processor that'll make awful software such as XCode and every internet browser a little faster? Well I'm in!


To be fair, a decked out i9 laptop can easily end up in the same price range. So I don't think it's that outlandish really.


Sure. I'm making the argument that 2k for a laptop form factor is outlandish, not that there aren't products as bizarre as 'gaming laptops' that perhaps cost even more.

I have yet to see a mass market use-case for a powerful laptop. Who are these people that need to regularly go from place to place and need to do compute-intensive work in those various places?

I've been given laptops at work - everyone would've been better off with a mac mini 100% of the time (assuming you can upgrade those?) - the only people who don't connect their laptop to a big monitor anyway are people asking for neck and shoulder problems within a decade.


why are the new models fatter, wider and heavier than the old ones?

https://support.apple.com/kb/SP809?locale=en_US https://www.apple.com/macbook-pro-14-and-16/specs/

shouldn't it be the other way around with the integrated chips, thinner screen etc etc?


People, including me, were pleading with them to stop trying to make things so thin over giving better batter life, thermals, cost effectiveness etc for the past 7 years or so.

I am delighted with this change. If you want light and thin get an Air; I want thermals, power, and bang for my buck.


I’m guessing:

1. Ports need more space

3. Thermal solution probably needs more space. MBPs have had terrible thermal solutions in the past several years. Obviously M1X will be more power hungry than M1, and that requires better cooling


All the processor options are really confusing.


I just bought a new Air like 3 days ago...


> magsafe

can we still use an usb-c port for charging?


Yes.


Nice. Ports are back. Very nice. Prices are up. Not nice. The deal breaker is the notch. Sorry Apple I am a designer. I cannot watch this 24/7. Even if the top bar is black the notch will be visible enough to distract my visual line. The more I look at this, the more I like the Framework laptop.

P.S. Happy down-voting, have a nice day and stay safe.


Agreed. Icing on the cake: an incredibly specced Framework laptop is less than the starting price of the 14" MBP. And it has a 4:3 display ratio. And I can choose my ports!

I'm confused why so many people seem hellbent on convincing me that a notch is a good thing, suddenly. Can't y'all understand that I have a personal preference towards rectangular screens without any holes in them? I admit that it doesn't make the laptop unusable, but IMO it's an unacceptable compromise in a $2k+ machine (that if I specced out, I'd end up paying close to $4k for)


I think everyone would agree the notch isn't good. But it's a tradeoff. Do you want a decent webcam? That's apparently the price we have to pay.


That's absolutely bonkers. I can think of 20 different ways to provide a decent web cam without having a notch in the screen. I would have preferred for them to leave the webcam off entirely before I would have ever settled on such an absurd solution.


Then go buy a laptop without a notch, I'll happily take the extra screen space that's usually wasted by the menu bar anyway.


I'd love to hear some of them…


My 3y old samsung has 3mm hole in display for camera that is better then this one.

*Citation needed, but it's vastly better then current gen MBP, and good enough for Video calls.

Disclaimer: I don't care about the notch one bit, already order one, but saying that it's needed for good camera is imo stupid. Imo they eanted to add faceid, but didn't have the time, or enough chips to do it in this iteration.


There was nothing wrong with how it was done before.


- Thicker top bezel

- Have the camera portion of the lid stick out a bit

- External webcam

- Move the "notch" into a corner where it's less obtrusive

- Put the webcam in the laptop base instead


Thicker top bezel is the only viable one. But just at a glance it looks like the bezel + notch height is about the same height as the current bezels. I don't really see either option as necessarily being better given that.

Having it stick out is a no-go because things sticking out of a laptop are at risk of breaking off.

External webcam is something extra to carry around and mount. That's even worse than a dongle.

I don't agree that a notch in the corner would be less obtrusive. And the lack of symmetry wouldn't do the appearance of it any favors.

Cameras at the bottom have been done before and they suck. They're called nose cams for a reason.


>I'm confused why so many people seem hellbent on convincing me that a notch is a good thing

Most people. I assume, don't care about the notch.

Also - many people use their macbooks without opening them most of the time.


Can you help me understand that? I have an iPhone with a notch and I find it’s totally fine. Day to day I don’t notice it. It just kind of blends in to the design of the phone. And I assume the situation here will be the same. Most software doesn’t use the whole menu bar anyway. It wont be on top of anything except if you use that space to watch full screen video.

I’m not excited about the price, but I’d totally take tiny bezels and an upgraded webcam in exchange for the notch. This looks like an excellent upgrade.


I up-voted your comment, you ask reasonable question and I will try to answer it honestly.

It is a use case.

There are different levels of design craft. The more you, as a designer, train your mind and eyes, the more you see invisible things for regular people.

I can spend days in clearing the white space between glyph/kerning of typography and this notch will distract my eyes constantly. Call it OCD or professional deformation, the notch will drive me crazy.:)

When designing you want every visual distraction to be absent from your screen, for example I use 50% gray for wallpaper to have middle gray reference etc.

You cannot compare iPhone UX with Design process UX. When you use your phone, you have a legitimate reason for compromise. When you use Pro labeled hardware you expect all Pro use cases and Pro UX to be respected.

Don't get me wrong, I like the overall design (even the redesign of the shape of the keys), and there is no doubt that this are top-notch monsters. But for my use-case this is no go.


Idea: Write a utility to add in a software bezel for folks like this. We can call it distraction free mode. We'll take the extra space back and replace it with a black bezel. The webcam and front facing utilities are now within that space. You can toggle it on and off.

> I can spend days in clearing the white space between glyph/kerning of typography and this notch will distract my eyes constantly. Call it OCD or professional deformation, the notch will drive me crazy.:)

As an aside, most designers I know use external monitors. Do you find yourself doing this type of work on your laptop display?


> As an aside, most designers I know use external monitors. Do you find yourself doing this type of work on your laptop display?

Guilty as charged (Two Eizos), but I work outside the office regularly, or in the weekends when "inspiration" strikes.

What is the point of Pro laptop when I cannot perform my work at a maximum UX comfort.

Another use-case is that I develop my photos on the go, so on, so on.:)

I am still partially in the Apple Ecosystem (mainly due to C1) so by the looks of it I will have to transition to something different.

> Idea: Write a utility to add in a software bezel for folks like this. We can call it distraction free mode. We'll take the extra space back and replace it with a black bezel. The webcam and front facing utilities are now within that space. You can toggle it on and off.

My first reaction was this idea also. I am sure that as we speak someone is firing up Xcode and it will be available if Apple approves it:)


The notch is not just a blow to the aesthetics, but it actually is annoying when it comes to software with long menu bars. Also, what happens when you install many plugins in the menu? The notch is literally taking up good space that could be used for menu apps.


It absolutely should be an option, but it'd also suck to lose that portion of the screen's vertical real estate in its entirety - especially when 16:9 / 16:10 aspect ratios already make said vertical real estate scarce as-is.


The aspect ratio of the new MBP seems to come out to 16:10.4. I assume that the 0.4 is the notch zone, and the part of the screen below it is 16:10. So it's really just a standard 16:10 with some extra pixels, which sounds good to me.


>Idea: Write a utility to add in a software bezel for folks like this.

I'd be surprised if this isn't an OS level option.


How I would bet anything I own that there isn't, and won't ever be one, knowing Apple. This would admit there's a problem and some users may not like the new innovation allowing for more screen space.


It is an option, at least in full screen mode. https://www.macrumors.com/2021/10/18/macos-hides-notch-on-ne...


Thats really when it would matter. MacOS has an ever present menu bar at the top of the screen. A notch would be entirely unintrusive - and practically invisible in Dark Mode.


Knowing apple, there won't be. iPhones don't have this option, but my OnePlus does for the camera hole.


> When you use Pro labeled hardware you expect all Pro use cases and Pro UX to be respected.

This can't possibly be the case, though. There are infinite Pro use cases and trying to respect them all would result in a confused and inferior product.

I'm not saying that your reasons for disliking the notch are wrong, or that they should have the notch. But they're working within certain constraints -- there's just no way to fit a decent webcam into a laptop screen without making space. As someone who's recently been trying to help his fiancee pick out a laptop, where said fiancee's primary criterion is that the laptop webcam shouldn't suck, I empathize with the decision.


I disagree, respectfully.

The main target audience of MacBook Pro started with designers, writers and musicians (just looking on the shelf towards my old Titanium PowerBook G4).

There are not so much professional use cases which will agree on removing screen estate and adding visual distraction. Screen estate is the main difference in professional UX.

One of the reasons that I mentioned the Framework laptop is that a small company can get super-close to professional audience when searching pro solutions with common sense.

The Mighty Apple with trillions in bank cannot come-up with something "innovative" and chooses the "trade off" approach?


> The Mighty Apple with trillions in bank cannot come-up with something "innovative" and chooses the "trade off" approach?

Do you think having money means Apple never has to make tradeoffs? There are always tradeoffs. It's just not possible to make a good webcam that exists in the tiny space at the top of an ultra-thin screen with ultra-thin bezels. Most manufacturers just make terrible webcams, including the Framework that you've brought up as a good alternative (which has much thicker bezels besides).

That's one way to go. Another way is to make space for the webcam. Apple can't magically throw money at every conceivable problem until it disappears.

You don't have to buy one of the new MacBooks. I suspect that most designers aren't going to have as much of a problem with it as you're suggesting, though. Outside of full screen most folks won't notice it after a while, and when you go to full screen Apple adds a black bezel anyway.


> It's just not possible to make a good webcam that exists in the tiny space at the top of an ultra-thin screen with ultra-thin bezels.

Maybe they could consider, you know, not dogmatically pursuing ultra-thin bezels for their own sake?

> and when you go to full screen Apple adds a black bezel anyway.

That's good to hear, but that then entails entirely removing a chunk of vertical screen real estate - which is already at a premium (relative to horizontal) with widescreen aspect ratios. Contrast with the Framework (or, similarly, the Pixelbook), which doesn't need to do that and boasts a 3:2 aspect ratio to improve upon that vertical real estate.


A laptop with thick bezels and no notch is the same as a laptop with no bezels, a notch and those pixels disabled in software. It sounds like you want it both ways - you don’t want to lose your vertical real estate (“which is already at a premium”). And you don’t want the notch. In the same paragraph talking about how important vertical real estate is, you hold up other laptops like the framework as ideal even though they lose vertical real estate via the chunky bezel.

I’m confused. Do you want design aesthetics (no notch)? Or do you want more vertical real estate (less bezels + a notch)? Having both would be the best. But given we can’t have that, if you were in charge of the MacBook Pro design, what would you choose?


> you hold up other laptops like the framework as ideal even though they lose vertical real estate via the chunky bezel.

Because they more than make up for it with the 3:2 aspect ratio.

> Do you want design aesthetics (no notch)?

Don't care about aesthetics.

> But given we can’t have that, if you were in charge of the MacBook Pro design, what would you choose?

Bigger top bezel and - ideally - a 3:2 aspect ratio.


I still don't understand. Lets say we have two machines:

- Machine 1 has a large top bezel and no notch

- Machine 2 has a tiny top bezel and a notch, but the pixels to either side of the notch are disabled in software (and never used). The part of the display thats enabled has identical geometry to machine 1.

Aesthetics aside, aren't these machines identical in every way? You have a strong preference for machine 1. Why?


> You have a strong preference for machine 1. Why?

Because Machine 2 offers nothing of value over Machine 1 - only more complexity and more opportunities for hardware and software issues.


> There are not so much professional use cases which will agree on removing screen estate and adding visual distraction. Screen estate is the main difference in professional UX.

You seem to have a misunderstanding of the new screen? Apple is reducing the bezel around the screen to provide you more screen space. How do you handle front-facing utilities like a webcam when you use up all available space? Their solution is to surround that area with a small boundary (the dreaded notch).

You can see that the Framework Laptop you cite has a large bezel around it to provide that top bar. You have less screen area here.

From the screenshots, it looks like programs running in full screen mode are pushed down to leave that top area, to give you your distraction free experience: https://www.apple.com/macbook-pro-14-and-16/


Yes, I clearly see in the first frame a wallpaper with black background that hides the notch.:) So they give me a real estate by removing the bezel and remove the vertical display space to hide the notch. Nice solution, but as I sad, not for my use-case.


Why does it not work for your use case? When the areas either side of the notch are unused (as in full screen mode) you just have a regular 16:10 screen with a top bezel. If previous MacBook screens worked for you then this one will too.


The stated resolution of the screen is 3024x1964. If you do the math, this basically amounts to a 3024x1890 16:10 display, with a 3024x74 extra display on top that is interrupted in the middle. Considering that the menu bar usually has empty space in the middle, isn't it a strict screen estate gain to move the menu to this extra display?

I mean, what would you have preferred, exactly? If they had stuck to a 3024x1890 display, no one would have had the slightest complain, but it would be unequivocally less than what we're getting. Again: the notch is located outside of the same 16:10 screen every other MBP has.


If it's about deliberately training your mind and eyes in order to use tools efficiently as a professional designer, then surely one could apply that training to focus on relevant parts of the screen and not the notch. Sure, if all else is equal between a laptop with a notch and one without, choose the one without the notch, but it's odd for the presence of a notch to be a dealbreaker for a computer that one otherwise would have chosen.


> I have an iPhone with a notch and I find it’s totally fine.

I have a Motorola with a notch and I find it's mildly noticeable and annoying. It's tolerable because it's a phone and I don't necessarily care about screen real estate maximization for a media consumption device (and because this phone is a temporary daily driver while I wait for my Astro Slide to finally ship), so I'm slightly more forgiving of e.g. the constraints on visible notification icons or the fact that Android's network traffic indicator is dead-smack in the middle and therefore "under" the notch (i.e. entirely useless unless the phone's in landscape mode).

For a laptop, it's different; judging by my workflow at my previous job (wherein I used a Mac), that'll almost certainly do funky things with either long lists of menu bar options or long lists of status bar widgets, and will almost certainly do funky things with full-screen apps that have buttons at the top of the screen. It's telling that all of Apple's full-screen productivity app screenshots on that page are on the 16-inch, which doesn't have the notch (EDIT: or maybe it does; hard to tell from the renderings); all the renderings of the 14-inch either have stuff like Zoom or show the apps as windowed. The one rendering of the 14-inch running Photoshop demonstrates pretty plainly that PS' menu bar items are right up to the notch; if a program uses more than that, what's the behavior? All items get squished? Items at the end go to the other side of the notch? Items get hidden entirely "under" the notch and become inaccessible?

GP's right, on all counts. The fact that said GP got downvoted into oblivion for daring to express a totally reasonable and legitimate opinion is disappointing, to say the least. And meanwhile, I've been overwhelmingly happy with my Framework, which doesn't resort to such gimmicks like notches and even boasts a superior 3:2 aspect ratio for yet more effective screen real estate. The M1 in these Macbooks is interesting and tempting, but a Mac Mini would readily scratch that itch just fine at a fraction of the price - especially if I'd have to use an external monitor anyway to prevent any interference with applications by that notch.


> Most software doesn’t use the whole menu bar anyway.

No, but all those little things you've installed that put up a menu bar item on the right add up - I'm sitting at a cafe on my laptop right now, and the menu widgets come to pretty much the center. If I unhide the less-frequently-used ones that I hide with Bartender then they're covering 5/6 of the width of the screen.

I am an artist and spend a lot of time with Illustrator in fullscreen mode, with no visible menus. It'll be pretty annoying to have this thing jamming into the middle of my work area. Especially given that I think I could probably count on one hand the number of times I've used the cameras hiding in the frame of my previous Macs. Anyone playing games will be annoyed by it too.

Hopefully they will not make this same move in the next round of Airs, as that's what my next computer's probably going to be.


> spend a lot of time with Illustrator in fullscreen mode, with no visible menus

My guess is that for these situations and for full screen games, you’ll see the screen size reduced by the notch height. And if not, I’m sure there will be an extension that makes the menubar a black rectangle.

Not optimal, but there are ways around the notch, if it bothers you too much.


Yep:

They hinted at this with some of the screen shots they showed:

https://imgur.com/a/McNBTHY

I don't know enough about MacOS GUI development, but I'm guessing if an app requests a full screen display, it gets told the resolution is the resolution minus the top, and that's just blacked out.


I guess sometime next year I'll get to find out if Illustrator actually requests Full Screen Display or if it just throws up a window with no chrome that covers the menu bar!

I am pretty sure it does the latter.


Sounds easy enough to fix in software if its a problem.

Mind you, if it is a problem, who knows how long it'll take adobe to actually fix it.


What happens when you try to move the pointer ‘into’ the notch? I can think of several ways to handle this, none of them good.


Or if your menu bar has so many items it can't all fit to the left of the notch? Would probably have to straddle the notch, but would still look pretty weird.


There’s a screenshot that shows this happening


And what happens in split screen mode (hidden menu)?


One of the things I dread is using two program windows side-by-side when doing code reviews.

Most of the arguments are in the full screen hiding the notch with a black title bar. What if you don't use it FS all the time? The notch would be right where two windows overlap, unless you manually resize both windows every time. Every day for 8 hours+ for everytime you open a code editor

Extra screen real estate is great, but I am sure they didn't have to go all the way to top. The point is getting moot because now the chins are unusable space as well all the time.


Well if you are also exposed to notch-less experience elsewhere, ie work, it can become quite annoying. If that's your whole world, I can imagine you just learn ignoring it


I don't like the notch either but you can compare this to the Framework laptop? This is literally a monster in performance compare to some of the most high end laptops on the market.


Yeah the prices are up but you're getting so much more computer for your money. Just the new screen alone is worth the extra.


Isn’t the notch strictly an improvement? The alternative is to lose that top area to a bezel. This feels like extra screen for free.


I agree with you on paper but I don't like how the notch just looks out of place. I don't see any way around it here, though; to get decent laptop cameras something has to give.


Probably can be fixed in software by filling the areas on the sides of the notch with black and moving the menu bar below the notch. Since this is a mini-LED display, the black is supposed to be pitch-black.


I think it depends where the dimming zones are. If they're behind the relocated menu then you might see blooming above it.


It is and it isn't. You are getting extra space for free, but some people like me have a very full menu bar and it's going to get very awkward with the notch.


Bartender is a game changer. One of the first things I install on a new Mac.


Very cool. Are there any other such tools you would recommend as a must-install?


It's extra screen, but with limited utility due to that notch being there. Having that extra screen without the notch would've been even better.


I hope they'll make a Mini soon


When are the 27" iMacs coming?


We've passed 1337 comments


The insides look great, the connectivity seems decent, but notch and chin, really? Like cheap Android phones? :-))


The article reads like an ad.


Apple is becoming more and more like other brands. It's losing it's think different mojo.


No touchbar, I'm sold


What happens if you plug-in an external display? With 3456-by-2234 non-standard ratio (neither 16:9 nor 16:10), will there be black bars on the sides when you mirror screens? If you don't mirror screens, the MacOS still does not rescale DPIs properly (unless it's Apple's own $4,999 retina monitor).


Apple has done it again.


> Designed with the earth in mind.

Lol


Finally a decent camera


No FaceID. Interesting.


Yes that is interesting. Maybe it would have made the notch bigger? :)


The iris is superior to other biometric modalities in so many ways that they appear to be almost toys in comparison.


FaceID isn't iris-based.


Yes and I assume that's why it didn't survive.


It's in the latest iPhones.

It was never in the Macbooks.


Huge vote of no confidence for the USB-C-only future, and I'm not sad about it.


Big sur = Big brother


This is everything I wanted. But the notch... I can't really accept that.


They increased display area with that notch and it only covers the menu bar.


I agree that in normal use it's only covering the menu bar and probably won't be much of a deal. What happens if an app has a lot of menu options? What will that do to the bar?

What about gaming? is this something the developers will have to work around or will the game only be below the notch?


I think on iphone notch is more of a deal because you use if different orientation and between notch you couldnt squeeze much content like baterry percentage.

On macos I think it's not a big deal for me considering I almost never maximize screen but extra 0.2'' of screen estate because of smaller frame is nice to have


Dancers ? How much of the $1999 go into advertisement ?


Ok. Now make it repairable.


it has a notch


> HDMI, SD Card, and MagSafe. Things people on the internet inclusive but not limited to HN said they will never come back because the future is USB-C.

Gonna be downvoted for the snark but... I'd like to hear now how the "a usb-c dongle fixes everything and it's perfect" Apple crowd will backpedal on this.


I’m that crowd and while I think MagSafe is a good addition back to the lineup (single-purpose and the other end is still USB-C) I just don’t see the point of adding back HDMI and SD cards. Now we get to go back to the era of jeez I sure hope there’s a dongle somewhere instead of moving to a future of just one cable.

Maybe we should keep Lightning too instead of USB-C for the iPhone.


I think everyone thought that was just over the horizon for a long time, and it never quite got there. My monitor is my dongle for everything, but I'm lucky in that I didn't have a monitor before the USB-C everything era.


Don't get me wrong, I'm on that camp myself, I have a Dell XPS with 2 usb-c and my LG monitor is my dongle when at home and I love it. And if I were the traveling consultant type I would just use a dongle, until everything is usb-c. I was totally on Apple side also when they removed the dvd drive when optical disks where still being used widely. It was the right tradeoff to have slimmer laptops.

I was just being snarky at the Apple fanbase that cheers every Apple decision, no matter what. Especially when Apple itself backpedals on something.


The greatest improvement: no touchbar.


Please don't post duplicate comments to HN. It makes merging threads a pain.


It's the "Fuck you Jony" edition. Ports (including HDMI!), MagSafe, and no touchbar. Would be a fine laptop if it didn't have government spyware baked in that I can't turn off even if I power it down.


I wish we could get a Macbook Air in these screen sizes. I don't need the extra power, just the bigger screen.


You can order a Macbook Pro 14" with the M1 (not M1 Pro) for $2000.

EDIT: My mistake, it's an 8-core M1 Pro with 14-core GPU (so neither the 8-core M1 with 8-core GPU, nor the 10-core M1 Pro with 16-core GPU) at that price.


To be specific, 14" or 16" Macbook Air closer to the $999 of the 13".

I guess that would cannibalise sales of the Macbook Pro, but the M1 Macbook Air is totally sufficient for regular browsing and officework.


Wait the base price of the Macbook Pro isnt for M1 Pro? That is confusing as hell.

EDIT: The base $2000 new Macbook pro does come with a M1 Pro on the US apple Store....


theory: the notch is apples way to keep people from taping over the webcam, a few percentage points. who benefits from less tape, at population scale, on a little used but programmatically controllable camera?


Still no removable storage? How are you supposed to get your data off if it dies?


You're kinda supposed to be doing that periodically BEFORE it dies.


emphasis on periodically? you'll always lose data between now and the last daily or hourly backup. Not great.


Or I could just remove the SSD and not have that tradeoff.


So my options are either a constant network connection or always keeping an external drive plugged in assuming I want to maintain 0 loss. Great! That's totally easier than just being able to remove the SSD.


What do you do if your laptop gets wet or lost? Or if someone runs it over? There’s not a ton of instances where the system won’t boot but somehow the SSD has survived.

Besides, the recent storage Apple has used is all encrypted with the keys stored in the Secure Enclave. Data cannot be recovered if you don’t have the board (even on the Mac Pro, IIRC).


Of the 2 laptop failures I've experienced both of them wouldn't boot but the data was recoverable because the drive didn't fail. Only in a vanishingly small percent of cases is the computer so destroyed that the SSD is also destroyed. The secure enclave is just another failing on Apple's part. If I'm encrypting my data it should be with keys I control. Apple should have a backup plan for users. Not doing so is negligent.


you should be using the cloud for everything, obviously /s

More seriously, I had not considered the secure enclave angle, which is worrisome.


Furthermore why bother encrypting anything if you then upload everything to the non E2EE cloud.


Well, there is an SD slot so if you're really concerned about being able to back up your work at all times if you're not network connected, you can periodically copy to that. If your laptop dies, there's no guarantee your SSD drive won't be the cause of the failure and/or can fail based on the same external event--e.g. spilling something.


Have backups like you're supposed to? Removable storage or not you should have backups of the device.


[flagged]


Don’t fish for karma by making the exact same comment on four threads.


When I saw the notch I immediately thought of Face ID. But comparing how frequent we unlock our Macs vs phones, it makes sense not to put it.


Windows Hello is widely praised as an excellent feature, I don't see why FaceID on Mac would somehow be different.


They probably want developers to optimize around the notch shape. Would be dumb to change the shape next year when they add faceID.


could be "face ID ready" without telling us, and share it will be a coming with macOS 12 on M1 Pro+


My thought as well. That's a software features as long as the requisite hardware is there. There is a track record of hardware sitting dormant on some Apple hardware, too.


The touch-id is so fast, I don't mind.


edit: this was wrong

16GB max RAM on the 14 inch model. No thanks.


This is not correct.

32GB on the M1 Pro and 64GB on the M1 Max in the 14".


It can be configured up to 64gb with the M1 Max.


You can configure to 32GB for an extra $400.


Not a detachable or a 2-in-1? Pass


Unfortunately still no bitcoin payment option.


So we're back to 4 macbook laptops again with 3 of them being called 'Pro'

Hmm


How do you count 4? The 13 and 16 will be phased out as old models and the 14/16 will remain. Then there's the MBA.

So 3 models, with 2 of them being 'Pro'.


The regular M1 13" Pro is still being offered, I would probably still count that as the 4th model.


13" M1 Pro is staying


Disappointed they're not releasing a Max Pro. Guess I'm waiting for the next cycle.


I believe they were uncharacteristically open about the schedule from the beginning of the CPU transition, with the Mac Pro coming in 2022.


IPhone 13 chips are on back order. I would love a Max Pro, but it I think it is going to take some time to reach market.


Mac Pro M1 Max Pro™


MBPs are irrelevant in the age of Surface Book and Framework


Nice hardware. It's a pity that I don't trust them with my data any more. What will they scan and report?


No greater example of the decline of our society: the contrast between HN's focus and attention to this product announcement and the NSA leaks.


That notch thing will put me off! Ugliest impractical thing causing additional efforts and potential troubles unnecessarily. Avoiding iPhones with that, will do the same with laptops.

And that magsafe -> no magsafe -> magsafe again disarray! Come on, make up your f mind. I held off buying new computer looong, did not want to leave MagSafe, it recently died so was forced into USB-C charging, which I did not want at all, ok, reshaped my practice and now this?! A never existed incompatible magsafe again?! I don't want to switch yet again, the magsafe 1 to 2 was a pain alone, then USB-C is a torture, I fn not do that again! Ok it presumably charges through thunderbolt 4 ports too with USB-C (theoretically, one never knows as I heard) but you spread your new magsafe after killing it proudly and loudly!? Jesus!

How come Apple can ruin and confuse things so effortlessly, do they have a mandatory training in that or what?! It is big kudos to remove touchbar, have (have?) proper keyboard after the pathetic stumble series of trying to reinvent the wheel and f up the life of millions (the Air M1 is flawed still, repeats keys frequently), hallelujah for having HDMI again, finally paying attention of practicaliities in a consumer product not just the appearances and show-off but come on! Now this idiotic notch thing and the zillionth way of charging again? Very off putting, again, continuouusly now. Actually not interested to buy. (luckily I do not need a power plannt only a decent performing one so I can live with my advanced Air M1 for long long long time - assuming the keybard holds off -, when my wife's old Air dies we will stick to that or turning back to some serious manufacturer making laptops for use not for show)


No matter what changes apples implements (even if its reverting to things people wanted back), people like you will always find a way to complain. Having used an iphone for a while, the notch really isn't noticeable, and I'd trade it for having a thick black border at the top any day.


Open your eyes, people will find problem with stupid unpractical ideas, not the good ones, don't you read the internet?! Read my post at least please then, welcome changes are mixed with anger on repeated stupidity and avoidable inconsistency, questionable design elements just for design upsetting practice on a product made for serious use. Why don't you read the whole post not just the part you don't like?! : /

Let me quote myself: "kudos to remove touchbar", "have (have?) proper keyboard" "hallelujah for having HDMI again". How can you reliably say people always find errors in everything? Only find error when there are errors for f sake!

I use iPhone too and the notch is not noticable because it is not there! Same for my MacBook, all of them!!


You can blame Apple for having made dumb decisions in the past, but you can't blame them for fixing them now. Also, what additional efforts and problems with the notch are you talking about ?


They had one single lone little sign of doing something right, don't let me quote Mr Wolf from the Pulp Fiction here please! : /

Long, long, long experiences show that Apple don't care about practicality as muchs as appearances! One swallow makes no summer, as they say. I balme them not for the step into the right direction, please read the lines, please!, I blame them for the mindless chaos they caused and complain that I will not be shuffled around by them again, I am burn, they made it.

Btw you know that te space of the notch does not display nothing, someone has to take care of it! That is an effort avoidable, as ALL other solutions show that. Also read posts about the notch on the phones, this is not a new thing, now it is escalated unnecessarily. We will see how this affect existing software with long menus, lots of manu bar icons, resolution change, screen shots. It has to be considered when it is there, compared to if it wasn't.

Too little too late!


You know it still supports USB-C charging right?

Rather than confusion, I would see it as Apple finally turning things around and getting back on a road that is less "design above all else"


How could you possibly now that, especially after decade long design fallacy over practicality and with this notch thing?! You have no foundation for this claim that it will happen just based one lone step back into the right direction.

The best we can say is that after intensionally ruining practicality for the elusive gods of design there was a little light of hope, that's it!

(btw.: "it presumably charges through thunderbolt 4 ports too")




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: