Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where is the graphics test? The M1 Max versus an NVIDIA 3080 or similar?


I'm waiting to be corrected by someone who knows GPU architecture better than me but as far as I can tell the synthetic benchmarks can trade blows with a 3070 or 80 (mobile), but the actual gaming performance isn't going to be as rosy.

Also recall that very few games needing that performance actually work on MacOS


"Also recall that very few games needing that performance actually work on MacOS"

But many Windows games do run under Crossover (a commercial wrap of WINE - well worth the measly licensing fee for seamless ease of use to me) or the Windows 10 ARM beta in Parallels. I got so many games to run on my M1 MacBook Air I ended up returning it to wait for the next round that could take more RAM. I'm very, very happy I waited for these and I fully expect it will replace my Windows gaming machine too.


Well, Apple G13 series are excellent rasterizers. I‘d expect them do very well in games, especially with that massive bandwidth and humongous caches. The problem is that not many games run on macOS. But if you are only interested in games with solid Mac support, they will perform very well (especially if it’s a native client like Baldurs Gates 3).


"very well" 45 fps top at 1080p medium settings https://www.youtube.com/watch?v=hROxRQvO-gQ

You take a 3 years old card you get 2x more fps.


Which other passively cooled laptop can do it? And what 3 year old card are you comparing it to? Hopefully something with 20W or lower power consumption.

45fps at medium Full HD is not far off a 1650 Max q


Apple compare themself to a 3080m, the perf from an M1 is not even close to a 3 y/o card. I don't care if it takes 10w if I can't even play at 60fps on "recent'ish" games.


You may have mistaken last year's M1 (the one in the video, available passively cooled in the MacBook Air) with the new M1 Pro and M1 Max (the ones being compared to the more powerful counterparts).


That's the M1. Yes, those numbers are superb for something that's competing with integrated graphics.

M1 Pro and M1 Max will be legit gaming GPUs.


What is the most intensive game you could actually run?

I was going to say Fortnite but I'm guessing that's not the case anymore


Baldurs Gates 3, Metro Last Light, Total War…


It's really hard to compare Apple and Nvidia, but a bit easier to compare Apple to AMD. My best guess is performance will be similar to a 6700xt. Of course, none of this really matters for gaming if studios don't support the Mac.


The mobile RTX 3080 limited to 105W is comparable to about an RX 6700M, which is well behind the desktop RX 6700XT.


The gaming performance will be CPU-bottlenecked. Without proper Wine/DXVK support, they have to settle for interpreted HLE or dynamic recompilation, neither of which are very feasible on modern CPUs, much less ARM chips.


From all reports, rosetta2 performs remarkably well. Apparently they added special hardware support for some x86 features to improve the performance of dynamically recompiled code.


Does someone know how much VRAM the M1X has? Because I bet it's far less than a 3070 or 3080.


There's no vram. It's unified / shared memory. There's no M1X. There's M1 Pro and M1 Max


I think we can safely shorten them to M1P and M1X.


Can't allow that. Either M1P and M1M or M1O and M1X.


M1 Pro & Max (and plain M1 too for what it's worth) have unified memory across both CPU and GPU. So depending on the model it'd be up to 32gb or 64gb (not accounting for the amount being used by the CPU). Put differently - far more than 3070 and 3080.


The memory is unified, and very high bandwidth. No idea what that means in practice, guess we'll find out.


It's very high bandwidth for a CPU, but not that great for a GPU (400GB/s vs 440GB/s in 3070 and 980GB/s in 3090).


It's not quite apples to apples. The 3070 only has 8GB of memory available, whereas the M1 Max has up to 64 GB available. It's also unified memory in the M1 and doesn't require a copy between CPU & GPU. Some stuff will be better for the M1 Max and some stuff will be worse.


>> ...not that great for a GPU...

* almost equals the highest laptop GPU available


But it's also a premium product, so it matching a 3070m isn't really above what you'd expect for the cost (but efficiency is another story)


On the other hand, it's zero-copy between CPU and GPU.


Of course, with a 3070 and up you have to play with headphones so you don't hear the fan noise.

This is the best feature of the new Apple CPUs if you ask me: silence.

Now to wait for a decent desktop...


the memory is unified so whatever ram is on there (16,32,64) can be allocated as vram.

That's why during the presentation they bragged about how certain demanding 3d scenes can now be rendered on a notebook.


I still didn't get their example about the 100Gb spaceship model - max RAM supported is 64Gb...


Up to 64GB…


It's probably bad, the M1 could not get 60fps on WoW so ... When I see Apple comparison I would take that with a grain of salts because the M1 is not able to run any modern game at decent fps.


My M1 cannot properly without stuttering show https://www.apple.com/macbook-pro-14-and-16/ (esp. the second animation of opening and turning the laptop).

Both safari and chrome


That's because it's actually a series of jpegs rather than a video(!!) - the same happens on my Intel Mac


Are modern games built with Metal? Pretty sure Apple deprecated OpenGL support. Macs have never been gaming computers.

The GPUs in the M1 family of Macs are for “professional” users doing Video Editing, Content creation, 3D editing, Photo editing and audio processing.


MoltenVK is Vulkan's official translation layer to Metal, and doesn't have too much overhead. Combine with dxvk or d3vkd to translate from DirectX—DirectX before 12 is generally faster with DXVK that Windows' native support.


"the M1 is not able to run any modern game at decent fps."

Do you have first hand experience with this? I do . We play WoW on MacBook air M1 and it runs fantastic . Better than my intel MacBook Pro from 2019


"Running fantastic" is what Apple would advertise, but what matters is fps, utilisation and thermals when benchmarking games


Defines fantastic because a 1080ti from 4 years ago run faster than the M1. My 2070 could run wow at 144fps, and it's a 2.5y/o card.

Yet most people can't get 60fps: https://www.youtube.com/watch?v=nhcJzKCcpMQ

Edit: thanks for the dates update


Comparing the M1 to a 1080ti is ridiculous. The 1080ti draws 250+ watts. The M1 draws 10w in the MacBook Air.

In the current market you can buy a MacBook Air (an entire laptop computer) for less than buying just a midrange GPU.


Well Apple compared themself to a 3080m which is faster than a 1080ti.


Apple compared the M1 Max to a 3080m. 4x the GPU cores and up to 8x the memory makes a difference, and it wouldn't be at all surprising to see that their numbers are accurate.


lol, got ‘em


No one has explained what you got wrong, so in case anyone reading this is still confused, Apple compared an M1 Max to a 3080m. An M1 Max's graphics card is ~4x as fast as an M1.


The 1080 Ti was made available in March 2017, so it's 4.5 years old. Not 6.


That’s not great especially because I believe WoW works natively for the M1 and uses the Metal API.

My follow up would be what settings were you playing at?


WoW is not a graphical demanding game even on the highest settings


In the keynote Apple said the M1 Max should be comparable to the performance of an RTX 3080 Laptop (the footnote on the graph specified the comparison was against an MSI GE76 Raider 11UH-053), which is still quite a bit below the desktop 3080.


No way it can get anywhere near 3080


Seems kind of unfair with the NVIDIA using up to 320W of power and having nearly twice the memory bandwidth. But if it runs even half as well as a 3080, that would represent amazing performance per Watt.


Apple invited the comparison during their keynote. ;)


I believe they compared it to a ~100W mobile RTX 3080, not a desktop one. And the mobile part can go up to ~160W on gaming laptops like Legion 7 that have better cooling than the MSI one they compared to.

They have a huge advantage in performance/watt but not in raw performance. And I wonder how much of that advantage is architecture vs. manufacturing process node.


I am very confused by these claims on M1's GPU performance. I build a WebXR app at work that runs at 120hz on the Quest 2, 90hz on my Pixel 5, and 90hz on my Window 10 desktop with an RTX 2080 with the Samsung Odyssey+ and a 4K display at the same time. And these are just the native refresh rates, you can't run any faster with the way VR rendering is done in the browser. But on my M1 Mac Mini, I get 20hz on a single, 4K screen.

My app doesn't do a lot. It displays high resolution photospheres, performs some teleconferencing, and renders spatialized audio. And like I said, it screams on Snapdragon 865-class hardware.


What sort of WebXR app? Game or productivity app?


Productivity. It's a social VR experience for teaching foreign language. It's part of our existing class structure, so there isn't really much to do if you aren't scheduled to meet with a teacher.


The MSI laptop in question lets the GPU use up to 165W. See eg. AnandTech's review of that MSI laptop, which measured 290W at the wall while gaming: https://www.anandtech.com/show/16928/the-msi-ge76-raider-rev... (IIRC, it originally shipped with a 155W limit for the GPU, but that got bumped up by a firmware update.)


The performance right now is interesting, but the performance trajectory as they evolve their GPUs over the coming generations will be even more interesting to follow.

Who knows, maybe they'll evolve solutions that will challenge desktop GPUs, as they have done with the CPUs.


A "100W mobile RTX 3080" is basically not using the GPU at all. At that power draw, you can't do anything meaningful. So I guess the takeaway is "if you starve a dedicated GPU, then the M1 Max gets within 90%!"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: