“If only they didn't put the power button on the bottom.”
While I think Apple was off the rocker on this particular decision, I do respect their org structure that allows this type of decision to occur. Believe me, there are companies where a dozen people or more would weigh in and prevent an unpopular choice. Consensus sometimes hinders a desired result (both good and bad).
It's a way of signaling how the product should be used. Plug it in, hit the power button, put it down, and never turn it off again. For many users that's probably the only time they will ever interact with that button (or want to).
I actually think it's a really good choice and shows Apple really understands design. And with the relatively low power consumption it makes sense. It's not like it's drawing a ton of power on idle
I have a Mac Mini and can't remember the last time I had to manually press the button. IIRC it even reboots on its own after a power outage.
I think I shut it down once for an extended vacation just to make sure appliances weren't on while I was gone and when I switched apartments. Otherwise I'd check and post my uptime from the command line.
It's a launch M1 mini so I'd wager less times pressing the power button than I have fingers on one hand.
Apple Silicon devices turn on automatically from IO, even after shut down, so the power button is only useful to: force shutdown if unresponsive or execute some sort of boot key combo to enter a recovery mode.
And if you use bluetooth IO (non-apple). I do on my Mac Mini M2, and yet I have maybe pressed that button 3 times in the year that I am using it as my main machine as I never power it off.
I'd never thought about it before reading this comment but I now realise I don't even know where the power button on my Mac Studio is. I used it once when I first set it up and haven't touched it since.
> I actually think it's a really good choice and shows Apple really understands design. And with the relatively low power consumption it makes sense. It's not like it's drawing a ton of power on idle
I use a Mac Mini (older model) in my music studio. It shares a surge protector with approx. $12k worth of audio gear (some of it nearly impossible to replace). I have all the gear + the surge protector switched off anytime I'm not using it. Which is most of the time.
While the weight and form factor would make powering the M4 Mini on a little more than a nuisance, I have a hard time lumping this into one of Apple's great design features.
M1 and newer Mac Minis automatically power on when plugged in/given power. If you're using an external power switch then that basically becomes the power button.
I'd still like the button to have been on the side or something over looks but it does seem like a pretty reasonable choice overall.
This is a setting in the control center, not sure what is the default though. You can make it auto-boot when external power switch is used, through that setting for sure.
Even if it is rarely used there is no benefit of making it hard to access. There is no harm in having an easy-to-access button that is rarely used.
I guess someone thinks the astetics are worth it, but even if the power button did notably harm astetics (which I doubt) I would take functionality over astetics any day.
If there were two models with different power button placements which one do you think people would buy?
Apple could have found a way to put the button somewhere else and make it nearly invisible, but that's expensive and the Mac mini is clearly designed with cost in mind.
If you want cheap and functional, you're in luck because that's pretty much all anybody makes.
Apple makes it difficult to access because they want to make sure you don't use it often, as they believe the experience of waking up the computer from sleep is better than starting it up.
It's a conscious decision based not on design, but on UX, as with the Magic Mouse USB port.
They were successful in annoying every customer of this product and being the laughing stock of even their most die hard supporters.
Like even across all the very Apple oriented publication, almost no one is recommending this mouse (even though the touch surface can be usefull).
I have the Magic Mouse and I had the previous version that just had battery swap. The experience on the newer one is much worse, previously you just had to spend 30 sec for a battery swap and you were on your way. Now you need to wait at least 5 min and you better not forget to put it to charge before leaving the computer, otherwise tomorrow the same problem await you.
And this is compounded by the fact that it has terrible battery life to begin with, especially considering the extremely mediocre sensor they put in it. Logitech has mouseq with much better sensors that last much longer on battery and they don't even have the charge problem.
If anything, the last generation of Magic Mouse is a testament of Apple's utter disdain for its customer and the general lack of care they have around user experience today.
They have the best chips around but it can't be just that.
Zero power draw is still less than a little power draw. A couple million of these babies running on idle is a considerable amount of power. Please, turn off devices when you're not using them.
Any modern computer system uses a lot of power for a few minutes after bootup. If you use the machine a few times per day you're wasting energy (and your own time) by turning it off instead of using sleep mode.
Completely agree! We just 3D printed a base switch that makes it easy to turn your Mini on and off. Here's a link: https://m4button.com/ (if you have a 3D printer, you don't need this).
> It's not like it's drawing a ton of power on idle
Probably even drawing less than a "normal" PC PSU would just burn to heat in losses, lol. 3 watts of total idle power consumption, that's nuts how low it is...
Your average PC PSU hits up to 95% efficiency, so even at maximum efficiency at full load it would burn like 30 watts.
The quote efficiency on most PSU would be around the half load (more or less). The total system draw does not include the power supply - it will have its own losses esp on low end, still likely in the 80s
This is actually how I've used my M1 Macbook Pro since I got it. I never fully turn it in. It's either sleeping when plugging into my Thunderbolt 3 dock, or its sleeping on my dining room table on battery power. The efficiency is so good it never dies even if I don't use it for a day.
My work machine is an M3 Macbook Pro. I put it to sleep on a Friday, and after a three-day weekend, it's still ready to go on Tuesday with 95% of battery left.
What's irritating is that a lot of Intel laptops used to be able to get pretty close to this, back when they supported legacy sleep states. I have yet to own a newer Intel laptop that can sleep for more than 24 hours without almost completely draining the battery.
I think it is really bad design. Perhaps necessary because of space restraints and in that case understandable. But that is entirely different to good design and I cannot really buy the "use case explanation".
Many leave their devices on their desk and Apple always had a problem with just letting devices turn of completely, there are regularly problems with it. And they do drain power on idle, which is a frequent complaint.
Yes, we are that insane to use a lot of Apple devices for business in some departments. MDM for phones and iPads is top for the baseline administration, but the devices are eccentric to say the least.
On a related note, the original Macintosh shipped with a physically inaccessible reset button, and the manual cautioned against installing the (bundled) switch that enabled access because "using it the wrong way could cause you to lose information":
After buying one, I actually like it. I know exactly where it is, and can reach for it by feel more easily; I could never tell you whether the power button was on the left or right side of the old Mini/Studio without checking each time.
It's also larger, more satisfying tactile/clicky, and concave compared to the old button (which was rounded into the outside curve, not particularly be satisfying to press). I think the old one being so small and indistinct feeling, and also being so close to the cables meant you would never try to reach for it blindly. You do have to lift it up a bit, but the device is so light you can do that with the same finger you're using to push the button (of course you need another finger to push the top of the mini _down_).
I think neither old nor new button were really meant to be used more than occasionally, since you typically wake your Mac from the keyboard, and both designs reflect that. I do sympathize that the new version could be less flexible in different mounting positions though.
(that said, I'd bet Jobs/Ive Apple would never have shipped this, unless the height underneath was exactly perfect for even the larger fingers to fit)
The GP's same argument also applies to the Magic Mouse, as it happens:
> It's a way of signaling how the product should be used.
In the Magic Mouse's case, it came out just on the cusp of wireless mice becoming "a thing." Most people, if they were allowed, would have just left the mouse tethered to a computer by its charging cable at all times, since that's what they were used to. But Apple thought you'd be happier once you stopped doing that. So someone (Ive?) decided to make it so that you couldn't charge the Magic Mouse and use it at the same time. This did two things:
1. it forced people to try using the Magic Mouse without any cable connected, so that they would notice the added freedom a wireless mouse affords. It was a "push out of the nest."
2. it made charging annoying and flow-breaking enough that people would put it off as long as possible — which would make people realize that the Magic Mouse's battery lasted for weeks on a charge, and so you really never would need to interrupt your flow to charge; you'd just maybe leave it plugged in to charge when you leave work on a Friday night (and even then, only when it occurs to you), and that'd be it.
---
One could argue that the truly strange thing, is that Apple has never changed this design, 15 years and one revision later. That's an entire human generation! Presumably people these days know that peripherals can be wireless and have long battery life.
But consider: Apple's flagship mousing peripheral — the one shown next to the Magic Keyboard in all product marketing photos — is the Magic Trackpad, not the Magic Mouse. The Magic Trackpad is the first-class option for multitouch interaction with macOS; some more-recent multitouch gestures don't even work on the Magic Mouse. (The Magic Mouse never got "3D touch", for one thing.) In other words, the Magic Mouse is basically a forgotten also-ran at this point — something just there on the wall in the Apple Store for those few people who can't stand the idea of using a desktop computer through a giant trackpad.
Which leads to an interesting question: what is the user-profile for the person who buys (or is bought) a Magic Mouse in 2024?
Well, probably one major user-profile is "your grandpa, a retiree from a publishing company, who's been using the same computer he brought home from work 20 years ago, until it broke last week — that computer being a Power Mac G5 with a Mighty Mouse; and who has never had a laptop, and so never learned to use a trackpad."
And if the Magic Mouse user is your grandpa... then said user probably does still need the cord-cutting lesson that the Magic Mouse "teaches"!
> it made charging annoying and flow-breaking enough that people would put it off as long as possible — which would make people realize that the Magic Mouse's battery lasted for weeks on a charge
At a certain point this just reads like Apple apologia. They made a mouse you can’t use while it’s charging as a means to advertise how long the battery lasts? What?
But it’s not an apology, it’s the right design decision. The battery charges to a usable amount extremely quickly, and if you could plug it in all the time most would, which defeats the point.
> if you could plug it in all the time most would, which defeats the point
The point of a mouse is to be a usable mouse. If folks care enough for it to be wireless then they can use it that way, but if they don't what's actually wrong with using it plugged in? Screams iPhone 4 era "holding it the wrong way". Baffles me why you'd want to provide fewer options for your customer to charge their wireless mouse in order to make them do it the "right way".
If you want to always drive a car with the parking brake on you can — it's your car — but if a driving instructor sees you doing it, they'll give you a demerit. Because you're massively hobbling the car vs. its design space.
> in order to make them do it the "right way".
To be clear, Apple likely didn't want to force people to always use the mouse that way; what they were likely aiming for was a "silent tutorial" — like the Super Mario Bros 1-1 "goombas hurt you, while mushrooms are something you want" thing.
It's just that, in a hardware product, there's no good way to force someone to do something a certain way the first time (in order to teach them), without forcing them to always do it that way.
I’m sorry but this is an absurd comparison. Driving a car with the handbrake on has an adverse effect on the primary purpose of the car. Using a wireless mouse with the wire attached still leaves you with an entirely functional mouse. Using it wirelessly is a preference. It is absurd to defend Apple forcing people to use it without a wire because it will “enforce design purpose”. If they need to do so then it’s the wrong purpose.
Why should Apple care if I want to leave the device plugged in all the time? How does this choice remotely affect them?
Same with this power button: why should Apple care whether or not I power off the device when I’m done using it and turn it back on in the morning? This all just seems like pointless behavior control.
> Why should Apple care if I want to leave the device plugged in all the time? How does this choice remotely affect them?
Because the wireless-ness of the mouse (while also being a macOS-compatible multi-touch surface) was the selling point / feature / Unique Selling Proposition of this mouse vs. other mice (and vs. the previous Apple Mighty Mouse.)
I don't know if you've ever had the opportunity to see many "normal" people's home-office desks, but I have — I worked as a call-out computer repair tech as a teen. And it taught me something: a lot of people have a really small or cluttered "mousing area" — often arranged in such a way that, for a wired mouse, the mouse's wire gets in the way of the mousing surface.
Picture, for example, an old 18"-deep sewing desk up against a wall, on which the user has placed their laptop [effectively permanently, as its battery is long dry]; with a bunch of other things like tiny little speakers and an inkjet printer competing for space on that tiny desk, such that there is only a 8"x8" square of free space to the right of a laptop. The user's mouse is then plugged into a USB-A port of the laptop that's also on the right [mouse cable is too short to plug it in on the left!], with the port being at about the center of the laptop's side. This mouse cable now "wants" to lay directly into the center of that clear 8"x8" square of space; and even if you bend it harshly, there's at least two inches of USB-A plug + cable strain-relief that will still be poking you in the hand.
(Why do they use a mouse at all, if they have a laptop, which presumably has a trackpad? Because trackpads on laptops — especially smaller/older/cheaper ones — can be ridiculously awful [tiny, laggy, insensitive, jumpy, etc], such that this cramped mousing experience is still better than the alternative.)
In such setups, "erasing" the mouse's tether to the computer is not just for aesthetics; it's a genuine ergonomic improvement that makes it "feel" better to use the computer.
And that means that any average cramped-desk person who buys one of these new-fangled wireless mice (or a computer that comes with one) — and actually does use it un-tethered — is going to become not only an advocate for wireless mice, but also likely an advocate of whatever brand of the mouse/computer was, due to the novelty-capture halo effect. (I.e. the "if you only date awful people, you'll become obsessive about the first romantic partner to be decent to you" effect. Decency [or wirelessness] isn't unique; but if you only know it from one place...)
That viral halo-effect-induced word-of-mouth brand-advocacy created by being at the vanguard of the Bluetooth wireless peripheral transition, is the potential upside that Apple saw when creating the Magic Mouse.
And it wouldn't be one they could capture, if they allowed sheer incuriosity to lead that average cramped-desk user to never even try the mouse without the charging cable attached (or, worse yet, if the Macs that shipped with Magic Mice were set up by people who didn't even know the mouse was supposed to be wireless — thinking instead that the mouse was just a wired mouse with a "modular" cable!)
---
Now, admittedly, Apple had many other ways they could have achieved the same goals.
For example, they could have just detected that you're using a Magic Mouse with one of their computers for the first time, and forced you through a little software tutorial that gets you to unplug it — and use it unplugged — for a bit.
I'm guessing they didn't go with that solution for several reasons:
• it goes against the marketing of Macs as being "ready to use for productivity out-of-the-box". Forcing you through a hand-holding tutorial isn't very "ready." (And mark my word, if there was a skip button, even the people most in need of that tutorial — especially those people — would skip it. People don't read manuals on frickin' home CPAP machines, and then die; you think they're reading that?)
• Apple loves thinking of themselves as a design company first and foremost. (Apple products are all stamped "designed in California" — that's what Apple does there, they design things.) And if you know anything about "design" as an academic discipline, you know it's all about figuring out how to shape products or information in ways that cause people to subconsciously/intuitively make certain choices. The core of Information Design is visual hierarchy — "organizing and formatting text to ensure someone glancing at a poster gets the most critical information before glancing away." The core of Industrial Design is the concept of affordances — "putting push-plates on the push side of a door and pull-bars on the pull side." Apple doesn't want to stop you and tell you how to use their stuff; Apple thinks they are clever enough to design their products such that they afford being used in exactly the intended way. And when the product's design "fights back" from having a positive affordance to idiomatic usage... they just design more forcibly, actively de-affordancing non-idiomatic usage.
• A tutorial that pops up on Macs doesn't help someone who wandered into an Apple Store; bought a Magic Mouse (a perfect "this store is too expensive for me, but I want to buy something" purchase in an Apple Store ca. 2009); went home, and promptly plugged it into... their Windows PC. Yes, people really do sometimes buy Mac peripherals and expect them to upgrade their Windows-using experience, not realizing that Windows doesn't have the particular set of multitouch gestures mentioned on the back of the box (especially not back in 2009.) The "hardware tutorial", meanwhile, is platform-neutral.
Thanks for the really really long reply... You've exhaustively gone over the selling points for a wireless mouse and why people would want and buy one. I don't think any of it is in question. It's great to have a mouse that can work without a cord connected. I bought a wireless mouse (not Apple's) because I agree with you about the selling points. What I don't get is why not also allow it to be used plugged in, if the user wants to, assuming it costs about the same to put the charging port on the front, and can be done without compromising the industrial design? Why deliberately make it useless while plugged in?
Please explain how it's in my best interest that I must use my peripherals wirelessly. The only wireless mouse I have ever owned is in my work bag, so I have one wherever I go, it's not for regular use and I have zero problems with mouse cables, for the actual 30th year this December.
Because it’s the design of the product? Every product is designed with a specific usage in mind. This is designed to be wireless, hence all of the ways in which it enforces and enables that. The battery lasts a very long time, so even in your work bag it should be fine (although are you then plugging into many different computers to associate it?)
If you want a corded mouse (and it sounds like that’s a better fit), there are plenty of options on the market.
The port on the bottom is really the least offensive element of the design. I know people find it fun to clown on, but if any of them had ever used one for 5 minutes they would realize it's a terrible mouse for a bunch of other more important reasons (weight, feet quality, tracking accuracy, polling rate etc.).
Yeah I hate that people go for the easy fodder, which barely effects real world use, and ignore the multiple actual issues with it that would make it quite poor even if the charging port was fixed.
There is one thing where you are right: Apple doesn't care much about mouses and is all in with trackpads.
But this also completely disproves your theory about why the charging port is on the bottom of the Magic Mouse. Indeed, both the Magic Trackpad and the Magic Keyboard have charging ports on their back, making it very easy to use them wired, which many people do.
If Apple was so set on forcing people to use their wireless peripherals without the wire, you would find the port on the bottom too. Yes, it would be very dumb and this is exactly what it is with the Magic Mouse.
The only difference is that the Magic Mouse was first designed with a door for swappable batteries and when they did the refresh and just put a port in its place with no further amelioration.
Both the keyboard and trackpad shape changed (became thinner because there was no need for round batteries storage anymore) but they didn't change the mouse.
We can make all kinds of theories about why but the simple answer is that they don't care and they feel like their mouse looks nice and don't want to invest in changing the design.
This is pretty much all there is to it, lots of apathy towards customers and a general lack of care, otherwise they could have made some other upgrades (like the sensor) a long time ago, without having to touch this stupid charging port.
And this is why they get a lot of shit for it and it's well deserved, if Apple is too lazy to make a mouse, then they shouldn't make one, especially not one that cost a 100.
All the apologetic theories about teaching people to use wireless stuff are so nonsense its really crazy that people can believe that.
You might be right, but I'm not sure the timeline lines up.
The Magic Trackpad didn't exist when the Magic Mouse 1 (the one with AA batteries) was designed. So they definitely cared about the design of the Magic Mouse at that time.
The Magic Mouse 2 (the one with that added the bottom charging port) was released on the same day as the Magic Trackpad 2, and the Magic Keyboard 1. The three devices were almost certainly designed together, likely by the same industrial designer. And, because the Magic Keyboard was a ground-up design at the time, this would have been one of the more-senior designers doing a complete design cycle, aiming to create a coherent "peripheral brand image" to suit the marketing of a new generation of Macs.
If that designer chose to do very little to the (external) design of the Magic Mouse 1 to update it to the Magic Mouse 2, that might be because they were taking operational-logistics concerns into account, e.g. a stock of existing aluminum housings + multitouch-digitizer-laminated plastic covers. But, more likely in my opinion, they just thought that the IXD of the Magic Mouse 1 already achieved its goals (in Apple's conception, not necessarily the consumer's!), and already aligned with the brand image they wanted for the Magic Trackpad 2 and Magic Keyboard 1.
Remember that the Magic Mouse is a mouse. You need to grip it and move it, and it needs to have a certain amount of inertia so that bumping it doesn't shoot it off your table. So it needs to have a certain weight and a certain height.
I have a strong suspicion that, back when developing the MM1, Apple's design team invested into a design-prototyping human-factors-analysis phase, to find an optimal height and weight (and center of gravity!) for the Magic Mouse, so that it would "feel good in the hand" and hit some optimum between "gliding and clicking well as a mouse" and "resisting running away from you when used as a multitouch surface."
If I recall consumer reviews at the time, the MM1 was taken as a step-change in the "prioritization of function over form" of Apple's mice. Before the MM1, Apple's last ergonomically-satisfying mouse had been the Apple Desktop Mouse II back in 1992! In the late 90s/early 2000s — that's the iMac "puck mouse" and Apple Pro Mouse era — many people had been just tossing out the mouse their Macs came with, and buying PC mice instead!
Since the MM1's (very likely) evidence-based design had been so successful, the designer of the MM2 probably wanted to reuse the "backed by the research" numbers the MM1 had arrived at. As long as human hands are human hands, those will still be the right numbers (at least when viewed through the lens of Apple's internal IXD-culture biases.)
There are several ways the MM2's designer could have aimed to hit these same numbers — but the simplest way to do it (provided the old design still "fit" in the new line-up) would be to keep the external form-factor the same (thus keeping the height and grip the same), and add just enough lithium-ion capacity inside the device, in just the right place, to hit the same weight and center of gravity that the MM1 had.
---
> both the Magic Trackpad and the Magic Keyboard have charging ports on their back, making it very easy to use them wired, which many people do
The Magic Keyboard and Magic Trackpad don't need to move. They're supposed to stay where they are, unless you pick them up. So they can maximize thinness and lightness (which looks "sexy", and is better for supply-side materials and shipping costs), while staying in place by just having really grippy feet (made of the most dust-collecting silicone I've ever seen on a device.)
Apple doesn't care if you leave the Magic Trackpad or Magic Keyboard plugged in all the time, because they're stationary. There's no User Experience "magic" you get by unplugging them. Unplugging them is convenient in certain limited-space environments, and de-clutters your desk, and looks good in product photos — but it doesn't make using them better.
---
ETA: I looked into what actually changed between the MM1 and MM2. You might be surprised!
Keeping the same external form-factor, doesn't mean that the MM2 was just "the MM1 with a lithium cell where the batteries had been."
Here are fully-disassembled images of the MM1 and MM2, c/o iFixit:
So, for starters, there was clearly a complete internal redesign and rework of the board. Different ICs, different layout — even a different sensor, in a different package, from a different vendor, with different optics.
Also take notice of how the charge controller is integrated onto the MM2's mainboard. There are companies that have transitioned pre-manufactured devices to a rechargeable rev, by "just slapping a charging port in where the battery door had been" — and those companies tend to stick the charge controller and charge connector together to form a little floating board, and run flying leads from that floating board to the mainboard in one direction, and to the battery cells in another, so that the whole charging assembly together "presents as batteries" to the mainboard, whether it's being charged or not. This was not that kind of hackjob.
And actually, the external design changed in several subtle ways, too. Note the differently-designed runners that interface into the housing in a different way, for example. Note that even the digitizer connection to the underside of the touch surface is different — which probably implies a different digitizer, and means that they couldn't reuse the existing acrylic top housings (that they almost certainly get shipped to them with digitizers already laminated in.)
In other words: the MM2 didn't reuse anything! These are entirely different parts, that just happen to look the same on the outside.
There would have thus been no parts-reuse advantage in putting the charging port where they did. It was a "free choice" — they could have put it anywhere. They were milling out new, different aluminum bottom housings (that have more material than before, so they can't just be reworks of the previous rev's bottom housing) — there was nothing stopping the designer from putting a little hole in that bottom housing on the front side!
(Nothing, that is, other than the designer's likely belief that the MM1 form factor — where the bottom housing of the mouse tapers thinner at the front and back so that the top housing basically meets the mousing surface — was some kind of Good, Evidence-Backed Ergonomics, that they would be sacrificing if they made the whole mouse body a few mm taller to give a front-side port somewhere to extrude from. I repeat what I said before: this was an ideals-driven choice, not laziness.)
This is a really interesting view, and I have to admit this actually makes sense. Wireless mice definitely are nicer to use, and you can usually make them charge fast enough that a five minute charge while you take a short break is enough to get you through the day to a proper charge.
I must admit, in light of that logic I can totally buy placing the charge port like that solely to force users to use the mouse correctly.
The Magic Mouse refresh was done way after Jobs death.
The Cube had a very annoying design for ports indeed, but at least it was very easy to open/repair, with a handle specifically for it.
But for sure it was a case of looks trumping everything, including practicality, unsurprisingly it didn't sell very well.
I mind this design decision a hell of a lot less than the baffling deliberate decision to map EVERY KEYBOARD BUTTON to be equivalent to the power button (which the damn thing already has) on their laptop line-up.
I love it when my macbook is turned off and I accidentally nudge a single letter on the keyboard and it powers back on - not to mention when you're drying to clean it with a micro fiber cloth.
For better or worse, I have a habit of clicking the touchpad or a few keys after I shut down my laptop. Just to make sure it's shut down properly. Back in Windows days with HDDs and hibernate, laptops sometimes took minutes to shut down completely, and I don't like closing the lid before shut down is complete.
Now, I end up restarting with that mere act, and have to long-press to shut down again because the shut down option won't show up on login screen.
With a locked screen, key presses go to the password field. I have twice caused my user account to become disabled due to too many password attempts while cleaning my keyboard.
If you fully shut down a mac laptop, you have to press and hold the power button to turn it back on. Not sure what you’re talking about here and probably why you’re getting downvoted.
I don't know about every macbook, but I just tried this twice on my 2019 macbook pro and pressing any key on the keyboard (or at least the 2 keys I tried, "f" and "8") will power it on when it is powered off (yes, fully shut down, not asleep). Based on some quick googling, this still appears to be the case for M-series macbooks.
That’s a pretty wild definition of “fully shut down” that manufacturers (not just Apple) are pushing. When my device is shut down, I expect it to be fully de-energized and drawing zero current. How can a keyboard action re-apply power if the button itself is not completing the power circuit?
This is one of the reasons I’ve started putting all of my devices on power strips with physical switches that de-energize the AC mains. You can’t even trust devices to power off when they say they are off.
The number of devices in your home that draw current when they are “off” is too damn high.
All Mac devices still come with a headphone jack - and they are even good for higher impedance headphones (I use a 32ohms DT770 on my Macbook/MacMini).
For mobile devices, removing the headphone jack was not well received and it annoyed me too when it happened. Last year I made the switch to airpod pros, and I think I was the last person on earth to switch to BT for headphones - never looking back. So much better not to have a cable and untangled it.
I value flexibility in a product more than just about anything else. I will quite often choose the product with more features and use cases, even if it means paying a little extra money, just to have the _option_ to use a particular feature, even if I'm quite sure I won't use it on a daily basis.
You don’t press it very often and this makes it harder to press accidentally (eg putting stuff on top of the computer or a curious cat). I very rarely use the power button on a computer but maybe we behave differently.
That's what I was going to say. Do people still use these ? Given the low power and the general stability (I often have 150-300 days of uptime on my macbook m1) why not just put it to sleep and wake it up with the keyboard/mouse ? I can't even remember the last time I actually rebooted my desktop, maybe last year and I'm not even sure
I'm the first to shit on apple but this sounds like a complete non issue
The Mac mini M4 performance is around 4-5x in DaVinci Resolve for me - compared to my HP laptop (i5-1135G7).
Rendering HDR video was around 12fps there on the i5 - the same project in the Mac mini gets 60fps.
The M4 10 core GPU seems on par or better with a mobile RTX3060(65W) for video tests (NR / Deflicker) so I'm also impressed about the M4's efficiency. A lot of power per Watt.
It's becoming a dedicated video rendering machine for me where all the SMB auto mounting issues with macOS seem solvable. Pretty happy so far with the base model price even in the EU. The power button placement is an annoyance for me, though.
When do you turn it off? I have a Mac M1 Studio and I just let it sleep. If things get weird I reboot. I think I recall using the power button about a year ago after returning from vacation after I had shut it down.
Right now I mount up to 7 HDDs to the Mac via SMB, have some Streamdeck / Pedal and the necessary external SSDs for fast storage connected. I will see if the SMB mounts come back OK after sleep (my laptop acts as server) but the Streamdeck and HDDs wake up randomly so overall it's easier to switch everything on and off depending on usage.
Everyone keeps citing idle, which is when the device is on and active but not particularly doing anything.
The standby power draw is 1W or less. I've used Mac Minis for years -- just replaced my M1 with an M4, though the M1 left me wanting for nothing -- and the number of times I've interacted with the power button is so negligible I imagine I've gone over a year without touching it. When I haven't touched it in a while it goes to standby, waking instantly when I engage it again.
Not everyone lives the same way. I am seriously considering a Mac Mini as my next upgrade yet I live in a RV and move frequently. Are there ways that I can keep the Mac mini powered while traveling.. sure, but why would/should I?
Are you not turning off entire circuits to reduce power draw when mobile? I’m actually thinking about one of these for my truck camper and its power draw seems fine, but the stumbling point for me is the additional power draw from the monitor it would require. I think I’m leaning toward an M4 MBP with nano textured screen for maximum power efficiency and ability to work outside when it’s nice, though I have not yet put much effort into researching efficient monitors
My EU mind is blown by these claims. Let’s take the lowest(1W) at sleep mode. With a thousand mac minis at sleep mode, that is already 1kW! In my country, a single person household’s yearly electricity package comes at 1400kW(+100 depending on provider) per year.
Note: intentionally keeping it simple, please don’t nitpick.
No household uses 1400kW, and kW/year doesn't make sense. Do you mean 1400kWh/year? That seems pretty low (NZ is 7000kWh/year), but if so, you're comparing power to energy, which doesn't mean much. 1W 24/7 < 9kWh/year, which is pretty small.
Personal guess from a fellow European citizen: I think they meant to say 1700 kWh/year. According to most German power utilities, the average 2-person household consumes about 2400 kWh/year.
Not really. Unlike previous mac mini models, the grille on the underside is both the air intake and exhaust. If anything it ought to be better upside-down, since convection now helps the heated air rise out.
It would make it fill up with dust, even while idle. Also, and fluid spill would be much more likely to cause damage.
The normal orientation is fine for most people who want things to be as simple as possible (ie, most Mac users). There is very little reason to ever turn it off. If you still do that frequently for some reason, just leave it in a locatino where the button is still easy to access.
The case is also going to radiate heat and turning it upside down will make that less efficient. The base won't radiate heat in the same due to being plastic and due to it already being used to pass air through.
The case isn't thermally connected to the SoC in any direct way, so while I'm sure it does radiate some heat, I think it's pretty negligible. The PSU sits at the "top" (in regular orientation), and I don't think it runs much of a risk of overheating.
I just arranged a selection of 4K H.264/H.265 clips in a 2x2 grid on a 8K timeline in DaVinci Resolve.
Playback works well - up to 60fps. However, export to H.265 creates a lot of Swap. Rendering went with 15-18fps. All videos on a SMB network drive but the GPU was the bottleneck for rendering.
Swap was even around 24GB with 5 videos which I tested first. Using 4x4K it went 9 GB before stabilizing at around 2GB. No effects or grading whatsoever - plain 4K60 SDR videos.
One single SDR 4K clip renders to 8K at 25fps. Using Superscale 2x makes that 0.5-1fps.
For 8K rendering you may be better off with 32GB RAM minimum or trying the M4 Pro model maybe with 24GB. For 4K/6K editing the base 16/256 M4 Mac mini seems sufficient when all video storage will be on external drives or network.
thanks for checking! I think I'll get the M4 Pro and max out the RAM. I don't have the budget for whatever the future M4 Studio would look like, so this seems like a nice sweet spot.
I had a few OFX plugins and maybe had the browser running which may have impacted RAM. Depending on how much RAM you can keep free and the amount of grading on that 8K video you may be OK with the 24GB of the M4 Pro base model but yeah - more is always better though with Apple it's painful to add more of anything .. on a budget.
>The Mac mini M4 performance is around 4-5x in DaVinci Resolve for me - compared to my HP laptop (i5-1135G7).
You could pick a variety of non Apple CPUs that easily deliver 4-5x the performance of an 11th gen i5. Maybe don't be disingenous and compare the M4 to a more recent CPU like i5-14600K, which is also 4x the performance. I'm not comparing on power efficiency, since that was not mentioned at all as part of your comment.
So i5-14600K is 1.57x on multi-core, slightly worse on single-core. $235 for the CPU versus $599 for a whole system. Could maybe match the total price, but Intel won't be able to come anywhere close on the power efficiency.
The 120GB/s memory bandwidth of the unified memory helps especially with video, I guess. The M4 CPU isn't really stressed out most of the time. Only multicam and HLG conversations it maxes out.
Once I patched my old Dell T1650 BIOS for ReBAR support yet the iGPU of the i5 1135-g7 had similar GPU performance for video as the Intel Arc A380 in the desktop PC. The old PCIe3 speed limited its performance. I heard others reported a smoother replay experience with Apple silicon compared to even a RTX4090.
I get some delays when fast scrubbing through a 9 multicam 720p timeline and just 360p proxies. Still impressed compared to what I was used to. Video editors may be surprised about the performance for the price.
There is a Github project [1] which has detailed instructions. The ancient i5-3570 only allowed 2GB ReBAR, BTW. GPU-Z says ReBAR / 4G is activated and working, Intel Arc Driver does not see it but seems to use it. Some part of the BIOS had to be manually fixed, AFAIR.
The PC was given for free, the CPU €11 yet overall I wouldn't recommend the process just for the result. It's only little benefit, if at all, though fun. On that occasion I also added some NVMe driver which works well, demonstrated for the similar Dell Optiplex [2][3]..
4x vs. the old i5, not the M4. They are trying to say that comparing to a CPU released four years ago is pointless because the newer CPU is obviously much better.
It's not disingenuous to do a real-world comparison to a system you already own when stating the specs. It's actually much more useful to hear these real world anecdotes than to look at geekbench numbers.
I expected a fast M4 package but still was mind blown to see the video editing performance. After all these video renders run for many hours.
My 2 year old i5 laptop - even with 64GB RAM and 2x2TB SSDs upgrades - was around the same price like the base M4 Mac mini / uses similar Power. The PC surely is way more versatile with these specs and expandability.
Staying mostly in X86-land due to affordable RAM & storage, nothing I currently have comes close to the M4 performance per Watt - and now even performance per $/€ - in my video-editing use case.
It's comparing apples to oranges. If you want to compare computers, compare a macbook with an old i5 to your laptop with an old i5. Comparing an M4 to an old i5 is just silly. Of course it's going to be faster.
It's a comparison of two CPU/iGPU combos I have on my desk with similar power draw. Those iGPUs are most power efficient for video editing as I like QuickSync from Intel.
The i5-1135G7 (17W TDP) has 2 Media Engines which I use for proxy generation in parallel for example and pretty versatile so I use it daily (64GB RAM..).
Still, I think it's a notable achievement to get 4x performance with the M4 for video at similar wattage of the i5. I don't have an M4 MacBook but I guess the M4 would perform similar to the one in the Mac mini.
M4 Mac Mini with 16GB RAM is doing a "good enough" job of editing 6k raw footage in Premiere for my team. I'm surprised to say I'm content with the 16GB of ram so far.
Edit: This is in contrast to my M1 Macbook Air with 16GB of ram which would stutter a lot during color grading. So definitely feeling the improvement.
I bought the first MacBook Air M1 with 8GB because it was the only option available in my area. Initially, I had doubts, especially after using notebooks with more than 16GB of RAM in previous years. But I was genuinely surprised by how well the M1 performed. My takeaway is that there’s a lot of room for similar improvements in Linux!
And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.
> And while I'm broadly satisfied with its performance, I do think that the SSD is probably carrying some of that load. And for a machine that often gets used far longer than a PC, I can't see that being great for longevity.
This isn't the early 2010s anymore - SSDs last "long enough" for most people, to the point they are no more consumable than your motherboard or your RAM. (I've actually experienced more RAM failures than SSD failures, but that's an individual opinion here.)
And for the downvoters - do you remember the last time you handed in your Steam Deck, Nintendo Switch, iPhone, or even laptop specifically for a random SSD failure, unrelated to water damage or another external cause? Me neither.
I'm still very happy with my 8GB Air M1 as well. It's incredible how well it still works for a 4 year old entry level laptop. I see all these new M's come out, and I'm sure they're fantastic, but I'm not at all tempted to upgrade.
Yeah, I don’t know why 8gb base models get so much hate online. 8gb is 64 billion bits of memory. If you’re writing everyday software and you need more memory than that, you’re almost certainly doing something wrong.
I also use an 8GB M1. It has firefox with many tabs & windows open in OSX and also a Linux VM in UTM which is running VSCode, vite, and another firefox with lots of tabs. It's performing well! (although swap is currently at 2.3GB, and there's a further 3.5GB of compressed data in RAM)
How much RAM should a few browser tabs and a spreadsheet use? Spreadsheets and webpages were both invented at a time when computers had orders of magnitude less ram than they do today. And yet, Excel and Netscape navigator still worked fine. It seems to me that bigger computers have caused chrome to use more memory.
If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?
That doesn't help you right now, but 22gb is ridiculous for a few browser tabs and a spreadsheet.
> If 16gb is considered to be a "bare minimum" for RAM, well, how much ram will all those programs use next year? Or in 10 years?
16gb is the figure for the next 10 years. If you see yourself being content with 8gb of memory shared between your CPU and GPU in 2030, you must have a uniquely passive use-case.
I remember when people said 4gb doesn't need to be the minimum for all Macbooks. Eventually MacOS started consuming 4gb of memory with nothing open. Give Apple a few years to be insecure about the whole AI thing and they'll prove to you why they bumped the minimum spec. Trust me.
It’s not just for tabs and spreadsheets, I also have an ide, containers, etc.
I do think the memory footprint of many applications has gotten out of hand, but I am more than willing to spend the extra money not to have to think about it.
This doesn't necessarily mean that your workload would perform unacceptably on an 8GB model. It just means that fewer optional things would be cached in RAM, more RAM pages would be compressed, and there'd be more swap usage.
I wish the same could be said of the Studio Display, which is quite power hungry. If the Mac is running then the display is using minimum 10 Watts of continuous power usage at all times, fan running, with the screen off.
I guess it takes 10 Watts to maintain the Thunderbolt controller, USB hub, A13 processor, and run the fan.
Power usage does drop to <1 Watt when the Mac is actually sleeping, unless anything is plugged into the USB hub. Even an empty iPhone cable will cause the display to draw 5 Watts. It's disappointing.
Also interesting, the M4 Mini has the flash storage on a replaceable module, instead of being soldered to the motherboard, although the NVMe controller is still integrated into the SoC.
iFixIt and others have already posted videos showing that the flash storage is now upgradable.
> M4 Mac mini Teardown - UPGRADABLE SSD, Powerful, and TINY
"Upgradeable" is too big of a word here, specially considering that they're using different form-factors even between the models released on the same year (e.g. pro vs non-pro) ; and also different from models released on the previous year (e.g. studio). This almost certainly means that next year's model will also use a different interface, so you won't be able to upgrade your storage at all.
You might be able to. You just need to make sure you get a compatible module somehow.
I wonder if 3rd parties will start selling them. If the memory controller is in the cpu, there’s no reason for the little board housing the ssd to have any proprietary chips…
Depending on your level of price sensitivity, you can always use a Thunderbolt SSD, an external RAID array of SSDs, or just get the 10 Gigabit Ethernet upgrade and hook into local NAS.
The M4 Pro Minis support higher capacity modules, so it's not too shocking that they are not identical.
We've already seen videos from the usual suspects showing that people who are sufficiently skilled with a soldering iron can replace the flash chips in the modules with higher capacity chips, in addition to replacing the whole module.
As noted above, you can simply replace that module with a higher capacity module with just a screwdriver, as iFixIt did.
However, there is a real opportunity for those who do have soldering skills to make a quick buck here.
You could pretty easily buy the cheap base model M4 and resell it as a custom upgrade build, as long as you were clear that the SSD was no longer stock.
It's not an easy solder job and they are picky about what NANDs they work with and how they're configured. It's better than soldered to the board but not by much.
Sorry, but again this is an abuse of the word upgradeable.
You could do the same with many other laptops and even some phones out there - buy from a 3rd party who has resoldered the corresponding parts. The replaceable modules brought you nothing.
Even the economic motive you mention is actually just because Apple overcharges for storage, and has nothing to do with the replaceable modules. As long as there is no cheap 3rd party source of these, a replacement module ecosystem makes no economic sense (someone will always have to bring in his device for resoldering, or you lose the price of one good working base model).
Apple bought a company that designed enterprise SSD controllers over a decade ago.
> Anobit appears to be applying a lot of signal processing techniques in addition to ECC to address the issue of NAND reliability and data retention. In its patents there are mentions of periodically refreshing cells whose voltages may have drifted, exploiting some of the behaviors of adjacent cells and generally trying to deal with the things that happen to NAND once it's been worn considerably.
I think the reason to make it replace/removable is to reduce e-waste at EOL. Lots of companies have policies on data storage on decomissioned computers to be physically destroyed, so making it replaceable allows the machines to be repurposed after.
No, it is soldered to the storage module. You have to desolder the flash chips from that module and replace them. You can't just order a bigger storage module from Apple (or anyone else) and plug it in.
I don’t think anything is stopping you buying a second hand / 3rd party module online. It just needs to be physically compatible with your particular generation of hardware.
I don't think this is true. If you watch the videos, dosdude1 specifically says he had to order blank NANDs for this process. Then you DFU restore the system from another mac. I have no proof, but I assume part of this DFU restore process is the new NAND chips being hardware paired in some way.
Again I have no proof, but there must be reasons he claims they have to be blank NANDs
The video at the base of this thread has iFixIt take the 500GB SSD from one Mac Mini and swap it with the 250GB SSD from another, and both recognized and worked with the replacement.
It is a swappable part. Which means much more attainable servicing for flash failure or exhaustion, and possibly even upgrading storage in the future.
not wanting extensive interactions with Apple's legal team. Charging $400 for $10 of storage means they have a lot of money to harass you with very well payed lawyers even if you are in the right.
Maybe they already have, depending on what you need. Settings >> General >> Sharing provides lots of options. "Remote Login" is SSH and SFTP, and last time I used it, "File Sharing" was SMB. "Screen Sharing" and "Remote Management" seem useful, too. I assume that "Media Sharing" is supposed to allow iTunes on your network to see media files, although I've never used it and the information on the dialog is limited.
Yes, but getting it to work requires that you both:
(a) disable FileVault, and
(b) enable automatic login
One option is to automatically log in to an account which has very little access, and have everything sensitive on an encrypted disk/partition, and to use a separate keychain for any credentials you want to protect.
I don't like the idea of enabling automatic login on any machine, so I keep FileVault on and just accept that any rebooted Macs will need physical access on restart.
If it's possible somehow to get screen-sharing access (or even SSH) without automatic login after a reboot, I'm sure lots of users would love to know how.
Linux support. MacOS is a desktop first gui based operating system. Linux on the other hand is a server first cli/terminal based operating system. Everything server related is designed to on linux first and foremost and may or may not incidentally also run on MacOS.
macOS is explicitly designed to not be a server, and the consumer hardware it runs on is also designed that way. Apple even discontinued the Server tools that you could buy on the App Store that used to be called Mac OS X Server.
If you want to run Linux server apps, you should run Linux. Because Apple hardware and macOS isn't giving you any advantages over a generic piece of hardware running a Linux distribution. The hardware costs more and is less upgradable than off-the-shelf hardware.
Servers should not run desktop environments because they are a waste of resources and widen the attack surface due to having more components installed and running.
And even if you want a desktop environment for your Linux server, Linux most certainly has a wide selection of mature stable desktop environments.
If you need to do development work or just achieve the goal of running Linux applications on a Mac, that can be easily done via virtual machines, containers, etc.
If they work on a BSD they should work okay on macOS. (Not because macOS is exactly like FreeBSD, just that it means the project has been tested cross-platform.)
This isn't the market for MacMinis though. Why are people on this forum so bad at understanding market segmentation? Apple made an incredible desktop machine that happens to work pretty damn well as a server if you poke around.
This machine is for people at home to for editing video. It's great in the field for production where it goes from pelican case to hotel desk to folding table to pelican case to cargo hold to storage.
Have you ever thought that maybe people understand "market segmentation", but at the same time, they'd like to know how broad a range of computing options one would have on these general purpose computers, with price tags in the many-hundreds to thousands range?
Sure, but to complain that a Mac, which, come on, at this point is a known quantity for 20 years, doesn't run Linux is just looking to complain. If you want more options there's endless x86 choices, and if you want ARM then demand better from other manufacturers as well. Apple showed its possible, why doesn't Dell come out with something comparable? I'm not a fanboy, I run systems of all stripes, but Macs aren't designed to be servers (even though they operate perfectly well as one) and people need to stop complaining that they aren't.
In the past, Apple sold at least four generations of the Mac mini that included models literally branded as server models. Continued interest in using more recent models as servers is quite reasonable.
I run full multiple Ubuntu desktop VMs on Parallels on a M1 MacBook Air. You can use Docker for server installs, sure, but QEMU also works great on Macs and with Rosetta you can even get pretty damn close to native x86 execution speeds.
they run through virtualization which is clunky to interface with across boundaries and introduces overhead. I also don't think it has any hardware acceleration for things that would benefit from using the gpu.
MacOS has built-in file sharing via SMB. It also has built-in VNC for graphically administering the server, built-in ssh/sftp, built-in rsync for backup, etc. etc.
I see thank you. I dream to setup something like iCloud but with open source software and hosted at home :) Not sure if there is anything like that out there.
MacOS would need syncookies to be a viable tcp server on public IPs, IMHO, but MacOS pulled FreeBSD's TCP stack a couple months before syncookies were added, and they never rebased or otherwise added syncookies later.
I haven't looked into if they pulled any scalability updates over the years, but I kind of assume they haven't, and the stack would have a lot of lock contention if you had more than say 10,000 tcp sockets.
Given that, if I were Apple compatible, I might run a mini as a LAN server, but my home servers provide services for the LAN as well as some public services (of limited value and usefulness, but still public).
Is this something that you can fix by putting the server behind Cloudflare? I assume most "home server" users would do that (or a similar service provided by Apple if they go down that route).
What I look for is, 128GB RAM minimum, decent number of PCIe lanes because I want two fast NVMe drives, a HBA card ( though this I guess could be external ), two network ports minimum, ZFS, sane terminal, native support for containers and VMs. Native support for UPS interfacing, native support for backup of containers and VMs. And lastly a community of other users doing the same.
Kinda unrelated, but with all this amazing efficiency, I wish Apple would re-introduce the one feature that made me truely love my old MacBook.
On my ~2010 Macbook Pro they had a series of small green lights on the chassis that acted as a battery indicator. When the laptop went to sleep, it would take on a slow breathing like animation effect. It was beautifully done. I was sad when it was removed.
The m4 has 6 efficiency cores and 4 performance cores. This is 2 efficiency cores more than the previous generations, and the same number of p-cores, thus higher e to p core ratio, which can explain a large part of the increase in power efficiency. Not to say that otherwise there is nothing remarkable here, of course there is, but if the author found a 30% increase in efficiency compared to m2 while they claim they expected 4-10% after 2 generations of chips, it could be because of that.
The m4 pro has 4 e-cores and 6-8 p-cores, hence I would not expect similar increase.
That would explain improving performance, but I don't actually understand how that would improve efficiency. Particularly at the high end where they quote 6.74 Gflops/W .
In particular, under some ideal, unrealistic assumptions to simplify things, and denoting n the number of E-cores, E the efficiency and W_e the power consumption of each e-core, while respectively denoting m the number of each P-cores, P the efficiency and W_p the power consumption of each p-core, where E>P, we can calculate the express EFF of the cpu as
While it may not be the literal fastest CPU ever, it still seems very, very fast, and the efficiency is pretty compelling. I'm not sure how much of those efficiency gains are a product of the design constraints that Apple is not beholden to (external memory, x86 backwards compatibility, other aspects of the AMD64 architecture, etc.), the slightly better process nodes, or superior design. I'm honestly dying to know, but I guess we won't find out, and as far as the products go, it doesn't really matter that much. The end result is a pretty good deal.
As a mainly non-Apple user I see the following caveats for my own uses:
- I'd love to see better Linux support. (As far as I know, Asahi Linux only covers the M1 and M2 lines, and as amazing of a project as it is, last I looked, it's neither upstreamed nor exactly what one might consider first class. Maybe it's getting there now, though...)
- I'm worried about the SSD situation still. It seems like it hasn't amounted to much (yet), but some use cases might be more impacted than others, and once the SSD does finally fail, the machine's dead. This is not how things work in most PCs, even mini PCs, and it's a bit of a hard pill to swallow.
- The pricing is great at the baseline, but it gets progressively worse as you go up. The Apple M4 Pro Mac Mini has a baseline price of $1,399.00, which I think is pretty decent for a high-end computer with 24 GiB of RAM. But, it maxes out at 64 GiB of RAM, which is less than half of what I have in my current main machine, and believe me, I use it. That 64 GiB of RAM upgrade costs $600. For comparison, the most expensive 64 GiB DDR5 RAM kit on PCPartPicker is $328.99. Don't get me wrong either, I understand that Apple's unified RAM is part of the secret sauce of how these things are as efficient and small as they are, but at least for my main computer I really don't need things to be this compact, so it's another tradeoff that's really hard to swallow.
But on the other hand, for people happy to use macOS as their primary operating system, the M4 line of Macs really does look the best computer Apple has ever produced. (For me, it is rare that I feel compelled to even consider an Apple computer; the last time was with the original M1 Mac Mini, which I did buy, although after some experimentation I mainly just use it for testing things on macOS rather than as a daily driver machine.) There really aren't many caveats especially since the base memory configurations this time around are actually reasonable.
I suspect these things could be great on homelab racks if the longevity issues don't wind up being a huge problem.
> - I'd love to see better Linux support. (As far as I know, Asahi Linux only covers the M1 and M2 lines, and as amazing of a project as it is, last I looked, it's neither upstreamed nor exactly what one might consider first class. Maybe it's getting there now, though...)
As I understand it M3 is not supported because there's no M3 Mini to run the continuous integration
There is now an M4 Mini, so there might be a chance Asahi Linux will eventually support M4.
I don't think there's enough high quality benchmark information to really make a statement like that, but most importantly, I care about both single-core and multi-thread performance. I don't really have any workloads that only use one thread.
Comparing the M4 with PC CPUs will be hard. Typically when comparing two PC CPUs, to make the comparison more realistic, you'd set some reasonable similar constraints, like using the same memory kits and so on. However, even without considering overclocking, the actual performance of a given CPU can vary massively depending on the thermals, power delivery, memory and so forth. (It can vary by over 50%. I didn't check but you should be able to see this on benchmark charts that allow user submission.)
(However, for what it's worth, I always do at least a bit of mild overclocking personally. Nothing extreme, but what does fit within the power and thermal budget is basically just free performance at the cost of some efficiency, a trade-off I'm happy to make for my main desktop machine.)
Nah that's all pointless trivia. It is dark inside the box. Nobody gives a rip whether the mini is faster because it's got better ram or if it's faster because it's got better arithmetic logic. So you do not have to control things like memory because you don't have a choice anyway.
You don't really seem to understand the point of benchmarks. You're trying to compare the performance between two devices to quantify which one is better at some specific task in some scenario. The tricky part here isn't that people care whether the CPU is better or not, the problem is that on the PC side you can fix the variables between CPUs so that you can just look at the value of individual CPUs, but you can't do that when comparing across PCs and Mac devices. So what do you pick to compare with? There is no correct answer, but there are some answers that are more sensible than others. e.g. you probably don't want to jump massively into another price class.
If money is no object and you just want ridiculous multicore performance it's going to be pretty hard to beat EPYC. Yes, the single core performance is going to be worse; it won't probably be the best even among PC parts, but many use cases gladly take that tradeoff.
The SSD in the new small Mac Mini is replaceable, though it is proprietary (not standard NVMe) and uses different physically sized and shaped drives that are incompatible with each other physically between the M4 base version and M4 Pro version.
Is there any benchmarks for these chips doing like regular 'data-sciency' CPU grunt work? Dataframe-wrangling, inverting matrices, doing large matrix, factorisations, fitting decision tree's, etc?
I'm very keen on one of these, but I simply have no idea how good they are at my day to day tasks in R or Python.
It depends. If you're using Python with numpy>=2.0.0 (and macOS>=14) then you should benefit greatly from Apple's Accelerate implementation of BLAS/LAPACK routines which are behind most linear algebra operations. I'm not aware of any serious public benchmarks, though.
> If only they didn't put the power button on the bottom.
I can't tell if anyone is being serious about the "Powergate" issue. The thing is 5" wide and weighs 1.5 lbs, it's not exactly a burden to lift it a little. And there are highly practical workarounds: https://www.reddit.com/r/macmini/comments/1gncek7/nailed_the...
I consider it a typical Tim Cook decision, in that the man led the company that made one of the fastest CPUs in the world, makes it draw as much as a Raspberry Pi. Absolutely crazy feats of engineering, design, manufacturing… and -
There is that ONE detail that would’ve made it perfected but it’s botched!
I don’t mind it too much, since it’s still 99% close to perfect.
Tim Cook cares about money and efficiency of building and moving product. That’s it. I highly doubt there’s been any important design detail about any product that he made himself.
Hah, Tim Cook decision pretty much sums it up; its the kind of thing that wouldn't have lasted 5 seconds when placed in front of Jobs (although there is a strong chance Jobs would have demanded his own nonsensical addition/subtraction to the design).
Jobs would have removed the power button entirely.
And then when there's a fault requiring a hard reset to fix you have to insert a bent paperclip into a tiny unlabeled hole on the bottom, or spell out a message in morse code by unplugging and re-plugging the power cord with some special timing. (This is not sarcasm)
Was Jobs in charge when they decide to place the power connector on the bottom of the "magic mouse"? But it's fine because it can fit in a manila envelope.
Jobs would have kept the button on the bottom, as it's not the proper way to use a computer.
Instead, he would have put motion/light sensors on the screen, so it would automatically wake up when you are sitting in front of it. Macs don't shutdown, they just go to sleep and wake up when you need them.
Yeah he likely would have said no ports, or lets have only one port, or he would have demanded that the Mac mini has the dimensions of some multiple of pi…
1. Given millions of things that are perfect it takes one of them for HN to lose its mind, power button happened to be it this time, Cook didn't decide that.
2. How often do people exactly have to turn off and on a mac that consumes less than a pi for them to constantly be reaching out to that power button?
It's not like Tim Cook personally decided to put the button there, but saying over many years he's aligned the company to be one that would leave the button there rather than bite the cost of putting it somewhere more ergonomic is something I can buy into. Seems like a way to improve margins generation over generation, which is the kind of thing he's obsessed with.
This is also the same Apple that made the G4 Cube: that felt like this in reverse, with Jobs driving them to make a capacitive touch button because of an obsession with a seamless surface.
Yes that's it. Jobs's annoyances were always about achieving a better product, a higher level of refinement or something of the sort. It was mostly about, "it can be better this way" and he was very often right even though sometimes not.
On the other hand, with Cook, it's always about cost cutting and corner cutting and the likes. It feels cheap (especially considering the pricing and brand aspirations) but also primitive and unrefined.
Which is why their price escalation was unjustified, if you want to charge a lot you need to figure out a no compromise product and, in my opinion, they have not been there a lot recently...
Shame they got rid of the ability to power the computer on and off from the keyboard. I know its been that way for some time, I'm sure there's good reason for it (maybe it doesn't work well over BT or something, or simply few generic keyboards offered the power button).
The comment in the article is in the context of rack mounting them which is a common thing to do with Mac minis. Having it on the bottom makes it hard to press as you can’t lift them up when they’re secured in a mount.
> Having it on the bottom makes it hard to press as you can’t lift them up when they’re secured in a mount
Hard rebooot is the only situation where you should be using the physical power button on a modern Mac. If you're installing Macs on a rack, presumably you can sudo shutdown -r.
The button on the bottom is trying to tell you that the system is built to be well behaved on stand by.
I am working on a solution to make it easier to hit the button from the front of a rack shelf, but the fact I have to mess with 3D printing just to hit a power button is silly.
Older Macs also had the power button on the back, which was also annoying, but at least a Mac that's secured to a shelf could have its power button pressed pretty easily.
The Mac mini _requires_ a mechanism to press up from the bottom in any permanent-ish install.
I would have thought that them being slightly higher than 1U would have precluded people from rack mounting them "flat" in the first place. It seems like it would be more efficient to rack mount them standing on their sides, and then the air gap between them would be enough to reach the power button easily.
Because the comment is very specifically talking about rack-mount installations. Granted, no matter you put the power switch, it's going to be difficult to reach if you install 21 of them on a single shelf.
> Apple VPs Greg Jozwiak and John Ternus explained in an interview to a Chinese content creator on Billibilli (spotted and machine-translated by ITHome) that the main reason the power button is on the bottom of the 2024 Mac Mini is because of the computer’s size. Since it was nearly half the size of the previous generation, the underside was “kind of the optimal stop” for a power button. They also say most users “never use the power button” on a Mac, anyway.
> Apple isn’t wrong here. The Mac mini measures 5 x 5 x 2 inches, compared to 7.75 x 7.75 x 1.4 inches from the last generation; it takes up much less space on your desk, which is great. The trade-off is that you run out of space for some important things, like a power button.
That explanation makes no sense. There are many mini PCs of the same size that have their power button in an accessible location.
The excuse that most users never use the power button is the "you're holding it wrong" of 2024. Stop telling me how to use your devices, Apple.
The explanation mentioned on several forums that it's a cost cutting measure to avoid extruding yet another hole in the aluminum case, or routing the power cable, makes no sense either. This is a state-of-the-art machine, yet they're cutting costs on such trivialities? Give me a break.
This is unequivocally poor design. Yet Apple will never publicly admit that, and will gaslight everyone to think it's actually good, as they usually do.
They've managed to get people to accept things they'll never accept in Intel or Android ecosystem. Like no SD card, no memory expansion, no dual SIM etc. That gives the confidence.
I guess once system shuts down you can switch off the power at the mains or adapter socket.
> The thing is 5" wide and weighs 1.5 lbs, it's not exactly a burden to lift it a little.
It's the difference between being able to hit the button one-handed or needing two hands. My Mac Mini is sitting at the back of my desk, and the power button is toward the rear end of the Mac, and I definitely find it a bit clumsy to reach back with two hands, flip it over (disturbing an wires/peripherals that might be plugged in), find the button, and press it.
> And there are highly practical workarounds
Not as practical as putting the button on the front or top.
It's certainly not a deal breaker, but I do find it mildly annoying. The ideal for me would be to have the button easily accessible on the front or top, and have it behave like other devices I use: a short press to sleep/wake, and a long press to initiate shutdown. And when I'm getting up from my desk, I could give it a quick tap to put it to sleep and lock it.
My workaround is to use a keyboard shortcut to put it to sleep, which it works fine and is not a big deal. But I still think Apple deserves a bit of mockery for this decision.
As with all things regarding power efficiency, you have to consider the wide use of these devices, not just the individual use.
If moving the power button there changes the behavior of thousands of people that would typically shut their computer down when they're not using it, that half glass of orange juice turns into thousands of gallons.
I have pressed the power button exactly once, since Friday (the day I got it). All other restarts were "soft" (including a couple of crashes). The keyboard and trackpad do fine, starting a shut-down computer.
It's replacing a docked MBP. That power button was a lot more difficult to reach, and I needed to hit it more often than this.
I spent a few minutes looking up whether a Mac could be booted from a Bluetooth keyboard but couldn't find any documentation of that. Back in the day some(?) Mac models could be booted by a USB keyboard, see https://www.projectgus.com/2023/04/griffin-imate/ for technical details.
I just bang on the spacebar, and it starts up. It's a Bluetooth keyboard, so I guess the system is listening to BT. I did that with the laptop, forever.
The Mini starts up a lot faster than the laptop.
That said, I should actually do a test, to make sure that the system is in real shutdown...
Nah. I'm wrong. The laptop started that way, probably because the keyboard is attached to a CalDigit dock, and tapping on the keyboard probably sent power to the device, which starts it.
That doesn't happen with the Mini, if I actually do a shutdown from the menu.
Apple has something I think they call "Deep Sleep," which is basically a shutdown, and that wakes from the keyboard.
That said, it's not a big deal to reach under the left side, and tap the button. The laptop was a pain, because I had to open it up.
But I've only had this thing a few days, and haven't had a chance to really torture it, yet.
It's probably easier to find and press than the old 27" iMacs. I always had a brief moment of trouble feeling around the back to find that darn button (part of the reason is that you need to press it very infrequently).
I can't imagine it's anything but a silly comment. Macs have the equivalent of wake-on-LAN, plus you can configure them trivially to restart after power loss. The idea that you'd have to press the button often is just silly.
Given my cat, after learning to press a button on his automated feeder, now presses anything that looks like a button with the curious expectation of food, I can only presume he got out while I was in Cupertino.
Button on the bottom isn't a design mistake. It's an opinionated choice.
Sounds vaguely like "you're holding it wrong". Maybe it was always supposed to be placed on its side? Apple should clarify why the button is on the bottom.
Minis are rarely racked in bulk unless you're running a server farm, which is not the use case they design for. The MacMini is first and foremost a desktop computer for non-professional people or at least not sysadmins. If people want to rack them, go ahead, but in that case how often are you hard rebooting a machine vs soft reboot anyways? Macs aren't known for freezing up too much.
Either way, it works for the use case its designed for.
Some of the rackmount kits for previous generations already reroute the power button and connectors to the front, like this https://racknex.com/wp-content/uploads/2023/04/with-power-bu.... (Though why not just install it backwards?) I guess they will be able to run a little lever under the M4 model the same way.
Actually, the M4 model is a little taller so it no longer fits in a 1U rack mount. Whereas before you could fit 2 horizontally in 1U, now you'd possibly fit 8 or 9 vertically in 3U. (Edit: This company says 10 per 2U https://www.racksolutions.com/m4-mac-mini-apple-hypershelf.h....)
I think the airflow for more than 3 per 1.33u, or 8-9 per 3u will necessarily suck.
I have designed for both, I think both have great use cases. 2 x 8 in 6u is really neat and tidy, I just don't love the concept of sitting the fans on their side, though I think they'll still last 5 years.
I have a 5K monitor for the Apple Silicon MacBook which I could easily switch to a MacMini and would be ergonomically better. My other MacBook is also about 10 years old and it's a question of how long I stretch everything out while it's all basically working even if not on the current OS.
Stupid question but why can't it (non Pro particularly) be powered over USB C like a MBP, if it's so power efficient?
Does it have extra performance that makes full use of the 155W input?
What would have it taken Apple to have given us the option to be powered by 100W USB C (eg the tech to downgrade power usage to match the power input)?
I just bought an ASRock Deskmini x600 with a Ryzen 7700 to run as low-power Linux server / workstation. Given the trouble I had with this thing due to (I believe) buggy amdgpu drivers and/or buggy firmware, I'm inclined to throw it out and just buy this Mac Mini.
I've been lurking here for more updates on the Mac Mini M4 since I haven't bought mine yet. I also shared some thoughts in previous comments [1][2], as I'm not only impressed by the technical achievements and form factor but also interested in seeing how Apple's business evolves over the next few quarters. I'm curious whether Apple will increase its market share on the desktop side while continuing to dominate in mobile.
What i am thinking right now is, what we can push the M4 Chip to the Maximum of its power? i've seen a lot of people using it for LLM, making cluster with ExoLab. It's amazing how its perform with such an efficiency.
I’m contemplating whether this can handle editing 8K video. Does anyone have any idea?
My goal is to venture into 180-degree VR production. With the Canon R5C and the RF 5.2mm f/2.8 Dual Fisheye Lens, I want to produce stereoscopic video at 8K. However, rendering such high-resolution footage demands substantial processing power, and my current setup definitely isn't enough.
Base version, probably not. If you went with one with M4 Pro cpu and the 48gb or greater ram you should be ok. Storage isn't as much an issue with Thunderbolt 4/5 drives easily meeting similar speeds to the internal storage.
I've done some 8k fisheye footage and converted for Vision Pro / Quest with the prior gen hardware kit and was able to edit and process it on an M2 Max with 96gb ram.
I just sold an M1 Mini and grabbed a Beelink. It's wonderful being able to run whatever OS/distro I want on the Beelink, and it's plenty strong enough for whatever. I love this adoration for Apple, primarily for my investment accounts, but in a cloud world I have no idea why there's such a demand for the M4.
I also have a Beelink, but it skeeves me out a bit. It's probably going to drive me more towards Apple M{X} for my next home server. Even though I know those parts are also made in China, I trust Apple's sourcing more. Bee-link's stuff is so affordable that it makes me wonder why that is.
The globe continuously inching towards war makes me quite paranoid, unfortunately.
Beelink's $299 base model of their compact desktops is enough for most regular home users. Not as powerful as the Mac Mini M4 but half the price and double the storage. Plus you can upgrade your own storage to 2TB for about $100 and it'll be faster than the Mac Mini's.
Yeah. I scratch my head when there is a ton of new models and competition happening in the mini PC market that provides good value for end users, but nobody notices. When Apple releases a new mini PC, suddenly it's like a breakthrough or something. I even see people say "the entry 8 core Ryzen you can get on a desktop PC is 7700" when discussing this new Mac mini on tech forum, as if giant towers are the only kind of desktop PCs and 7840U/7840H(S) doesn't exist. (You would expect these people to know better)
They've publicly disclosed that they built custom Apple Silicon servers to power Private Cloud Compute.
"The root of trust for Private Cloud Compute is our compute node: custom-built server hardware that brings the power and security of Apple silicon to the data center, with the same hardware security technologies used in iPhone, including the Secure Enclave and Secure Boot."
Why use Macs at all? It's well within Apples abilities to make a completely custom server motherboard built around Apple Silicon rather than hacking something together out of Mac parts. That would allow for much better rack density, and they could add proper server amenities like a BMC for remote management.
> The M4 Pro continues to be manufactured using a 3-nm process and on the old M3 Pro (27-28 watts), we measured a lower consumption than on the M2 Pro models (~36 watts), despite its improved performance. In contrast [...] the new M4 Pro can consume up to 46 watts, settling at around 40 watts during the further course—so at its peak, it consumes 60% more.
Assuming they refer to the full chips rather than the binned ones, each generation of pro chips has the following number of p and e cores:
| Model | # p-cores | # e-cores |
|--------|-----------|-----------|
| M2 Pro | 8 | 4 |
| M3 Pro | 6 | 6 |
| M4 Pro | 10 | 4 |
Thus m3 pro has more e-cores and less p-cores than m2 pro thus the big increase in efficiency, while the m4 pro has more p-cores than m2 pro thus the increase. It is all about tradeoffs and, honestly, the result is pretty much expected when you count the cores. I assume there is some improvement per generation, but if the number of cores is not constant, the latter is gonna drive most of the variance generation to generation.
That’s comparing a M4 Pro (middle level) to a M3 Air. The Air is a lower power machine with the low spec processor.
There is no M4 Pro Air. They have to be using a MacBook Pro. That likely has a bigger display, a display capable of getting way brighter, showing more colors, better speakers, all sorts of other stuff.
That’s not a very valid comparison.
If anything, the fact that the M4 Pro gets so close to the M3 is impressive.
The M3 was on a process that was known to run hot. I strongly suspect that every M4 chip is more efficient than the equivalent M3 chip.
I am not sure that is true, does it not use more peak power but get more work done leading to less energy overall for say exporting a hundred photos because it finishes quicker?
Any feedback on how it comes along for regular Software Development (backend stuff) activity and as a common home PC? Looking at the benchmarks looks like there shouldn't be any issue but any first hand experience would be greatly appreciated.
IMHO, developing with Node.js, Java, Python, Go, etc.. within MacOS is more convenient compared to Windows machines.
Also I can highly recommend using version managers (e.g. nvm, jenv, pyenv, gvm, etc..) for these languages to quickly install and manage different versions.
I'm pretty happy with my M1 mac book pro with 16 Gb. I'd expect this to be faster. I typically have intellij, vs code, slack, a bunch of docker containers, etc. running. All fine. Get more memory maybe.
Since I turn off my Mac almost every night, just like I turn off my television, lights, and other stuff when I don’t use them. To me, personally, it makes no sense to waste the electricity for about 15 hrs * 365 days per year.
I would put the power on any vertical side of the mini if I were designing it for my use.
I really hope you mean you unplug the power cable from the TV, cause none of the modern TVs turn off when you ask them to. The TV bootup takes too long for it to be "off" off.
No, but I most often use the hardwired button on the side of the TV to shut it down, not just the remote into standby. It takes about 15 seconds to turn on, nothing to worry about.
I see a lot of reviews that say things like this but seem to be written by people who aren't testing against commonly available mini PCs that are built on efficiency architectures.
How different is the efficiency of this compared to something like an Intel N100/200/300 or a Ryzen 7 7735HS that you can get in cheap mini PCs from manufacturers like Beelink?
I am not doubting that Apple's processors are class-leading but at the same time it seems like I see a lot of people impressed that a mini PC can idle under 10 watts. That's been common for a long time now.
I have an N100 in that listing on the linked GitHub project, it's the best Intel system I've tested, and it gets around 2.5 Gflops/W (which is a little less than a Raspberry Pi 5, which is not known for being the most efficient Arm system).
Specifically, I don’t like them because they show really vague percentages or “scores” where it’s very unclear what specific benchmarks are being used. The benchmark bars for things like single and multi-core performance don’t break down different applications and use cases or even say which benchmarks they are referring to.
This link in particular never goes deeper than Geekbench and has a bunch of comparisons to stuff that shouldn’t guide purchase decisions like:
Newer - released 1 year and 4 months later
More modern manufacturing process – 3 versus 6 nanometers
I much prefer the more in-depth tests that outlets like GamersNexus do, where CPUs are tested using multiple workflows and solutions. For example, you can test specific games and applications, test file compression and video encoding, test simulation speed on CPU-dependent simulation games, test clock speed behavior and throttling (does the CPU have jittery boost clocks or does it settle in to a consistent frequency).
> Specifically, I don’t like them because they show really vague percentages or “scores” where it’s very unclear what specific benchmarks are being used.
It think it's a good overview over various technical data and benchmark results. Benchmarks are linked.
> that shouldn’t guide purchase decisions like ... More modern manufacturing process – 3 versus 6 nanometers
Actually that's influencing buying decisions and gives an idea what kind of efficiency a CPU might be able to deliver. I have various M processor machines from Apple and a CPU on the recent 3 nanometer TSMC process has advantages over a 6 nm process. For example the newer M CPUs are more power efficient & smaller and some of that enables the M4 series of chips to be ahead of the rest in terms of efficiency. Over time other CPUs will move to such 3nm processes. I've seen that with the M4 in the iPad Pro and now with the M4 Pro in the new Macs.
> I much prefer the more in-depth tests
That serves a very different purpose. I like both in-depth tests and tabular comparisons between CPUs.
TL;DR: The M4 blows everything away, but isn't a general purpose machine. The N100 is slightly better than 50% of the power efficiency of the remainder. It's also slightly faster despite having less cores. Single core speed is looks to be twice the speed of ARM.
An N100 box with 16GB of RAM and 1TB of NVME is around the USD$200 mark, which is far cheaper than the Mac or Ampere, but in line with the other ARM options. It comes in more form factors with more customisability options than you can poke a pointed stick at.
All in all, it doesn't fair too badly. The low price, fast single core speed, and compatibility with everything makes up for a lot of sins.
I'm pondering the down votes. My current theory is they are over this:
> The M4 blows everything away, but isn't a general purpose machine.
Perhaps that upset Apple fans.
My definition of "general purpose machine" is one that I can realistically write software for. I written operating systems (with a custom network and GUI stack) in the past. Doing that requires hardware that isn't locked down, and somewhat less obviously is well documented. As in "Intel Architecture Manual" style documented, or failing that at least open source drivers like the Linux i915 driver. Apple doesn't come close to meeting that criteria.
That's a shame, because their consumer hardware is stellar. If it was open, I would chose it over anything else.
> That’s about the most niche computing task I can think of.
You must run a lot of niche computers in that case, because every one of them has an OS, mostly specialised to their use case, written by people like me. I'd wager there are more OS's our there than paint programs. Every computing device you have runs one - watches, irrigation controllers, power box meters, high speed USB cables, modems, disk drives, USB sticks, washing machines, car ECU's (did you know your average car has over 100 computers, echo with their own OS)? Hell, even a modern super scalar CPU under the hood has a horde of smaller CPU's you aren't aware of doing their house keeping.
OS's are a bit like dirt. They are the foundation everything else stands on, but most people dismiss dirt as the thing that makes other things dirty, just as you are dismissing OS's. Just as there are almost unimaginable number of types of dirt, some that grow trees, so that can support dams, there are a huge number of OS's out there. Which means there almost certainly are far more OS programmers out there that write OS's than there are programmers who write paint programs, but it appears the paint programmers are the rock stars and we get to toil away in the dark.
You'll get no argument from me about the quality of Apple's hardware. It's stellar. The M series in particular is so good people are trying to put them to uses Apple didn't have in mind. But so far without much success, because unlike Intel's N100 it isn't designed to be a general purpose computing platform that can be put to any use you cam imagine. Instead Apple wants it's products to be locked into their ecosystem, and only put to uses they can imagine.
How often are these people powering down their iMacs? Why!?
People love complaining. Apple doubles the base ram and keeps the price the same, people complain that base storage is too low. If that doubled then they would find something new.
> And the system I bought includes 10 Gigabit Ethernet and 32 GB of RAM
I thought it was pretty clear from "the system I bought" that he was not talking about the base model. And I think $100 for 10 GbE is surprisingly reasonable for an upgrade from Apple. For comparison, 10 GbE Thunderbolt adapters typically cost about $200 - and while 10 GbE PCIe cards can be bought for less, they tend to be much less power efficient and generate a surprising amount of heat.
I actually think it's very commendable that Apple even gives the option to upgrade to 10 GbE on a mass market desktop. I was recently looking to buy a non-Apple Mini PC, and while 2.5 GbE is very common now, 10 GbE is still relatively rare. The options I found were to go with a Minisforum MS-01, which is considerably more expensive than the base M4 Mac Mini w/10 GbE upgrade, or to order something slightly sketchy from Aliexpress. So as soon as Apple announced the new M4 Mac Mini, I went with that instead.
> In 1.25U of rack space, you could run three Mac minis
I mean, 1.25U at 5" deep. Lots of cabinets are 35+" deep, if memory serves. So technically it would be 21 Mac Minis in 1.25U of space, so it's more like almost 6 teraflops. Again, button-on-the-bottom and wiring and thermals aside.
Very very cool, but only makes it more disappointing that you can't actually use this for anything innovative, except in the Apple-approved format & use cases.
Can't upgrade any of the internals, doesn't run Linux easily, no way to use any of the internal components separately, or rebuild them into a different form factor. Imagine being able to directly mount these to VESA behind a dashboard. I have an old M1 Mac Mini I'd love to use as a NAS, but the disk is slightly too small and you can't upgrade it, so it's just useless to me instead.
Impressive to see Apple match the Pi for idle power & efficiency, but deeply frustrating to see them takes the exact opposite design philosophy.
I use Linux, but I think the cheapest M4 Mini offers an incredible value and efficiency per €. With education discount, it's around €650, including VAT. It's pretty hard to find such a silent and powerful machine for that little. Any comparable options?
A good fanless build with a i3-14100T is more expensive and 40-50% slower on Geekbench. An i5 is a bit closer. Some 2024 Ryzen CPUs can match or exceed its multicore performance, but these are also more expensive and much less energy efficient. Pricewise, things start favoring PCs if you need more RAM, as Mac upgrades are costly.
One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
I don't understand why so many people use the discounted price as reference. Surely very few of us on HN are still in college? So let's use the actual price when making comparisons.
>I don't understand why so many people use the discounted price as reference.
Or when they only use it to make the Apple pricing seem more favorable and ignore it when it comes to PC pricing. Most PC manufacturers also have educational pricing, whether directly or through some portal provided by your institution. I know my son's college had a deal and also had a list of the tax free days in the state so that you could pre-order and then pay and pick up on the day the tax didn't apply.
i'm sorry, tax free days?!? am i too european to understand this? does this apply to everything, like groceries, tech, flowers, wood etc., or just corporate transactions?
I can't say for all states but here in Massachusetts we have an annual tax free weekend where sales tax (6.25%) is not applied for "most retail items of up to $2,500, purchased in Massachusetts for personal use" (https://www.mass.gov/info-details/massachusetts-sales-tax-ho...).
Also groceries never have a sales tax in Massachusetts but, again, that varies by state.
> tax free weekend where sales tax (6.25%) is not applied
that's such a strange concept for me. i wonder what the historic reasoning there is for it, as it seems like one of those legacy things which were started to increase sales during difficult market times :D
> Also groceries never have a sales tax in Massachusetts
also interesting :) what i knew was that some or most states display the prices without tax, so you'll only know the total of your grocery trip at the checkout. never seen this here over the pond, prices always include taxes.
what's common is that different things are taxed differently. food and beverages have lower tax than non-essential things, except of course if the beverages contain alcohol, etc. yada yada blabla.
>also interesting :) what i knew was that some or most states display the prices without tax, so you'll only know the total of your grocery trip at the checkout.
It comes up on /r/askamericans all the time, but it's not realistic to include tax on the prices because there are so many different taxing zones. A large city may have multiple. Most places you can figure it's going to be ~10% and might be pleasantly surprised when it's less. Everyone knows to figure roughly 10% extra, so it's not a chore or anything, even children figure it out.
We don't have tax free weekends in Australia but fresh produce is also exempt from GST (our version of VAT). Anything that has had any "processing" done on it incurs GST though, so oranges are tax free but orange juice is not.
Some places it's anything with sales tax, or maybe just goods in general if they already have low or no tax on food. Other places have it on specific goods that would be considered 'school supplies'. I think where my son is, it's a week or weekend where it's all sales tax is waived. Definitely not a corporate thing, it's to give parents and residents a break and to help stimulate the economy with spending.
all you need is a .edu address If I recall correctly. you can buy them on alumni addresses.
That said, a far chunk of HNs never completed college, like myself and lost access to any email accounts of this sort, which only further supports your argument directly, as the EDU discount isn't universally attainable
Regardless, if people start to abuse this by getting discount while not actually being student or teacher, we can say goodbye to that discount and real students and teachers will suffer from it.
It's not like they're taking a loss from educational purchases. It's just price discrimination. You might as well say "if we all started using newspaper coupons we can say goodbye to those discounts."
Likewise, the expected percentage of people using the education discount is part of Apple's calculation. Even if they lose money on the device (which I personally think is unlikely), they'll make it back from subscriptions and their cut of app store purchases.
> They almost certainly won’t get rid of it because people are abusing it.
It always depends on the ratio (valid cases vs abusers), if the amount of the abusers gets too high, then the discount is not correctly fulfilling its purpose.
> If they want to end the abuse they will simply toughen the verification procedure.
It also depends on how expensive or difficult it is to maintain such verification procedure. At some point it is not justified anymore.
I just personally don't like the current attitude which seems to be going on. If you can "cheat" on getting the discount, people just keep finding reasons why they are justified to cheat. "They should toughen the verification procedure if cheating is possible".
It happens everywhere. People get praised on finding such cheats. Even in Universities, people are encouraged to cheat on getting better grades with less work. Oh, clever boy! He used different LLMs with with good context that made the output look like his own writing.
Not much different than saying "get a better lawyer", if you are getting punished for breaking the law. Opposite applies and that is why lawyers can be really expensive.
Or, not much different than big tech doing morally questionable things because the law is lagging behind. "Nobody is not enforcing the law, so it is perfectly okay. Worst case is that we need to pay some fines.".
It's not cheating if they intend for you to do it (but just can't explicitly say it's allowed because then everyone would do it and that would collapse their self-assortative price-discrimination strategy.)
> If you can "cheat" on getting the discount, people just keep finding reasons why they are justified to cheat. "They should toughen the verification procedure if cheating is possible".
You seem to misunderstand the argument. It's not "they should toughen the verification procedure if cheating is possible." The argument is that they would toughen the verification procedure if cheating were possible and they cared; which proves, at the very least, that they don't care (and potentially proves that they in fact want you to do it, at least sometimes.)
To be clear, this argument doesn't apply to bureaucracies — governmental, academic, or Enterprise — where there's so much red tape in the way of making changes that it's almost impossible to fix issues like this even if several people care quite a lot.
But this argument very much does apply to a relatively-agile, not-so-Enterprise-y-for-its-size corporation like Apple. In fact, it applies especially to Apple, who has an almost Disney-like obsession with micromanaging all customer interactions as an extended customer-lifecycle marketing opportunity. (For example: you'll never find a rotting out-of-date page on an Apple-owned website.)
Apple know exactly who they're giving this discount out to. They've almost assuredly sat down at least once and done a hand-analysis of one or more months' purchases, to determine the proportion of education-store purchases that are from genuine education customers. (Heck, they probably have gone far beyond this; far lazier corporations than Apple set up heuristics for this kind of "promotion fraud"; run continuous analyses on them; and spit out a weekly reports to mull over in marketing-KPIs progress meetings!)
If Apple's education store gives discounts to group XYZ, then you can assume that that's the intended outcome. At least under the Apple marketing department's current paradigm of thought.
> It's not cheating if they intend for you to do it
It feels like you are proving my point of people finding excuses to buy the Mac with educational discount, when they don't meet the requirements :)
The intend it clearly for educational setting. For students and teachers. You dishonor the intend if you still try to claim the discount. Whether you are punished or not.
I think you might be suffering from a categorical blindness to a certain type of thing humans do.
Let's say I own a private beach. I want to allow my beach to be enjoyed freely and responsibly by a reasonable number of people, whether friends or strangers. I don't want to constantly be cleaning up garbage on my beach. And I don't want the beach to be overcrowded when I myself use it.
So what do I do? Well, I'm sure not going to hire a bouncer to guard my beach. (How would I even tell them who's allowed in, anyway? Can you recognize "irresponsible people" on sight?)
No, instead, I will probably post a sign outside my beach, saying "NO TRESPASSING".
But I won't enforce it! And if anyone (e.g. my few direct friends who I invite to hang with me at my beach) asks, I'll tell them I won't enforce it! They can bring people to my beach if they like!
Access to the beach is now an open secret. It's something that people can freely tell those they trust about. The number of people visiting the beach will rise slowly over time. Maybe it'll eventually increase to be too much; or maybe it'll level off, due to churn in the population near the beach. (Mostly depends on how hard the beach is to access, and the demographics that live nearby.)
If some tour company tries to drop off a whole busload of tourists at my beach, though, I will most certainly kick them out, pointing at the "NO TRESPASSING" sign. (Since I don't have a bouncer, probably what I would actually do is call the cops on them.)
The cops would ask me about the people already on the beach, of course. To which I would say:
> Those people on the beach right now? They're my "friends." No, I don't exactly know them... but I know people who know them! They're "on the guest list." But these people standing by the bus over here — these are not my friends. These are people brought here by a guy trying to profit off of providing others access to my beach, which I have not granted. They are not allowed in. Nobody brought here by this bus company will ever be allowed in.
This is every underground party ever. This is every travel destination for the rich. Open secrets, with guardians who actively lie by exaggerating the restrictions or conditions in place, to keep a lid on the spread of the secret.
And this is a thing companies do constantly.
• Every store discount code given out to some YouTuber to give to people who watch their thing? Open secret. (Consider: is it "legitimate" for a discount app like Honey to find and publish those audience-targeted codes? No, probably not; Honey would be acting like the tour-bus operator above. But would the online store mind if you personally found the code and used it, despite not being a member of that Youtuber's audience? No, they'd be happy to have your business. Would they even mind if you told three friends, and you all immediately bought something? No. In fact, they'd be overjoyed!)
• The unmentioned (and implied to the contrary!) never-ending-ness of the free trial period for WinRAR? Open secret. (If WinRAR never implied you had to buy it at some point, nobody would have ever bought it; they'd just consider it freeware. But you don't "have" to buy it. It goes on working forever. Some people feel guilty or pressured, and do buy it. Others eventually discover the bomb is a dud. This is WinRAR's intended business model.)
• The CPU binning lottery? Open secret. (Did you know you can keep RMAing retail-purchased CPUs until you get a really highly overclockable one? You do now! And people have been doing this for decades! CPU vendors don't care—in fact, they want these few super-enthusiasts to get their hands on their best CPUs, since they'll probably publish some really nice benchmarks with them. Free advertising! They certainly don't want a company doing this in bulk though. That'd be way more trouble than it's worth; and then what would they do with a huge pile of RMAed known-below-average-binned CPUs?)
• How easy Photoshop was to pirate in the pre-Creative-Cloud era? Open secret. (See my sibling post.)
You can exploit any/all of these if you know (and you're not in a situation legally preventing you from doing so — e.g. corporations can't pirate things.)
And some people know; but most people don't.
This equilibrium state is exactly the point aimed for by the corporations that create these open secrets. They don't want these secrets known by everyone. (If enough people do it, then it's no longer a marketing expense, but a hole in their business model.) But they don't want these secrets known by nobody, either.
The creators of any open secret, want some deserving people to take advantage of the open secret; otherwise they wouldn't have made it an open secret. (In almost all cases, you have to actually do extra work to make something an open secret. It's extra work to carefully design and manage the "virality coefficient" of an open secret so that it'll hit equilibrium, rather than spreading to fixation or dying out. The outbound word-of-mouth advertising required to get an underground party to happen, for example, is way more work than just putting up posters! It would almost always have been easier to just have no secret at all!)
I hope you will agree with me that this dynamic exists in general.
If you do: what then leads you to believe that what Apple has here is a dumb unenforced mistake, rather than an open secret?
---
One extra point, that doesn't have a clean place to insert above: corporations are really careful with the way they structure the wording of the exaggerated-restriction "wards" shrouding their open secrets.
For a person, a "TRESPASSING A-OK" sign would just be a sign. But for a corporation, any positive criteria they give implying that a group does qualify for a certain promotion, can be taken as a legal promise on their part.
If Apple offered an obscure promotion to "anyone who can find it" — some secondary secret version of their online store that just happens to have lower prices, say — and then some bigcorp found it... and if Apple then attempted to refuse to apply those promotional prices to that bigcorp's 100k-seat volume purchase of Mac Studios or whatever they were trying to get away with — then the bigcorp could actually be in their right to sue Apple for breaking the promise they were making by having such a store available without qualification! (a.k.a. promissory estoppel.)
(To be clear, to win such a case, the bigcorp would have to also prove that they then went out and did something under the assumption that they could get those 100k Mac Studios at that price — bought 100k Mac Studio-shaped desk nooks, say — and that by being refused the promotion, this contingent action has resulted in a financial loss for them — e.g. if it turns out the 100k nooks have zero resale value, so they're out the cost of the nooks, and also have a huge pile of useless plastic it'll probably cost money to dispose of. But that's not too uncommon of a problem to have, in a big-enough corp with many async/concurrent/pipelined corporate purchasing negotiations going on. So it's something the legal departments of vendors like Apple are always wary of accidentally getting tangled up in.)
"Students and teachers" is a particularly nice/"safe" wording for open-secret shrouding language for a corporate promotion, because there is no case in which a corporation qualifies as a student or a teacher. And yet literally anyone else can become a student at any time, just by signing up for a zero-tuition-until-you-take-courses online university program and nabbing the resulting .edu email. (By the premise of continuous education/lifelong learning, we are always students!) "Students and teachers" is a group that any price-conscious motivated individual can join trivially (just like clipping a coupon!), but which keeps the corporate-buyer discount-loophole-hunters out.
That is a great write up. But I think this proves even more my point that people do anything to make an excuse for cheating :)
I agree that there might be some open secrets. This particular case is not comparable. Simply, because it does not make sense. Apple is making a harware business. They are already are giving the discount for the correct use base where the discount is an actual investment
* Students that then might pick the same hardware in the future at work, company they found, etc.
* Teachers, who promote the same hardware for students
For others, why this would be open secret? The correct user base already gets the discount. There are no benefits to give discount for others as well, even in secret. It is just loss. These same people likely would by the hardware anyway. I bet that this price difference does not make them to not buy the product.
The story you are telling is not comparable in this case.
The comparable comparison would include that you allow some random people into that beach well, that you don't trust. But because the count is so small, it does not matter.
However, people start posting about your beach in social media, or even in Hacker News. Friends of friends of friends tell about their friends too. Now the beach is crowded and all randoms are the all the time! What would you do? Get a bouncer or put "a real" Trespassing sign? And even your friends can't enjoy the beach anymore.
It is all about statistics and in what direction we let these things go.
> That is a great write up. But I think this proves even more my point that people do anything to make an excuse for cheating :)
You would have an argument (not a good argument) if I ever actually took advantage of the education discount. But I don't!
(I get all my Apple computers as business-lease equipment from my employer, within which I have arbitrary IT equipment purchasing authority. And then, once they've fully depreciated, I buy those computers from my employer for a trivial sum to become my personal computer(s), and also order new current-gen work computer(s). Is this "cheating?" No, Apple loves this — my employer is paying full price, and never gets any sort of discount. And my employer also loves this — they just want me to be productive, and paying a few thousand dollars to buy whatever arbitrary equipment I requisition every two years, is extremely cheap for how much my added productivity will make them over that period. Given the different things each party in this relationship values, this is a win-win-win.)
> For others, why this would be open secret? The correct user base already gets the discount. There are no benefits to give discount for others as well, even in secret.
As roughly seven other people have replied to you: price discrimination. The user base Apple would like to help out are "individual buyers who just barely cannot afford Apple products, with a $100 discount being enough of a difference to prevent them from falling out of the funnel."
Students tend to be central members of this group; but Apple, in practice, seems to actually want to help this group as a whole.
(And why wouldn't they? It's not like they're making a loss on education-discounted sales. They're making money and getting people into the Apple ecosystem, where they'll hopefully dive deeper once they have more money!)
But there's no way to openly offer "anyone who needs a $100 discount to be convinced to buy an Apple product" that $100 discount, without either:
• sounding like you're literally calling people poor (open "means-adjusted pricing"? It's been tried; people hate it! Only ever gets aired out as a TAM-expansion tactic in markets for extremely-inelastic-demand goods with zero competition, e.g. on-patent medications.)
• or leaving a loophole for rich people to find that results in Apple not being able to milk them.
And the one thing that goes against every strand of a luxury consumer product company's DNA, is the thought of letting a rich buyer with high willingness-to-pay get away with a low-margin purchase. In Apple's business, milking one rich customer can give you the net profit of dozens of low-margin customers. (Think: convincing some Mr. Moneybags who walks into an Apple Store thinking they want a Mac Mini, that what they really need is a fully-upgraded Mac Studio.)
> However, people start posting about your beach in social media, or even in Hacker News. Friends of friends of friends tell about their friends too. Now the beach is crowded and all randoms are the all the time! What would you do? Get a bouncer or put "a real" Trespassing sign? And even your friends can't enjoy the beach anymore.
How is this comparable? As other sibling replies state, the open secret of the Apple education discount has been widely dispersed for at least a decade now. It is at equilibrium — it clearly isn't spreading to the point that "the beach is overcrowded." Ask a random person off the street — heck, ask the average person on HN five minutes before this thread started — and they would not know that Apple offers an education discount but doesn't verify academic status.
You want to know what an open secret reaching fixation looks like? Picture it being discussed in "money-saving tips" listicle videos put out by popular [i.e. tens-of-millions-of-subs] vloggers. Not even tech vloggers, either — I'm talking gaming vloggers, art vloggers, beauty vloggers, etc.
Some open secrets do run away like this — and yes, this does cause their creators to pull the plug! The Apple Store education-discount open secret is not like this.
> This equilibrium state is exactly the point aimed for by the corporations that create these open secrets.
Not necessarily. You know that biological evolution is blind, but thriving in a market environment doesn't require companies to know why what they are doing is successful.
So eg Photoshop (and Windows) used to be really easy to 'pirate' by individuals. And you can argue that this was good for Adobe (and Microsoft), because it's like an informal education discount: youngsters get used to the software at home and train themselves, so that later on it becomes the obvious choice for the office.
But for the mechanism to work, Adobe doesn't have to understand the mechanism. They could just not know at all about the pirating, or conclude that it's too much hassle to chase the pirates (but be completely unaware of the positive effects). Or on the contrary, they could over-estimate the positive effects of piracy etc.
Microsoft products are trivial to pirate thanks to Microsoft Activation Scripts [1] which is on GitHub. It is inconceivable that they aren't aware of it with 102k stars. That can only be deliberate.
I agree: I am sure that people at Microsoft are aware these days.
The first commit in the Microsoft Activation Scripts repository is from 2020. For Microsoft the dynamic I describe goes back all the way to the 1980s (and perhaps even earlier.)
Back in the 1970s and 1980s people at Microsoft might or might not have been aware. (I don't know for sure either way.) But it already worked in their favour.
My point is that the dynamic works whether or not anyone is aware of it.
Having a cloud account is entirely disconnected from the activation state of Windows, and always will be. The activation state of Windows is a property of a Windows installation, because Windows installations — all the ones Microsoft cares about, at least — are managed (including license management!) by the IT departments of organizations; while Windows logins are managed by individual users.
Microsoft would be breaking their own business model in half if they forced each user to have a "Windows subscription" bound to their personal cloud account, instead of being able to just sign a $10MM/yr contract with Oracle or EY or whomever for a 100K-seat volume license.
Remember also that many large-scale deployments of Windows machines aren't of personal computers at all, but of:
1. workstations with non-cloud Active Directory-managed user accounts, with the accounts and data on the machine being backed up to corporate servers and thus the machine itself able to be drop-in replaced overnight without the user even noticing the change;
2. workstations with roaming user profiles configured, where many different people log in and out of the same computer throughout the day (think: computer labs, internet cafes, etc)
3. shared workstations where many employees log in and out of the same computer throughout the day (may overlap with 1) — think of the computers behind the desks at the customer-service wickets at a bank
4. machines with no logged-in users, only an AD administrator remote-managing them through domain privileges — think e.g. digital signage
If licensing status attaches to the logged-in user, then none of these use-cases work! And together, these use-cases form 80+% of how Microsoft makes money from Windows!
I think this is a lot like the situation with oldschool Photoshop: for a long time, people pirated Photoshop, and Adobe really didn't care — didn't bother to do anything to make piracy the least bit challenging.
This was seemingly because they considered the amount of money they could make off of sales to individuals, to be relatively trivial next to the amount of money they could make off of corporate volume licensing; and they knew that corporations wouldn't be pirating Photoshop even if it was trivial (because corporations always have the thought of an acquisition-time assets audit on their minds.)
Apple likely thinks the same way about this education discount: all their material income comes from volume purchases or alternate distribution channels (e.g. cellular carriers for phones), or in-store sales; with online retail sales being a relatively-trivial fraction. So it doesn't really matter if they're "losing" part of their margin on these online retail sales.
(Or, if you think about it another way: this is essentially customer-driven price discrimination. Like coupons are for grocery stores. The discounted price is Apple's true price — the price that builds in a profit margin they're happy with. The higher price is pure gravy if they can convince people to part with it. They put the higher price front-and-center, and make the lower-priced offer a bit obscure. People "spending someone else's money" don't care about hunting for deals; they just want to get the thing and get out. So you can milk the gravy from them. People who hold their bank balance more dearly, hunt for the deal, and find it. Still fine; still made a profit from them!)
> Apple likely thinks the same way about this education discount: all their material income comes from volume purchases or alternate distribution channels (e.g. cellular carriers for phones), or in-store sales; with online retail sales being a relatively-trivial fraction. So it doesn't really matter if they're "losing" part of their margin on these online retail sales.
Exactly. At that point when the amount of abusers gets too high (because this will become mainstream knowledge and people think it is generally acceptable to dishonor the intention), then this will end. Or if they are able to improve the verification process with negligible costs.
So, the more people talk about "educational price", and more think that is acceptable to "cheat", more likely the count of abusers reach that threshold and good things end.
> (Or, if you think about it another way: this is essentially customer-driven price discrimination. Like coupons are for grocery stores. The discounted price is Apple's true price — the price that builds in a profit margin they're happy with. The higher price is pure gravy if they can convince people to part with it. They put the higher price front-and-center, and make the lower-priced offer a bit obscure. People "spending someone else's money" don't care about hunting for deals; they just want to get the thing and get out. So you can milk the gravy from them. People who hold their bank balance more dearly, hunt for the deal, and find it. Still fine; still made a profit from them!)
You are finding again an excuse to cheat. It is perfectly okay to take an advantage of discount if you are eligible for that. But this was not the case.
I've seen this FUD repeated for a long time. Hasn't happened yet. Probably the worst that will happen is they'll start requiring some type of verification again.
You don't even need a .edu email address. I logged in with my regular Apple account and made a purchase on the education store expecting them to ask for that or some other verification, and they never did.
They claim the right to audit purchases through the edu store and charge you the difference if you don't qualify, but I've never read anyone online reporting they've been audited/charged.
>which only further supports your argument directly, as the EDU discount isn't universally attainable
Pay someone with an edu account to complete the purchase for you. Also, they are commonly available for community college students, including those taking free classes.
Since the first ARM systems (may be before) you can’t upgrade things on your own, i had an Air which can be SSD upgraded but not memory upgraded. Memory can only be upgraded from factory.
One thing that will potentially future-proof the new Mac Mini is that the SSD is on a removable board. It's a custom Apple design but someone's already hand made their own upgrade. Wouldn't be surprised if there will be 3rd party upgrades commercially available within a year.
> € 230,00 for +8 MB RAM?! There are places you can get that for a tenth of that price.
"Comparing our memory to other system's memory actually isn't equivalent [...] because of the fact that we have such an efficient use of memory, and we use memory compression, and we have a unified memory architecture."
- Bob Borchers, Apple vice president of worldwide product marketing (who apparently never heard of zram)
But what's the point in Borcher's comment? Because there's efficient software use of memory, it's legitimate to put a tenfold price marker on the hardware?
Yeah, that doesn't explain why the Intel Mac Pro cheesegrater wanted $3,000 for 160GB of socketed RAM that OWC would sell from the same manufacturer, same speeds, for $1,000 for 192GB.
Sorry Bob, architecture may be different now, but Apple has always been egregious.
They have variations of the program in some European countries. It's been a long time for me, but in the UK they used to just whitelist university domains (we didn't use .edu TLDs either).
We use .ac.uk though (and much of the non-USA world uses .ac.ccTLD similarly) so no need to whitelist individual university domains. I don't know about Apple, but that's a common approach. (And does irritate some where they don't use either and get missed, Canada for example.)
It may be a local thing in Bay area, but usually there’s some way to get some discount when making a purchase with Apple - be it via education, or via a corporate discount (just show your badge from another company), or via a friend who works at Apple, or some big retailers start selling at good discount (eg Amazon easily gets 5-10% lower price over time).
Anecdotally, last week I visited a local Apple Store with my son who is in middle school. Without any prompting from us, the Apple rep asked my son if he is planning to go to college some day, and applied the college discount to our purchase without my son saying much…
Because when one configures it with reasonable 32GB RAM/2TB SSD and EU prices, it suddenly becomes £1800 and it's harder to convince anyone of its price superiority.
A lot of people have a relative or something still in education, just buy it through them. It's not like this is government subsidy, just a promotion to increase sales and maybe hope to have long term customer by hooking them at younger age. Probably much less immoral than blocking ads on YouTube.
I think you are missing the point. The person who mentioned educational pricing was asking if there are any machines with comparable performance and silence for that little a price, and said that the educational price is €650.
Suppose I know of a non-Mac that has similar performance and silence for €1000 non-educational. To decide if that meets the requirement I'd need to either look up the non-educational price of the Mac to compare to €1000 or I'd need to look up the educational price of the €1000 machine (if it has one) to compare with €650.
They are more likely to get useful answers if they post the non-educational price so that people don't have to do extra work to figure out if they should respond.
The typical .edu discount from Apple on largish purchases is about $100, regardless of whether that's a $600 final tag or a $2000 final tag. So, somewhere between 14% and 5%.
If Apple sells 50% of Macs to the .edu discount market, that's a difference to you of somewhere between 2.5% and 7%.
Or, you can accept that Apple's prices are not set by the market so much as by their marketing department.
So is paying the full price and signalling to Apple "we can afford it just fine, don't sweat about cutting margins or lowering extra disk/RAM pricing", but I don't see you complaining about that :)
Companies tend to focus on the overall % profit margin for a product. If a higher percentage of sales are for a discounted (edu) SKU with lower margins, they will tend to raise the price of the product to hit their desired profit margin.
e.g. If a company was selling a product at $1000, and wanted to offer a 20% discount for EDU that would be bought by 50% of the market, they would need to raise the price by about 10% to keep the same margin. If only 20% of the market bought the discounted SKU, they could keep the same margins with only a 5% (?) increase in price to the rest of the SKUs.
You are free to purchase as many Apple products as you want to offset any perceived revenue losses from promotional discounts. I'm not so sure why you would want to do this but I keep hearing that behavioral economics is a thing, maybe paying more is your definition of rationality.
It’s basic market economics. More discounted purchases tends to lead to an increase in the non discounted price. Of course, that’s baked in at outset. Apple knows x% of sales come with a edu discount so the non-edu price is offset to account for the edu discount. I don’t have any problem with a vendor doing that. It’s how they forecast a profit margin. Apple, apparently, has allowed “people who know someone in education to also claim an edu discount” to be part of their pricing model that ultimately leads to increased prices for those those do not know someone with an edu email.
It’s trivially easy to obtain the discount. Anyone working in education, or a student at any level, k-12, higher ed, graduates with access to uni email can get it. Apple doesn’t ask any questions or for verification.
They also go on sale at a similar price to the general public relatively frequently.
But also a ton of people are absolutely in college, at any age, new people are coming to HN every day; I'd think HN is an easier place to discuss and explore vaguely tech/startup related topics than in school
Even Apple Store employees will freely give you the discount. Apple doesn't discount because they aren't a discount brand, but they will give you this discount if you ask.
If you continue reading the sentence, it gets even more bizarre:
> it's around €650, including VAT.
Whatever taxes and discounts apply to the commenter’s own idiosyncratic situation have nothing to do with the price of the product.
A couple of years ago, I might have cared what the price of an M2 with Pasadena sales taxes was since I lived there at the time, but I sure wouldn’t have included them when talking about Apple prices here.
Similarly, VAT costs are between you and whatever jurisdiction you live in that’s levying them. Apple isn’t the one to thank or complain to about them.
I am outside of North America and have been for about 3/4 of my adult life.
The issue with adding VAT to prices on a forum with people living in a lot of different places is that VAT rates vary greatly from place to place.
To get an idea what an Apple product costs, it's more helpful to look at the price charged prior to taxes, tax deductions, educational discounts and other factors that will depend entirely on the specific cases of each reader.
You're saying this to someone who twice took a 24-hour train from Beijing to Hong Kong to buy an Apple computer for 27% less due to HK not having an additional electronics tax.
A lot of my friends in Taiwan used to buy macs in HK for the same reason.
This is most likely because OP used Euros. In Europe, prices are listed including VAT. So in day to day life, you only see prices with VAT for your country included.
The cheating/fraudulence encouragement in this thread is disgusting. You guys are not stilling a pencil but several hundreds dollars. Paying a part of it or apple being fatty rich doesn’t make it more honest.
I don’t think I ever encounter here such collective encouragement to bypass a law (ok probably for jailbreak which is not a fraud). Not sure if the demography changes, societal culture change or just luck.
Edit: Oh and yeas I never completed college, don’t own a .edu and am maybe just subconsciously jealous.
> Pricewise, things start favoring PCs if you need more RAM, as Mac upgrades are costly.
That's the position I'm in, along with some other people I've talked to recently, too.
For our situations, the M4 would likely offer more than enough processing power, and the efficiency and physical size are attractive, but a maximum of 32 GB of RAM definitely isn't sufficient.
The M4 Pro's 64 GB of RAM is somewhat better, but the cost of those upgrades are very hard to justify.
I'd also prefer to use the system for at least 5 years, and likely up to 10 years, if not longer. Even if 64 GB is tolerable now, I can easily see it becoming insufficient for my needs before then.
The lack of reasonably-priced internal storage, while easier to work around than the lack of sufficient and reasonably-priced RAM, doesn't help matters, too.
Even if future Studio models, for example, might allow for a more ideal amount of RAM, I have to expect that unjustifiable upgrade costs will likely still be an issue, and then there's the wait on top of that.
I can easily see myself and the others I've talked to settling for PCs, rather than making unjustifiably-expensive Mac purchases.
In same boat, I have the 5950x with 64gb memory running PopOS and there are times I'm hitting swap a lot more than I'd like. 16 or 32gb of memory is just not feasible, and even 2TB of storage would likely cause headaches, I have a 4TB and a 2TB nvme at the moment which will come with me next upgrade.
I'm leaning towards an upgrade next year to the 9950x3d if reviews pan out. Sure, it's going to be a bigger machine with louder components, but the upgrade will likely be half the cost of anything close from Apple since I can take my existing GPU, PSU and storage at the very least along with me.
And "upgrade costs" is highly misleading for most of the components. You are buying a different machine config that you can't change, up or down, later on. I get that most people don't want to bother opening up a PC to swap out components, but the easier they made it, the more people will do it, and Apple is running the other way.
Mine doesn't but yes I could move to a mATX or bigger board to unlock that extending its life. I tend to go for the 'smaller' ITX cases and boards, so currently have a x570-i setup maxing out at 64GB.
For storage at least, you can pop your existing nvme drive into a thunderbolt enclosure and use it on a mac mini. Over TB4, it should run at the drive's full speed (so long as you get a decent enclosure).
It won't help the RAM situation, but storage at least is upgradable like that.
Be careful to check the support for larger ram on the motherboard as well as cpu - I’ve got an am5 setup with 128gb of ram but the it had to be down locked to even post.
Memory usage is not comparable across Linux and Mac. MacOS is much better at avoiding swap, uses memory compression, shared frameworks etc. At the same time it tries to use all the memory available which makes direct system-wide comparisons not accurate. A good rule of thumb is that 8GB on Mac == 16GB on Windows/Linux.
MacOS does seem to “use all the ram” but never falls over itself.
I think the kernel is likely genuinely better in low memory conditions (its hard to be worse than Linux here to be honest) - and thats combined with being aggressive about using as much of the ram as there is available opportunistically. (not fully unloading applications when closing them for example).
“WindowServer” uses 2-3G of ram, and electron apps use lots too; but truthfully my macbook is able to sustain significantly more open programs than my linux laptop, despite my linux laptop actually having more memory. (32G vs 24G for the Mac).
I cant explain it and I am genuinely curious how this is the case, but at least anecdotally, parent is more correct than not.
For what it's worth, the apple silicon machines are much more efficient on RAM than most - a 16gb m1 absolutely mops the floor with the 32gb of ram I have in my thinkpad with an i7. It's not really even close.
Your comment might win you the argument on a random non tech forum but not here.
much more efficient in what? mops the floor by what? which year's i7?
Don't get me wrong, I 100% believe what happened, but if you mean "my macbook is faster than my i7 thinkpad" you should use those exact words, but not bring RAM into this discussion. If you want to make a point about RAM, you need to be clear about what workflow you were measuring, the methodology you were using, and what the exact result is. Otherwise your words have no meaning.
Repeating what I just commented elsewhere, but Mac uses several advanced memory management features: apps can share read-only memory for common frameworks, it will compress memory instead of paging out, better memory allocation, less fragmentation.
Bandwidth for copying things into memory is also vastly faster than what you get on Intel/AMD, for example on the Max chips you get 800GB/s which is the rough equivalent of 16 channels of DDR5-6400, something simply not available in consumer hardware. You can get 8 channels with AMD Epyc, but the motherboard for that alone will cost more than a Mac mini.
Sharing read-only/executable memory and compressed memory are also done on Windows 10+ and modern Linux distributions. No idea what "better memory allocation" and "less fragmentation" are.
800GB/s is a theoretical maximum but you wouldn't be able to use all of it from the CPU or even the GPU.
System design and stability. On MacOS a lot is shared between applications compared to the average Linux app. Dynamic linking has fallen out of favor in Linux recently [1], and the fragmentation in the ecosystem means apps have to deal with different GUI libraries, system lib versions etc, whereas on Mac you can safely target a minimum OS version when using system frameworks. Apps will also rarely use third party replacements as the provides libraries cover everything [2], from audio to image manipulation and ML.
People who need 64GB+ RAM are not running 1000 instances of native Apple apps. They run docker, VMs, they run AI models, compile huge projects, they run demanding graphics applications or IntelliJ on huge projects. Rich system libraries are irrelevant in these cases.
This thread started as question on how MacOS is more efficient, not the usefulness of more RAM. In any case, you might still benefit from the substantial increase in bandwidth and lower system / built-in apps memory usage, plus memory compression, making 16GB on Mac more useful than it seems.
I can run apps with 4 distinct toolkits on Linux and memory usage will barely go past the memory usage of opening one Facebook or Instagram tab in a browser.
Compared to compiling a single semi-large source file with -fsanitize=addresses which can cause one single instance of GCC or Clang to easily go past 5G of memory usage no matter the operating system...
I'm talking about memory bandwidth - maybe your workloads don't take advantage of that but most do and that's why apple designed their new chips to take advantage of it.
Video Editing. Backend and Frontend development utilizing docker containers. Just browsing the web with tons of tabs. Streaming video while doing other stuff in the background. Honestly most things I'd rather do on my M1.
So probably nothing that actually needs more than 16GB of RAM then. And realistically comparing M1 to an i7 several years older than it.
Having more RAM doesn't increase memory bandwidth and having more memory bandwidth doesn't necessarily mean better performance. You aren't even able to make use of all of the bandwidth your M1 is capable of in the real world [1].
Apple Silicon has good perf/watt but the gap probably isn't as big as you're thinking.
When did I say having more RAM increased memory bandwidth? Are you having a separate conversation with yourself right now? I feel like you might have misinterpreted what I originally said and just ran with it.
Not sure what you mean by 'efficient', they are faster for sure (amazing memory bandwidth thanks to on chip memory), but to my knowledge they would be the same for amount of data stored. So that same think pad will likely be faster at tasks that need 24GB for example, highly depend on the use case as always.
Memory requirements for general-purpose desktop usage usually don't come down to a single task with a large working set that needs to fit in RAM in its entirety. It's more often a matter of the aggregate memory usage of many tasks, which means that in practice there's a wide gray area where the OS can make a difference, depending on the effectiveness of its memory compression, swap, signalling memory pressure to applications, suspending background tasks or simply having fewer of them in the first place.
I run Ubuntu on my Thinkpad - I generally notice the biggest difference with video editing, but really multitasking anything is night and day because of the memory bandwidth. I use the same software on both machines for video editing, Davinci Resolve.
Nix works well on mac, very similar to Nix/Linux for the most part. There are some missing packages, but the common ones tend to be fine. Its worth using the Determinate Systems installer to avoid reinstalling Nix on every macOS update though.
Nix-darwin is good, and I use it, but it is nowhere close to NixOS. I think there are some options I've set through it that macOS keeps overriding, so the declarative configuration drifts from the real one eventually
I think the only real issue with Nix on macOS is that Nix can eat through storage quite quickly, and storage upgrades are pretty expensive on Macs. This might push the balance back to an fanless ryzen build
> I think the only real issue with Nix on macOS is that Nix can eat through storage quite quickly, and storage upgrades are pretty expensive on Macs. This might push the balance back to an fanless ryzen build
Only if you want to be able to roll back multiple versions. Otherwise, I think it is fine.
I've been using Nix on macOS for almost a year. The good (and bad) thing about Nix is that it supports many different use cases, so you have to spend some time understanding the options before you can even figure out which flavor to install.
A good way to get started is to start using Nix to replace/supplement Homebrew. You can install Nix in addition to Homebrew and have some packages installed by one and some by the other. You can uninstall a Homebrew package and then reinstall it with Nix. You can even remove it with Nix and go back to Homebrew if you like.
I would wait on nix-darwin until you are sure you need/want it. (I have recently started using it for its support of the `linux-builder` feature, but not everyone needs that.)
As a software developer who uses macOS to develop for Linux, it is a great tool and I cautiously recommend it to those who are willing to deal with some learning curve and frustration.
I haven't yet used nix-darwin enough to make a recommendation one way or another. (But the `linux-builder` feature is compelling if you need it: https://nixcademy.com/posts/macos-linux-builder/)
A comparable option in my opinion would be Minisforum 790S7. They also have a separate mini-ITX motherboard from that one if you want to DIY.
The CPU in it is faster in raw multi-thread performance, single-threaded it's a bit slower, but still quite impressive.
The only problem I had with Minisforum is that they couldn't supply the exact hardware I ordered and their suggested solution was either to wait for 1+ month or get a sligtly different configuration. Two times out of two.
Quality-wise they're pretty good though, no complaints there.
I can't find a Minisforum 790S7 for anywhere near the price of the base model mac mini. I am seeing $459.00 USD and that is "BAREBONE (NO OS/RAM/SSD)" [1]. I am comparing this to the M4 Mac Mini base model, that does indeed come with an OS, RAM, and an SSD[2] at $499 USD.
Welp. You're definitely correct. But that's the only machine in my opinion that comes close (and offers some advantage like a whole PCIe 5.0 x16 slot). There are other mini PCs that are cheaper, some other commenter suggested Beelinks which are also quite popular among enthusiasts, SER8 for example: Ryzen 8745HS, 24 GB RAM and 1 TB SSD for 467 Euro. Seems competitive enough.
Maybe it's not performance-comparable but $284 (BF35 coupon discount from $319 list) for Ryzen 5, 16GB RAM and 1TB SSD [1] in my mind is a good value trade-off versus the Mac Mini. The only thing that gives me pause is the concern expressed by some that Chinese MiniPCs are susceptible to Bios malware. I've looked into Coreboot, Libreboot and System 76 open firmware to mitigate the risk of infected Minisforum firmware but there's always the possibility of it crippling the device which would be a big time-loss more than anything.
Other flavors of malware are easily removed with a quick Windows reinstall before use but potential firmware infections are a good reason to pay more for mainstream PCs.
That's a valid concern but I personally avoid going into this rabbit hole just for the sake of my (already fragile) sanity.
Speaking of issues, the other one with these low cost mini PCs is low-quality SSDs. The one that my UN100D was supplied with was pure garboleum in terms of speed so it had to be replaced.
I've recently bought two minisforum PCs on Amazon. I fully expected the SSD to be garbage and to throw them out. To my surprise, they were decent-ish Kingston TLC PCIE 4.0 SSDs. Definitely not the cheapest SSD on the market.
Beelink EQR6 has an internal PSU and is also quite small, a bit smaller in footprint actually. It even comes with two full-size m.2 slots and expandable RAM.
Mini is great, exceptionally so, I actually just got a rather souped-up one (that's the reason I'm in this thread) but x86 vendors are catching up and there's a certain possibility that more established brands will pick up.
For the base model, but any upgrade on the Mac will kill that advantage instantly. For those keen enough to solder better parts on it the Mac Mini base model is the worst kind of barebone, filled with components that shouldn't even be produced anymore.
> components that shouldn't even be produced anymore
That's kinda harsh. For what it's worth, the base model isn't that bad and the storage can be (theoretically) upgraded down the road, even though it might cost a fair bit more than a standard m.2 SSD.
Sure, in raw compute it's slower than competition, but objectively it's still plenty fast and more trustworthy in terms of reliability.
I can only speak of experience with their UN100D mini PC which I use as a home server in a fairly constrained closet. Not a hot machine by any means, far from that, but the cooling seems pretty decent even for this low power CPU.
BD790i that I just received (didn't even install anywhere) has a rather substantial heatsink over the CPU similar to high wattage GPUs. The chip itself is rated at 55W TDP so it shouldn't be that much of a problem cooling it. The motherboard doesn't come with a fan and I'm definitely not going to spend extra on high-end ones at least just yet.
I don't know anything about the thermals of 790S7's case though. It looks like they gave it at least some thought judging by the duct over the CPU, but how it actually performs I have no idea.
> One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
Been using that ever since M1 became a thing; nothing worth mentioning, "not supported" is vanishingly rare in practice.
The Beelink (and other mini-PC brands) offer comparable performance.
The fact they offer lots of different configurations lets you choose your own trade-offs.
Assuming you don't have an operating system preference, the base model Mac Mini is tough to beat outright, but as you upgrade it there are other options that get interesting.
> The Beelink (and other mini-PC brands) offer comparable performance.
[0] has the new (base!) M4 at 3859 (single) and 14837 (multi) whilst [1] has the Ultra 5 125H 4500 at ~2200 (single) and ~11500 (multi). "comparable" is doing a lot of heavy lifting in your sentence.
There are even cheaper mini PCs you can buy that offer great bang for the buck for home users who just want to do web browsing and run Office, but the performance isn't at the same level, even if you spend a lot more.
After experiencing NixOS it's hard to settle for anything less. Nix is only good for running home-manager; nix-darwin is mostly a joke (I don't mean disrespect, I appreciate the work devs are doing, but limitations of the platform cripple the entire experience).
FWIW, Mitchell Hashimoto runs NixOS VM on his Mac for development. And that's the option I'm gonna implement once I get my MBP from repairs.
There was a comparison mentioned on last week’s episode of the Accidental Tech Podcast, I don’t remember if it was pointed out to them or they noticed it themselves.
Base Mac mini: $599, 16/256 GB
Double storage and ram: $600 upgrade.
Price of 32/512 config: $1199
Two 16/256 machines: $1198.
The ram is (Apple) reasonable at $200, but $400 for the storage doubling is insane.
The differences actually are quite huge. As far as I know, the M series chips all use LPDDR5 RAM, which is indeed more expensive than the DIMM/SO-DIMM modules you would add to your diy build.
Still, you can easily get a good kit of DDR5 DIMMs for 110€/32GB from a retailer. So while LPDDR5 RAM is more expensive, it is most certainly not expensive enough to justify a 230€/8GB price as being driven by BOM costs
Also, the 256 uses 2x128 drives while the 512 uses a single drive, so you even get slightly slower storage with the upgrade. The base model is a great deal.
This is true for the Mac Studio but not for the M4 Mac Mini -- they all have a single storage slot and the only difference between the 256 and 512 models is the model of the NAND chips.
I use nix-darwin on an M2 as a daily driver. It works great! A few quirks you need to go with Brew (mostly graphical applications), but my setup is almost identical between my NixOS and my nix-darwin setups other than that (and some OS toggles).
In Germany you can get the cheapest (base version, 16GB RAM/256GB SSD) M4 Mini for 579€ via Unidays edu discount (also including VAT).
I picked mine up from the post office yesterday, it's 50% faster in Geekbench single/multi-core CPU benchmarks than my M1 Pro Macbook Pro and about as fast in GPU performance. Impressive.
Daily nix user across Mac and Linux, though I use Mac for actual development. No problems here moving between the two with my dev env defined on GitHub [0]
How do you handle the different keyboard layouts (cmd and ctrl on Mac, ctrl and superkey on Linux)?
I'm using a Mac at work and Linux at home with a programmable keyboard but I didn't find a solution to "merge" cmd and ctrl on Mac, so I still need to use both on Mac (not a big drama, but slightly annoying).
My half solution is to use a keyboard that physically feels quite different to help my brain use a different mode. The Linux keyboard is a big heavy mechanical keyboard while on the macbook I just use the built-in keyboard.
It's not a perfect solution and I still make mistakes, but it helps.
>One can potentially use Nix on a Mac Mini to keep similar development environments to those used in Linux, but AFAIK some packages are not supported on ARM. Any experiences using Nix and nix-darwin as a daily driver?
Would this run anything Docker/ARM?
My entire home server setup is Linux/Dockerized and the Mac Mini hardware looks so good, but the more I read about MacOS as a server OS the worse it seems to get.
Maybe for a little server or something, but with the hard to upgrade 256GB storage.. I don't get the appeal. Also 16GB of memory is extremely limited these days. Again, perfect for a little server, but not for a daily driver.
For large media, sure, but I really don't to pay a premium to then need to manage where my everyday apps are installed cause I only have 256GB of storage I can't upgrade.
Well, missing packages for one. Nix prides itself on having one of the most compete package catalogs of any Linux package manager, and on Mac it leaves quite a bit to be desired. A lot of functionality has to be hooked in via home-manager scripts that are a lot less stable than NixOS modules, and since your system isn't built/upgraded by Nix you can't write overlays for a lot of Mac software. If you only need the versioned Flake management then it might be an okay option, but I found myself pretty frustrated by it in places that aren't an issue on Linux. I can't comfortably use it as a Homebrew replacement.
Also, my Mac is 256gb which feels far too cramped for Nix. I'd really only recommend it if you're working with 512gb or more.
> Yeah, I eventually realized that nix-darwin is only good for managing the list of homebrew casks.
If that's really all you use it for, and you already use Home Manager, the Homebrew module for Nix-Darwin works fine in Home Manager with some small tweaks. I use that with a custom, per-user Homebrew prefix to keep Homebrew out of the way except for the task of installing casks.
Pretty sure only M1/M2 supported, so none Apple's new offerings will fly...yet.
Shame, I'd love to use Linux on Apple's latest and greatest MacBooks, but will stay with tried and true Dell Precision series until the year of the Linux Apple laptop becomes a reality.
There's an extremely experimental/feature limited 3rd party implementation of macOS native containers. It requires disabling all sorts of security features, though.
macOS simply doesn't work if I want to run this as a home server, which is my primary use case for an Apple silicon Mac. Most server applications are first class citizens on Linux, like Docker and Kubernetes and caddy/nginx (I know ports exist but there's more documentation and experience on Linux). Furthermore, systemd is a lot more documented than launchd and generally speaking it's easier to do things like upgrading headless, setting up NFS, and the like. I wish Apple offered these machines with official Linux support, but that's antithetical to their philosophy.
That's not a thing. Apple silicon doesn't use EFI so you need a completely custom ROM to satisfy the boot process, hence Asahi. And Asahi doesn't support M4 and likely won't for a while.
ARM64 support for GUI apps (via flatpak in the Fedora Asahi Remix) is also pretty poor, though your standard fair of CLI apps are present.
NUC 14th gen with i3 is around 400 EUR with VAT, with no RAM or storage. For the other 250 EUR, surely you can get more RAM than 16 GB and more storage than 256 GB.
I use a NUC as a daily driver. The problem with NUCs is that cooling is suboptimal, the fan is small and thus noisy. It can be fixed with a third-party case, but that's at least €60-100 more for a much slower machine. Plus, you may void the guarantee by transplanting the motherboard.
It’s a shame that ASUS cancelled the NUC Extreme line. I know it’s quite a bit bigger than other NUCs. But the 13 Extreme had expandability, good cooling, and fast CPU options.
4 cores instead of 10 cores, 69W TDP instead of 22W, UHD Graphics 730 versus Apple's 10 core GPU (0.5 TFLOPs vs about 4.3), 23% worse single core performance, 45% worse multicore, and much louder cooling.
Apple is certainly out of their mind on storage. But on a desktop it’s trivial to plug-in an external disc that you can buy at an absolutely reasonable price instead of the insane Apple one.
$200 bucks for 8GB RAM extra is not fair ($200 to go to 24GB, another $200 for 32GB). 64GB DDR5 kits can be had for less than $200.
And while you can plug in storage, I think it does kind of ruin the appeal of having such a small device by having a bunch of spaghetti cables. And if the boot drive goes it's not easily replaceable and makes the machine a brick until it is.
I bought a no brand mini PC really small with older Intel CPU. It runs Debian quite well. But noisy and slows down a lot, perceivable once it gets hot. But it's cheap around $100,-150 on Chinese e-commerce websites with memory, disk.
Apple does amazing stuff. But it's very pricey in most markets and unaffordable to those on budget.
How does one “idling” like this? “ In 1.25U of rack space, you could run three Mac minis, idling around 10W, giving almost a teraflop of CPU performance. ”
While I think Apple was off the rocker on this particular decision, I do respect their org structure that allows this type of decision to occur. Believe me, there are companies where a dozen people or more would weigh in and prevent an unpopular choice. Consensus sometimes hinders a desired result (both good and bad).