Well, it makes sense. If you're going with a dedicated graphics card in your laptop, battery life is already out of the window, so you might as well get as much processing power as the thing can handle.
As a proud owner of a laptop that could double as a self-defense weapon to cause massive blunt trauma (and a charger that falls squarely into the same category) I welcome this decision.
I am however considering getting a lighter notebook with longer battery life in the future. Having the power of a full desktop machine in your backpack comes in incredibly handy when you need it but it can get a bit awkward working with it on the train.
The issue isn't so much battery life, as it is heat dissipation. How does that thing handle cranking out that much? Granted the new 1000 series is pretty damn efficient at what it does(my 1070 is amazing for the price), it's still a lot for a laptop.
From the article: more CUDA cores at a lower clock. Since clock scaling isn't linear on power consumption, doubling the cores and halving the clock (as an example, not the actual ratio they used), leaves you with a net efficiency gain.
At factory settings the card draws 120W and pushes ~110fps in their 1440p test, but throttling the power limit down to just 60W only reduced it to ~90fps.
(As an aside, the AMD RX480 comparison shows why people are disappointed with Apple supposedly using AMD Polaris GPUs in the upcoming Macbook Pro refresh)
I just don't understand why Apple seems to prefer AMD. Bad experience with Nvidia's drivers in the old Core Two Duo MBPs? Does AMD have a better track record?
Apple is a backer of and is invested in OpenCL. OS X itself leverages OpenCL throughout the OS (Quicklook for example uses it to make previews faster) and of course FCP/Motion/etc make heavy use of OpenCL as well.
Nvidia cards are capable of OpenCL but they've never performed as well with it as they do with CUDA. AMD has always been the better option for that.
Of course Apple could implement CUDA support in their software, but they've never been big on running with vendor specific standards that they had no part in the development of.
It doesn't matter which cards are better at OpenCL, because it is a legacy technology on Apple platforms, most likely to never be updated beyond the current version 1.2 (latest is 2.2).
Power consumption, optimization for non-DirectX drivers, AMD could meet the parts demand from Apple.
Apple ha(d) really specific requirements for their machines, so I don't doubt the decision was a result of whatever specification being met by AMD and not Nvidia at the time. Nvidia seems pretty happy in cornering the high-end market and can barely keep the 1000 series in stock at the moment.
Alternative options: NVidia is too expensive per unit or most Apple customers could care less about dedicated graphics. I think the latter is most likely - they don't sell their products on specs.
Still, I've worked with a lot of Apple and Dell laptops and they ALL have some type of overheating issues with GPUs. Whether they've solved all these problems with this, who knows. But I'm skeptical.
There's a slightly lower clock speed, but its really going to be up to the OEMs to make this all work from a cooling perspective. I suspect your average gaming laptop is so big and heavy that having another big fan in there is not going to really bother anyone.
Its probably going to be a kludgier solution, but the M series GPUs are fairly terrible, often a fraction of the performance of their desktop equivalants. I think Nvidia saw the external GPU thing on the horizon and decided this is the better approach. I tend to agree. I'd rather have a little extra weight and girth in a laptop than worry about a whole external enclosure and not having it when I'm out, or dealing with all the wires and such.
From a battery perspective, who cares. Even the M series had to be plugged into the wall for any non-trivial GPU work.
They've had external GPU solutions here and there in the past, not quite sure why it didn't catch on more. You get the mobility of a laptop, and then when you're home docked in, you can get the GPU processing power in a separate form factor.
External GPUs didn't catch on earlier because Thunderbolt is the only viable standardized interface for them, and Intel refused to certify eGPU products based on TB1 or TB2.
TB3 is the tipping point where Intel decided there's enough bandwidth to make it a smooth experience, and put their weight behind the idea.
I recently switched from an 18" luggable gaming laptop to a surface book. Powerful enough to replace it (I got the version with an nvidia GPU in the base that it switches to dynamically), but small and light and good battery life too.
The most exciting thing to me in portable PCs is the 'VR backpack' form factor. As cool as the HTC Vive is, moving around with a fat cable sticking out the back of your head is a big detriment. Putting the PC in a backpack will be so perfect for VR - bigger batteries, more efficient (without the LCD), no cables.
These GPUs will make VR backpacks even more viable, with reduced power consumption, better performance and reduced size. I am super excited to see what will be coming out in this field over the coming years.
If simply discussing current generations so yay for the Vive wireless lighthouse base stations and controllers but nay for its 12V 1.5A requirement. As for the oculus, yay for its 5V over USB but nay for its cable connected tracking.
Anyone here have any experience with the new generation of TB3 External GPU docks like the Razer Core?
$500 is pretty steep for what's essentially just a tiny case + power supply, but if it works as advertised with no serious pitfalls, I might be tempted to make a splurge for the extra flexibility.
Though I wish someone could make a smaller, cheaper graphics dock that's built specifically to house less power hungry, single-slotted cards like the RX 460. Something like that would be more than enough to handle my modest gaming needs for the foreseeable future.
I hadn't bothered to look up the dimensions and weight on the Razer Core until you mentioned it though, and realized it's barely any more convenient to move around than my current desktop PC case (SilverStone FTZ01, only about 150mm more in a single dimension), which removes a lot of the appeal it had for my use case, unfortunately.
I honestly wouldn't mind a non-upgradable version that uses one of the laptop cards listed in this article if it means they could make it appreciably smaller and more discrete than a slim-SFF desktop case. I mean, the dock itself is a modular component that can be upgraded as a whole anyways.
I also appreciate the review, but I think it's a case of too little, too late and too expensive.
For that price you can simply build a spare HTPC in a box of similar size or get a powerful laptop with good GPU built-in.
The problem with either of those setups is you don't get the flexibility of being able to use the same portable, quiet and power-efficient laptop for both work on the go and for gaming at home, so you never have to worry about keeping things in sync between multiple machines (which is easy enough if you just need files, but often impossible if you also want to sync arbitrary settings for frequently used applications).
For many people like myself, the TB3 laptop + GPU dock combo is worth a lot more than the sum of its parts.
I think this is a sign they're worried about AMD Polaris. Most people don't realize just how much AMD has turned around the past year. I saw an article recently about people shorting Nvidia.
Polaris is not a serious threat to Nvidia. The GTX 1060 is significantly faster than the RX480 at a similar retail price; the GTX 1060 is considerably more expensive than the 960 was at launch and I expect that Nvidia have retained good margins. The RX480 seriously missed AMD's efficiency targets, hence the PCIe power fiasco. AMD have nothing to compete at the high end and have no serious HPC offering; with PC sales shrinking year-on-year, HPC is a crucial driver of growth.
Polaris is just barely enough to keep AMD in contention. As with their CPU range, AMD are relegated to a value-oriented offering for the low to mid market. This isn't a good place to be. Nvidia can afford to squeeze AMD's margins, because they have a monopoly on the more profitable high end. AMD are also being threatened from below by Intel's increasingly powerful iGPUs.
>The GTX 1060 is significantly faster than the RX480 at a similar retail price
Not from what I saw. Slightly faster or equal in DX11, slightly slower or equal in DX12/Vulkan.
And not price-comparable either, there is a $50 difference. If you are pointing to the EVGA etc, note that they have a single fan and as such are going to throttle quickly.
For a good price comparison I suspect we will need to wait for the rumoured 1050Ti, which should be actually price comparable to a 4GB RX-480.
Personally, if I was building a upper-midrange gaming PC right now, that marginal $50 would go to a bigger SSD, not to buy a 1060 over a 480.
I think the current value in the stock is not just the high demand for the new cards fueled by a VR upgrade cycle, but speculation on their gains in the HPC and "deep learning" markets (where CUDA/cuDNN is so far ahead of openCL that from my vantage point they have a monopoly for now). I think that Nvidia can continue to compete on the high end because of the overlap of high end consumer markets with those other two.
My full disclosure: I am very long Nvidia (and it has been very kind to me so far)
> speculation on their gains in the HPC and "deep learning"
I think this will be the battleground; see intel MIC for another big sillicon trying to edge in.
I agree that CUDA/cuDNN is far ahead, but do you think they can maintain it? GPGPU paradigms are changing pretty rapidly, so the entire field could be ripe for disruption.
> Most people don't realize just how much AMD has turned around the past year.
I've been recently reading stock reports from some of my banking friends (and actually chatted with some staff at AMD) because I'm curious about the turnaround. Don't own equity yet.
Just last year AMD looked to be in very dire straights, and are still operating at a loss. Is the consensus that they are going to survive now?
Another good reason for dropping the M is in my opinion all related to how good another company got at creating gpus... that's Intel.
Intel first eliminated the whole aftermarket entry level gpu industry and will probably eliminate the middle tier also.
As Pluma states
"If you're going with a dedicated graphics card in your laptop, battery life is already out of the window, so you might as well get as much processing power as the thing can handle"
Yes, I gave a read to it but if I remember I think there was a piece that didn't perform up to the speed of the thunderbolt link making the whole solution run 1/4 speed.
The solution approach I think goes as back as when using the express card port on elitebooks
It would be incredible on the other hand to get this solutions as official products from vendors like nvidia or asus without the DIY (not because I'm against it but to push making use of this solutions officially will improve the state of the solutions).
And they still offer crappy 3D performance and for long time had better DirectX drivers than OpenGL ones, which sometimes lied about their actual hardware capabilities.
"And if you can make it look like it's for grown ups too, that'd be great."
THIS. I'm not going to buy a laptop that looks like a prop from a Michael Bay movie. The best current option for a laptop with a GPU and a design that wouldn't be embarrassing to leave the house with is the Microsoft Surface Book, which is expensive, has a mediocre GPU and is actually still kind of embarrassing to own. I'm pretty excited about the day that gaming laptop manufacturers realize that their target audience isn't entirely composed of guys whose main fashion inspiration is Reaper from Overwatch.
I dunno. I've had decent luck with Asus laptops that are at least somewhat gaming-capable. They've still got the same M-class GPUs you'd expect (outside of these eXXXtreme or external options) but you're still looking at a standard brushed aluminum chassis (maybe a plastic base depending on model), i5 or i7, 16GB of RAM, SSD, and $1000-1500 price tag.
Not exactly cheap but certainly not in the same range as the GPU-laden Surface Books. I picked one of the Asus notebooks up a couple years ago when I needed something for occasional use (so I didn't want to go all out on budget) that could still handle some graphics work and live projection visuals which typically need OpenGL support past what you find in an iGPU.
It's not as thin and light as an ultrabook or tablet/convertible but that's the flipside of powerful and not too expensive. Maybe if it was my main or only machine I'd have gone higher end and higher price but there's definitely some options in between the crappy $600-800 general purpose laptops and the $2000-3500 Decepticons and fancy detachable tablet offerings.
I don't see how much room for design there is on a Surface book besides the keyboard dock assembly, but even that looks pretty spartan with clean lines and a rather pale gray that could be mistaken for an Apple laptop from the Aluminum Powerbook era. So I don't know what's embarrassing about it except for, contentiously, the monochrome Microsoft logo attached to it.
I can highly recommend the Dell XPS 15. I too was in the market for a laptop with a mediocre GPU, which didn't look absolutely terrible, was affordable and not too bad on the whole portability thing. So far it has definitely lived up to my expectations.
I can recommend getting the 1080p version (but not the entry-level model, as that one has a smaller battery and no discrete GPU), unless 4K is very important to you and you know what you're getting into wrt. high-res scaling in Windows.
Man. Hopefully these come with some kind car-like exhaust heat shield between the keyboard and everything else. I tried playing games on my macbook pro, and while it ran fine (guildwars 2, a few years ago) the machine got so blazing hot that I had to play with an external keyboard.
After a few days of doing that I figured that having the laptop that hot for extended periods of time wasn't a good idea and quit playing. Probably for the better.
So what about that great big "?" in the table: TDP? How hot is it going to run? How well is the laptop going to handle it - does it now require an all-metal body? How well is the essential issue of fan and heatsink maintenance considered in the design of the laptop?
I ask this because I have two GPU laptops both of which gradually degraded in usability over time entirely because of thermal issues.
Did you try removing the heatsink/fans and cleaning them out. I was close to replacing my laptop because it was constantly overheating and it turned out the problem was a copper heatsink grid thing which was densely packed with dust, insulating the entire computer. Since cleaning that out my laptop is good as new.
Yes, that was my first thought on experiencing the problem. It's often quite hard to do (that's why I mentioned "How well is the essential issue of fan and heatsink maintenance considered in the design of the laptop?" above). One of my laptops requires taking the whole thing apart from the keyboard side in order to get at the heatsinks below, so I gave up.
And of course there are laptops on sale which are glued shut.
Air duster works reasonably well but can force dust further into cracks.
> And of course there are laptops on sale which are glued shut.
Are there? Even Macbook Airs can be opened with nothing more than the appropriate pentalobe screwdriver. I had to do it a while ago to clean out the fan which had developed an annoying clicking noise.
Microsoft Surface. I suppose you could call it a tablet, but it has a keyboard and an i7 processor and runs Real Windows including games. The tablet/laptop distinction is being eroded, and "tablets" are much more commonly held together with glue.
The desktop 1060 has TDP 120W and the notebook one should be within 10% of its performance. So, I don't think they will get below 80W. There is also the 1050Ti with 128-bit bus, though. That should cut TDP significantly.
When people talk about 4K they usually mean (whether they know it or not) UHDTV, which is the standard used in consumer monitors and TVs and in which 4K = 3840x2160.
This makes sense, as the increasing capability of tablets and phones has made portability a lot less of a priority in a laptop for me, at least. It's almost hard for me to imagine why a MacBook Air was an appealing buy for me in 2011 - I think that a decent laptop was superior for a lot of general use tasks that any recent iPhone can now do just as comfortably/quickly. When I have to use a full system at home, I expect some sort of a major advantage in terms of capability and I care a lot less about form factor.
I wonder if Apple even has a response in these areas where svelteness has much less of a premium.
NVIDIA laptop GPUs were always the "same chip". Each "chip" in NVIDIA can be floorswept differently, with parts lasered off and its operating clock range adjusted. This is why the 1080 and the 1070 is the "same chip"--GP104.
Previous mobile chips were actually the "same chip" as the desktop ones, except they lasered more off and reduced the clocks even more, sometimes reduced the clocks a lot and lasered less to make the performance better by cramming more hardware.
They just dropped the M thing to show that "oh look, we've come this far!" but in reality, none of their policy actually changed--just floorsweep differently to have less TDP and hopefully it's close enough to desktop tier perf.
edit: what they are, are different SKUs of the "same chip".
edit again: I'm not saying this isn't any small achievement, Pascal has brought a ton of improvements to the design. I'm just saying it's a marketing move that I expect the HN crowd to look through.
Nope. Unlike previous-generation mobile parts, these are functionally equivalent to the desktop parts - same number of cores, shaders and ROPs, same memory bandwidth. Of course clocks are slightly lower and these parts are binned for higher efficiency, but the performance is very close to the desktop parts.
The 980m was substantially cut down compared to the desktop 980 and only had about 55% of the performance.
It's actually a rather refreshing marketing move - in past gens the "M"s were often smaller chips - while the 980M had the GM204 like the non-ti 980, the GTX960 had the GM107 (GTX750!)
Now for BS, go to a store and look for a laptop i7 that's actually a quad core... outside of a gaming laptop.
Besides gaming, anyone has used this beefy laptops to run GPU intensive machine learning algorithms? I am curious to see if it's practical to use a gaming laptop vs. a desktop to run tensorflow calculations.
Personally I just built the desktop and ssh in with an old macbook. Best of both worlds- you're using the wired connection at home to download stuff, and your mobile device is just running a terminal and a browser so it has long battery life, and you can turn off the laptop and let the desktop chew on your model.
Amazing advances in GPU technologies over the past few years.
I love this comment: "...Oh, and don't forget the power adaptor, which—as I saw with some models in performance demos—is literally the size of a brick."
As long as it's not a wall-wart... Whoever seriously considers it a good idea to block off 2-3 outlets in a powerstrip to run one DC inverter ought to be bludgeoned with that inverter until they see reason.
Absurd as it may sound, one can find extremely short 3-prong extension cords designed specifically to take care of the wall-wart problem. Search Amazon for "1 foot extension cord" for examples.
That setup actually works really well; its predecessor GX700 was one of the few previous-gen laptops that can drive HTC Vive _and_ Oculus Rift (adapters required) at the same time. https://plus.google.com/+AudreyTang/posts/VGebgzXnefP
I'm sure it does, and I wasn't suggesting otherwise! The new version (the one I linked) has dual GTX1080 in SLI so I'm sure it's a beast, and the liquid cooling docking station is actually very ingenious. I was just amused by how unconventional it is, given the current tendency towards ultra-thin and/or convertible laptops
As a proud owner of a laptop that could double as a self-defense weapon to cause massive blunt trauma (and a charger that falls squarely into the same category) I welcome this decision.
I am however considering getting a lighter notebook with longer battery life in the future. Having the power of a full desktop machine in your backpack comes in incredibly handy when you need it but it can get a bit awkward working with it on the train.