Hacker News new | past | comments | ask | show | jobs | submit login
AMD Announces 307% Increase in Q3 Earnings (businessinsider.com)
284 points by baazaar on Oct 24, 2017 | hide | past | favorite | 169 comments



I really don’t understand the negative market sentiment on expectations of future cashflow. AMD has been building compelling GPU and CPU chips and this is a rising tide market driven by new solutions to longstanding problems (AI/robotics/driverless cars/drug prediction/VR/AR/big data). These technologies will applied to every industry on the planet including huge industries like agriculture that are today the least digitized of all major industries. Exponential growth in IoT devices and AI2AI interactions will generate tremendous amounts of data as well as the meta data that sits on top of that. Then there is bitcoin. I think people are massively underestimating the TAM. This era will require massive new computing power and especially for big data centers it makes sense to diversify your supplier. It’s a rising tide that will be much higher than AMDs last high watermark. AMD is a measly $10b company, compared to NVIDIA ($110b) and Intel ($160b). A 10% move by NVIDIA exceeds AMDs entire market cap. Would love to hear some counter opinions on this.

Disclaimer: I own both NVIDIA and AMD.


Is agriculture really the least digitised industry? At least in Europe and the US, going to a farm these days you'll find robots and machines everywhere.

I would argue construction is probably the least digitised. The Economist had an article recently that construction is the only sector in manufacturing in the UK that declined in productivity over the past two decades. I'm not really in the topic but the construction I see in the city (London) looks not like they adapted any processes developed over the past 1-2 decades. Maybe that's where the next revolution happens.


https://www.nationalgeographic.com/magazine/2017/09/holland-...

After reading this, it is clear how much can still be done. Holland has in the past 3 - 4 years consistently been innovating and producing much better yield in all sort of agriculture products. The first two years most US companies has been saying those prices aren't sustainable, and they will come down soon, it is now clear those better then expected yields are not mere coincidence.


While I can't speak for the physical part of construction, I know that the Structural Engineering part was until semi-recently (2007-ish) pretty manual. My flatmate graduated at the same time as me and worked in a local engineering firm in Edinburgh and I was always curious what his work involved. It turns out there were a LOT of extremely manual calculations performed with pen and paper, with datasheets containing commonly used calculations for how structures should behave under different loads and the properties of various materials you could plug into these calcs.


One of my teachers used to say "what is cheaper: Robert or robot?". There already are constraint based systems for structural engineering that e.g. calculate supports given structure and loads, select materials, etc. I am not too familiar with UK market, but in my country people in civil engineering tell me that significant portion of engineering project costs come not from actual engineering, but rather various permits, coordination and other bureaucracies that do not benefit from single party automating.

This makes returns on investment into new software, trainings and lost temporary productivity diminishing. Without government changing standards on engineering output, the market for flexible projects is not that large (which is a bit of a chicken and an egg problem).


That might explain why my friend wasn't enthusiastic about trying to automate some of them for him. Tbh this was before I realised startups were a thing you could do so I wasn't thinking about this as a business, just as a way to make him less stressed out so he could leave work at 5pm and come for beers with me


I’m not that familiar either, but there’s loads of building work going on near me in London that looks pretty modern. There’s a hotel nearby that has been built out of prefabricated rooms that came in on lorries, and there are two blocks of flats being built with a platform that rises up a floor at a time, producing seemingly fully completed floors out of the bottom (difficult to describe). Both of these seem relatively modern to me, and are certainly seeming to produce building more quickly than normal.


I saw this all coming from a mile away. I was frankly shocked at how low the price was last year, given they nearing the end of a major development cycle (both AMD [as a CPU vendor] and ATI->AMD [as a GPU vendor], they have a habit of developing a leading flagship product about once every four to five years, but focusing more on making their last flagship more affordable in the years between).

It also can not be ignored that until the Switch came out, they were making the SoCs for every current generation console.


Not to mention AAPL keeps using them for all their desktops and laptops with dedicated GPUs


> I think people are massively underestimating the TAM.

While I agree, my only criticism would be that the "old" data center market is effectively consolidating to commoditized hardware. In other words - we have less idle CPUs in data centers and the diminishing returns of Moore's Law is hitting. So while the new wave is positive, the underlying revenue stream may not be consistently holding.


AMD doesn’t make money consistently. It’s in an incredibly capital intensive business. Occasionally, one of its chips outmaneuvers the corresponding Intel chips for a generation or two. But in a couple of years, it rolls the dice again. More likely than not, Intel will beat it next few cycles. Then AMD will lose money while it tries to develop another hit.


As an investor I think it’s always important to ask: what’s changed and why now? Discussing the past without addressing those two questions may make you miss opportunities that were obvious in retrospect. You might have been able to say the same thing about Apple when Jobs rejoined the company and started launching new products.


But nothing has changed. The mechanics of AMD’s business is the same (and indeed the guy who designed Zen is no longer there).


I have no opinion on AMD or how it compares to Intel or Nvidia, but this kind of enthusiasm about exponential growth reminds me of how Cisco was the largest company in the world by market cap in 2000.


I agree. The last three AMD booms have tracked to the last three asset bubble periods (1999-2000, 2005-2006, now). They have no earnings to support the present $14 billion market cap, much less higher. When this latest asset bubble party ends, AMD will tank again. They're supposedly in a boom period due to demand for their products, yet they're struggling to even get back to where their sales were in 2014 (when their stock was 1/5th what it is now). It's nothing more than the latest iteration of stock market mania (where stocks like Activision carry near 50 PE ratios while sporting 5-10% type earnings growth).

Cisco is still one of the more remarkable stock stories I've seen in the last two decades. Something like ten straight quarters of accelerating revenue growth, quarter over quarter, leading up to 68% growth for the third quarter of 2000. Three quarters later their quarter over quarter growth had evaporated down toward zero.

That same shock deceleration basically killed Lucent, Nortel and Sun Microsystems. The three were worth around $600 billion together during the bubble. Sun of course struggled on as their business melted, but they were a shadow of their former self by 2009. Oracle nearly got caught in the same trap (also having a crazy ~$200 billion valuation during the bubble), instead they went on a competitor acquisition spree and created as much of a moat as they could. 17 years later, their stock is finally back to where it was during the bubble (in nominal terms).


Yes, but I am worried AMD 's agreement with GF is holding them back. AMD right now is selling as many EPYC, Ryzen and Vega as they could produce. Since they brought back Polaris to GF, the capacity are now shared between both their GPU and CPU product line. It would make sense if AMD sell more EPYC and Ryzen CPU then Vega since those CPUs are have higher margin. But Vega also have much higher market volume due to Crypto mining.

Navi, the next gen modular GPU are going back to TSMC 7nm. But that is professional offering appeared to by the end of 2018, and consumer offering in 2019.

i.e There is a capacity limitation of how well AMD can do within the next 24 months. That is excluding AMD's APU launching soon which I think is going to be a big hit.

P.S - That is GF's Fab 8 20% capacity increase included. It is clear AMD is not the only one using Fab8, I wonder who are these other customer using 14nm in GF.


> But Vega also have much higher market volume due to Crypto mining.

Does it?

I'm genuinely interested in this as Vega appears to have high power usage requirements and less than spectacular mining results. I mean they're fine, but not the multiple of current cards that was rumoured before release. I owned a Vega 64 briefly, but sold it on as I didn't really want the noise or heat. And it still doesn't look like any real partner cards have been released, the lineup is all reference cards.

Have you seen any sales figures that would show the mining popularity of vega?


AMD isn't in nearly the same breadth of markets as nvidia. Some examples: automotive, super computing (which they dominate due to CUDA), and game streaming services. Also, AMD's driver support on Linux platforms has historically been pretty weak and often broken, further limiting any consideration of uses outside of gaming.


AMD is a clear winner now on Linux. Nvidia will become an underdog, if they won't open source and upstream their driver, which is very unlikely. Nvidia was and will be plagued by integration issues, while AMD already nearly caught up performance wise, and will overtake Nvidia soon enough.


I've heard this annually or so for 5 years or so. Yet whenever I look at a benchmark of linux games about half of them don't work on AMD GPUs.

Similar for the steam statistics on linux, AMD just doesn't compete on anything that pushes the GPU on linux. Even things like vlc, mplayer, plex, and friends often get the AMD GPUs in a wonky mode.

The problems I most often see is new windows fail to map, I just get a black window. Presumably something to do with memory allocations of the backing store. It's also frequently munges the console so I can't get to text mode to unwedge the AMD GPU.

So generally if you what a stable linux box I'd recommend nvidia or intel GPUs. Things just work, you can easily have month long uptimes, and things don't freak just because you are playing 1080P video full screen. No pixel droppings, no failing to map, console always works, etc.


> Yet whenever I look at a benchmark of linux games about half of them don't work on AMD GPUs.

Not sure what you looked at, but I see quite a rapid progress of bug fixes: https://www.gamingonlinux.com/wiki/Mesa_Broken

Never had problems with vlc or mpv on AMD. It works just fine with vaapi. If anything munges the console, it's Nvidia blob which doesn't even support framebuffer.

Also, it sounds like you aren't using Mesa, but use AMD closed stack instead. Don't do that. In short, if you want a stable experience - AMD is way ahead of Nvidia.


> Also, it sounds like you aren't using Mesa, but use AMD closed stack instead. Don't do that.

This is true, but needs a caveat: a lot of work has gone into improving Mesa and the rest of the Linux graphics stack recently and it's now pretty much caught up (performance-wise) to the proprietary AMD drivers. However, that's only happened in the past 6 months or so, so if you're not using a rolling-release distro like Arch or Gentoo you might still be better off with the proprietary drivers until your distro catches up.


Most stable distros including Ubuntu have repositories for backports, that provide recent Mesa. So I'd say there is no need to use closed AMD driver at all, unless someone needs full OpenCL or compat profile.


You should clarify you mean games. Nvidia has the gpgpu market to themselves on Linux, and nobody cares if the driver is open source in that area.


Yes, I mean games. In GPGPU market AMD is popular as well though. Miners buy Vega GPUs like hot cakes, it even creates a shortage. Also, CUDA lock-in will be undermined by Vulkan becoming the new common basis for compute. And AMD hardware is better for it and has more raw power, there is no question about it.


I haven't noticed a Vega shortage, are you sure?

Vega is loud and power-hungry, and expensive The last generation of AMD card was snapped up by miners, I've not heard of it happening with Vega.


There was Vega shortage for some time, but now it looks better. Huge demand from miners clearly affects it.


AFAICT here in the UK there was a very limited first week release, followed by a restock and since then it's not been an issue.

I am genuinely interested in whether miners are really buying these. Have you any reason to think they are?

I couldn't find much when I looked, beyond speculation about how amazing they might be (but turned out not to be). AFAICT for ethereum you're better off with nVidia 1070s which can cost about 75% as much, have about 75% of the hash rate, and use under 50% of the power.


That's what I've heard from others about the shortage reason. I didn't specifically research mining topic.


I've pinned my compute hopes on AMD for about a decade now. They've been able to hang in there performance-wise, but their solutions on linux just haven't been as stable as NVIDIA's. I hope that's changing, but I'm pretty reluctant to believe it.


The amdgpu driver really changes that. With NVIDIA you have to choose between an open source driver (nouveau) that has quite some problems or a proprietary driver that does not really integrate well with the rest of the platform. E.g. for a long time you could not use Wayland with the proprietary NVIDIA drivers because they had implemented a different API for device memory allocation (via EGLStreams) [1]. In contrast, AMD is actively contributing to the open source amdgpu driver and builds their proprietary AMDGPU-PRO driver on top of that.

I have recently switched from an NVIDIA Quadro card to an AMD FirePro card. I use GNOME/Wayland and the difference is quite big, with NVIDIA on nouveau, there was regularly flickering (at random moments), artifacts and other problems. The FirePro with the amdgpu driver on the other hand works completely flawless on my machine.

[1] https://www.phoronix.com/scan.php?page=news_item&px=XDC2016-...


> I have recently switched from an NVIDIA Quadro card to an AMD FirePro card. I use GNOME/Wayland and the difference is quite big, with NVIDIA on nouveau, there was regularly flickering (at random moments), artifacts and other problems.

nouveau was always pretty awful, but the nvidia binary drivers always Just Worked, across many models for many years, whereas with AMD it's always been a crapshoot whether a given card would work properly or not.

The binary drivers always kept up with the important things (indeed they were well ahead of AMD in terms of doing accelerated video decoding on linux via Xvmc and later vaapi, their xinerama integration was always better...). Sure, they don't support Wayland, but that's a solution in search of a problem; if and when there's actual value to be had by using it I have confidence that nvidia will support it.


Just worked? The amount of security advisories for Nvidia kernel module or Nvidia X module are staggering.

X is also insecure from the ground up, that alone is enough incentive to move away from it.


they don't support Wayland, but that's a solution in search of a problem

Not

- if you don't want any GUI application to be able to read keystrokes/mouse events/screen grabs of any other GUI application,

- if you want per-display scaling factors,

- if you want a tear-free desktop experience.


I'm currently building out a CUDA cluster and I can assure you that Nvidia drivers don't "Just Work"

If AMD is worse, they're failing to clear a pretty low bar.


I'm on 1070 for DL tasks with the proprietary driver and it segfaults cinnamon every hours, probably when the acceleration for its graphic effects kicks in. It can be restarted though. I know it is due to the driver because every time the kernel gets updated cinnamon keeps failing to restart itself until I restart the entire system. This never happened before switching from Radeon.


Their new drivers (AMDGPU) are being slowly integrated in the Linux kernel, no closed-source binaries.


Small correction: the amdgpu driver has been integrated and working in the upstream Linux kernel for many years now.

The one chink recently was the new display code, which was a requirement for Vega (but not for earlier generations), and which was a genuine open sourcing of an existing, Windows-focused code base. That process took longer than one might hope for, but we're clearly over the hump now with it going upstream.


AMDGPU beats Nvidia proprietary blobs on every measure. Stability, Performance (Vega 64s hanging with 1080Ti in some games), ease of installation. Especially now that both the kernel developers and AMD are making a concerted effort to integrate the driver into the kernel.


With OpenCL being replaced with Vulkan, things will get in shape.


The future of GPU is virtualization (ie where a single gpu can power multiple VMs at almost native performance). When the mining profits start to dwindle, AMD is going to need something to fall back on. They should start putting MxGPU on their consumer cards. Intel already sees the future and has enabled GVT-g on all their integrated GPUs.

AMD had slideshows describing the feature on the new Vegas, but decided to disable it before release. This alone stopped me from building a threadripper + vega machine. Now instead I'm waiting for Intel to add more cores, then I will use their integrated graphics between multiple VMs.


What do you use integrated graphics for on VMs? Genuinely interested, I haven't used VMs beyond web dev and occasionally trying out some new distro of Linux.


For those using IOMMU, integrated GPUs are a wonderful way to separate your host machine (which uses the integrated GPU exclusively) and your virtual machine (which uses the dedicated GPU exclusively) such that you can work on a Linux host and game on a Windows guest with (near?) native performance.


I wonder if AMD will enable SR-IOV for their desktop GPUs.


> AMD is a clear winner now on Linux.

It depends what you value. I've consistently bought Nvidia hardware because of the proprietary drivers.


I had an admittedly very old ATI card that I was using in my Linux box while saving my pennies for a newer Nvidia card. I had to give up on it and use an equally old iMac (2008-ish) because the ATI card was locking up my desktop with ring 0 errors nightly (if I ran something that required the 3D hardware, 2D was fine), and no amount of searching could find the solution. That put me off an AMD GPU. I would love to support something else in the ecosystem, but sadly I just don't have the dollars to waste on new hardware that may not work.


Until recently, it was a sensible approach for Linux gamers. Today - not anymore. AMD is just better overall.


The several dozen people playing AAA games on linux must be thrilled.


Your sarcasm is misplaced because you didn't pay attention to the growth of the Linux gaming market. Check contributions to Mesa by Valve, Feral and others.

But more to the point - sure, Linux gamers are quite excited about all the work that's going now into the open graphics stack.


Amusing that you call it sarcasm. Linux gaming has flatlined at 1% or so for years without any noticeable changes.

I just checked the last steam hardware survey, linux is at 0.60%.

The linux steam hardware has pretty much been a flop. Ubuntu is ditching unity.

I sit in front of linux desktops at home and work. I do occasionally game some under linux. I have a linux phone (which has a linux kernel). But I fully realize it's a niche and feel lucky when any game comes to linux. I don't really see any reason for any real optimism.


Ubuntu ditching unity has nothing to do with anything, what am I missing?


It shows that Canonical doesn't have the money to support the desktop like before and rather focuses on server / IoT instead.


It's much more a sign of their failed mobile strategy, which hinged on UI convergence between the desktop and mobile versions of Ubuntu. With all that gone, pooling their efforts with the rest of the Gnome-using world only makes sense (much of the Unity desktop also consisted of appropriated Gnome components).


I think your conclusion falls under the term non sequitur. It does not show anything about Canonical's financial incentives or their strategy, your statement is basically stating an assumption what the reason is.

Further I would argue it is irrelevant to the initial argument.

What has Canonical to do with it? Neither "Linux", "Linux Desktop", "Linux Gaming" or "AMD graphics under Linux" are in any way tied solely to Canonical or what Canonical does. Again, am I missing something?


> It does not show anything about Canonical's financial incentives or their strategy, your statement is basically stating an assumption what the reason is.

Fair point.

> What has Canonical to do with it? Neither "Linux", "Linux Desktop", "Linux Gaming" or "AMD graphics under Linux" are in any way tied solely to Canonical or what Canonical does. Again, am I missing something?

Ubuntu is the only official supported distribution by Steam and GOG (they also support Linux Mint, but it's based on Ubuntu).


RedHat and Valve do way more for Linux gaming than Canonical. So I agree with the above, Canonical's decisions aren't affecting Linux gaming that much.


If that was the case, why doesn't Valve support Fedora or RHEL/CentOS, but Canonical's Ubuntu?


Because Valve assume majority of users are using Ubuntu or derivatives. Which doesn't contradict what I said above. RedHat developers contribute a lot to Mesa. Canonical isn't exactly known to do that.


If the majority of Linux gamers use Ubuntu, that means that Canonical has something to do with gaming on Linux ;)


Something yes, but not improving it directly. Besides, Ubuntu proper isn't even the most used distro probably. Mint is likely more used.

My point is, those who work on Mesa (OpenGL / Vulkan) and Linux graphics stack have way more direct impact of Linux gaming (they are fixing bugs that affect games, improve performance, add new functionality and so on).


> Linux gaming has flatlined at 1% or so for years

You didn't pay attention either. The growth was well described here: http://boilingsteam.com/linux-gaming-in-2016-the-good-the-ba...

And that was last year, since then it only improved. And please, stop using Steam hardware survey for any Linux market estimation. It's useless.


> The growth was well described here: https://boilingsteam.com/linux-gaming-in-2016-the-good-the-b...

That actually says (if you follow the links to try to find the actual claims): The Linux Market share on Steam is about 1%, Mac is about 4%. They are relatively stable around that. But obviously more people join Steam every day.[1]

So... about 1%, using Steam to estimate (not steam hardware though).

[1] https://boilingsteam.com/our-fifth-podcast-with-feral-intera...


It's referncing same useless Steam survey for that quote. See above.


Exactly!?

There is no evidence presented anywhere which disproves that. Your claim was that it showed evidence of growth, but if you read it then it turns out it is using the steam survey, and shows no growth in percentage terms at all.


The article above showed evidence of growth, read it again.

And there is actual sales data that comes from developers. It's very different from those survey numbers.

Also, since there is no info on methodology of that survey, you can't know even what it means. I've heard from many Linux users, that they never got one while using Steam on Linux, while they got it while using it on Windows for example. It even never comes up in Valve's own SteamOS, so it's clearly not something Valve put a lot of thought in.

So, I'll stand by what I said. Data from that survey is useless as is and should not be applied for any market evaluation.


The article above showed evidence of growth, read it again.

Show me a quote. There is a larger number of titles being published on Linux, but there is nothing about sales or usage.

And there is actual sales data that comes from developers. It's very different from those survey numbers.

In that article or elsewhere? Can you point to it please?


Larger number of titles corresponds to growing demand. That's a natural dependency.

There were several articles with numbers from developers on GOL: https://www.gamingonlinux.com

Search for sales statistics or something of that sort there.


So.. not a single quote, number or anything.

The OP's comment "The several dozen people playing AAA games on linux must be thrilled." seems pretty realistic.

Increased number of games is simply because it is close to zero-effort for most game engines to press the button and deploy to Linux Steam. That doesn't mean anyone is actually playing them.

I had a quick look at the GOL site and I didn't see anything obvious claiming growth. Ironically (given the topic of this HN story) I did see this:

Get ready to become a neural detective as 'Observer' is now on Linux, AMD not supported.... I spoke with Aspyr Media, who confirmed to me the team has "currently no plans to support AMD at this time for Observer".[1]

Yet, clearly gaming on Linux will take off any day now... (And this is from someone who runs a Linux desktop computer)

[1] https://www.gamingonlinux.com/articles/get-ready-to-become-a...


> Increased number of games is simply because it is close to zero-effort for most game engines to press the button and deploy to Linux Steam.

It's far from zero effort. Besides, engines are making Linux support easier, is also driven by demand. But hey, legacy execs would rather talk about how Linux gamers don't buy games, instead of actually making games for Linux.

See https://twitter.com/icculus/status/923201260819550209


"game streaming services" is a non-market, sorry. Maybe in an investor's checklist of "potentially interesting futuristic-sounding stuff", but nothing that compares to, say, having a chip in every Xbox One and PS4, and dominating game development mindshare for the next years.


Depends on the margin. Nvidia pulled out of gaming consoles due to the margin MS and Sony dictated. AMD was fine with fulfilling demand at almost any margin.


The long-term effects are worth much more than the margin on each sold console.


What are these "long-term effects"?


Getting game developers familar with AMD hardware. And in the Xbox One X's case: Getting TV manufactors to support FreeSync.


Inside of gaming, AMD's open source AMDGPU driver+MESA outperforms similar priced Nvidia cards with their closed stack.

I find it odd that AMD can get the drivers right for gaming but that's about it.


Citation? All the comparisons I've seen still have nVidia way ahead


AMD open-source drivers (new GPUs: amdgpu; old GPUs: radeon) is way better than NVIDIA open-source drivers (nouveau) and is competitive with their own closed-source drivers (new GPUs: amggpupro; old GPUs: catalyst), while losing to the NVIDIA close drivers [1].

If you want good performance using open-source drivers, AMD is the way to go with Linux [2].

[1]: A old comparison (Jan/2017): https://www.phoronix.com/scan.php?page=article&item=mesa171-.... Nowadays the performance is even better.

[2]: Open-source drivers is important not only because FSF and all, but because the open-source drivers follows the advancements in Linux graphics stack much closer. Things like KMS and Wayland works much better with open-source drivers than NVIDIA proprietary drivers. Even things simple like Xrandr is ramdomly broken in NVIDIA drivers.


Open-source-only is an artificial comparison to make; nvidia's open-source drivers get very little development attention precisely because their official drivers on Linux are so good that most people have no reason to use the open-source ones. What matters to most users is performance, stability, and features under the best available driver - and in those terms nvidia still wins on Linux. (At least IME - e.g. my experience is that Xrandr was much more reliable for nvidia than for AMD)


It depends, open-source driver brings better desktop experience in general, even considering only NVIDIA drivers (nouveau), since you don't need the extra performance in 90% of the cases. Things like KMS is only available in NVIDIA binary drivers in recent releases, are still kind bugged and it is more difficult to use (installing a binary driver and including the module in initramfs VS. completely plug-and-play support with open-source drivers; the work can be automated by distro maintainers, but still is not the same thing). Other things like Wayland works so slow in NVIDIA binary drivers that is simple unusable. Another example is GPU switching: for some time Linux has a very good dynamic GPU switching support using PRIMUS, however NVIDIA proprietary drivers still does not support it (maybe they can't?), so NVIDIA ends up reinventing the wheel and their implementation is really bad (Vsync issues and no dynamic switching, you either start your whole X11 session in iGPU or dGPU, so it is kinda useless).

NVIDIA binary drivers really only win in performance. Even the features in NVIDIA drivers tends to be more bugged: i.e. I use compton as my composite manager. It needs a very old OpenGL version (like 1.1 or 2.0 capability), and NVIDIA drivers still mess up and I need to activates some workarounds in compton to get usable desktop. MESA (used by open-source drivers) has a much better OpenGL implementation, and I can basically get a glitch free desktop without workarounds. I also remember getting random glitches in Chrome and Gnome Shell, just to cite more examples.

So it is not an artificial comparison. Want performance and CUDA? Yeah, go to NVIDIA. Want just a stable, modern desktop (Wayland and composite glitch-free)? Open-source drivers is the way to go.


Your older Phoronix link misses out on the large performance increases that were made this year once OpenGL 4.5 support was completed.

https://www.phoronix.com/scan.php?page=article&item=radeonsi...

The AMD open source driver is just about on-par with their closed source Windows driver. One of the benefits of Open Source is also being utilized now. Game companies like Feral Interactive, who develop many Linux ports, have been improving the driver's performance for their games.


I just wanted a comparison between NVIDIA binary drivers and AMD open-source drivers. I know the comparison is old, I even added a footnote explaining this, saying that current performance of open source-drivers is even better.


Comparing amdgpu+radeonsi to nouveau isn't really fair. Nouveau can't use proper reclocking most of the time because Nvidia are being jerks.


Nvidia can play the game however they want. But they were not pushed into that situation. Their choice, they get to own it. I think it’s a fair comparison.


Nvidia isn't involved in nouveau, it's developed completely by the community. Since it can't use hardware in full, there is no point to compare it to radeonsi performance wise.


I don't think anyone would say this with any negativity towards the folks who are developing the nouveau driver. Given the circumstances that they are working under (no useful help from the vendor), they have done heroic work.

But I think you are missing the point here and that is: if you want to buy a GPU now for a Linux machine and you don't want to use a big proprietary blob, AMD is by far the best choice, because the amdgpu driver outperforms the nouveau driver.


> if you want to buy a GPU now for a Linux machine and you don't want to use a big proprietary blob, AMD is by far the best choice, because the amdgpu driver outperforms the nouveau driver.

No doubt, I said so explicitly elsewhere in this thread. I.e. that on Linux Nvidia will lose its current dominance.


Sure it makes sense, since the graphics driver for most is a means to an end. Few users are graphics driver fans based on how commendable the progress is relative to circumstances. Though this is of course a good and valid POV too.


Not intended as a slight against nouveau. I’ll spare you a political rant.


In The Witcher 3 in Wine, AMD with Mesa beats Nvidia blob by a huge margin. Nvidia usually tries to optimize individual titles by cheating and substituting shaders and etc. In conformant OpenGL tests, AMD / Mesa is already close if not better. Mesa developers did a great job in the past year, and they are still working on improving performance.


I think he's basing it on a price-performance ratio. Phoronix tests show that NVIDIA's closed OpenGL/Vulkan drivers are still ahead of AMD/Mesa's open graphics stack. -- On the bright side the "relative" transparency of AMD with their graphics development does make maintaining the FOSS drivers easier as a kernel maintainer (although I rarely send patches to the graphics subsystem I have friends who do).


Can confirm. Dealing with AMD drivers made me switch to Windows.


They've recently been trying to share a lot of code between windows and linux drivers. That should help them stay closer to parity.


Where did you get the $110B number from? https://en.wikipedia.org/wiki/Nvidia shows something quite different. Nvidia is nowhere near Intel.


Current share price * outstanding shares. Mkt cap 119.21B


Some of that is my cash. I bought into ryzen recently, breaking from a decade of intel on all my home machines. The chip is working. It is very fast. Maybe I'd get a few more fps from intel, but not at the same price.


I've been AMD only since 2004. Feels good.


Do you run Linux with your AMD cards or just Windows? Wondering what the Linux experience is like these days with all the Linux driver improvements AMD has made.


I recently bought an HP laptop which has a AMD A10-8700P APU and was very surprised to find out that basic display functionality is very glitchy under Linux (backlight would flicker, or in some cases it wouldn't work at all). I tried with Debian unstable first, then with Ubuntu (17.04 and now 17.10) and none of the vanilla kernels in these distros worked for me (with the amdgpu driver, ), so I'm now running 4.9.51+ from https://github.com/M-Bab/linux-kernel-amdgpu-binaries (the most recent versions crash when I use HDMI output).

You can read more about it in:

* https://www.reddit.com/r/linux/comments/6m2jvn/eli5_of_amd_d...

* https://www.reddit.com/r/linux/comments/754uh6/update_on_dcd...

* https://www.phoronix.com/scan.php?page=news_item&px=AMDGPU-1...

* https://www.phoronix.com/scan.php?page=news_item&px=AMDGPU-D...


I'm running only linux. All seems well.


same. TR 1950X over here.


What OS are you running on it, and how has your stability been?


Ubuntu 17.10. Also running a Nvidia 1080 Ti. Everything is liquid cooled. Works like a charm


(Not OP.) FreeBSD has been rock solid on 1950x for me. I've attempted to reproduce the "kill ryzen" reliability problem reported on some of the desktop CPUs and been unable to.


Is there support for Bhyve?


Yep.


(not OP) I run OpenSuse 42.3 and it works flawlessly.


I don't understand why this good news translates into the stock being down over 12% after hours.


The direction a stock moves after an earnings announcement has nothing to do with how good the earnings are in absolute terms. What matters is where they ended up relative to what the market was expecting. If AMD announced a 300% increase in earnings, but the market was expecting 400%, their stock is going to go down because 400% was already priced in. Similarly, a company's revenue can be down 300%, but if the market was expecting it to be down 400% the stock will rise.


In this particular case, it has nothing to do with the earnings of the previous quarter. The stock is down after hours due to guidance for the current quarter. Investors were not expecting a 15% drop in revenue in Q4.

Personally, think most investors are over-reacting and this is a good thing to keep analyst expectations in check. They are really getting out of hand with some of their forecasts on quarter to quarter performance.


That's not it either. Q4 guidance has actually gone up over the previous estimates. Because their business is cyclical, it should be lower than Q3.


I agree. The new guidance is still higher than the last Q4, however, it is still lower than what was forecasted.


Having negative revenue is technically possible, but the example of a company’s revenue being down 300% or 400% seems extremely unlikely. ;-)


Yes but you can at least say that it is equally likely for a company's profits to be down 101% as it is to be down 400%.


Profits can be negative (and often are), that's not the same as revenues being negative.


What do you mean?


In other words: your share price is a function of the emotions of fickle investors and their expectations (regardless of plausibility) rather than how well your company actually did.


Share price is an estimation of future performance, not past or current performance. Specifically it's supposedly the net present value of all future cash flows, discounted appropriately.


How do you respond to the comments of stock market investors that suggest the market is frequently irrational?


The market is inherently volume-weighted, and there is much more "smart money" than "dumb money".


To give an example of something a smart investor would do, Warren Buffett supposedly only invests in industries he understands in depth:

http://www.investopedia.com/articles/05/012705.asp

What proportion of investors would you suggest show such restraint? Over half?


The proportion of "investors"? Low. The proportion of market capital allocation, among people doing active allocation for long-term buy and hold strategies? Very high.


No. The change in your share price is a function of the change in how well your company is expected to do. The absolute value still corresponds to how well you're doing/expected to do.


Well, yes, but that's not the point being made: the stock market aggressively prices in expectations ahead of time to avoid losing money. The reason the stock goes down is because the gain is already priced in!


That applies to pretty much all domains. From choosing an employer, restaurant, movie, car to choosing an editor, programming language, keyboard, etc. Humans, some more so than others, trick themselves into thinking they are rational, or that their decisions are firmly grounded.


Because AMD also said this: "For the fourth quarter of 2017, AMD expects revenue to decrease approximately 15 percent sequentially"


The market is all about growth -- they're projecting a pullback next quarter. It's also already priced in -- they're about 200% last year's price ($7.50). Investors already expected this news, they only beat estimates by a little.


A stock price doesn't reflect how profitable you are today, it reflects how profitable you'll be tomorrow.


Markets overreact, especially when there are weak hands that were looking for an earnings pop. I wouldn't be surprised to see it only close down 6% tomorrow.

IMO their guidance was playing it safe, and they will outperform it. They've done that the past few quarters. It's also a sequential decline, not YoY.


The market can remain irrational for much longer than any one individual can remain solvent.


Buy on the rumor, sell on the news.


Short-term investors took the profits they made over the past few months and got out, not expecting even bigger grows over the next few months. But that could change in a few months' time, when AMD will prepare to launch Ryzen 2.


People in the stock market care more about short term gains than long term ones.


There are definitely some people that care about short term gains, but there are plenty that are in it for the long term. Virtually everybody with a 401k or mutual fund is in it for long term gains, and I would imagine the amount of people in that basket far exceeds the number of day traders.


Stock price is all about expected future earning.


Article with figures: http://markets.businessinsider.com/news/stocks/Advanced-Micr...

110m profit vs 27m, year on year for the quarter. Revenue is up 27% over the same period


So are machine learning framework developers going to stop being so subservient to CUDA and start supporting OpenCL now?


I would give the framework developers some credit, they shipped and the field made a lot of progress on their tools. Getting good performance out of GPUs (CUDA or OpenCL) is quite difficult, cuDNN has historically been the only good low-level deep learning library, and NVIDIA makes great hardware. My company is building a fully open source low-level framework called PlaidML bringing OpenCL support and other benefits to the existing frameworks. We're starting with Keras, first code was posted last Friday: https://github.com/plaidml/plaidml


And additionally they made Intel up their game - honestly appreciate that!


I really hope Intel doesn't fix it's game before AMD and Arm seriously sandwich them out of the consumer market.

5 years of decline in PC/x86 sales.. Intentionally terrible product lines of a monopoly not wanting to canabalize any higher pricing potential. Insubstantial profits available if they ever get their mobile chips on track after some terrible attempts. CPUs based on cheery forecasts of the high 5 years ago should be hitting the end of their pipelines.. A Microsoft that isn't reaĺly sure if/why it still does desktops.

As someone who has been more occupied with Arm CPUs, I wonder what someone who was paying more attention to Intel would notice and I look forward to being able to scale up straight from Arm64 to genuine AMD64.


5 years of decline in PC/x86 sales.. Intentionally terrible product lines of a monopoly not wanting to canabalize any higher pricing potential.

I see why. I had a three-year-old Retina iMac that was a nightmare to set up (https://jakeseliger.com/2015/01/01/5k-retina-imac-and-mac-os...) and now have a current-gen Retina iMac that was easy to set up, which is a great improvement, but the day-to-day performance increases are small. The screen is better but not by much. The GPU is apparently much better but little that I do is GPU constrained.

Laptops have gotten lighter, at least.


What about battery ? it seems that the norm is 8h of usage. I guess most efforts were in CPU power reduction.


I hope Intel does step up their game, but assuming that there's healthy competition with good ARM and AMD chips. Competition is good for the consumer.

But I also want to see ARM based laptops from both Microsoft and Apple (Though Intel has Apple by the ears with Thunderbolt until USB 3.2 becomes a viable alternative on the low end.)


Dunno about Apple, but i find it unlikely from Microsoft. They have tried again and again to go beyond x86, only to find that they can't escape the weight of win32.

Our best bet would have been ChromeOS, as it is a platform with no legacy to deal with. But even there new models are now Intel based rather than ARM.


Remember that Microsoft made a big bet on ARM based tablet/laptops only a few years ago. It was a massive failure and cost them hundreds of millions of dollars in write offs. I can't see them being very excited to try again.


Consumer devices are mostly ARM and AMD, there's not a lot to be sandwiched out of.

OTOH Intel has


Can someone please update the headline, or update the link? They're out of sync. Alternate link: http://markets.businessinsider.com/news/stocks/Advanced-Micr...


I´m very excited for upcoming AMD EPYC enabled dedicated servers and what it means for the market. The number of cores and memory channels is unmatchable by Intel


And PCIe lanes.


AMD vs NVidia novice here.

My untested understanding has been that even though AMD is performance-to-dollar higher performance than nVidia, it is plagued by not having a stellar software support that nVidia enjoys either via CUDA and consequently to cudaNN, software drivers not as fully compatible with games as is nVidia. AMD has OpenCL but I understand that most vendors / users are not exactly great supporters for OpenCL.

Since the hardware is just the trojan for software and nVidia has solved that problem, will not having the s/w support be a great factor for AMD not getting the same value as nVidia? Unless AMD puts in great effort to get its software game as good as its hardware, is it still DoA?


(Disclaimer: I work on AMD's Mesa driver)

There's also ROCm, which is AMD's attempt at an answer to CUDA. It's hard to unseat the incumbent just because it's always hard to get people to change their software environment, but I think we're getting there.


Thank you. This is the first time I heard about ROCm.

I sincerely wish that ROCm adoption takes off - been a AMD / ATI fanboy - so have been wanting to see AMD make a great dent.

Any numbers / etc to get more info on adoption of ROCm? Are there any specific usecases where ROCm well suited / performs on par / or above par cuda?



Vega had a bumpy start. Rather limited availability, lack of custom design models and etc. I'm still waiting for Sapphire to make one.


I just bought a Sapphire 56... is it just a rebranded AMD board?


Yes. Compare it to their Nitro models for Polaris chips.


I wonder how much of this is due the huge demand for AMD GPUs for mining cryptocurrency coins


Let's see what they do with 60G WiFi.


what's up with the article title?

    AMD says its going to see a big drop in revenue, shares sink (AMD)
It seems to have an anti-amd bias.


It's 100% factually accurate. After hours trading is down 10%, and drops in revenue were announced. How was it biased?


Well AMD has a cyclical business, and a Q3 to Q4 drop is always expected. But Q3 outperformed and Q4 guidance was raised, while both are massive increases from last year.

Simplifying an announcement that exceeds expectations and raises revenue trends to "a drop in revenue" is absurd.


This is a misleading title, and the poster has a clearly biased stake in AMD. Hopefully no one made any rash decisions to buy stock because of this post, as of trading day close, AMD is down 12%.


I am curious how do you know that the poster has a clearly biased stake in AMD?

Business Insider did have an article with that exact title.

The results were announced after hours and the stock had already dropped before this posting.


Click the article link. What does the title say?


The correct question is: What did the title say at the time of submission?


No, the reason why I asked you to look at the title and see that the title had been changed is to understand the fact that the original title "AMD Announces 307% Increase in Q3 Earnings" was misleading. If it was a good summarization of the news, why then, did the author change it? Because it's fundamentally flawed.

Yes, AMD did see a significant increase in earnings, but the profit was far below expectations. Additionally, AMD during their investor statement lowered their guidance for Q4 profitability by over 10%. Do you see how such a title might be misleading?


What? This was posted (to HN) after trading closed.


Yes, you are correct. Earnings release statements for companies are generally released after market close. That doesn't mean trading stops. After hours trading doesn't end until 6PM EST.


If the earnings bump is really driven by the Threadripper CPUs, then earnings mean very little as AMD cost are much higher due to much larger die and lower yields. For every $100 worth of CPU sold, Intel is making orders of magnitude more profit.


Threadripper is a multi-chip module composed of two Zeppelyn dies (the same 8-core die used in the desktop part). There is no larger die or lower yield problem. This is the whole value proposition of the Zen microarchitecture — high yields, low per-CPU costs, typical MCM drawbacks (high interdie latencies, duplicated functional blocks, etc).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: