Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AMD Launches Carrizo (anandtech.com)
56 points by hanifvirani on June 3, 2015 | hide | past | favorite | 57 comments


I wonder if stagnating CPU requirements might make AMD competitive again.

I have a 4 year old i5 laptop that can handle 100% of general office/home use. Battery life and gaming are the only two weak points. AMD aren't going to compete with Intel on battery life but with overall cpu/gpu power consumption falling it's not as big of a problem as it used to be.

Wish they weren't stuck on 28nm still. It seems like Intel will be pushing 10 before they get down to 20/22... Maybe I'm just nostalgic of my overclocked Athlon 64 but I'd love to see AMD make a comeback and the nm gap between AMD and Intel just seems like kneecapping the underdog.

Realistically they are going to be launching this against Skylake and I can't help but think all the advantages they are promoting are going to get nullified by that. (HEVC and gaming performance)


It looks like AMD is skipping 20/22:

http://en.wikipedia.org/wiki/Zen_(microarchitecture)

http://www.pcgamer.com/amds-next-gen-zen-cpu-due-in-2016/

Seems like the next chip after Carrizo is targeted at 14nm fin-something.

Which, if it pans out, would put Intel & AMD on the same node for the first time in a while. Supposedly Intel's first 10nm part isn't due until 2017:

http://en.wikipedia.org/wiki/Skylake_%28microarchitecture%29


The issue with TSMC/GF "16/14 nm" is that not all important features have been brought down to that level, so the actual shrink is less pronounced than usual. Furthermore, Intel's FinFET's are likely to be more mature than the competition, so I'd be wary of calling the two nodes equivalent. Close, certainly closer than the past, but Intel might retain its lead a bit longer.


Sure, they aren't equivalent, but at least it's closer than 28nm :) Also, while TSMC/GF are fudging the whole "14nm" thing, I would bet Intel is too. Node names are all marketing, and Intel is not above marketing.


nm has been rather misleading for past few generations, Intel is just as guilty as other fabs.

Not everything gets shrunk in modern die shrinks.

What my limited understanding is that transistors might be getting shrunk but wiring connecting transistors is not getting shrunk it is still at some larger number (65nm?)


CPU perf requirements may be stagnating, but perf/watt definitely isn't.

And it seems very unlikely for AMD to ever be competitive on that front.


That being said, that is exactly what Carrizo is about.


> It seems like Intel will be pushing 10 before they get down to 20/22...

Intel seem to be struggling though, with shrinking not working out as they planned. They've missed one upgrade-round with the delay to Broadwell, and if the rumours are correct and they're releasing a couple of CPUs and then jumping to Skylake - that's quite a hit to ROI given the investment they've put in to an 'unused' chip family IMO.


Broadwell was delayed but Skylake is well on track. While there are not many Broadwell Desktop CPUs worth buying right now (Core i7-5775C and i5-5675C offer superb perf/watt though, especially the iGPU) the new round of mobile CPUs is still impressive and will certainly sell well before skylake comes around.


Thanks, that's good to know. Guess I'm a bit sore as I held out for a 97 chipset motherboard for months (when I really needed a new PC) with the aim of using Haswell and then switching to Broadwell and future proofing. With the lack of Broadwell desktop CPUs I could've saved money and gone for 87 chipset, and planned a Skylake replacement...


Is Skylake on track for laptop OEMs to ship new devices in 2015?

Lenovo released Broadwell laptops in Q1, it is hard to believe they will release Skylake before 2016.



Guess it's their time for another Pentium4/Bulldozer mistake ?


> AMD aren't going to compete with Intel on battery life

Why not?


Intel's ever-widening process advantage makes that unlikely, processor design entirely notwithstanding. Unfortunate, but it's the result of being stuck on 28nm for 4 years and counting now.


I wonder what would happen when the process shrinks stop in say 5 to 10 years.


Will it? Sure we're getting close to the physical limits of our current types of designs but a processor that uses light instead can get significantly smaller and 3D. Perhaps we will end up transitioning to systems that use light and continue shrinking.

Not sure what happens after that though.


You mean "when" :)


These days when I read news like this, my first response is that, does AMD have any chance to survive at all? The technical details for any new chip is secondary.

Dr.Su did not have a good tracking record and I never got it why she was picked as the CEO, but it's not all her fault though, AMD has been in decline for years, I just hope someone will buy it before it totally collapses.

Additionally, Intel is battling with ARM/Samsung etc and the need for AMD as a competitor(i.e. to avoid monopoly litigation) is gone too.

Sigh.


> These days when I read news like this, my first response is that, does AMD have any chance to survive at all? The technical details for any new chip is secondary.

I know what you mean. When I started reading my first thought was "ugh, why don't you guys just give up" which is a terrible sentiment for me to have. Carrizo looks promising but I just can't see them gaining a whole lot of market share back from Intel.

When I used to build my own computers I always picked the best processor and graphics card at the time, no brand loyalty at all. It was about the time AMD was kicking Intel's butt and it made me so happy to use the "underdog's" products. Now it's probably been a decade since I've used a single AMD processor. It makes me sad.


I understand completely what you mean. CPUs and graphics cards are hard problems to solve. For a long time, I was very happy to use AMD, because I felt that I was helping to keep the desktop market at least at two players (which is also crazy close to a monopoly). I still use AMD graphics cards for this reason.


If I was a judge the ARM thing would not excuse them from a monopoly. For me it's a bit like saying you have competition because Intel has GPU's too. You can do general computing on a GPU too, but it's a different playing field.


To see just how little Intel cares that AMD exists in the PC chip market these days, just look at how they price their Atom-based chips. Some of them go up to $160 [1] - for what is effectively only a $30 ARM chip competitor.

Intel can afford to do this because as far as it knows the "AMD competition" doesn't exist at that level. If it did, Intel wouldn't dare to price a $30 chip five times higher or replace "Core"-based Celerons and Pentiums with Atom-based ones to trick 99% of its customers into thinking it's actually an upgrade.

[1] http://www.anandtech.com/show/9125/intel-braswell-details-qu...


I'd have thought that the XBox One and PS4 GPU deals would have given them quite a boost, but it seems like they're still struggling these days.

Can anyone enlighten me on why those deals might not be helping as much as I thought they would?


> Can anyone enlighten me on why those deals might not be helping as much as I thought they would?

Good explanation may be found there: http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4... "Two years ago, in January 2011, Nvidia CEO Jen-Hsun Huang told reporters that the Sony-Nvidia deal had earned Nvidia $500M in royalties since 2004. The total number of shipped PS3 consoles by March, 2011 stood at 50 million according to data from the NPD group."


Thanks for the link.

I did think margins on any console GPU deal would probably be slim, but I didn't know it could be that slim. Seeing some hard numbers really helped.


Console hardware is always high volume and low margin; MS and Sony usually breakeven on costs or even eat losses at the start of a new console generation. For AMD, it was the volume they needed just to keep orders going to the fab and a cash stream of some sort, but actual profit is slim by nature of that market.


My guess is that AMD probably wanted those deals so bad that their margin is not very healthy.


I thought AMD's revenue went up by not investing to much in "being the first with 5 % performance improvement " research... ( more revenue in the server market, ... )


Peak 3D performance does not matter.

With Tao3D, I keep pushing graphic cards to their limit, on laptops and desktops. This raymarching example http://www.youtube.com/watch?v=GUMqT9W5BG8 runs fine on my Macbook Pro for about 20 seconds. After that, the heat becomes high enough to throttle the system down. And you end up with a very unpleasant "fast/slow/fast/slow/fast/slow" experience as the system tries to cool down its graphic chip.

So you really don't care about PEAK performance. What you care about is sustained performance, and power consumption in that scenario (not just running idle).


As a programmer, I have high hopes for HSA. I wish Intel would adopt it or join the HSA consortium and get onboard. That would push APUs forward for everyone.


Intel's pushing OpenCL, which seems to me gives most of the advantages of HSA.

OpenCL 2.0 includes unified CPU & GPU virtual addresses (SVM - shared virtual memory).


And so does CUDA, which usually out-performs OpenCL, plus the tooling for CUDA is just so much better with the Visual Studio integration.


Yes, but CUDA is Nvidia only - another trick in their anti-competitive bag.

I always get disappointed when I see something that CUDA specific. "Hey that's interesting machine learning package! Oh wait, that's Nvidia specific."

With OpenCL, you can develop on laptop, test on workstation and deploy on actual servers. With CUDA, you're trapped.


> With OpenCL, you can develop on laptop, test on workstation and deploy on actual servers. With CUDA, you're trapped.

GPGPU developers usually prefer hardware with Nvidia GPUs inside, so you can do all this too. I wouldn't call Nvidia anti competitive, just like I wouldn't use that word for Apple. They saw a niche (HPC/smartphones) when no one thought that market to be attractive, jumped in with proprietary technology that others later tried to reproduce (OpenCL/Android) and they kept the market leadership in terms of profits by reiterating their product.


Well I'm developing on a laptop with GTX 860M so I can still develop on a laptop, test on a workstation (GTX 970) and deploy to actual servers (Tesla)


AMD has to stop naming their products like this.

I mean look at Intel, i3,i5,i7. I can at-least guess which one's better.

Yes I do understand that Intel too has many different names that caters to different markets, yet somehow Intel’s names are far more easier for me to understand, they are far more shorter, and if I want more details about the processor I can read what’s after the "Intel-i7-xxxx" and figure what it is.

I really want to see AMD succeed, I want to have more options when I want to buy a processor.


Oh come on, intel's naming is terrible. "better" isn't clear cut, but even if it is, i3/i5/i7 still wouldn't be consistent. Features like hyperthreading, vt-d, v-pro, more cores, better graphics, are sort of stochastically more likely to be in the i7 than the i5 than the i3, but it's never consistent.

See for example the bottom table on this page title "More Differentiation" http://www.anandtech.com/show/4083/the-sandy-bridge-review-i...

i3-2100 does have aes-ni, i3-2120 does not. i5-2300 does not have vt-d. i5-2400 does. i5-2500K does not.

In that generation, i3 have hyperthreading, i5 do not, i7 do. That's for desktop chips though, for laptops its different.

The latest generation, broadwell, was slightly more consistent. But you can never be sure, I have to look up these suckers every single time. At least Intel Ark is nice.


> I mean look at Intel, i3,i5,i7. I can at-least guess which one's better.

If you're assuming i7 to be better than i5, you're wrong. Single core clock matters. Some games are barely playable with a low clock i7, but work perfectly on a high clocked i5.


yeah, because games are all that matters to people on HN. Besides, a i7 4790K runs at 4Ghz and up to 4.4Ghz (Single Core Turbo) natively while having 4 full cores, show me a i5 that can beat it in games (non overclocked of course). Sure some lower clocked i7 might be slower than some i5s in certain tasks, but those are obviously still better at multi threaded tasks than an i5.


They are also going to debut their new 300 series GPUs with HBM in two weeks at E3. https://twitter.com/AMDRadeon/status/605938931448737792


The 300 Series will be slightly improved repackaged 200 Series chips without HBM though. They will announce one really new chip (Fiji based) with HBM memory and a different name to compete with the 980Ti/Titan X enthusiast cards though.


> AMD Secure Processor

Can anyone explain how this works and why should a user want it? It seems like a DRM on your device.


You could use it for DRM, yes, but that's not the sole use. The point is to put cryptographic keys and secret information somewhere out of direct reach of the CPU, to keep it more secure. An an example: Apple's iPhone stores all fingerprint data exclusively in its secure module.


Somebody other than the user to put it there. Because if user puts it there, he can read it too. Or is there something I don't understand?


No, it's write-only. You can ask "does this fingerprint match the one I stored earlier?", but you can't read out the original fingerprint.


> Somebody other than the user to put it there.

No, the user can install software on the chip.

> Because if user puts it there, he can read it too.

Not all communications links are two-way.


I'm afraid some users accept DRM, also some business can require it for internal purposes. Personally I consider the existence a thread.


Lack of VP9 support is bad for Youtube (and Google). I wonder if it would be too hard to support both codecs.


VP9 was always going to be dead in the water. There are just too many influential companies part of the MPEG-LA to compete with. With Netflix, Amazon and Apple moving to HVEC and the PS/XBox refresh rumored to have HVEC support there is simply too much content: legal and illegal to bother with adding VP9 support.

Plus is anyone expecting a lot of YouTube 4K content ?


> Plus is anyone expecting a lot of YouTube 4K content ?

Every enthusiast cameral high end smartphone or the latest GoPro shoots 4K, more and more people have 4K displays and prices come down. So yeah, i think there will be a lot of 4K on youtube in the near future.


>VP9 was always going to be dead in the water.

What? Youtube has >70% of the online video market, and primarily uses VP9 [0]. To be fair, VP9 isnt commonly used outside of google, but to discount the market leader in online video as "dead" is disingenuous at best.

[0] http://www.statista.com/statistics/266201/us-market-share-of...


YouTube doesn't primarly use VP9. It uses H.264 on iOS, Safari, IE, Flash, Consoles, TVs and a lot of Android devices. Those combined are a sizeable amount of traffic.

And I would argue that the driver of higher end content e.g. 1080p, 4K is not going to come from YouTube but from Netflix, Amazon, Apple and illegal content which a lot of people watch on consoles.

Nothing has fundamentally changed to see this being anything other than a repeat of what happened with H.264/VP9.


You're right, it is more complicated than I assumed. I use Chrome/Linux where nearly 100% of videos are served VP9. According to this [0] 61-69% of browsers support VP9, but I cant find a good source as to what percentage of youtube is actually served with VP9.

However, in the last year 25,000,000,000 hours of VP9 video have been served on youtube [1]. Maybe this is a minority of web video, but its hardly dead.

[0] http://caniuse.com/#search=vp9

[1] http://youtube-eng.blogspot.com/2015/04/vp9-faster-better-bu...


YouTube is using VP9 for 720p and 1080p, too. The file sizes are significantly smaller.


VP9 isnt limited to 720p+ resolutions: https://i.imgur.com/rM8WVcm.png

(Right click a youtube video and select "Stats for nerds" to get this info)


Not a word on Linux support, of course.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: