Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As best I can tell Intel kinda painted themselves into a corner with their 14nm+++ process. It produces really high quality parts (no surprise) but that also means that there is now an expectation of super high clocks on the intel side. That's not realistic however and I highly doubt their 10nm can currently match their 14nm+++ process on clocks (yet). It looks like Intel is thus focusing 10nm on areas where clocks more representative of a new node won't be an issue: server and mobile.

I'll be honest and say I'm very curious to see how intel tries to escape this trap they've set for themselves. Unless we see rapid gains in the 10nm process quality up into the 4.5+ GHz range I wouldn't expect to see 10nm desktop parts anytime soon.



Maybe that's perfectly okay? I don't care about power efficiency in my large stationary gaming PC, and I don't see any reason I should care.


Haha, obviously the largest CPU market is date centers which pretty much only care about performance/watt. The gaming market is really only useful for PR and launching new architectures.


A fine point. I guess I'm not clear which audience we're discussing here?

Data centers presumably don't care at all about those super high clocks, so I don't think any downgrade in that area would be a problem for them, as long as overall metrics are good.


Both server and mobile have plenty 14nm parts going forward. No, the truth is much simpler: the 10nm process is botched and produces very little but they can't just straight up admit it's not working. They need to limp with it until the 7nm which is developed independently and actually has promise to be working.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: