The typical pc builder (or user) isn't professional so rarely use data that can be corrupted. I edit photos and play games. The photos have backups so I wouldn't sweat a corrupt bit here and there. The largest structure I use is my file system but I try never to get too attached to data on a machine (machines should be possible to reinstall within an hour - data such
as photos should be elsewhere).
People working with sensitive datasets or fragile data structures (large cad files was mentioned) can certainly use ECC with good reason.
But for most home machines? Sure, if it doesnt cost 10 or even 3% extra then I'd recommend it. So it would be great if AMD could pressure Intel towards bringing ECC to consumer chips.
Otherwise for a normal builder just put that $100 towards a better graphics card (if gaming) or a better monitor or whatever, and the lack of ECC will make your game crash once in 3 years (it crashes 99 more times due to bugs and bad drivers...)
"Not minding" data corruption is a perverted mindset that comes from internalising Intel's market segmentation. It's not a law of home computing that we don't deserve the good technology.
Absolutely, but my point was that people don't even have important "data" a lot of the time on home build PCs so the question is how much added cost is actually acceptable for ECC? For me on a gaming machine the acceptable added cost would be in the very low single digits, so while I welcome if AMD goes "ECC for all" I was just trying to rationalize not recommending it to home pc builders with current pricing.
> "Not minding" data corruption is a perverted mindset that comes from internalising Intel's market segmentation.
Disagree. The mindset of "assume my metal box can catch fire at any time" is absolutely the right one to adopt, and the more valuable your data is the more right the mindset becomes.
Intel's nasty market segmentation strategy doesn't make that mindset wrong.
Are you going to get the word out to every owner of Xeon(s) in the world that they're doing it wrong and should be using consumer gear if they truly value their data?
Or, is ECC in fact a good thing that's worth having?
The key here is "truly value" and how much. Intel pricing forces you to think about whether you value what's in your ram.
If you are only gaming on a Xeon you should have put that money on the gpu. If you are doing databases, cad etc without ECC then the converse is true: should have put more money towards ECC.
Can't see the controversy in recommending Non-ECC for Intel buyers based on the workload in question.
People working with sensitive datasets or fragile data structures (large cad files was mentioned) can certainly use ECC with good reason.
But for most home machines? Sure, if it doesnt cost 10 or even 3% extra then I'd recommend it. So it would be great if AMD could pressure Intel towards bringing ECC to consumer chips.
Otherwise for a normal builder just put that $100 towards a better graphics card (if gaming) or a better monitor or whatever, and the lack of ECC will make your game crash once in 3 years (it crashes 99 more times due to bugs and bad drivers...)