> Recently I looked into AMD's 2400G APU with much superior graphics but I don't understand what is it for. Sure it can sort of run some games at mediocre frame rates on last-century 1080p monitors, but not higher.
Historically, AMD/ATI integrated graphics chipsets were a budget-friendly alternative to avoid the performance void that Intel integrated graphics provided. Everyone who gamed on the budget end knew this. You looked for AMD/ATI integrated, or if you had the money to spend, NVIDIA had some discrete solutions as well.
Over the years, Intel stepped up their game and closed the gap. However, these days even the Intel Iris platform is poor where used, because it's typically utilized on higher-end computing where high resolution monitors are provided to match.
Because pixel fill rate is a bottleneck for all graphics solutions, the gains that Intel Iris provide are set back by the higher resolutions they tend to be required to drive.
Historically, AMD/ATI integrated graphics chipsets were a budget-friendly alternative to avoid the performance void that Intel integrated graphics provided. Everyone who gamed on the budget end knew this. You looked for AMD/ATI integrated, or if you had the money to spend, NVIDIA had some discrete solutions as well.
Over the years, Intel stepped up their game and closed the gap. However, these days even the Intel Iris platform is poor where used, because it's typically utilized on higher-end computing where high resolution monitors are provided to match.
Because pixel fill rate is a bottleneck for all graphics solutions, the gains that Intel Iris provide are set back by the higher resolutions they tend to be required to drive.