I am guessing - is it because of HDMI and its abysmal bandwidth? Before with VGA the actual resolution of CRT monitors was limited by RAMDAC. And 10 years ago I was able to to have 2048x1536@70Hz on a top-end 15" CRT already.
I guess as nowadays all digital signal is routed through content protection filter, the technology processing these streams wasn't performing well, causing stagnation in display resolutions for a long time (well, there was also a limit in DPI of CRTs and immature LCD tech). I guess only nowadays with HDMI 2.0, DisplayPort 1.3 or Thunderbolt we are getting to the point the digital signal can be "properly" controlled by content-protection chips, hence allowing 4K+ resolutions (though it's still a mess).
1. DVI has supported higher resolution displays for a long time; 1920x1200 since 1999 on single-link, and 2560x1500 on dual-link. Note that this is more than '10 years ago'.
2. HDMI has supported 4K displays since May 2009.
Most of your post belies a misunderstanding or underestimation of the technology involved. The bottleneck was never 'content protection filters' so much as the feasibility of building electronics that can handle higher-bandwidth signals over cables with no extra data channels (in a backwards-compatible manner), and the challenge of getting manufacturers to make hardware that would actually support it, at increased cost, for no practical benefit.
For older LCDs, it was an issue of new technology is expensive. I remember when everyone I knew had 15" CRT monitors, and having a 15" LCD monitor was a luxury that almost no one could afford. It might be hard to remember, but consumer LCD displays were new once, and new technology is never cheap. On top of that, the cost of an LCD panel doesn't scale linearly with diagonal size, so going from 15" to 17" to 19" was a huge cost curve. Until LCD production was more consistent and people understood the point in buying them over CRTs, the market didn't really heat up, and so sizes/resolutions never grew.
As for DPI, LCD DPIs depend on the size of pixels, which are a mechanical element. In contrast, CRT displays are a printed screen of phosphors; paint a smaller, more detailed phosphor grid, adjust the electronics for more precision control, and boom, higher DPI.
20Gbps is certainly non-trivial amount of bandwidth to put in a cheap, consumer cable. The whole system is hard to make work for cheap; I would think HDCP chips would not be a bottleneck at all.
It was/is because LCDs have fixed resolutions.
High DPI displays was a niche, with CRTs you could satisfy that niche without sacrificing the mainstream audience since you didn't have to force people to run any specific resolution.
The reason for why high DPI was a niche was because of one thing, and one thing only: software. You had to have good eyesight and seek a large screen real estate to even be able to make use of it.
Software didn't scale well. We had small laptop screens with 1920x1200 in the early 2000s. It barely cost more than lower resolution displays (back then you typically had three different resolutions to choose from for every laptop (then came apple)). It was real apparent even back then that high-DPI wasn't costly at all, it just wasn't mainstream enough for anyone to care.
Fast forward a decade and we have the exact same problem. Software can't scale shit. The only reason apple was first with high-dpi was because they controlled the software, and they took the easy way out. They just scaled everything up 4 times and look at that, exactly the same screen real estate with just higher DPI. Same thing with iPhone 4 - the only reason they increased the resolution so much was so that they wouldn't have to bother with scaling ("retina" was just an added bonus). Remember that niche that sought large screen real estate and that bought high-resolution CRTs in the 90's? Well, they are still waiting for a decent LCD...
As for HDMI, no, firstly HDMI was never intended for computers at all. But the hype in combination with small laptops forced it upon us anyway. But HDMI came much later anyway, the battle for high-resolution/high-DPI displays was already lost (because of software). The real technical hurdle was the OSD and scaling chips. Again, the reason for why apple was first with the 30" 2560x1600 display was because they were the first ones that could ditch both the OSD and all scaling from the monitor. It only had brightness adjustments (no OSD) and all of the scaling was done on the graphic card, that way you couldn't pair it with a regular PC - if you did you would have to use another monitor to be able to enter BIOS, install the OS, enter safe-mode, or game on any other resolution (which you pretty much had to). Of course, eventually most graphic cards could do so but apple were the first to be able to assume that the monitor would be paired with such a graphic card.
That and the fact that Dual-Link-DVI was quite rare (hardly surprising since there were no monitors on the market that used it).
Oh and people, I hope you (not parent, but lots of others) didn't run 1280x1024 on your 4:3 CRTs. The only 5:4 monitors that existed were 17" and 19" LCDs. You should have used 1600x1200 or 1280x900, that is, if you didn't want to stretch everything.
I guess as nowadays all digital signal is routed through content protection filter, the technology processing these streams wasn't performing well, causing stagnation in display resolutions for a long time (well, there was also a limit in DPI of CRTs and immature LCD tech). I guess only nowadays with HDMI 2.0, DisplayPort 1.3 or Thunderbolt we are getting to the point the digital signal can be "properly" controlled by content-protection chips, hence allowing 4K+ resolutions (though it's still a mess).
EDIT: spelling