I'd gladly buy one, or even four, if they just worked.
Unfortunately, high-res multimonitor setups are horrendously unstable, require specialized graphics cards (which usually sound like rocket jets), and have a mess of incompatible standards. I'd probably need to spend a week or two figuring out what graphics cards I need, upgrading my motherboard to have enough slots to take them, figuring out if there's a way to make it talk to my laptop, etc. Getting four 1080p displays working was enough work.
Basically, not a project I have time for.
Anytime that hardware becomes a project, it reaches 5% of the market. Most of the market are people who just want to get work done. That means word processors, IDEs, spreadsheets, PPT, etc. USB is fine even for multi-monitor 5k for normal work if you update the screen incrementally and downsample/compress video. Video games aren't necessary. 200 watt graphics cards aren't desired. Stability and ease-of-use are critical.
The trick would be making it Just Work. That means driver support (Linux, Vista, XP, MacOS, etc.). That means testing across a range of hardware and software. That means using a standard cable (USB or similar, not a pair of DisplayPort 1.3 with dual-link cables and a graphics card capable of virtual...). It also means performance testing for the stupid stuff (not issues of 60hz vs 10hz refresh, or 3d gaming, or even video -- just that there aren't funny issues where you wait 1 second for a screen refresh, or the system locks up for 30 seconds thinking).
I used a 39" 4k display for a while (before I moved and decided to use it as a "temporary" TV) and it "Just Worked" as I'd expect it to. While it's only 30Hz at 4k, that more than meets the requirements you put forth of word processors, IDEs, etc. It plugs in over HDMI and doesn't require the newer standard or any special cables to use. I ran that display along with a 1440p display off of a single Radeon 7850, which is neither super expensive, nor loud, and for normal use there were no issues to speak of. Games were a little bit iffy at 4k, but 30Hz wasn't great for most of the most power hungry games anyway. I just switched over to the other display.
It was great, I could have a ton of terminals open, a browser, text editor, a video playing and it didn't feel cramped. Essentially having 4 19.5" 1080p monitors without the borders.
Writing this out makes me want to take that back as my main monitor, I guess I'll have to find a TV to replace it.
>> Essentially having 4 19.5" 1080p monitors without the borders.
I myself have been debating whether to wait for a 40" 60Hz 4K display or to go with basically 4x1080p monitors.
While I'd just love to have a single cable, single giant monitor setup, I also have developed presbyopia in the past few years, and I have concerns about the abilities of my eyes to focus on the furthest edges of a large screen vs. having four independently tilted monitors. The ergonomic benefits of the 4 monitors might actually outweigh the mess of cables and borders/bezels.
Depends on what OS you use. If you use Linux, yeah, you're screwed. They are still arguing over how to support monitors that expose themselves to the OS as two monitors. The current answer is "that's dumb, maybe they'll go away." It is dumb, but they're not going away.
Over on Windows, Nvidia has some hacks in the driver to work around this. With a $70 current-gen card, I can drive two 4k monitors at 60Hz. One monitor connects via two HDMI cables; the other over Displayport MST. (I don't actually have two 4k monitors, but I did convince my computer that my one monitor was two to test this.)
You're right about getting 4 monitors working. Probably possible with an SLI setup, but the current generation of video hardware makes 1-3 monitors easy and 4+ monitors hard. 3 monitors is better than a few years ago, though.
Nvidia limits multiple monitors on their consumer cards to 2 monitors and 1 gpu or 2 gpus and 3 monitors in an surround (SLI) setup.
For their professional series of gpus they offer premium mosaic which allows up to 4 monitors on a single card[1].
AMD allows up to 6 monitors on a single consumer card with Eyefinity[2].
Of course through the use of splitters and multiple cards it's possible to attach more monitors. I've personally setup a 12 display system off a single card. There are of course performance limitations when you only have one gpu and a large grouped display.
For what it's worth I've had an easier time setting up multiple monitors in Linux then I have on Windows.
Wait, how is a monitor that appears as two to the OS a problem? We've had multi monitor support for a long time, this is just the panels/pixels stitched closer together, isn't it?
The problem is you'll have two copies of your desktop (or whatever you choose to have 1-per-monitor of) on one physical screen, because the OS thinks it's two monitors when it's actually one monitor.
To make this go away, all that clever support needs to be disabled, which is actually currently impossible in Linux!
Don't know why you are getting down voted, I have (3) 3k displays (ASUS PB278Q) and I have frequent problems getting them to come back up after reboots or sleep. I've tried 5 different brands of displayport cables, a MST HUB, drivers, mixed config (DVI, Onboard Intel, ect...). The AMD driver will report that there is not enough bandwidth, the only thing I can do to get them to come back up is unplug and replug the disport cable. Research online shows this is a common problem with Hires displays.
Do you (or anyone else) have any idea if this is getting better, and how quickly, if so?
I'm of the same mind that I'm incredibly willing to shell out a bunch of money on super high resolution displays (and even more powerful video cards to drive them) just as soon as I'm sure I'm not going to be on the painful part of the adoption curve.
Display port works well enough for UHD, I'm not sure about this resolution, but it should be ok. Even a lowly Intel integrated GPU might be able to drive this for non gaming applications. Visual studio and X code already support 2X, I think Office and iWork do also.
No displays, past present or future, can run over USB, sorry.
So what you are saying is that we need a new standard that gets wide adoption that support 2K and 4K (and soon 8K) monitors. I completely agree. And it should be a single cable as well. It would be best if it was a straight forward extension of an existing standard.
It probably uses Multi-Stream Transport by combining the signals from two DP 1.2 ports. Something more or less like Dual-Link DVI but using 2 ports. Regarding graphics cards performance probably only high-end graphics card from Nvidia and AMD (Quadro/FireGL) can drive a screen like this with decent performance.
For anyone else who hadn't heard of it: this is what the (never-defined) "MST" refers to in the article. I'd never heard of it: it's basically daisy-chaining, or using a USB-like hub, for DisplayPort.
aside of what rando289 said about not reaching the theoretical maximum, you're mixing up units. You'd have to divide by (1000^3) as bits are always measured in proper SI units (1KBit = 1000 Bit).
TB does 20 GBit/s
And 5120x2880x24x60 Bits/s / (1000^3) is 21.23 GBits/s, so even in the best-case scenario (which is likely unrealistic), you wouldn't have enough bandwith over TB2.
DisplayPort 1.3 will solve the issue of course, but that doesn't help me personally much as the hardware I'm using is a 2012 retina MacBook (which I'd be theoretically willing to replace to be able to use the resolution we're talking about here) and a 2013 MacPro (which I won't replace this early), neither of which have DP 1.3 support.
I'd bet against it. Those kind of specs are almost never reliably achieved. Besides, "DisplayPort version 1.3, which was expected to be finalized in Q2 2014, will increase overall transmission bandwidth to 32.4 Gbit/s" wikipedia
I looked at 4K TVs recently, and if you look through the advertising they're all 4K at 30hz (ie, totally worthless for viewing anything on) due to the limitations of HDMI 1.4.
Movies rely on persistence of vision (or motion blur) to pull off the 24 FPS. I don't want motion blur in anything I use a computer for aside from games - and only then if implemented properly. I haven't seen a good implementation of that yet.
Mostly, yes, but the trend (set by the Hobbit movies) is heading towards higher frame rates (60 and beyond). I can imagine the connectors and graphics cards are racing to get that kind of performance at those resolutions again.
This is not true. The Hobbit demonstration showed why 24fps is superior for Hollywood storytelling and movies will likely stay that way, possibly for decades.
Some reviewers seem to have that opinion, but many, many people use the high-frame-rate-simulators on their home TVs. I've seen the first two Hobbit movies in 48 fps and wish everything was there or higher. Hell, there are projects like SVP [1] to get video smoothed on your computer. People LIKE higher frame rates.
The Hobbit demonstrated that people have different opinions and react differently to change. I saw The Hobbit in both formats, and I thought that the 60 FPS presentation was vastly superior to the 24 FPS.
That depends on the screen technology. 24p looks great on plasmas. I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content. No idea about this specific TV though.
I don't think you're very familiar with any of thisthis technology.
24p looks great on plasmas.
We're talking about the drawbacks of a 30hz refresh rate here. There are no 30hz plasma displays. They run at more like 600hz.
I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content
Again, that's pretty far removed from anything we're talking about.
Yes, consumer LCDs often run at 120hz these days and yes, one reason for this is so that they can display 24p content (120 is a multiple of 24) without judder.
If Dell had managed to make the world's first 5120x2880 LCD monitor that runs at 120mhz or higher you can be sure they'd be trumpeting that fact.
I think the confusion is that I'm assuming that a 30Hz display might also be able to refresh at 24Hz. I am familiar with consumer televisions, but not with this particular TV. But I see no inherent reason why a 30Hz display cannot also refresh at 24Hz, especially when the 30Hz limitation seems to be due to the display connector's bandwidth.
I apologize for my tone! You made a totally reasonable assumption, but LCDs don't work that way.
A 120hz LCD always refreshes at 120hz, essentially[1]. But it can easily display 60hz content - it just display each frame twice in a row. Same with 30hz (display each frame four times in a row) or even 24hz (display each frame five times in a row) content.
But suppose you were, say, trying to display 119hz content on a 120hz monitor. You'd have to display the first 118 frames for one LCD refresh interval, and then display the 118th frame for two intervals, which causes judder. Or you could blend all of the frames together and simulate a 120hz source (this is "pulldown").
But I see no inherent reason why a 30Hz display cannot also
refresh at 24Hz, especially when the 30Hz limitation seems to
be due to the display connector's bandwidth.
If Dell's 5K monitor has a 120hz refresh rate, it could indeed display 24p content smoothly because, as you say, the display connector's bandwidth would be sufficient.
But Dell's 5K monitor almost certainly is a 60hz panel (else they would have trumpeted the fact that it was 120hz!) and 60hz isn't evenly divisible by 24, so you get judder or have to do pulldown.
>What I don't understand is why these TVs don't have a DP input so we can use them at 60fps with modern PCs (and already on the market GPUs).
Probably because every TV maker / rebrander and their mother has both an HDMI license and the tooling setup. There's, basically, 0 cost to keep usin gHDMI. Moving to DP needs licensing and retooling.
Could you integrate the graphics card into the moniter? Surely the bandwidth between the PCI express card and the motherboard will actually be less than the bandwidth of the cable to the display.
This is exactly why laptops got retina displays before desktops. Need more bandwidth? Add another twisted pair. (LVDS.)
With DisplayPort and HDMI, we have to wait for the world's least competent standards agencies to add more twisted pairs. HDMI only cares about TV, so they're useless. DisplayPort is merely very very slow.
Meanwhile, there is a 5k monitor on the market but no way to connect it to anything reliably. MST is a minor disaster. (I have the Asus 4k display. I did get it working at 60Hz, but only by intentionally triggering a bug in the Nvidia driver, which triggers another bug that makes the monitor look like one monitor instead of two. SIGH. Meanwhile, ChromeOS can't handle this correctly at all, because the Intel video driver does not have that same bug. Remember how multiple monitors never worked with Linux, and they would both appear as one big display and windows would pop up in the middle spanning both screens? They fixed the glitch. Now that's exactly what we want to happen, but all the code to support that has been deleted. SIGH.)
It works fine if you only have 1 display, then you can run everything display side.
But now lets say you have 3 displays. Now each built in GPU has have the ability to talk to other items on the PCIe bus, not just the controller. This sounds like a blessing. The host PC would only have to upload the window contents once, then each display would send the information to each other. Simple right?
But now we have to deal with hardware latency, and cross vendor standardization. Which are largely problems then bandwidth.
Each of the PCI graphics cards in your 2002 PowerMac was using a long-established standard signal (VGA) to communicate with the monitors, so there was no problem.
If you build the graphics card into the monitor, you'd need a new (non-existant) standard to pipe graphics bitmaps and draw instructions between the PC and the monitor. (Unless X is up to the task... hahah)
OR, you could just extend PCI express over cables to the monitors. Which would work (you can already buy PCI express breakout boxes) but then you have much greater latency between the card and the system, etc etc etc.
Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) into one serial signal alongside a DC connection for electric power, transmitted over one cable. Up to six peripherals may be supported by one connector through various topologies.
Not enough PCIe lanes to get enough performance for games but it should be fine for desktop applications.
Why exactly should latency be such an issue for graphics cards? If the graphics card only needs to refresh the screen every 15ms what difference would a few nano seconds make for the signal to travel down a wire? I'm guessing that reading to PCI express if already >100ns for large payloads.
You can. You can also build/buy (kickstarter project?) a PCIe brakeout box. it has been done, but it's crazy expensive for what it is. For me this was the biggest thing I loved about TB. The ability to have a laptop, but also a fast external card.
I've been wanting a new monitor. Ideally a large, high resolution, high refresh rate monitor. Everything I've read though has lead me to hold off for precisely that fact - interconnects are currently completely insufficient to handle that kind of data, so two of three is the best you can get. Big + 4k+ = terrible refresh rate, big + 120hz = low resolution.
Hacky solutions some models use like having the monitor pretend to be multiple monitors using multiple ports (as this Dell is apparently doing) have a number of issues AFAIK.
Considering they talked about MST during the announcement AND there's not really any other viable way to drive this display, it's a pretty safe bet it will use two inputs and MST.
The snark is because your objections are lazy, and unimportant.
Inferior proportion for monitors if you ask me (comparatively to 16:10). Useful space simply cut to reuse TV format. 16:10 is also closer to the golden ratio.
Small TVs are all 720p or maybe 1080p. Computer monitors stop being 1080p at about 22".
Widescreen is popular because monitors are sold by diagonal, and the more rectangular the screen, the cheaper it is to have a big number. (The other interesting scheme going on these days is lying about the diagonal. My 31.5" monitor is called "32 inch class". My 64" TV is called 65 "inch class". WTF!?)
> The other interesting scheme going on these days is lying about the diagonal. My 31.5" monitor is called "32 inch class". My 64" TV is called 65 "inch class". WTF!?
This scam has been going on long before LCDs. CRT marketing also chronically stretched the truth.
What's a "standard"? 16:10 is as much a standard as anything else. The only reason 16:9 is used is because TV screens use it, so mass production reduces cost for monitors as well. It doesn't mean that 16:9 is better or a standard more than 16:10.
While I would love this to be the case, I think Apple is going to jump on the 4k bandwagon. All evidence suggest that Apple is gearing up for 4K, not desktop retina. (look at wallpaper sizing, and their 4k advertising push on the MacPro).
My thoughts exactly. I could see Apple using a similar panel for the next 27" iMac and Thunderbold Display, while using a 4k panel for the smaller iMac.
At $2500 a pop now, I doubt they would use this for another couple of years. Perhaps better to go with a UHD display now and deal with the the non-whole number multiple.
I imagine they'll at least have to support other people's 5K monitors on the Mac Pro, and maybe sell one themselves as a high-end display option. (I assume that 5K will appeal to professionals working on 4K video.)
Is it just me or are displays starting to have their own "moores law" in how fast the resolutions/technology is increasing. I wouldn't know were to predict what the max resolution of a commercially available display will be in 2-3 years. It's seems like it took a decade for 1080p to become "standard" and now I can buy decent 4K TV for the price a 1080p TV was just 5-6 years ago.
I'd pass this one. Resolution is nice. However its surface area doesn't do justice for that number of pixels. I'm having 30" 2560x1600 monitor, and any upgrade from that needs have both bigger physical size AND more pixels. Improving one without the other isn't good enough. Yes more pixels give nicer fonts and better image quality, but bigger area is needed to actually use it effectively. When considering down sides of showing it as two separate displays on Linux I'd say "These are not the monitors you are looking for."
Looks good except for the stand height, which would actually be a dealbreaker for me - a high stand is fine for the living room floor, but there's no point putting a great monitor on your desk that you can't look at without tilting your head back and giving yourself chronic neck strain. Are there any similar monitors with lower stands?
Looks like Dell's typical adjustable stand, which allows you to slide the monitor almost all the way down to the desk. If not that, then surely it has a VESA mount that you could use with your own stand.
I have a 2011 30" Dell Ultrasharp and a 2014 27" Dell Ultrasharp and they both have excellent height adjustable stands. I don't know if they reserve the nice stands for their Ultrasharps or what.
Unfortunately, high-res multimonitor setups are horrendously unstable, require specialized graphics cards (which usually sound like rocket jets), and have a mess of incompatible standards. I'd probably need to spend a week or two figuring out what graphics cards I need, upgrading my motherboard to have enough slots to take them, figuring out if there's a way to make it talk to my laptop, etc. Getting four 1080p displays working was enough work.
Basically, not a project I have time for.
Anytime that hardware becomes a project, it reaches 5% of the market. Most of the market are people who just want to get work done. That means word processors, IDEs, spreadsheets, PPT, etc. USB is fine even for multi-monitor 5k for normal work if you update the screen incrementally and downsample/compress video. Video games aren't necessary. 200 watt graphics cards aren't desired. Stability and ease-of-use are critical.
The trick would be making it Just Work. That means driver support (Linux, Vista, XP, MacOS, etc.). That means testing across a range of hardware and software. That means using a standard cable (USB or similar, not a pair of DisplayPort 1.3 with dual-link cables and a graphics card capable of virtual...). It also means performance testing for the stupid stuff (not issues of 60hz vs 10hz refresh, or 3d gaming, or even video -- just that there aren't funny issues where you wait 1 second for a screen refresh, or the system locks up for 30 seconds thinking).