Hacker News new | past | comments | ask | show | jobs | submit login
Dell Previews 27-inch ‘5K’ UltraSharp Monitor: 5120x2880 (anandtech.com)
132 points by ismavis on Sept 5, 2014 | hide | past | favorite | 111 comments



I'd gladly buy one, or even four, if they just worked.

Unfortunately, high-res multimonitor setups are horrendously unstable, require specialized graphics cards (which usually sound like rocket jets), and have a mess of incompatible standards. I'd probably need to spend a week or two figuring out what graphics cards I need, upgrading my motherboard to have enough slots to take them, figuring out if there's a way to make it talk to my laptop, etc. Getting four 1080p displays working was enough work.

Basically, not a project I have time for.

Anytime that hardware becomes a project, it reaches 5% of the market. Most of the market are people who just want to get work done. That means word processors, IDEs, spreadsheets, PPT, etc. USB is fine even for multi-monitor 5k for normal work if you update the screen incrementally and downsample/compress video. Video games aren't necessary. 200 watt graphics cards aren't desired. Stability and ease-of-use are critical.

The trick would be making it Just Work. That means driver support (Linux, Vista, XP, MacOS, etc.). That means testing across a range of hardware and software. That means using a standard cable (USB or similar, not a pair of DisplayPort 1.3 with dual-link cables and a graphics card capable of virtual...). It also means performance testing for the stupid stuff (not issues of 60hz vs 10hz refresh, or 3d gaming, or even video -- just that there aren't funny issues where you wait 1 second for a screen refresh, or the system locks up for 30 seconds thinking).


I used a 39" 4k display for a while (before I moved and decided to use it as a "temporary" TV) and it "Just Worked" as I'd expect it to. While it's only 30Hz at 4k, that more than meets the requirements you put forth of word processors, IDEs, etc. It plugs in over HDMI and doesn't require the newer standard or any special cables to use. I ran that display along with a 1440p display off of a single Radeon 7850, which is neither super expensive, nor loud, and for normal use there were no issues to speak of. Games were a little bit iffy at 4k, but 30Hz wasn't great for most of the most power hungry games anyway. I just switched over to the other display.

It was great, I could have a ton of terminals open, a browser, text editor, a video playing and it didn't feel cramped. Essentially having 4 19.5" 1080p monitors without the borders.

Writing this out makes me want to take that back as my main monitor, I guess I'll have to find a TV to replace it.


>> Essentially having 4 19.5" 1080p monitors without the borders.

I myself have been debating whether to wait for a 40" 60Hz 4K display or to go with basically 4x1080p monitors.

While I'd just love to have a single cable, single giant monitor setup, I also have developed presbyopia in the past few years, and I have concerns about the abilities of my eyes to focus on the furthest edges of a large screen vs. having four independently tilted monitors. The ergonomic benefits of the 4 monitors might actually outweigh the mess of cables and borders/bezels.


Agreed. Mine was an inexpensive Seiki. Super nice to have the real estate. Next one will hopefully be the 50" one, tilted like an drafting table.


They have some "pro" 60Hz screens coming in q1 2015, but I don't think 50" is one of the sizes.

http://www.techpowerup.com/202448/seiki-announces-a-trio-of-...


Yup, that's the one. Surprisingly good looking panel for the price, once you change it to a limited color space and tune the color settings a little.


That's the same one I have - currently $339 on Amazon. I've been very happy with mine, especially for the price.

What video card are you using with it?


Depends on what OS you use. If you use Linux, yeah, you're screwed. They are still arguing over how to support monitors that expose themselves to the OS as two monitors. The current answer is "that's dumb, maybe they'll go away." It is dumb, but they're not going away.

Over on Windows, Nvidia has some hacks in the driver to work around this. With a $70 current-gen card, I can drive two 4k monitors at 60Hz. One monitor connects via two HDMI cables; the other over Displayport MST. (I don't actually have two 4k monitors, but I did convince my computer that my one monitor was two to test this.)

You're right about getting 4 monitors working. Probably possible with an SLI setup, but the current generation of video hardware makes 1-3 monitors easy and 4+ monitors hard. 3 monitors is better than a few years ago, though.


Nvidia limits multiple monitors on their consumer cards to 2 monitors and 1 gpu or 2 gpus and 3 monitors in an surround (SLI) setup.

For their professional series of gpus they offer premium mosaic which allows up to 4 monitors on a single card[1].

AMD allows up to 6 monitors on a single consumer card with Eyefinity[2].

Of course through the use of splitters and multiple cards it's possible to attach more monitors. I've personally setup a 12 display system off a single card. There are of course performance limitations when you only have one gpu and a large grouped display.

For what it's worth I've had an easier time setting up multiple monitors in Linux then I have on Windows.

1: http://www.nvidia.com/object/quadro-k5000.html

2: http://www.amd.com/en-gb/products/graphics/desktop/5000/5870...


Wait, how is a monitor that appears as two to the OS a problem? We've had multi monitor support for a long time, this is just the panels/pixels stitched closer together, isn't it?


The problem is you'll have two copies of your desktop (or whatever you choose to have 1-per-monitor of) on one physical screen, because the OS thinks it's two monitors when it's actually one monitor.

To make this go away, all that clever support needs to be disabled, which is actually currently impossible in Linux!


I thought Linux supported extended desktop, but come to think of it, I never actually tried it. Windows always worked perfectly fine :-)


Are you saying that there is some kind of dropoff, where current GFX hardware has the horsepower to drive 3 large screens easily but not 4?


I'm saying current-gen hardware puts three ports on the card by default. ATI does it, NVidia does it, and Intel does it.


Ah, I see. My GFX card has something like 5-6 ports (4 mini-DP ports and 1-2 DVI) so I've never had that problem


Are you doing your development on a remote computer via putty then?


Don't know why you are getting down voted, I have (3) 3k displays (ASUS PB278Q) and I have frequent problems getting them to come back up after reboots or sleep. I've tried 5 different brands of displayport cables, a MST HUB, drivers, mixed config (DVI, Onboard Intel, ect...). The AMD driver will report that there is not enough bandwidth, the only thing I can do to get them to come back up is unplug and replug the disport cable. Research online shows this is a common problem with Hires displays.


  I'd gladly buy one, or even four, if they just worked.

  Unfortunately, high-res multimonitor setups are   
  horrendously unstable
Trickle-down effect, friend! Today's bleeding edge is (usually) tomorrow's mainstream.

This bleeding edge problems are (unfortunately) a necessary first step towards that.


Do you (or anyone else) have any idea if this is getting better, and how quickly, if so?

I'm of the same mind that I'm incredibly willing to shell out a bunch of money on super high resolution displays (and even more powerful video cards to drive them) just as soon as I'm sure I'm not going to be on the painful part of the adoption curve.


Display port works well enough for UHD, I'm not sure about this resolution, but it should be ok. Even a lowly Intel integrated GPU might be able to drive this for non gaming applications. Visual studio and X code already support 2X, I think Office and iWork do also.

No displays, past present or future, can run over USB, sorry.



Yup. The quality is terrible though. Colors are washed out and artifacts are easy to see on 1080p.


> No displays, past present or future, can run over USB, sorry.

Tons of USB to HDMI adapters are available though -- maybe that is what he means? Personally I've never tried it: http://www.walmart.com/c/kp/usb-to-hdmi-adapters


So what you are saying is that we need a new standard that gets wide adoption that support 2K and 4K (and soon 8K) monitors. I completely agree. And it should be a single cable as well. It would be best if it was a straight forward extension of an existing standard.


Assuming the signal is transmitted uncompressed, this would be 5120x2880x32 bits per frame, so 5120x2880x32x60 bits per seconds at 60fps.

That's 28GBit/s - more than what a single TB2 port can handle and probably also more than what a lower-to-mid range GPU is able to produce.

I hope the solution to this will neither be reducing FPS nor adding compression, so it looks like we need a new display connection standard.

Please tell me that I got my math wrong as I really, really, really want a display at this resolution for my day-to-day work.


It probably uses Multi-Stream Transport by combining the signals from two DP 1.2 ports. Something more or less like Dual-Link DVI but using 2 ports. Regarding graphics cards performance probably only high-end graphics card from Nvidia and AMD (Quadro/FireGL) can drive a screen like this with decent performance.


For anyone else who hadn't heard of it: this is what the (never-defined) "MST" refers to in the article. I'd never heard of it: it's basically daisy-chaining, or using a USB-like hub, for DisplayPort.


24 not 32 bpp - the alpha layer is not sent to the display.


If that's right, then we've got:

  5120*2880*24*60/(1024^3): 19.77GB/s
Which might work within the 20gbps offered by TB2.


aside of what rando289 said about not reaching the theoretical maximum, you're mixing up units. You'd have to divide by (1000^3) as bits are always measured in proper SI units (1KBit = 1000 Bit).

TB does 20 GBit/s

And 5120x2880x24x60 Bits/s / (1000^3) is 21.23 GBits/s, so even in the best-case scenario (which is likely unrealistic), you wouldn't have enough bandwith over TB2.

DisplayPort 1.3 will solve the issue of course, but that doesn't help me personally much as the hardware I'm using is a 2012 retina MacBook (which I'd be theoretically willing to replace to be able to use the resolution we're talking about here) and a 2013 MacPro (which I won't replace this early), neither of which have DP 1.3 support.


Apparently it supports 10 bpp input, so raise the bandwidth required by 25% for a 10bit workflow.

http://www.tftcentral.co.uk/news_archive/31.htm#dell_up2715k


Oh, I didn't know that. Let's wait for DP 1.3 then.


I'd bet against it. Those kind of specs are almost never reliably achieved. Besides, "DisplayPort version 1.3, which was expected to be finalized in Q2 2014, will increase overall transmission bandwidth to 32.4 Gbit/s" wikipedia


A lot of 4K screens aren't at 60fps, they're at 48fps.

I wouldn't be surprised if this screen is lower than 60fps.


I looked at 4K TVs recently, and if you look through the advertising they're all 4K at 30hz (ie, totally worthless for viewing anything on) due to the limitations of HDMI 1.4.


Do you consider theatre movies at 24hz "worthless"? Or is there some other difference?


One thing that comes to mind is that unless you can pull them down to 24Hz, it will really suck to watch a 24Hz movie on a 30Hz TV.


Movies rely on persistence of vision (or motion blur) to pull off the 24 FPS. I don't want motion blur in anything I use a computer for aside from games - and only then if implemented properly. I haven't seen a good implementation of that yet.


30hz is just fine for a lot of big Emacs windows and terminals!


I work all day on a 30Hz 4k tv. It works for viewing lots of text buffers.


30 Hz is bad for most PC content, but it should be fine for movies, which are mostly shot and distributed at 24 frames per second.


Mostly, yes, but the trend (set by the Hobbit movies) is heading towards higher frame rates (60 and beyond). I can imagine the connectors and graphics cards are racing to get that kind of performance at those resolutions again.


This is not true. The Hobbit demonstration showed why 24fps is superior for Hollywood storytelling and movies will likely stay that way, possibly for decades.


Some reviewers seem to have that opinion, but many, many people use the high-frame-rate-simulators on their home TVs. I've seen the first two Hobbit movies in 48 fps and wish everything was there or higher. Hell, there are projects like SVP [1] to get video smoothed on your computer. People LIKE higher frame rates.

[1] http://www.svp-team.com/


Many people have the high-frame-rate turned on by default and don't know it. That's not the same as preferring the look.


The Hobbit demonstrated that people have different opinions and react differently to change. I saw The Hobbit in both formats, and I thought that the 60 FPS presentation was vastly superior to the 24 FPS.


One movie does not make a trend.


I mainly play games rather than watching filmed content, which it seems makes my use case a little bit more sensitive than other people's.


Right, but watching a 24hz source on a 30hz screen is pretty bad - the technical term is "judder" if you want to learn more.


That depends on the screen technology. 24p looks great on plasmas. I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content. No idea about this specific TV though.


I don't think you're very familiar with any of thisthis technology.

  24p looks great on plasmas.
We're talking about the drawbacks of a 30hz refresh rate here. There are no 30hz plasma displays. They run at more like 600hz.

  I'm pretty sure modern LCDs refresh at a multiple of 24, and can natively refresh at 24 FPS when they detect 24p content
Again, that's pretty far removed from anything we're talking about.

Yes, consumer LCDs often run at 120hz these days and yes, one reason for this is so that they can display 24p content (120 is a multiple of 24) without judder.

If Dell had managed to make the world's first 5120x2880 LCD monitor that runs at 120mhz or higher you can be sure they'd be trumpeting that fact.


I think the confusion is that I'm assuming that a 30Hz display might also be able to refresh at 24Hz. I am familiar with consumer televisions, but not with this particular TV. But I see no inherent reason why a 30Hz display cannot also refresh at 24Hz, especially when the 30Hz limitation seems to be due to the display connector's bandwidth.


I apologize for my tone! You made a totally reasonable assumption, but LCDs don't work that way.

A 120hz LCD always refreshes at 120hz, essentially[1]. But it can easily display 60hz content - it just display each frame twice in a row. Same with 30hz (display each frame four times in a row) or even 24hz (display each frame five times in a row) content.

But suppose you were, say, trying to display 119hz content on a 120hz monitor. You'd have to display the first 118 frames for one LCD refresh interval, and then display the 118th frame for two intervals, which causes judder. Or you could blend all of the frames together and simulate a 120hz source (this is "pulldown").

  But I see no inherent reason why a 30Hz display cannot also
  refresh at 24Hz, especially when the 30Hz limitation seems to 
  be due to the display connector's bandwidth.
If Dell's 5K monitor has a 120hz refresh rate, it could indeed display 24p content smoothly because, as you say, the display connector's bandwidth would be sufficient.

But Dell's 5K monitor almost certainly is a 60hz panel (else they would have trumpeted the fact that it was 120hz!) and 60hz isn't evenly divisible by 24, so you get judder or have to do pulldown.

_______ [1] nVidia's Gsync is one possible solution http://www.anandtech.com/show/7582/nvidia-gsync-review


For most 2014 TVs, it's not a limitation on the TV because they also support HDMI 2.0, but a limitation on the source.

What I don't understand is why these TVs don't have a DP input so we can use them at 60fps with modern PCs (and already on the market GPUs).


>What I don't understand is why these TVs don't have a DP input so we can use them at 60fps with modern PCs (and already on the market GPUs).

Probably because every TV maker / rebrander and their mother has both an HDMI license and the tooling setup. There's, basically, 0 cost to keep usin gHDMI. Moving to DP needs licensing and retooling.


As far as I know using DP does not require a license except for using the DP logo.


4K at 60hz (ie, totally worthless for viewing anything on)

What's wrong with that? Sounds lovely!


looks like it was a typo, parent was edited


I'd prefer 120 hz, at least for games and UI animations.


Sony's 2014 screens all support 3840x2160/60p, HDMI 2.0. Bit laggy for mouse work though.


Dell's 5K display is 60hz, or at least most articles say so.


Could you integrate the graphics card into the moniter? Surely the bandwidth between the PCI express card and the motherboard will actually be less than the bandwidth of the cable to the display.


This is exactly why laptops got retina displays before desktops. Need more bandwidth? Add another twisted pair. (LVDS.)

With DisplayPort and HDMI, we have to wait for the world's least competent standards agencies to add more twisted pairs. HDMI only cares about TV, so they're useless. DisplayPort is merely very very slow.

Meanwhile, there is a 5k monitor on the market but no way to connect it to anything reliably. MST is a minor disaster. (I have the Asus 4k display. I did get it working at 60Hz, but only by intentionally triggering a bug in the Nvidia driver, which triggers another bug that makes the monitor look like one monitor instead of two. SIGH. Meanwhile, ChromeOS can't handle this correctly at all, because the Intel video driver does not have that same bug. Remember how multiple monitors never worked with Linux, and they would both appear as one big display and windows would pop up in the middle spanning both screens? They fixed the glitch. Now that's exactly what we want to happen, but all the code to support that has been deleted. SIGH.)


Heh, you solved one problem by creating another.

It works fine if you only have 1 display, then you can run everything display side.

But now lets say you have 3 displays. Now each built in GPU has have the ability to talk to other items on the PCIe bus, not just the controller. This sounds like a blessing. The host PC would only have to upload the window contents once, then each display would send the information to each other. Simple right?

But now we have to deal with hardware latency, and cross vendor standardization. Which are largely problems then bandwidth.


I never had any problems running multiple displays on separate PCI graphics cards on my 2002 PowerMac G4, why would that be a bigger challenge now?


Each of the PCI graphics cards in your 2002 PowerMac was using a long-established standard signal (VGA) to communicate with the monitors, so there was no problem.

If you build the graphics card into the monitor, you'd need a new (non-existant) standard to pipe graphics bitmaps and draw instructions between the PC and the monitor. (Unless X is up to the task... hahah)

OR, you could just extend PCI express over cables to the monitors. Which would work (you can already buy PCI express breakout boxes) but then you have much greater latency between the card and the system, etc etc etc.


> OR, you could just extend PCI express over cables to the monitors

We already have this

http://en.wikipedia.org/wiki/Thunderbolt_(interface)

Thunderbolt combines PCI Express (PCIe) and DisplayPort (DP) into one serial signal alongside a DC connection for electric power, transmitted over one cable. Up to six peripherals may be supported by one connector through various topologies.

Not enough PCIe lanes to get enough performance for games but it should be fine for desktop applications.


Why exactly should latency be such an issue for graphics cards? If the graphics card only needs to refresh the screen every 15ms what difference would a few nano seconds make for the signal to travel down a wire? I'm guessing that reading to PCI express if already >100ns for large payloads.


You can. You can also build/buy (kickstarter project?) a PCIe brakeout box. it has been done, but it's crazy expensive for what it is. For me this was the biggest thing I loved about TB. The ability to have a laptop, but also a fast external card.

But no...


There are some Sony screens that do this.


But the latency will probably be huge enough to be a blocker.


I've been wanting a new monitor. Ideally a large, high resolution, high refresh rate monitor. Everything I've read though has lead me to hold off for precisely that fact - interconnects are currently completely insufficient to handle that kind of data, so two of three is the best you can get. Big + 4k+ = terrible refresh rate, big + 120hz = low resolution.

Hacky solutions some models use like having the monitor pretend to be multiple monitors using multiple ports (as this Dell is apparently doing) have a number of issues AFAIK.


HDMI and TB are playing a bit of catchup, but I was under the impression DP can do it today.


Technically, in theory, yes. In practice, no - every 4K+ monitor I'm aware of using DP uses the 2-monitor hack.

http://thewirecutter.com/reviews/best-4k-monitor-doesnt-exis...

Hopefully in a year things will have caught up some.


DisplayPort 1.3 is 32Gbps


Maybe they'll use two connectors and present it as two 2560x2880 monitors?


No, they use two connectors bonded together.


That's why this monitor is intended to use 2 separate inputs. If you read the article you'd know that already.


No need for the snark. The article says "if". I don't trust that.


Considering they talked about MST during the announcement AND there's not really any other viable way to drive this display, it's a pretty safe bet it will use two inputs and MST.

The snark is because your objections are lazy, and unimportant.


Why not 5120x3200? 16:9 trend is really annoying.


The history of why 16:9 came about kind of explains why it's not going anywhere any time in the next 40-50 years.

http://en.wikipedia.org/wiki/16:9


Inferior proportion for monitors if you ask me (comparatively to 16:10). Useful space simply cut to reuse TV format. 16:10 is also closer to the golden ratio.


Because video content is usually 16:9, and producing different panels for TVs and PC monitors would mean both would become more expensive.


Not really. Nobody is making 5k 27" TVs.

Small TVs are all 720p or maybe 1080p. Computer monitors stop being 1080p at about 22".

Widescreen is popular because monitors are sold by diagonal, and the more rectangular the screen, the cheaper it is to have a big number. (The other interesting scheme going on these days is lying about the diagonal. My 31.5" monitor is called "32 inch class". My 64" TV is called 65 "inch class". WTF!?)


> The other interesting scheme going on these days is lying about the diagonal. My 31.5" monitor is called "32 inch class". My 64" TV is called 65 "inch class". WTF!?

This scam has been going on long before LCDs. CRT marketing also chronically stretched the truth.


I use my 16x9 monitors vertically and that way they are each two 4:3(-ish, actually 9:8) stacked virtual screens.


Why a non-standard ratio like 8:5?


16:10 is a standard ratio. 1680x1050, 1280x800, 1920x1200...


Looks like I got my math wrong. I just noticed that 8:5 is the same as 16:10 =p


What's a "standard"? 16:10 is as much a standard as anything else. The only reason 16:9 is used is because TV screens use it, so mass production reduces cost for monitors as well. It doesn't mean that 16:9 is better or a standard more than 16:10.


So according to https://www.sven.de/dpi/ it will be 217.57 PPI.. similar to the Mac Book Pro retina 15 which is 220 PPI. See http://en.wikipedia.org/wiki/Retina_Display for the list.


Retina Apple 27" displays should not be far away now.


There's a list to compare against in the article itself.


Which is sadly missing the Nexus 5 @ 445ppi


This is the same size/resolution as what a 27" Retina Thunderbolt Display would be.


While I would love this to be the case, I think Apple is going to jump on the 4k bandwagon. All evidence suggest that Apple is gearing up for 4K, not desktop retina. (look at wallpaper sizing, and their 4k advertising push on the MacPro).


My thoughts exactly. I could see Apple using a similar panel for the next 27" iMac and Thunderbold Display, while using a 4k panel for the smaller iMac.


This is very likely - Apple has been using the same panel as Dell for their 27" 1440p Thunderbolt/Cinema displays.


At $2500 a pop now, I doubt they would use this for another couple of years. Perhaps better to go with a UHD display now and deal with the the non-whole number multiple.


I imagine they'll at least have to support other people's 5K monitors on the Mac Pro, and maybe sell one themselves as a high-end display option. (I assume that 5K will appeal to professionals working on 4K video.)


Right. They can already support UHD, so I think a UHD iMac is imminent. However, I see us quickly moving to QUHD after a bit of time in UHD.


Just to put this in historical context: 14745600 pixels are enough to display 230 CGA (320x200) screenshots simultaneously.


Is it just me or are displays starting to have their own "moores law" in how fast the resolutions/technology is increasing. I wouldn't know were to predict what the max resolution of a commercially available display will be in 2-3 years. It's seems like it took a decade for 1080p to become "standard" and now I can buy decent 4K TV for the price a 1080p TV was just 5-6 years ago.


I'd pass this one. Resolution is nice. However its surface area doesn't do justice for that number of pixels. I'm having 30" 2560x1600 monitor, and any upgrade from that needs have both bigger physical size AND more pixels. Improving one without the other isn't good enough. Yes more pixels give nicer fonts and better image quality, but bigger area is needed to actually use it effectively. When considering down sides of showing it as two separate displays on Linux I'd say "These are not the monitors you are looking for."


Looks good except for the stand height, which would actually be a dealbreaker for me - a high stand is fine for the living room floor, but there's no point putting a great monitor on your desk that you can't look at without tilting your head back and giving yourself chronic neck strain. Are there any similar monitors with lower stands?


Looks like Dell's typical adjustable stand, which allows you to slide the monitor almost all the way down to the desk. If not that, then surely it has a VESA mount that you could use with your own stand.


Ah, Dell monitor stands are height adjustable these days? Good. Looks worth going for, then.


I have a 2011 30" Dell Ultrasharp and a 2014 27" Dell Ultrasharp and they both have excellent height adjustable stands. I don't know if they reserve the nice stands for their Ultrasharps or what.



What do we do for a commercial stunt. We had 720p for 720x1280 We had 1080p fir 1080x1920

And now "5K" for 5120x2880? Is the screen supposed to be used in a "portrait" setting?

It's fine if you work in IT, you see what it does, but for most people those commercial designations are getting confused.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: