As high DPI displays become more common, it seems like the time has come for more advanced file formats (or just using the more advanced features of existing formats). For example, PNG already has an interlacing mode that progressively adds detail to the image (http://en.wikipedia.org/wiki/Adam7_algorithm). It could easily be co-opted such that a normal image contains 6 layers of detail, with an additional "retina extension" image that high DPI browsers grab to refine the quality. This avoids the waste of downloading a low DPI image, then throwing it away to replace it with high DPI.
Even more interesting is http://en.wikipedia.org/wiki/JPEG_2000, which has "truncatable" bitstreams. You can stop getting data at any point, and depending on the encoding choices, you'll just lose fidelity in colour, resolution etc. Encoders can reorder the bitstream to deliver whatever is most useful for the image first. Browsers could then just stop when "enough" has been downloaded to satisfy the demands of the device; high DPI devices would just continue to grab more of the bitstream. It's really useful for devices on low bandwidth links, as you start getting visual results with very little data.
JPEG 2000 hasn't been widely implemented outside of specialised devices, mostly because it's computationally heavy compared to JPEG, and the patent situation is unclear. Although this does mean that it could be implemented in a targeted way, designed to solve these problems (the spec for the entire format is huge). Moreover, one of the patent holders is, IIRC, Apple.
The idea of compound images is really interesting, although it would require an extra HTTP request to fetch the hight-DPI portion of the image. But maybe less of a problem over SPDY?
The idea truncatable bitstreams is fascinating too. I'm not well versed in networking, but wouldn't the latency of a mobile network kill the benefit of this technique? e.g, by the time the server receives the "connection closed" signal, a large part of the extra data would have been sent, no?
It would need an extra HTTP request, yes. In practice for the PNG solution, one would be better off with sending the DPI in the request headers, so that the "correct" image is the only one sent.
For JPEG 2000, the network characteristics are important, but I don't think it would be too bad on a mobile network. Low DPI devices might get a bit "too much", but it wouldn't be a problem - they can just throw it out (or incorporate more detail).
Headers do seem the best place for a non vendor specific standard to be set. I'd love to see something like
viewport-dpi - the dpi
viewport-max - the maximum possible pixel dimension
and possibly viewport-current - the pixel dimensions at the time of the request
Decent responsive design should deal with the differing viewport sizes, but it might be nice to get a hint before delivering your page what direction to weight that response in.
Something like device-pixel-ratio would be more useful than DPI. Raw DPI is meaningless unless you take distance into account, as is very clear in the case of the projector.
That is certainly the ideal way to reduce the HTTP requests. The problem with this implementation is clear: without a 'src' attribute on your images, if your script fails then your site suddenly has no images at all. Not to mention the overhead of manually modifying all of your existing markup.
Our goal with retina.js was to make it zero-config: no markup changes, no extra element attributes or flags.
The Stripe gallery posted here the other day[1] used src-less img tags for lazy loading. It looks like they handle the script issue by having the normal img tag (with src attribute) nearby in a noscript block.
Doesn't really do much to help with the overhead of modifying existing markup. Perhaps if you're already using something like image_tag in Rails, a retina_image_tag helper might not be too much of a stretch.
Sounds complicated. I think we just need to do something similar to HTML5 'video' tags for the 'image' tag, where you simply put a list of file paths in the tag, and the browser selects the best one to request, only with different resolutions instead of different file encodings. We could have a standard notation like adding "_x2" to the double res image filenames, and the browser selects the appropriate one to load.
Then, we can support current image formats in the way you envision.
Also, anything you have vectorized should be in SVG format, including text in non-web fonts.
That's fine if a) you're willing to maintain two image sets and, b) the only two resolution images you need are "normal" and "2 x normal". The advantage of a progressive quality system is it scales to arbitrary requirements in a single resource. One file provides 1x, 2x, 3x, 3.14x etc.
When $110 buys you 2TB (this morning on Newegg), then doing it on the fly and caching the result has about 100x better ROI than trying to solve it with a smart format. Remember, we're speaking about still photos here, where high retina quality ones rarely take more than 500k each, so 2TB buys you room for 4,000,000 photos (and since you'll be caching the lower res 200K images, it actually buys you room for 10,000,000 million photos)
That's not the case for videos - netflix, hulu and friends have multiple copies of each 1GB optimized for different devices. On-the-fly conversion is not possible, and the stream are significantly difference. For them, a reasonable, universally supported progressive video format will indeed make a huge difference
I'm not aware of such an existing module, but I'm sure a good one will appear within the next couple of years of it does not yet exist.
Compared to developing and deploying a new progressive format (across users, web servers and web browsers), the effort -- both in developing this module, and in configuration, is negligible.
Problem with new formats is legacy support. Something that could degrade gracefully on browsers that don't add support would be very welcome. Maybe the best solution is to keep the current 'img' tag notation, but somehow indicate to browsers that double-rez assets are likely to be available at something like 'filename_x2.ext', and they should request those when they come across an img tag, instead of 'filename.ext', falling back with a second request for the latter in case the '_x2' isn't found. Something like:
<meta img-x2-res="_x2" />
Encoding the image twice with a standardized addition to the filename is just an extra bit to add to the export macro. The image storage is irrelevant for most cases (assuming you compress appropriately); it's the bandwidth and request volume that's precious.
If you are going to do that you may as well have a meta tag to set a suffix "to find the jpeg2000 version of the image", and browsers that understand that format can then get the better format.
I'd like to propose a new function for the Images module. This function will allow developers to provide, in a compact manner, multiple variants of the same image at differing resolutions. Using @media pushes the two asset references apart from one another, whereas such a function keeps related asset references together. It also helps keep selectors DRY. We've called it image-set(), and it takes one or more image specifiers.
That's their notation. Imagine having to do that for every image ever, when they're all just filename + 2x resolution key + filename extension. Totally asinine. Developers should just be able to pick a single page-wide filename key for all their 2x assets, and the browser will know to look there.
Yep, a solution is still needed for content images.
That's their notation. Imagine having to do that for every image ever, when they're all just filename + 2x resolution key + filename extension. Totally asinine. Developers should just be able to pick a single page-wide filename key for all their 2x assets, and the browser will know to look there.
You assume the developers can guarantee all image assets will be 2x. For quite some time, I doubt that'll be the case. And, given that, if the browser just naively requested 2x assets, there would be some set of wasted http requests that just add overhead and significantly delay page load.
Explicitly declaring the 1x and 2x images' existence is a far better solution, even if it is more verbose. The UA can't be guessing about the existence of resources if it's also to be efficient.
No, the better solution would be to add support for:
<meta img-x2-res="_x2" />
This would alert the browser of the likely presence of 'filename_x2.ext' when it sees img tags, and it would fall back with a second request for 'filename.ext' in case it's not found. This seems relatively easy to implement (especially for web developers), and degrades gracefully on older browsers.
Then, a more fleshed out version accounting for the scenario you describe would be the following:
<meta img-x2-res="_x2" assume-present="false" />
Then for any img tags for which double-res assets are available, you could add a property to the img tag as such:
<img x2-res="true" />
Or, you could leave the 'assume-present' property off, as 'true' is default, and put '<img x2-res="false" />' on any images for which double-res assets are unavailable. This would avoid a second request when the first fails.
Such a solution would be significantly more convenient for developers, as you could choose whether to assume the presence of 2x and flag ones that don't have it, or assume its absence and flag those that do, saving tons of time.
Sorry, that should be '2x', not 'x2'. The img '2x-res' property could also be used to provide an asset-specific 2x-res filename key as such:
<img 2x-res="@2x" />
If we wanted this to affect the image asset requested by the CSS background-image and border-image values as well, then a new CSS property would be required for flagging 2x-res active or not and an asset-specific 2x filename key. But its use would be vastly superior to -webkit-image-set, as you could just do this:
.class {2x-res: "@2x"}
or:
.class {2x-res: "false"}
The only limitation is that these 2x assets must share the same filename with the 1x, except for the addition of a 2x key at the end of the filename. However, that seems to be what people are already doing simply to keep track of their assets, and it's much easier to sell me on this imposition than on having to redo all my CSS in the most redundant and painful way imaginable.
Regarding the first trick: the value really isn't very much. Storing 2x2 downsampled versions of an image at all levels is only a 33% increase in size. This is the mipmap trick, and can be done right now with nothing but a size check on the client. I don't think 33% is worth mucking with file formats over.
Well to be fair, it's been several years since I've heard anyone call an SUV a Jeep (other than a Jeep), a tissue a Kleenex, or a copy machine a Xerox. But Zipper was a brand name?? What the hell is the generic product name for a zipper?
Doesn't seem to have a consistent one, clasp fastener or separable fastener are earlier generic terms. BTW Zipper wasn't trademarked by the creator or manufacturer but by BF Goodrich when used on their zipped-up boots.
Two which surprised me were Ping Pong and Adrenaline.
In fact, they should sue all programmers who ever made a zip() function for trademark violations!
(in fact, even more confusingly, most zip() functions don't interleave elements, like zippers do, but return a sequence of paired tuples - this, naturally, gives Zippers a bad name who wants their pants to pairwise join)
It might just be my German origin, but I wouldn't call an SUV a Jeep either. We do however call all of the off road-capable bulky cars with the roll-over protection thingy ("Überrollbügel" in German, can't think of the English equivalent) Jeep.
I've understood that technically the main idea of "retina" is that the physical display pixels no longer map one-to-one with the logical user interface (CSS) pixels. Instead, the retina display appears as a virtual low-resolution screen that is able to utilize higher-resolution images (which would otherwise be scaled down).
So in this sense, I would claim there is a clear technical difference between just higher-resolution screens (more pixels) and retina displays (same pixels but "better looking").
To me this virtualization of pixels seems like a good idea, since the majority of web pages assume pixels have certain DPI range. Operating systems like OSX treat pixels as floats anyway, so if you need subpixel accuracy, it's still possible.
No, the distinction of device pixels vs. CSS pixels predates the first "Retina" iPhone. The distinction existed in all browsers supporting full page zoom, including the Safari on the first iPhone, and in Opera for years before that.
"Retina" is just a marketing term for "pixels so small your eye can't distinguish them at a normal distance anymore".
The distinction is also in CSS specs[0] since version 2.1: “The reference pixel is the visual angle of one pixel on a device with a pixel density of 96dpi and a distance from the reader of an arm's length. <…> 1px thus corresponds to about 0.26 mm (1/96 inch).”
This becomes about semantics, then. If somebody writes an article about "retina displays", I immediately assume it's about this particular way of implementing resolution independence in the browser and in the OS (by doubling/quadrupling the physical resolution while keeping the virtual resolution the same). So as such it serves me better than a generic term that requires additional explanation.
Uh... of course it's about semantics. What josteink was complaining about was the use of a proprietary marketing term to refer to something technical.
Your interpretation is a fine one, but it's not the one that everyone shares. Specifically, it's not the sense that Apple uses the term either. Take a look at Apple's marketing page for the retina display and see if you can find anything about "resolution independence" at all.
So I guess the title of the article should be "Resolution independent high-resolution graphics for your website" then. Retina still communicates the subject better to me, since we all know this is about providing high-res images for iOS devices with Retina displays. Maybe I'm just getting too old to be anal about terminology like this.
There is nothing silly no gimmicky about it.
It is just a catchy name for some technology. There are a lot of similar marketing terms in tech.
Lots more people will understand what you are talking about when you say "retina display" instead of "high-DPI display".
What's high-DPI? What's high-DPI when talking about pentile displays? Not to mention that "retina" is easier to pronounce.
Apple loves simpler names for a reason.
No, it's a gimmick designed expressly to confuse the consumer. Other manufacturers of comparable products, for trademark reasons, can't use the term "retina". So only Apple has "retina".
Intel played a similar trick about 8 years ago with "Centrino". The advertising (for what was essentially just a 802.11b chipset with some added processor/chipset requirements) was so successful that many novice users got fooled into thnking that "Centrino" was wifi, and that all those other manufacturers were just cheap knockoffs of an Intel technology. It did great harm to the market.
> many novice users got fooled into thinking that "Centrino" was wifi, and that all those other manufacturers were just cheap knockoffs of an Intel technology. It did great harm to the market.
Really? Both the claim that many users were "fooled" by the Centrino branding and that this hurt the market for Wi-Fi devices seem highly implausible to me. I don't think I heard anyone use the term "Centrino" outside of Intel PR and reporting thereupon, whereas Wi-Fi had made it into the vernacular at least a decade ago.
The only other trademark for the 802.11 series of standards that got any traction at all was Apple's AirPort, and even their OS calls it Wi-Fi now.
Yes, really. I had to explain to numerous relatives the distinction between "Centrino" and "Wifi". How old are you? Were you engaged with people who were actively buying laptops at the time? Were you engaged on the subject outside the tech community? The Centrino television ads were pervasive, everyone knew what they were about, and no one at the time had ever heard of a laptop that could get on the internet without a wire.
(edit: your remark about AirPort makes this clearer on reflection: you were a mac user at the time, and familiar with Apple products much more than PCs. Outside the mac world, as you might expect, literally no one had heard of an "AirPort". So you were insulated from Intel's nonsense, essentially.)
It's the same thing here. We have "retina.js" being pushed around as a solution for what is clearly a manufacturer-independent problem. Yet on it's face it appears to be Apple-only software. You don't think that constitutes harm to the market?
> The Centrino television ads were pervasive, everyone knew what they were about, and no one at the time had ever heard of a laptop that could get on the internet without a wire.
Apple introduced laptops with Wi-Fi (calling it "AirPort") in 1999, five years before your "eight years ago". Starbucks first started rolling out Wi-Fi (calling it "Wi-Fi")in 2001, and had most of their stores offering it by 2003. It was not an obscure technology eight years ago.
I'm sure whenever non-geeks go shopping for a computer, in any era, there are a variety of marketing terms that need to be explained to them. And yes, I'm old enough to have run through that exercise a few times. But that's not really evidence that "Centrino" hurt the growth of Wi-Fi any more than the way-more-prevalent "Pentium" branding hurt the '90-'00's highly competitive CPU market that gave us 2+GHz x86-64's.
Wi-Fi adoption rates were exceptional for a new computing technology, especially one that required infrastructure beyond what could be put "in the box".
Just like Apple was a couple of years ahead of the curve on Wi-Fi and called it AirPort, they're a couple of years ahead of the curve on double-res displays, too. I don't see how them putting the name "Retina" on those displays (while even in the OS they're still calling it Hi-DPI!) is going to harm what's surely going to be an explosion of high resolution screens in the next few years.
Really. I worked as a laptop salesman when Intel introduced the term "Centrino". I had to explain to many customers that yes, this other laptop that happened not to use an Intel chipset also worked wirelessly.
To get a Centrino sticker, a laptop had to use an Intel chipset, Intel wireless adapter and Intel CPU. Because the marketing for Centrino focussed almost exclusively on Wifi capability, there were many laptops with non-Intel chipsets or AMD CPUs that were perceived by customers as not being capable of wireless networking.
This article[1] explains how Wi-Fi wasn't very popular until the Centrino campaign.
That's essentially an Intel press release (follow the author link).
Anyway, I'm perfectly content to believe that Centrino and Intel's multimillion dollar marketing campaign for it really helped the growth of Wi-Fi. The parent post claimed the opposite: That the branding "hurt the market" for Wi-Fi, which seems absurd.
I fail to see your point. All companies give trademarked names to their products, and to specific features of their products (especially the features that are used heavily in marketing).
Apple is welcome to give any name they want to their product. But when they confuse users to the extent that otherwise-smart web developers start producing "retina.js" to manage resolution independent images (hardly an Apple-specific problem!) then the practice has gone too far.
Who's else is making high-DPI displays? I have really seen companies running with this. It would be nice to see Samsung, for example, start promoting 300 dpi screens so they go mass market.
Sony's Xperia S have 338 PPI which is even more dense than Apples "retina display". So there are other companies using high dpi displays. Galaxy Nexus is at 316 PPI if I remember it correctly. HTC has a phone, Vigor, that has 342 PPI. So apple is certainly not the only company out there who likes pixels :)
Even though the term "retina display" may be a gimmick, I think the creator of retina.js specifically designed it with Apple devices, and Apple's developer documentation, in mind (case in point: usage of the @2x modifier).
DPI i just incorrect. Use PPI if you really insist on that name.
That said, there is nothing gimmicky about the retina name. It's a device where distance to the display, resolution and screen size are such that someone with normal vision cannot distinguish between pixels.
High-PPI does not in any way pack the same information. It’s ambiguous and unclear.
The only thing that’s bad about retina is that Apple uses it like a trademark. I would love it if any company could call their HD TVs retina displays (because they are), but that’s not possible. Luckily everyone else is busy taking away that name from Apple (for example by naming their software like that).
It's a device where distance to the display, resolution and screen size are such that someone with normal vision cannot distinguish between pixels.
It is a post-development sales pitch used to pitch as a differentiation the fact that Apple's naive scaling required them to grossly overshoot the mark. Competitive products already had excellent displays before Apple decided that they had no choice but catch up.
I would love it if any company could call their HD TVs retina displays
Why would you love it? "Retina" display is a misnomer -- unless the device is a fixed distance from my eyes, and it is specifically geared for my eyes specifically, it is horse shit to call it a retina display. It is ignorant marketbabble that lowers us all.
Oh come on. Apple’s iPhone and iPad barely hits the mark there. They are overshooting nothing.
It’s a petty good approximate term that to my mind works perfectly well. Sure, things change depending on view distance but I guess nerds have to survive a term that’s not always exact. The horror!
That was an uninformed rant on your part. What are you suggesting as an alternative? High-PPI certainly does not work.
Apple’s iPhone and iPad barely hits the mark there.
Do you actually believe that the magical "retina" mark just coincidentally happened to be 2x Apple's original resolutions? How convenient!
It’s a petty good approximate term
It's a marketing term that the stupid embrace. Is a 64Kbps mp3 "eardrum audio" in a standard room with a fan? Is 128Kbps eardrum audio on a crummy mp3 player?
That was an uninformed rant on your part
What was uninformed about it? Desperately curious to hear what.
Wouldn't that mean that the device will first try to download all of the low res images and when the script finished loading and is running, the device would then start to load all of the high res images. How's the user experience with all of those additional requests and traffic? (especially via GSM/CDMA)
The user experience when viewing webpages is improved by actually showing a lower-resolution image while the crisp high-resolution image is being downloaded (however slowly), similar to progressive JPEGs.
Depending on the data contract, the user experience when viewing their mobile phone bills might be affected negatively though.
The iPhone 4 and 4S also "lie". It's not really a lie, though. The displays are intended to have higher pixel density, not more screen real estate. By reporting the "incorrect" size, websites are actually sized consistently across all generations of iPhones and iPads, but the retina devices render the same content with higher resolution.
PS: you may wish to add a "(-webkit-min-device-pixel-ratio: 2)" clause to your media queries if you want to target retina devices.
Indeed it would. Certainly less than ideal, but this provides an avenue to easily implement higher DPI graphics. Right now, it seems like the easiest method to get a site upgraded without having to make a ton of changes.
Ideally, we'd have something that would deliver only the correct image to the correct device, the first time. I haven't seen a clean method for achieving that yet.
Agree, I think the correct solution is to actually have a request filter on the server that correctly maps the img tag source to a high DPI version. Obviously, this would be a framework or server specific implementation, but this is the only correct way.
I could see how that's problematic for in page img tags, but for background images, couldn't one just use media queries to grab the higher resolution image?
foresight.js looks more interesting since it takes the perceived connection speed into account when deciding to serve up 2x images. https://github.com/adamdbradley/foresight.js
Wow, the background on that page takes forever to fill in on my iPhone 4. Like over a minute. And the whole Mobile Safari hangs for 30 seconds or so if I switch orientation, I think because it has to resize the massive background image to the changing screen width and the processor isn't up to it.
So I would be cautious of using this technique with huge images.
Yes, despite being quite small (878kb) the @2x background takes a while to load. We wouldn't recommend having a 2800×1867 retina background on your site, but for the sake of the demo we wanted to do some showing off.
Agreed! Potentially, you could be causing greater data charges/slower loading just because you have an iPad 3... Anyone know if there's a setting to turn these image loads off on an iPad 3?
While this script is fine, it adds an extra HTTP request per image to check for a retina version of the image. Also, the browser will start downloading the small version before it loads the big version (adding another extra http request). Check this easy jQuery snippet that does the same thing, but avoids these issues:
There are several problems with this implementation.
1. The image doesn't have a src attribute. This means if that script fails for any reason then none of the images on your site load.
2. You need to have width and height attributes on all of your images. This is a nightmare for both dynamically generated content and for responsive designs.
3. You need to have those two extra attributes on each of your image elements in order for it to work.
Our goal was to make retina.js zero-config. We wanted it to work on existing sites without any changes to the existing markup.
After introducing Retina displays and graphics.. Apple has unknowingly disrupted the whole display industry. Suddenly all of us now start noticing pixels on our laptop screens as well as HDTVs, which existed since a long time now. I believe this will actually thrust the current internet into a new era of HD internet and soon computer makers and tv makers will jump onto making high pixel density displays. This makes sense. Till now, all was about resolutions and the larger the resolution the smaller the icon, window etc in the OS world. Apple's retina implementation has changed that. You still see things in the same size or aspect ratio but the sharpness is 4 times that of what we used to see earlier, making the whole data consumption experience to be overwhelming.. Kudos to Apple for this. I hope most of the internet 2.0 and display manufacturers are thinking alike on this..
The main place that pixels were noticeable was in an Apple store. Next to the iPhone 4, the older iPads looked much less sharp, at least for text (particularly Computer Modern). There have been lots of attempts at resolution independence, and I'm sure Windows and various Linux GUIs have hooks for such, but they've not been tested much because the hardware was generally in the 96-120 ppi range. There's no need for HDTVs to change.
I havent used the code and I'm sure it works well. However, we've been seeing a lot of unnecessary library usage. A lot of jQuery-based things that really don't need to be as an example and looking at the .coffee files here, I think that the JS it produces would be very simple. Why not just write the JS? Is there really a benefit from coffee in this situation?
I like the idea however there was some discussion that Safari on the new iPad downsamples the images even if they are at retina resolutions. I have not looked at the code however I hope that this issue is addressed.
This appeared to be "as designed" in safari. The only solution I'm currently aware of is to save images as progressive jpgs. This gets around the limitation imposed by Apple.
Even more interesting is http://en.wikipedia.org/wiki/JPEG_2000, which has "truncatable" bitstreams. You can stop getting data at any point, and depending on the encoding choices, you'll just lose fidelity in colour, resolution etc. Encoders can reorder the bitstream to deliver whatever is most useful for the image first. Browsers could then just stop when "enough" has been downloaded to satisfy the demands of the device; high DPI devices would just continue to grab more of the bitstream. It's really useful for devices on low bandwidth links, as you start getting visual results with very little data.
JPEG 2000 hasn't been widely implemented outside of specialised devices, mostly because it's computationally heavy compared to JPEG, and the patent situation is unclear. Although this does mean that it could be implemented in a targeted way, designed to solve these problems (the spec for the entire format is huge). Moreover, one of the patent holders is, IIRC, Apple.