Hacker News new | past | comments | ask | show | jobs | submit login

Not in the slightest. Speaking categorically, JSON + templates are smaller to transfer over the wire than fully-rendered HTML -- and can be cached in exactly the same way ... even bootstrapped into a single HTTP request.

If you care about optimizing it, you can get your JS UI to render faster than the equivalent pre-rendered HTML would have.




I'm skeptical, but even so, I still need to render it on the server for googlebot.


You can make your ajax content crawler friendly by following a specification google currently supports. I'm not particularly sure how much of this spec the other search engines support but it's a good start : https://developers.google.com/webmasters/ajax-crawling/


you still need to generate serverside HTML with this approach


that's correct. You basically end up serving your content to crawlers dressed up in html. it's not very hard to do but it presents quite a challenge. We already do this for blogs where we serve alternate content for rss readers which is an apple and oranges comparison, however the idea is more or less the same. The takeaway is that there is a way to serve alternate content to search engines and it involves following an ajax crawling scheme.


Don't you still have to render server side for crawlers and screen readers (accessibility)?


Many of these client-side JS-heavy apps are private "workspace"-like pages ... nothing that would ever need to be indexed by the Googlebot.

If you want to serve screen readers, then yes, you need to make sure you're serving only reader-compatible HTML and JavaScript.


Yeah, that makes sense.


Nope. Screen readers have been able to handle javascript-generated content for years. According to WebAIM's most recent screen-reader survey, 98.4% of screen reader users have javascript enabled: http://webaim.org/projects/screenreadersurvey3/#javascript


Crawlers, less so. The major search engines are getting smarter about it, but for now, you still need some kind of HTML output from the server if you really want to be indexed properly.

It doesn't rule out a pure client-side app at all, but you do have some extra work involved to output HTML from the server. Which is why NodeJS will ride on the coat-tails of this approach; less redundancy.


Hah! "The major search engines" - you mean Google, right? ;)


How effective is Javascript with screen readers, though? It's been a while since I played with a screen reader (late 2009 or so), but when I did even really simple things like a pop-up div confused it. It didn't inform the user that something new was being rendered or anything.


Surprisingly effective. A colleague of mine published some techniques to add consistency to Javascript events across screen readers: https://github.com/ryanfitzer/Accessibility




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: