Hacker News new | past | comments | ask | show | jobs | submit login

I think Core Web Vitals were incorporated into their approach to prepare for the retirement of AMP. Pushing publishers towards industry-wide best practices is a good thing. Forcing AMP as a solution really wasn't.



I'm not super happy with the Web Vitals, either. They seem to be pushing the Web toward deploying a lot of otherwise-unnecessary JavaScript cleverness that ultimately presents itself to me in the form of a generally more obnoxious experience. Older and non-Chrome browsers may have difficulty rendering it. Total bandwidth consumption can go up, because it's A-OK to load a bunch of enormous resources later, just as long as it doesn't happen on initial page load, or take longer than 4 seconds to happen, as measured by the company with the fastest Internet connection in the world.

Long story short, I don't see it as best practices that serve Internet users as a whole. They seem to be more closely tailored to the interests of Google. And, by extension, the subset of netizens from which they can generate the most revenue.


I just looked into the search console for the first time in forever, and I'm not sure how Core Web Vitals would be pushing for unnecessary JavaScript. A website consisting of 100% static content seems to be fully okay according to their metrics - but funnily enough, some months ago they claimed that 50% of the pages were served too slowly on computers, while 100% of pages were OK on mobile. Same static server. Now it's the other way around, they randomly say that a random selection of pages is too slow on mobile, and it changes every day. Maybe it's just their own connection that sucks? It also doesn't help that they keep using acronyms for the broken stuff that aren't explained everywhere on the page itself.


> Maybe it's just their own connection that sucks?

Your Core Web Vitals report in Search Console is based on Chrome User Experience Report data. Meaning that this is data from your real users, not Google running simulated tests of your pages on their own servers. I.e. when someone loads your page from Chrome, Chrome reports back how long it actually took the page to load for that user (it doesn't happen with all users, they have to meet various opt-in criteria [1]). So, if you see that 50% of pages are served too slowly on computers, it means that 50% of your real users actually experienced slow page loads (as measured by the Web Vitals metrics). Perhaps your static site isn't as efficient as you think, or your server is slow, or the devices/connection of your users is much worse than you assumed. That's the power of this data; it shows you that in the real-world the experience isn't as great as you're assuming and encourages you to investigate further.

(For the record) The landing page of the Core Web Vitals report does indicate where the data is coming from. Next to "Source: Chrome UX report" you see a question mark. If you hover over that question mark then click the "Learn more" link it takes you to this page: https://support.google.com/webmasters/answer/9205520?ref_top...

Disclosure: Googler working on https://web.dev. I'm not on the Web Vitals team but interact with them.

[1] https://developers.google.com/web/tools/chrome-user-experien...


It's how the combination of the three encourages you to do things if you want to have a site with rich content. You're supposed to paint the page ASAP, so you don't want to defer loading any large content, but then you're not supposed to have the page layout shift around at all as you dynamically load all that content later, so you've got to do clever things with placeholders and swapping out content and whatnot. You've got up to 4 seconds to load all that stuff, which is enough to load an enormous amount of data over a fast internet connection, so much so that the same amount of content might take minutes to load over a slower connection. Fortunately, they've chosen methods for measuring that metric that are heavily biased toward measuring the experience of people who have 24/7 access to broadband.

So, yeah, Google may want to encourage a nice Web experience, but they don't want to back this with metrics that might discourage people from sending too much business in AdSense's direction, or fail to favor Chrome over alternative browsers.


Whilst there is an element of lab measurement involved, they do use field measurement, so metrics are collated from users rather than their own connection. This means that your data could just as easily be skewed by a browser/OS update that rolls out to a ton of devices at once, as much as anything at your end.

Do agree that the proliferation of acronyms doesn't help with wrapping your head around it all!


Perhaps the mobile loading time tolerances are larger, meaning that if the timings between desktop and mobile are largely the same, they could have different results.


The existence of first-contentful-paint, as well as the page speed index, would suggest they care to prioritize for these factors as well and not simply defer to after load.

Both of those metrics account for the visual completion of the page relative to its final appearance -- deferred resources would slow that.


As someone who has tried to improve his PageSpeed Insights score for a few days I kind of agree. I also think their metrics scores are harsh.

At least they link to a lot of docs and advices.

Also funnily enough they complain about Google Analytics.


To me AMP was just Google trying to turn the web itself in its own walled garden (after many failed social media attempts).

And, as I read in the article, it looks like behind this move there is some current "Antitrust Pressure" plus publishers quite pissed off about losing both control and revenue themselves (as much as 39% less conversions, they says.).

Clearly AMP was way more in Google master plans than a poor web performance palliative.

[Edit]

This was the article linked to this story when I commented (now changed to some Google Dev docs):

https://themarkup.org/google-the-giant/2020/11/19/as-antitru...


I never saw any convincing argument that AMP helped pushed a walled garden, at least not in any common accepted sense of that term.

It was open source, it was used by many others including competitors, it was optional and it didn't block access from anyone. Having an AMP version in no way "locked" you to any garden, AMP versions aren't even meant to be the canonical page anyways.

It may have had a lot of issues, but "walled garden" would not be one of them.


All of the antitrust case is about how Google pushed its user searches to its own proprieties.

And it was as optional as publishers were almost forced to jump in it to stay relevant in News SEO.


Exactly. Walles garden would be requiring publishers to send articles directly to Google if they want to enable instant loading (like Apple News). Instead, publishers publish their articles publicly in a way that Google's competitors can and do consume.


The carousel setup it enabled was certainly a (soft) walled garden. It hijacked the top portion of a publisher's page, the back button, and swipe actions, resulting in more time spent on Google.


I always viewed AMP as being primarily for Google’s data collection interests, not primarily lock-in.


Is showing full screen images on desktop an industry-wide best practice? How about the faux navigation bars meant to resemble browser chrome?

This "Google was just forcing publishers to fix their pages" meme desperately needs to die. Just consider all the extra standards crap they were pushing to introduce to perfect the deception. This was, as always, about owning the data.


Except... they didn't own the data? AMP was conceived because Google was worried about everything moving into Facebook's walled garden. That's what it was competing against.

Google doesn't need to own the data because Google is the world's gateway to the open web. They don't care who owns the data as long as they can crawl it.


It sounds like AMP will stick around, serving as the easiest way to guarantee a great Core Vitals score (because it's so locked down), but that you're welcome to find other ways to get a great score yourself. Which is how it should have always been.


It's not Googles job to push 'industry wide-best practices' to third parties.


> It's not Googles job to push 'industry wide-best practices' to third parties.

It's absolutely Google’s job to decide what quality signals to incorporate into their search rankings.


Id argue its their job to index the internet, not dictate how people build their webpages to determine the order they are displayed. They want their crawler to be faster and to capture market share/screen time.


Right, but when I search "cookie recipe", there are millions of pages that match in the index. Yes, popularity is a good starting point, but even that has its limits. All things equal, I'd much rather have a page that loads in 1s than one that loads in 10.


I'd also much rather have a page that brings me to a cookie recipe when I search "cookie recipe", but Google ranks pages with tons extraneous text, images and ads above your bare bones recipe site. This has given rise to an immense amount of blog spam on Google, where you have to scroll through paragraphs and paragraphs of inane content created by an army of underpaid writers just to get to the relevant information you searched for.


I'd honestly be curious to see a screenshot of your search result for "cookie recipe" because that's definitely not my experience.


My top result is a blog format recipe — recipe at end of page, though this one has a “jump to” button.

https://joyfoodsunshine.com/the-most-amazing-chocolate-chip-...

The second recipe result is Betty Crocker and as you’d expect — recipe at top, steps with photos after.

I personally find more of the former than the latter when looking for specific recipes (Red wine chocolate cake was my most recent search)


They don't dictate anything - I am perfectly free to try to set up "the last page on the internet" which will be penalized in rank compared to anything else.

They would indeed prefer to be utterly ignored by designers over being gamed to but SEO remains something optimized for like clickbait titles and headlines.


When Microsoft was the dominant OS, they enabled ever-harder-to-ignore auto updates because without them, deployments of their OS could be used to harm other people's systems through remote attack exploits.

Google is also at a scale where they can improve the quality of everyone's web experience with their scale. It' not so much "their job" as "their obligation."


Microsoft enabled harder to ignore auto updates because users didn't want to update. If a user doesn't want to update, they shouldn't have to, as it's their computer. It's not their OS being used to harm users, it's users making their own decisions and accepting the associated consequences. Google should not be trying to force people to obey web standards, they should try to let users make their own decisions.


> If a user doesn't want to update, they shouldn't have to, as it's their computer.

And then they plug that computer into a global multi-user network and their machine is botnetted and used to harm other users. In that context, people are no longer making simple decisions and accepting the consequences; a tragedy of the commons is instead created.

Your thinking works when computers are isolated from each other. When they're not, it's in the same category as "states require annual vehicle inspections." Because when you're sharing the road with other drivers, you owe it to them that your vehicle is unlikely to undergo catastrophic rapid disassembly.


I completely disagree. Its my website, my servers, my bandwidth I pay for. Google has zero obligation to dictate how others go about creating their own product. They can enforce web standards sure, but AMP is horrific, and Im sure they next-gen AMP will also be counter productive to third parties.


By that reasoning: it's their search engine, their crawlers, their business to route people to the websites likeliest to be useful to those users.

In that sense, their approach is in some way more equitable than Microsoft's: they're not forcing change upon your system by way of mandatory updates, they're simply saying that if you don't play the same game they play, they're unwilling to do business with you.

If you're free to maintain your server to your standards, why should they not be free to maintain their search service to their standards?


Is google search there to provide the most accurate information or the fastest load information regardless of content? "The primary goal of Google is to provide users with the most relevant, highest quality results based on user search queries, i.e. their wants and needs when performing a search online."


These are not mutually exclusive. And no, "most accurate" isn't really guaranteed; their goal is basically a closed feedback loop: they have signal on whether people liked the result they yield and that up-signals that result for searches in similar context in the future.

Speed is valuable because it allows users to more quickly digest whether the result is relevant to them.


That is where I also disagree. Giving a higher rank to a AMP page that people like vs an page the is more accurate information seems like an issue to me. People are larger not smart enough to discern an accurate answer (look at how bad society wants to regulate FB, Twitter, etc for posting information that does not pass 'fact checks'). At the end of the day it doesnt matter, these decisions are made by what drives advertising dollars not relevancy or accuracy. Today faster browsing and crawling equates to faster advertisement display.


I don't think a search engine is the tool you seek; you're looking for an expert-opinionated resource for sifting data, not an automated system to retrieve some data on arbitrary topics.


It's not their job, but it's certainly in their interest (since it gives their users a better experience), and it's not illegal, so it's understandable that they do.


It looks like it's no one's job, though. And the incentives for those third parties are even worse aligned with user interests, in many cases, than Google's incentives.

And I say this as someone that thinks that Google is the scariest company out there, right now.


True but it is in their interest. Google benefits when the web is fast and full of content that can be indexed and worth surfacing.


It is their job to turn their power in to money, and AMP is just a power play. It is what tech companies do - if we don't like it, something about the fundamentals has to change.

My only complaint is the smirky, chipper PR front they package with it. Microsoft, at least, wasn't too insecure to show some fang without the Candyland faux-earnest horse shit when they were king.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: