You're comparing only the densest places (i.e. expensive places with mostly apartments) to only the least dense places, giant houses on big lots. Look at a pre-1950s / railroad neighborhood of basically any smaller/older city in the US, you'll often find a walkable, affordable neighborhood of single family homes.
Also - cheap land does not mean that it's cheap to build and maintain. Almost all of the infrastructure required to build (sewer, power, roads, etc) scale up with area, not density.
> And really those issues should be solved at the browser level not the OS level level that affects every single application that runs on it.
They aren't -- this is exploitable in the browser if not patched at the OS level.
> For a drive-by exploit to work (assuming there is one, just because a site is "shady" it doesn't mean it will be 100% sure that it will try to infect your computer with something) it will need to make a TON of assumptions about your setup
If you run these on an ad network, you get access to millions of different setups - you don't need to make any assumptions, you're virtually guaranteed to find someone with a vulnerable setup.
Yes, but the chances are very low that the someone is you if you're using a recent browser version (I'd say not cutting edge, but recent). Probably far, far lower if you use uBO or similar. On Linux, at home, they're probably infinitesimal unless you're being targeted.
Mostly FUD. It's not really exploitable in a practical real world sense. Show me the exploit that can read my password or SSH key, and not some fixed set of data that's been staged by the PoC.
> It is not a requirement for AMP. CDNs now let you roll your own domains on the AMP standard
All these certificates do is make it so Google's browser (and only Google's browser) will mask the fact you're on Google's domains if you sign the file a certain way.
If anything, this shows more anti-competitive practices -- they're adding features into their browser that specifically benefit a features of their search engine.
You don't need lock-in to be anti-competitive. The requirement of extra work to implement AMP to get that higher search results page placement is the issue.
At which point pushing for new technologies as a private entity is anti-competitive vs moving technology forward?
If the criteria is just "needs extra work" then unfortunately almost nothing can change and we're all going to live with the existing technology. Change inherently has friction and requires "extra work" with the hope that's an investment which provides returns long term.
In other words, say you are a large Internet company that is trying to improve web page loading times. You profile why most web pages are slow and identify issues. You publicly report on those issues and develop guidelines and criteria. Nobody bothers because "extra work". You develop new technology that directly addresses those issues, this technology works within the existing environment but it requires both client and server support to be most effective. Do you think anyone cares? No, because of "extra work". That's why there needs to be incentives. Now you have a "penalty" for not doing that "extra work". You can file it under "it's anti-competitive" (maybe it is) but if you do the "extra work" then suddenly the anti-competitive part works for you, not against you. IMO that's why it's not anti-competitive.
Other examples: why do you think there are so many people that complained when iPhone released with Flash reader? "extra work". Similarly when it removed the audio jack. Change is friction and friction is extra work. But most of the time that's not anti-competitive...
Let me explain based on my 15 years of adtech experience:
HTML is already fast (see HN for an example). HTML is already universal across devices and browsers. HTML is already published and easily cached for billions of pieces of content.
AMP is a fork of HTML that only targets mobile browsers specifically linking from Google search results. It's useless on its own, but AMP is required to get higher placement on search results pages, so publishers are effectively forced to spend technical resources to output an entirely new format just to maintain that ranking.
If Google wanted faster pages then it can do what it always does and incentivize that behavior by ranking results based on loading speed. These signals are already collected and available in your Google webmaster console. There's nothing new to build, just tweak ranking calculation based on existing data. Sites would get faster overnight, and they would be faster for every single user because HTML is universal.
Do you know why they didn't do that? Because it's the ads and tracking that's slow, not the HTML. Google's Doubleclick and Google Analytics are the biggest adserver and tracking systems used on the web. This entire AMP project is created to circumvent their own slow systems. It creates more work for publishers while increasing data collection by running a "free" CDN that never leaves a Google-owned domain and thereby always supports first-party cookies. It's a perfect solution to protect against anti-tracking browsers and why Chrome now will also block 3rd-party cookies, because it won't affect AMP hosted on Google domains.
First party storage won't be affected without some major AI tech in browsers so cookies are still the best deterministic connection, especially since most people are logged into a Google service already (gmail, chrome, android, youtube, etc).
Probabilistic techniques are used for anonymous users or environments like iOS Safari that are very strict.
Then why are you searching on Google? That's where you would see an AMP page served from a Google AMP cache. If you searched on Bing, you would get AMP pages served from a Bing AMP cache instead.
AMP pages hosted by the publisher, Google's AMP cache, Bing's AMP cache, or some other company's AMP cache? GGP was complaining about sending any information to Google. Only one of those options does so.
It's not always faster. There are plenty of performance and usability issues with AMP pages, not to mention all the extra development effort needing to maintain a different version of the site just for a few mobile browsers.
Passwords are a terrible form of authentication anyway -- if it's something that actually matters, use some form of 2-factor auth.
Requiring a long password on a site where the impact of a breach is minimal is not a good policy, you're just going to get people who can't ever login.
And every time a developer sees that, they're going to google "bar javascript" "single bar javascript" "bitwise or javascript" "bitwise or javascript effect" until they figure out WTF is going on.
This all to save a fraction of a microsecond on an operation that's called 60 times on the page.
I prefer using IDEs to navigate larger projects, and you get better code completion (I use Visual Studio for C and PHPStorm for PHP). I just have a hotkey to open up Vim at the current line/file, - I prefer Vim for heavy text editing and heavy reformatting.
Very rarely being every single release? You should not be using a changing library in any production code, you're just asking for problems.
If all you're using jQuery for is a document.querySelectorAll polyfill + the most primitive event handling, you're possibly safe, as those don't change terribly often...but at that point, you should probably be using a different library or a custom build of jQuery.
Also - cheap land does not mean that it's cheap to build and maintain. Almost all of the infrastructure required to build (sewer, power, roads, etc) scale up with area, not density.