Hacker News new | past | comments | ask | show | jobs | submit | aaronm67's comments login

You're comparing only the densest places (i.e. expensive places with mostly apartments) to only the least dense places, giant houses on big lots. Look at a pre-1950s / railroad neighborhood of basically any smaller/older city in the US, you'll often find a walkable, affordable neighborhood of single family homes.

Also - cheap land does not mean that it's cheap to build and maintain. Almost all of the infrastructure required to build (sewer, power, roads, etc) scale up with area, not density.


Where are you?

In NYC and SF, you'd never fill your contract if you were trying to pay < 150.


I was freelancer in Maryland and then worked at a company in LA.


> And really those issues should be solved at the browser level not the OS level level that affects every single application that runs on it.

They aren't -- this is exploitable in the browser if not patched at the OS level.

> For a drive-by exploit to work (assuming there is one, just because a site is "shady" it doesn't mean it will be 100% sure that it will try to infect your computer with something) it will need to make a TON of assumptions about your setup

If you run these on an ad network, you get access to millions of different setups - you don't need to make any assumptions, you're virtually guaranteed to find someone with a vulnerable setup.


Yes, but the chances are very low that the someone is you if you're using a recent browser version (I'd say not cutting edge, but recent). Probably far, far lower if you use uBO or similar. On Linux, at home, they're probably infinitesimal unless you're being targeted.


Mostly FUD. It's not really exploitable in a practical real world sense. Show me the exploit that can read my password or SSH key, and not some fixed set of data that's been staged by the PoC.


Every license has different goals - that's the whole point of having different licenses.

The goal of this license is to encourage more companies to contribute to open source, not to encourage usage.


> It is not a requirement for AMP. CDNs now let you roll your own domains on the AMP standard

All these certificates do is make it so Google's browser (and only Google's browser) will mask the fact you're on Google's domains if you sign the file a certain way.

If anything, this shows more anti-competitive practices -- they're adding features into their browser that specifically benefit a features of their search engine.


That's not true. CDNs also use their own non-Google domains and infrastructure for AMP hosting:

https://amp.cloudflare.com/


Effectively 0 AMP sites are using anything other than Google's CDN.



Yes, sites just host the original copy submitted to Google. You can see all the resources are loaded from https://cdn.ampproject.org

If you visit the page from search results (which is the only place it would be linked) then it would never leave Google's domain.

Here's the actual URL used from search results: https://amp-businessinsider-com.cdn.ampproject.org/v/s/amp.b...


But as long as it's possible it doesn't qualify as lock-in.


You don't need lock-in to be anti-competitive. The requirement of extra work to implement AMP to get that higher search results page placement is the issue.


At which point pushing for new technologies as a private entity is anti-competitive vs moving technology forward?

If the criteria is just "needs extra work" then unfortunately almost nothing can change and we're all going to live with the existing technology. Change inherently has friction and requires "extra work" with the hope that's an investment which provides returns long term.

In other words, say you are a large Internet company that is trying to improve web page loading times. You profile why most web pages are slow and identify issues. You publicly report on those issues and develop guidelines and criteria. Nobody bothers because "extra work". You develop new technology that directly addresses those issues, this technology works within the existing environment but it requires both client and server support to be most effective. Do you think anyone cares? No, because of "extra work". That's why there needs to be incentives. Now you have a "penalty" for not doing that "extra work". You can file it under "it's anti-competitive" (maybe it is) but if you do the "extra work" then suddenly the anti-competitive part works for you, not against you. IMO that's why it's not anti-competitive.

Other examples: why do you think there are so many people that complained when iPhone released with Flash reader? "extra work". Similarly when it removed the audio jack. Change is friction and friction is extra work. But most of the time that's not anti-competitive...


Let me explain based on my 15 years of adtech experience:

HTML is already fast (see HN for an example). HTML is already universal across devices and browsers. HTML is already published and easily cached for billions of pieces of content.

AMP is a fork of HTML that only targets mobile browsers specifically linking from Google search results. It's useless on its own, but AMP is required to get higher placement on search results pages, so publishers are effectively forced to spend technical resources to output an entirely new format just to maintain that ranking.

If Google wanted faster pages then it can do what it always does and incentivize that behavior by ranking results based on loading speed. These signals are already collected and available in your Google webmaster console. There's nothing new to build, just tweak ranking calculation based on existing data. Sites would get faster overnight, and they would be faster for every single user because HTML is universal.

Do you know why they didn't do that? Because it's the ads and tracking that's slow, not the HTML. Google's Doubleclick and Google Analytics are the biggest adserver and tracking systems used on the web. This entire AMP project is created to circumvent their own slow systems. It creates more work for publishers while increasing data collection by running a "free" CDN that never leaves a Google-owned domain and thereby always supports first-party cookies. It's a perfect solution to protect against anti-tracking browsers and why Chrome now will also block 3rd-party cookies, because it won't affect AMP hosted on Google domains.


This makes sense. With all that and browser fingerprinting and accounts and "other" mechanisms, do they even need cookies anymore?


First party storage won't be affected without some major AI tech in browsers so cookies are still the best deterministic connection, especially since most people are logged into a Google service already (gmail, chrome, android, youtube, etc).

Probabilistic techniques are used for anonymous users or environments like iOS Safari that are very strict.


So the user sees your URL, you're getting the revenue from the ads that are shown, sharing will share your URL, your statistics work flawlessly.

In other words: if it behaves exactly as a page hosted on your site (just faster), why do you care?

I'm getting the impression that HN users care a whole lot about seeing the request in the nginx log they are tailing.


Well, as a user, I care about not announcing loudly to Google every single step I take on the web.


Then why are you searching on Google? That's where you would see an AMP page served from a Google AMP cache. If you searched on Bing, you would get AMP pages served from a Bing AMP cache instead.


In the past year, I've seen amp pages increasingly often linked from all sorts of places (reddit, FB, here, etc) besides Google's search results.


AMP pages hosted by the publisher, Google's AMP cache, Bing's AMP cache, or some other company's AMP cache? GGP was complaining about sending any information to Google. Only one of those options does so.


I am obviously not.


It's not always faster. There are plenty of performance and usability issues with AMP pages, not to mention all the extra development effort needing to maintain a different version of the site just for a few mobile browsers.


It's anticompetitive af.


Passwords are a terrible form of authentication anyway -- if it's something that actually matters, use some form of 2-factor auth.

Requiring a long password on a site where the impact of a breach is minimal is not a good policy, you're just going to get people who can't ever login.


https://nypost.com/2018/08/25/why-nyc-is-priciest-city-in-th...

Construction in NYC is far more expensive than anywhere else in the world, largely because of union contracts.


And every time a developer sees that, they're going to google "bar javascript" "single bar javascript" "bitwise or javascript" "bitwise or javascript effect" until they figure out WTF is going on.

This all to save a fraction of a microsecond on an operation that's called 60 times on the page.


Actually only 52 :D

But yeah, I agree about googling part: I remember having Googled "tilde javascript" and "pipe javascript".. :)

Tilde is useful with indexOf:

if (~array.indexOf(item)) {}

..equals to:

if (array.indexOf(item) > -1) {}


This is the sort of thing that makes Perl readable in comparison to Javascript :)


I prefer using IDEs to navigate larger projects, and you get better code completion (I use Visual Studio for C and PHPStorm for PHP). I just have a hotkey to open up Vim at the current line/file, - I prefer Vim for heavy text editing and heavy reformatting.


Very rarely being every single release? You should not be using a changing library in any production code, you're just asking for problems.

If all you're using jQuery for is a document.querySelectorAll polyfill + the most primitive event handling, you're possibly safe, as those don't change terribly often...but at that point, you should probably be using a different library or a custom build of jQuery.


Unfortunately, it's the people least qualified to assess the safety of using an always-changing version who are most likely to do so.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: