Hacker Newsnew | past | comments | ask | show | jobs | submit | stephenjudkins's commentslogin

No. Most NATs handle UDP sessions trivially as long as one end of the connection is not behind a NAT itself. Tricks like UDP hole punching are necessary only between two endpoints that are both behind NATs.


And assuming the protocol can cope with changing source ports. I’ve seem some udp protocols that only work if the source port is not translated.


One of the authors is known for heavily cherry-picking his facts, making some pretty extreme and outrageous assumptions, and suing fellow academic critics who point this out. Caveat lector.



Oh, that guy. Suing PNAS and the author of a study critical of his work is honestly quite shocking. One would hope the work would stand for itself, or at least the response should’ve been another paper rather than a lawsuit.


You find it shocking that he might have a valid case against them, or that he has money to waste on legal harassment ?


It is thoroughly inappropriate to respond to legitimate criticism, made in good faith, published in a reputable journal, with a lawsuit.

It doesn't matter if you're right. If you're right; arguments can be made to show that your opponents are wrong (and he was in fact allowed a rebuttal letter in PNAS). Filing a lawsuit breaks any semblance of civility of discourse and dispassionate pursuit of truth; and results in everyone worse off, as the discussion has become thoroughly toxic.

It's burning down the entire house because someone disagreed with your analysis.


Whether it was appropriate or not depends on whether the criticism was legitimate or not. That is the legitimate purpose of legal proceedings to determine.


I guess you would be a heretic to be someone who complains that while the rich want to pay for renewable energy and can afford to live in such a Friedman-esque society in which money buys your civil liberties from the legal system--or even just protection from criticism... meanwhile cost of living, suicide rates, homelessness, civil liberties continue to deteriorate for the working class.

Edit: maybe the downvotes suggest my guess was right.


It is pretty easy to find him doing just that in this paper.

Here's some examples

> Nuclear energy is often seen as a fundamental or bridging technology for future low-carbon systems (International Energy Agency, 2015a; Echavarri, 2013). While it is true that electricity production from nuclear energy is characterized by very low CO2 emissions during the operation phase of the plant, its full life-cycle CO2 emissions, including all up- and downstream processes, are typically much more CO2 intensive.

While this is true, the paper does not make an equivalent comparison to their scenarios. They do not include the upstream and downsteam emissions from PV and wind. Which should include their energy storage requirements. When addressing storage requirements they hand wave, pointing to two papers that also hand wave the requirement. They all but flat out deny the duck curve. But regardless, they don't include these parts into their own scenarios.

Later when they present their 6 scenarios, they assume no change in nuclear. If you do want to make a good comparison moving forward, you should have a scenario 7 that would shut down the current reactors and replace them with new genIII or genIV reactors. You might say I'm being facetious, but the issue is that this is what the nuclear camp wants. The nuclear camp does not want scenario 1 (business as usual), and agree that it is a bad idea. So to make a claim about which direction we want to move forward to be the cheapest and most environmentally friendly you need to address the other positions. While I wouldn't be surprised if this was still a more expensive option, by not addressing it the authors are creating a strawman argument. In reality they do even worse than this, they create scenario 6 which is scenario 1 but with decreased efficiency (which they don't cite evidence for the number they use).

tldr: I'm not sold on the paper. There's merit to it, but the study was not rigorous enough and did not consider the arguments of the opposing scenarios that it is specifically countering.


Don't miss the assumptions they're relying on:

>Emissions are considered per kWh of produced electricity (kWhel), including emissions that occur over the complete life-cycle of a technology (cradle to grave). We use the following values (based on Sovacool (2008), Lenzen (2008) and updated values from Jacobson (2009); nuclear: 66 g-CO2/kWhel, onshore wind: 10 gCO2/kWhel, PV (no difference between utility-scale and rooftop): 30 g-CO2/kWhel.

That paper Jacobson (2009), the subject of the dispute mentioned above, factors in emissions from the burning of cities from nuclear war (at the high end, one city every 30 years), as well the "opportunity cost from planning-to-operation delays." The latter assumes 10-19 years to set up a nuclear plant, during which time emissions from a hypothetical incumbent coal plant are attributed to it. Somehow, PVs and wind have no opportunity costs from delays (though section 4b says they should take 2-5 years). Apparently PV production scaling will never bottleneck as they're deployed grid-wide.

http://www.rsc.org/delivery/_ArticleLinking/DisplayHTMLArtic...


Good to know. I posted it here because it'll get a critical review.


[flagged]


"His core defense: That massive, unexplained increase in hydropower capacity wasn’t a modeling error; it was a modeling assumption. He assumed that the U.S. could simply add turbines on existing dams such that they could deliver 15 times more instantaneous capacity, even though total flows for a year would remain constant"

This seems like a pretty big error. Enough for me to be unsure of the paper linked here.


Judge the math, not the man.


Refute the critics, not sue the critics.


Regardless of what your opinion of the author's character (and my polite rebuttal to look at the data instead), you might be biased on the topic [1]. I'm attempting to discuss from a place of intellectual honesty, and I expect the same from others.

"And it’s the 21st century — its embarrassing that we are talking about wind energy!"

"Simply put, we need nuclear power. We need to spend more time and energy trying to innovate on nuclear energy. That’s what is needed to support the power needs of 10 billion people who will have the energy demands of a modern western nation today. Solar and wind will just not cut it."

[1] https://news.ycombinator.com/item?id=18907686


Hang on. Didn't you just say, "Judge the math not the man." What are the ground rules here. Are we allowed to take into account authors' previous bad behavior or aren't we?


> What are the ground rules here.

https://news.ycombinator.com/newsguidelines.html

"Be civil. Don't say things you wouldn't say face-to-face. Don't be snarky. Comments should get more civil and substantive, not less, as a topic gets more divisive."

"When disagreeing, please reply to the argument instead of calling names. "

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."


Sure, I agree with all that. But you're dismissing the parent poster based on his comment history, while castigating him for dismissing your link based on the author's lawsuit history. That seems a little inconsistent to me, is all I'm saying.

Personally, I think a history of suing one's critics does open one's subsequent work up to significantly increased skepticism from the community, because it's a strong signal that something's off. There's some norm-breaking going on. I wish I could read this article but it seems like it's behind a paywall.


Posting rational thoughts in other threads does not mean you are biased in everything else. Unless you mean the author is biased against dogma?


People are excited for RISC-V instead of Raspberry Pi's ARM.


It is explicitly advertised on Adafruit as a level shifter https://www.adafruit.com/product/1787


Here's one theory: https://storagemojo.com/2014/04/25/amazons-glacier-secret-bd... and here's the HN thread: https://news.ycombinator.com/item?id=7647571

Facebook uses Blu-ray discs with a robotic storage system for its cold storage: http://www.datacenterdynamics.com/it-networks/robot-digs-thr...


It would be nice if Facebook released some stats, kinda like backblaze, about blu-ray durability.


I thought degradation of optical medium was already pretty well understood? (And happened on timescales generally longer than FB has been in operation)


There is Blu-ray compatible archival disk format that the manufacturers claim will last 1000 years: https://en.wikipedia.org/wiki/M-DISC

I don't know how this claim could be verified in our lifetimes.


I've done accelerated aging on polymers (medical, but I bet it's similar).

We have an accelerated aging experiment, and we sample the part for degradation throughout the test. We also "shelf age" another test group of polymers from the same batch by placing them on the shelf.

Every year or so (or sometimes 5 year spans), we take the shelf samples and look for degradation, and compare it to the accelerated data.

In one polymer's specific case, free radicals in the polymer chain slowly react with oxygen and that's how it breaks down (takes about 10 years if the polymer isn't stable), so our accelerated aging is to put in in a pressure vessel full of high-pressure oxygen, and apply a little heat. We can get 10 years of degradation (roughly) within 2 weeks.

This is almost certainly what the archival blu-ray format has done and is doing.


I was too young at the time optical medium was everywhere to search for actual data, and public claims were contradictory (5 years, 10 years .. more .. ). And that was CD-ROM density.

Interestingly, http://blog.digistor.com/the-unparalleled-durability-of-blu-... says that there were new design constraint in blurays making them much more durable.


Perhaps Amazon's Glacier center operates in a lights-out environment with a decreased oxygen environment to increase the life span of the discs.


Yev from Backblaze here -> Yea, we always kind of hope that others will jump in and start sharing data as well. We'll see!


parboiled2 [https://github.com/sirthias/parboiled2] offers pretty much everything the built-in parser combinators do, but with fewer bugs and vastly improved performance. Since both use PEG grammars, it should be straightforward to port over. parboiled2 is fast because it generates fast, stateful parsers using macros while only exposing a purely functional interface.


It's pretty simple to wire up a 24VAC thermostat to control 120V power. Purchase a transformer [http://www.amazon.com/Honeywell-AT140A1000-40Va-120V-Transfo...] and a 24VAC relay [http://www.amazon.com/dp/B00097BDUA/ref=pe_385040_30332190_T...], get a junction box, and wire it all together.

This doesn't solve your problem of the $250/room cost, but if you'd like to use some other thermostats in your rooms this would work great.

The only complaint I'd have about mine is that the relay is very loud. You might want to search for a quieter one.


There are purpose built combination units:

http://www.aubetech.com/products/list.php?noLangue=2&noFamil...

They apparently pay attention to how loud it is.


By comparison, Apple just posted $8.22 billion of net income. [http://www.nytimes.com/2012/10/26/technology/apple-profits-r...].

Approximately $5.1B of Samsung's profits came from their phones; this is despite Samsung being an extremely diversified company. I cannot find what proportion of Apple's profits come from sales of iPhone and iPads; they do, however, make up a majority of their revenue.


And we're only speaking of Samsung Electronics here.

The entire conglomerate is on another level and approx. double the size of Samsung Electronics.


As you stated,

"I cannot find what proportion of Apple's profits come from sales of iPhone and iPads; they do [...] make up a majority of their revenue."

it's imo very safe to assume that.

They sold an astonishing 26.9 million iPhones in Q3, 58 percent growth compared 2011. Also 14.0 million iPads (Q3), 26 percent growth.

With a 4.9 million Macs sold (only 1 percent unit increase) and a 19 percent decline with Ipods, i guess it's safe to assume that most of their revenue (especially additional revenue compared to last Q3 earnings) comes from Iphones/Ipads.

All numbers from official press release:

http://www.apple.com/pr/library/2012/10/25Apple-Reports-Four...

Anyway: Good job Samsung :)


they make roughly half their profit from iPhone. iPad I think may be around a quarter now.


Here's Horace Dediu's estimate for Q3, I doubt the most recent quarter is much different in terms of iOS share:

http://www.asymco.com/2012/08/21/the-interlopers/

Looks like 90% ish. Really astounding to see Apple and MS profits side by side like that. From ~10% of MS to nearly double over less than five years.


If you're curious how this handles memory management--the largest impedance mismatch between Java and Objective-C--see http://code.google.com/p/j2objc/wiki/MemoryManagement.

The preferred method, for iOS apps at least, seems to be use reference counting. This implies that any objects with cyclical references will not be released. Thus, the semantics of this approach will be subtly different than that all JVMs, which have GCs which can detect cyclical references. Thus, the documentation suggests using "runtime and tool support to detect memory leaks".


"runtime and tool support to detect memory leaks".

So... a garbage collector?


Well, if it's just detecting, that's not GC. That's just, "Hey! There's trash here!"


Once you know where the trash is, the easy part is deallocating it.


Its not automatic, you need to explicitly release. They are referring to the tools builtin in xcode which you will have to use manually to debug memory problems in Objective-C then move it back to your Java code as annotations.


I think that's referring to "development runtime", not "production runtime". I.e. run a memory leak detector while you are debugging and profiling. But once you fix all/most of the leaks, you don't need to deploy a GC.


This is probably meaningless for most developers: it is only going to be made available "for use in development, qualification, and testing". Further, there's no information on what license this will be released under.

It's a shame, but given the amount that Azul has poured into research it is not surprising.

At least we can look forward to some interesting benchmarks and code analyses, perhaps?


As for the licensing, did Azul obtain a TCK from Oracle/Sun to say their JVM is for Java? If they did, it is unlikely they will open it up under a permissive license (e.g. MIT, BSD, Apache, etc).

One of Apache Harmony's big reasons for not obtaining and using the Java TCK is that it wouldn't allow them to release their JVM under the Apache license.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: