Hacker News new | past | comments | ask | show | jobs | submit | jessermeyer's comments login

There is obviously a trade off here, but categorically speaking, new releases introduce new bugs and security exploits too.


But they do patch the known security exploits that are likely to be actively used. I'm happier with a security exploit (almost) nobody knows than with a published one that appears in hacking tutorials from 10 years ago.


There are two degrees of separation here though: The software vendors and then the linux distros.

If you sell software that requires your clients to upgrade their system-wide security stack, so they might not. If it is statically linked, no need for them to.


Those are basically contradiction of terms. Orienting the program structure around the data necessarily requires control over memory layout and how it is interpretted.


I've worked on several. You haven't heard of them because none of them shipped due to issues with commercial engines (in particular Unreal Engine 4).

They are very difficult to use if the game play semantics are complicated and require lots of interaction with world state or world geometry. If you're making a common FPS, they are great.


> They are very difficult to use if the game play semantics are complicated and require lots of interaction with world state or world geometry.

Could you give an example? I’m trying to understand what the limits look like. It appears to be trivial to you, but for someone outside of game design, I can’t imagine what those might be.


We almost never hit technical limits in the renderer, streaming systems, etc. Instead, we found that pushing gameplay systems beyond the prototype stage would require more and more effort, as we'd encounter deep engine bugs, or the tooling simply did not cater to our use case.

We ended up implementing more and more tooling outside the engine, and there comes a point where UE4 became a IO/Rendering system. We'd've been happier if the engine were modular in design from the get-go.


Sorry, I think we're stuck in a loop.

> we found that pushing gameplay systems beyond the prototype stage would require more and more effort

I understand that you're saying that some systems exist that can't work. I'm trying to understand what those systems would look like, and how the user would see it as being different. Do you have an example of the system/mechanic that can't work?


Not at a high level that is easy to express in a HN post, sorry. It's one of those "the devil is in the details and there are a lot of details".


There is always a minimum cost of moving data from one place to another. If you're computing on the GPU, the data must arrive there. The problem is that PCIE bandwidth is often a bottleneck, and so if you can upload compressed data then you essentially get a free multiplier of bandwidth based on the compression ratio. If the decompression time is faster than having sent the full uncompressed dataset, then you win.

But yeah, direct IO to the GPU would be great but that's not feasible right now.


>The problem is that PCIE bandwidth is often a bottleneck, and so if you can upload compressed data then you essentially get a free multiplier of bandwidth based on the compression ratio.

Agreed! The history of computers is sort of like, at any given point of historical time, there's always a bottleneck somewhere...

It's either with the speed of a historical CPU running a specific algorithm, with a historical type of RAM, with a historical storage subsystem, or with a historical type of bus or I/O device... once one is fixed by whatever novel method or upgrade -- then we always invariably run into another bottleneck! <g>

>But yeah, direct IO to the GPU would be great but that's not feasible right now.

Agreed! For consumers, a "direct direct" (for lack of better terminology!) CPU-to-GPU completely dedicated I/O path (as opposed to the use of PCIe as an intermediary) isn't (to the best of my knowledge) generally available at this point in time...

If we are looking towards the future, and/or the super high end business/workstation market, then we might wish to consider checking out Nvidia's Grace (Hopper) CPU architecture: https://www.nvidia.com/en-us/data-center/grace-cpu/

>"The fourth-generation NVIDIA NVLink-C2C delivers 900 gigabytes per second (GB/s) of bidirectional bandwidth between the NVIDIA Grace CPU and NVIDIA GPUs."

Or, we could check out the Cerebras WSE-2:

https://www.cerebras.net/product-chip/

>"Unlike traditional devices, in which the working cache memory is tiny, the WSE-2 takes 40GB of super-fast on-chip SRAM and spreads it evenly across the entire surface of the chip. This gives every core single-clock-cycle access to fast memory at extremely high bandwidth – 20 PB/s. This is 1,000x more capacity and 9,800x greater bandwidth than the leading GPU."

Unfortunately it's (again, to the best of my limited knowledge!) not available for the consumer market at this point in time! (Boy, that would be great as a $200 plug-in card for consumer PC's, wouldn't it? -- but I'm guessing it might take 10 years (or more!) for that to happen!)

I'm guessing in 20+ years we'll have unlimited bandwidth, infinitely low latency optical fiber interconnects everywhere... we can only dream, right? <g>


I don't know how much bias the fine structure constant has on the function of cognition, but I think we can all agree that constats incompatible with higher level biological function like cognition would never produce arguments in favor of their typicality.


The answer is: A lot. When looking at fusion processes that create carbon, if the strength of the electric force (quantifiable as the fine structure constant) were just 4% different, our universe would never have produced enough carbon to create life as we know it. The limit is even tighter for the strong nuclear force - less than one percent. If you pick arbitrary constants, you'd very likely just end up in a universe that contains only protons and no higher elements. And it gets even weirder when you start looking at gravity, because most universes should actually have collapsed again long ago or expanded so fast that no elements could form. Some of these fine tuning problems can even be looked at in the absence of intelligent life, because even for a liveable universe they seem ridiculously fine tuned to support what we actually see in the sky.


I carefully chose my word "function" instead of "existence".

Give me two different universes where cognition exists but where the fundamental constants differ. Would you expect the ability to perform syllogism would be fundamentally biased to reflect the constants which brought about their existence?


Easy. I present you two universes, one with our cosmological constant and one with a slightly smaller but still nonzero one. The differences would only become apparent over distances greater than a few billion light years or so. Out planet, solar system or even the entire galaxy would be virtually indistinguishable. But inhabitants of both would wonder how the cosmological constant got cancelled out so precisely against the QFT vacuum over so many orders of magnitude when studying the sky. No bias required.


Mathematicians I personally know well are stoutly religious. They describe the world of number and form and structure with the same kind of language the clergy describe God, and so there is some sense in which these are felt as aspects of the same thing.

I also know a number of engineers who are deeply religious, perhaps encouraged by recognizing "design" in nature, seemingly requiring a designer.

In my conversations with these people, their faith, while socially speaking is Christian, the specifics have hardly anything to do with traditional or orthodox theology. It's, as I perceive this, the only acceptable social structure available to them to live out these deep feelings of beauty and harmony in community.


I always found Donald Knuth’s professed Christianity somewhat curious. Then I watched a small bit of his interview with Lex Fridman, and happened to catch the part where he speculates that maybe God is a big computer. So, OK.


Knuth had serious doubts in his college days, but ultimately decided it was "OK to believe in something unprovable." Which to me speaks to his humility.


> where he speculates that maybe God is a big computer.

I cringe at phrases like that. It's empty theologically and technologically. It represents a complete inability in imagining what God could be, and a naive, childish pride in our little creations, these computers. While in his internal driven world view it might make sense, to me it has no more or less truth than "maybe God is a big steam engine"


Please don’t judge DEK based on my hasty summary; check out the interview for the context. My only point was that the Christianity in the head of someone like Knuth is probably a bit more fluid and creative than whatever’s in the heads of the people who come to your door on Sunday morning to save your soul.


Yeah - indeed I probably was a little hasty and uncharitable. I have a ton of respect for Knuth both as a computer scientist and as a religious believer. I will try to look up the interview, because I am curious to get the context around the quote.


> It represents a complete inability in imagining what God could be, and a naive, childish pride in our little creations [...]

Aren't all religious statements just like that?


If an engineer at a FAANG company reduced their global link times by even 10% they'd be promoted into early retirement.

Mold is an order of magnitude faster.


If that were true, the author would have retired, because they were a Google engineer when they wrote lld, which is bonkers fast compared to gold, Google's previous linker.


Early retirement and, to use rui's word choice, "earn a comfortable income" are not inconsistent.

But to your credit it does sound that rui saved Google (vastly) more money than they ever paid him.


There are hundreds of people in platforms, language tools, ads, search, and probably everywhere in the company that have saved Google way more than they ever got paid. The thing to understand is that Google's scale is not your scale, as an employee. You make a change that saves the company a million dollars per month. The number is large because the company is large, not because you're amazing.


I think a single employee netting even a single digit percentage in a year's worth of effort in improved performance for all of Google is an astonishing accomplishment. Trivializing a "million dollars per month" improvement just says so much about how little programmers here realize their worth to technology companies, and intellectualize ways to justify it.


The guys that improved dependency download time by up to 80% at my company (Amazon) sure as hell are enjoying the 3% raise they got like everyone else.

I agree with you that Mold is an order of magnitude faster than the competition (sometimes). However in that case the absolute value is more representative than the relative number. Your linking time going from 10 seconds to 1 or 2 seconds is only a tiny improvement.


Nobody only spent 10s on linking. For any reasonable sized binary (in the range of 100 to 200 MiB) it is somewhere around 50s to 60s before mold. People regularly links 1GiB binary for living during their development cycle. If you use small binary or primarily using Go, probably not the target audience of mold.

Otherwise I actually agree mold is amazing but have limited commercialization potential unless author expands the scope (like RAD Game Tools) / take some VC money (much like Emerge Tools)


10s is an extreme link time with lld. lld can link clang in 6s. Mold cuts that in half, but it's still true that you save 3s, not a minute.


I disagree. Our lld link times for the Linux build of our project is over 3 minutes, with LTO disabled.


I sort of want to hear about this project but also I'm sort of afraid to find out. How long does it take with mold?


I work in video games, and all the game projects I've worked on for the last 10 years or so have been multi minute link times. We predominantly iterate on windows with MSVC, and it's usually "server" binaries that are produced for linux.

Afraid I don't know how long it takes with mold, sorry!


Incredibly wrong take.


Deeply and needlessly cynical. rui tried to fund a high-value software infrastructure project on its own merit and it's not feasible long term. So a change is required.

Hardly a bait and switch 'extortion'.


Well, maybe going all-in in building a "high-value software infrastructure project on [your] own" is not a good idea if you don't have any concrete plans for funding right from the start? Changing the license to a non-FOSS is basically the most knee-jerk reaction you can do in that case. Find a maintainer. Give the project to the community. Become a regular contributor in your own spare time instead of the main (and sole) developer.

There are so many better ways to handle such a situation.

If you replace "building software" with "building a road" (or some other similar real-world infrastructure project), everyone would agree that it's a self-made problem and that there are other ways out rather than just "from tomorrow on this will be a toll-road". You can turn the road over to your city/municipality. Or you could let other take over the maintenance. If the road is useful, there will be others. If it isn't, well, everyone moves on.


> Give the project to the community.

Why? Did the community give to him? It certainly sounds like it hasn't. What obligations does one have to a community that doesn't give back?

> There are so many better ways to handle such a situation.

Better for who? You? Or him?

If open-source communities want open-source, they're going to need to come to grips with the need for people to eat, and to do that they are at minimum going to need to pass the hat. If they're not going to do that, this happens, and telling somebody who isn't you what they should do for "the community when "the community" doesn't support them is, frankly, wrong verging on immoral.


You are 100% free to fork the last permissively licensed commit and maintain it yourself. Looking forward to seeing you put in some work with the same enthusiasm that you use to tell people how to run their projects.


> Changing the license ... Give the project to the community

Wrong - this is a false dichotomy. "changing" license is not retroactive. Older releases are still available to the community and anybody can fork the project and carry on a new release train.


> Find a maintainer. Give the project to the community.

The code is out there for the community to pick up if they wish; anyone can rise up as a maintainer.

Besides, how does above help at all in the author getting paid?


Large web companies like Google implement their own encryption stack anyway.

On the BSD's I've used, LibreSSL is a standard kernel configuration option. I'll note on FreeBSD, LibreSSL lacks the in-kernel fast path, last I checked.


> Large web companies like Google implement their own encryption stack anyway.

Google uses BoringSSL[1], which is another OpenSSL fork. I believe AWS uses a mix of OpenSSL and Boring SSL (someone can correct me!).

So it's "their own encryption stack," but that stack is at least originally comprised of OpenSSL's code. They've probably done an admirable job of refactoring it, but API and ABI constraints still apply (it's very hard to change the massive body of existing code that assumes OpenSSL's APIs).

[1]: https://boringssl.googlesource.com/boringssl/


AWS maintains their own TLS stack: https://github.com/aws/s2n-tls


Is this an argument for GPL?

Seems like the big players came, saw, borrowed, and then did their own thing without contributing back.

If this were my project, I would be inclined to archive it and do a GPL fork.


None of what happened with OpenSSL or its forks is incompatible with the GPL.


Forgive my ignorance, but all of these forks are also still open source? My impression was that patches and improvements were made in closed source, private repositories to the benefit of the companies without paying anything back.

Otherwise, couldn't some openssl contributors just crib fixes from the forks?


As far as I know, all of the major ones are. I don't believe anybody has attempted to make a closed fork of OpenSSL, at least not one that has gained any real traction.

> Otherwise, couldn't some openssl contributors just crib fixes from the forks?

They do! But I assume it gets balanced with their own feature development time, and it becomes harder as the codebases drift. OpenSSL probably hasn't done itself many favors with the recent (3.x) "providers" refactor.


This piece accusing Watts of laziness is just as guilty.

`It’s true that Buddhism, and particularly Zen Buddhism, teaches that we are perfect just as we are, we have merely forgotten our true nature.`

As someone who has studied Zen Buddhism for a few decades, you will be very hard pressed to find anyone practiced in it refer to the words 'true' and 'perfect' so casually.

True is only meaningful with respect to an abstract system of rules, and this extends likewise to perfection. You need an external metric to determine what is, and by the same token, is not, perfect.

Zen Buddhism teaches people how to experience the world independent of that part of your mind that is actively categorizing the world into true, false, imperfect, perfect, etc. To experience your own experience of life as directly as possible, without mediation through your linguistic centers or moral philosophies. Since we're programmers, one analogy would be to reduce all those needless abstractions in the call stack down to the essential turing-complete read/write add/sub and jump instructions.

So Alan is forcing an important point on the Yogi. You can only know or define enlightenment with respect to an abstract system of thought. Remove the abstraction, and there is a complete liberation. No meaningful way to form distinctions. "Doesn’t he see the Brahman everywhere, and in all people, all beings". All becomes one. There is no difference between the enlightened and the non-enlightened in the non-conceptual world. Which is the world that Zen teaches how to experience, if for no other reason than to reveal that it is possible, and provide a renewed perspective on the seemingly ordinary miracle of conscious life.


Wish I could upvote this one ten times, or a hundred. Thanks for taking a little time to type it up.


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: