Hacker Newsnew | past | comments | ask | show | jobs | submit | squiggleblaz's commentslogin

Engineers want some kind of regulation because they feel like computer systems, which they nominally control, are out of control, because of the business people's demands. They want the right to say no without having to have the consequences of saying no. But then when regulations come in, they're not about regulating business, they're about regulated interactions between people and business. And whereas the idealist sees a regulation as a chance to change things for the better, a regulator sees a regulation as a chance to preserve things as they were just before they became bad. (It takes a politician, not a regulator, to change things.)


> Fixing this is difficult, not just because people are resistant to change, but also because the variations in accents.

The relevance of accents is greatly overstated. The argument is of the form "we should let the perfect be the enemy of the good, and therefore it's impossible". There are a great many words in English whose pronunciation is irregular: these are the ones we should fix. For these, accent is irrelevant; you can pronounce your r's hard or your a's broad, and it doesn't matter: "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been, from Old English (ic byrge vs myrge) on. You could just fix 100 words like "bury" and "could" and "are" whose spellings are either wrong or etymological but don't reflect extant variants, and the spelling would be reformed, children's lives would be improved, and it wouldn't be a problem from any perspective of accent variation or etymology or anything.


> "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been

I've definitely heard speakers for whom "bury" rhymes with "furry", and that's without the "Merry–Murray merger" (i.e., the same person would pronounce "berry" to rhyme with "merry" and quite distinctly from "bury".)

> You could just fix 100 words like "bury" and "could" and "are" whose spellings are either wrong or etymological but don't reflect extant variants, and the spelling would be reformed, children's lives would be improved, and it wouldn't be a problem from any perspective of accent variation or etymology or anything.

In many cases it would take existing homophones and turn them into additional meanings of the same spelling, which would actually reduce clarity and comprehensibility of written text.


> "bury" is pronounced to rhyme with "merry" in probably every accent of English that's ever been

Bury rhymes with hurry around Philadelphia (NJ, Maryland, some parts of NY).


A certificate authority is an organisation that pays good money to make sure that their internet connection is not being subjected to MITMs. They put vastly more resources into that than you can.

A certificate is evidence that the server you're connected to has a secret that was also possessed by the server that the certificate authority connected to. This means that whether or not you're subject to MITMs, at least you don't seem to be getting MITMed right now.

The importance of certificates is quite clear if you were around on the web in the last days before universal HTTPS became a thing. You would connect to the internet, and you would somehow notice that the ISP you're connected to had modified the website you're accessing.


> pays good money to make sure that their internet connection is not being subjected to MITMs

Is that actually true? I mean, obviously CAs aren't validating DNS challenges over coffee shop Wi-Fi so it's probably less likely to be MITMd than your laptop, but I don't think the BRs require any special precautions to assure that the CA's ISP isn't being MITMd, do they?


A local minimum is a point in the design space from which any change is an improvement (but there's other designs which would be worse, if they make several larger changes). I think it's hard to make that claim about Git. You're probably referring to a local maximum, a point in the design space from which any change makes it better (but there's other designs which would be better, if they make several larger changes).

In my career, I've used Svn, Git and something I think it was called VSS. Git has definitively caused less problems, it's also been easy to teach to newbies. And I think the best feature of Git is that people really really benefit from being taught the Git models and data structures (even bootcamp juniors on their first job), because suddenly they go from a magic incantation perspective to a problem-solving perspective. I've never experienced any other software which has such a powerful mental model.

That of course doesn't mean that Mercurial is not better; I've never used it. It might be that Mercurial would have all the advantages of git and then some. But if that were so, I think it would be hard to say that Git is at a local maximum.


> something I think it was called VSS

Hmm, maybe Microsoft Visual Source Safe? I remember that. It was notorious for multiple reasons:

* Defaulted to requiring users to exclusively 'check out' files before modifying them. Meaning that if one person had checked out a file, no one else could edit that file until it was checked in again.

* Had a nasty habit of occasionally corrupting the database.

* Was rumored to be rarely or not at all used within Microsoft.

* Was so slow as to be nearly unusable if you weren't on the same LAN as the server. Not that a lot of people were working remotely back then (i.e. using a dial-up connection), but for those who were it was really quite bad.


> it's also been easy to teach to newbies

The number of guides proclaiming the ease of Git is evidence that Git is not easy. Things that are actually easy do involve countless arguments about how easy they are.

I can teach an artist or designer who has never heard of version control how to use Perforce in 10 minutes. They’ll run into corner cases, but they’ll probably never lose work or get “into a bad state”.


> A local minimum is [...]

Unless you're in ML, in which case it's a minimum of the loss function, not the utility function...


> You're probably referring to a local maximum, a point in the design space from which any change makes it better (but there's other designs which would be better, if they make several larger changes).

I think you meant "worse" for that first "better."


Git being easy to teach to newbies is an uncommon opinion. It was not clear if you meant easier than Subversion. But this would be even more uncommon.


> I've never experienced any other software which has such a powerful mental model.

I hate to be that guy, but you should spend some time with jj. I thought the same, but jj takes this model, refines it, and gives you more power with fewer primitives. If you feel this way about git, but give it an honest try, I feel like you'd appreciate it.

Or maybe not. Different people are different :)


Reinforcement learning, maximise rewards? They work because rabbits like carrots. What does an LLM want? Haven't we already committed the fundamental error when we're saying we're using reinforcement learning and they want rewards?


I think it's your responsibility to control the LLM. Sometimes, I worry that I'm beginning to code myself into a corner, and I ask if this is the dumbest idea it's ever heard and it says there might be a better way to do it. Sometimes I'm totally sceptical and ask that question first thing. (Usually it hallucinates when I'm being really obtuse though, and in a bad case that's the first time I notice it.)


> I think it's your responsibility to control the LLM.

Yes. The issue here is control and NLP is a poor interface to exercise control over the computer. Code on the other hand is a great way. That is the whole point of skepticism around LLM in software development.


> GNU did not have a working system until Linus released Linux in 1992. They had pieces and components which were worthless on their own.

People were installing GNU onto existing Unix systems because GNU was better than they were distributed with. Maybe they did that with components of BSD Net/1 - no one has ever told me they did but it probably happened - but that was definitively post GNU.

Anyway, I'm not sure if this matters so much to the debate. Stallman was reacting to a change. He rambled politically and wrote some code to back it up because he used to be able to do things, and now he could only do them if he would write some code and win some allies.


Linking against GPL code on a backend server which is never distributed - neither in code or binary form. (Because what might happen tomorrow? Maybe now you want to allow enterprise on prem.)


This makes sense thanks!


My favorite one I think is the Internet Explorer/Google Chrome "Same shit different - " one, because it's obviously recent and somehow iconic of the sort of person who reminisces about the old web, and clearly narrowcasting to such people.


I'm trying to brainstorm an answer. My best guess is that SSH is obsoleted by disposable instances. You can spin up a new instance for every version of your configuration, transition to it, and dispose of the original (or set it aside or whatever). That way, you could probably have a reasonably complete tech career and only ever use ssh as an implementation detail of git.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: