Hacker Newsnew | past | comments | ask | show | jobs | submit | LPisGood's commentslogin

I was still in middle school in the early 2010’s and I remember thinking how lucky I am to want to be a computer programmer for a career AND it happens to pay a lot of money.

Unfortunately many people today got into for the money and not the passion (or at least the passion and the money). Those people look for shortcuts and are generally unpleasant to work with, in my opinion.


Those are the exact people who are most excited about AI today.

They just want the code but they don't enjoy the coding, so they're trying to find something that will give them the former while sparing them the latter.


I don't think that's true at all. I think people who enjoy coding, but don't like AI, like to believe that's true, because it makes it easy to write off AI as useless, and to look down on people who use it ("they're not real programmers anyway. Not like me."). But my experience is that there's a ton of talented people who love programming who also find AI super exciting and useful.

For a hobby I'm writing a little videogame in C using Raylib. I write a lot of the code myself, but sometimes there's an annoying refactor that won't be any fun. I have limited time and motivation for hobby programming, so if an AI can save me 10 minutes of joyless drudgery when I only have 30 mins to work on my project, that's fantastic. Then I get 10 more minutes to work on coding the stuff that's actually interesting.

Not to mention it's an invaluable source of information for how to do certain things. Asking Claude to give me guidance on how to accomplish something, without it writing the code explicitly, is a big part of what I use it for.


Saying those people are only there for the money is a little bit reductive IMHO.

I like computers but I actually don't like programming that much as an action.

Programming is just a tool I use and try to master because it allows me to do what I like and that's building things.

I'm happy that AI is there to help me reduce the friction in building things.

I'd also argue that people who sees programming as an end and not as a mean are also going to either don't like working in most software companies or to be pretty negative contributors despite their mastering because, in my experience, those people tends to solve inexistant problems while having a hard time understanding that what pays their salary are boring CRUDs calling tangled ORM queries.


I said absolutely nothing about money.

> I actually don't like programming

Right, that's my point exactly.

> in my experience, those people tends to solve inexistant problems

Yes, that's definitely a risk with people who really just love writing code. Fortunately, most people who like to write code also like to have useful code written, so in my experience, the folks who will go off an yak shave a thing no one needs for months are fairly rare.


Telling an AI to implement CRUDs calling tangled ORM queries sure sounds like an exciting profession!

Really? That hasn't been my experience at all. There are a lot of brilliant developers who love programming and are excited about AI. (Just read Simon Willison's blog, or any number of other people.) Conversely, my sense is that a lot of the people who are just in it for the money don't want AI to change the industry because they don't like learning new things and they feel it threatens the six-figure salaries they get for churning out boilerplate code without having a deep understanding of architecture or systems.

And they may be right to be worried! If you are in the game out of love and you like learning new things about computers you are well-positioned to do well in the AI era. If you just want to get paid forever to do the same thing that you learned to do in your bootcamp in 2018 when the job market was hot, not so much.


..but but... I've never had the opportunity or helpful environment to focus on learning languages and focused on other skillsets.

Other efforts to try and coordinate the time, finances and a team to accomplish the projects that I have in mind also failed miserably..

Am I (for example) so bad to believe that I could possibly accomplish some of my dreams with the help of LLMs as another attempt to be an accomplished human being?

(partly /s but partly not)


I made no moral claims about whether it's bad to want to have code without wanting to go through the experience of writing it.

I want a nice lawn, but I don't want to mow it myself. I pay a landscaper. I don't think that makes me a bad person.

But it does make me a different person from someone who enjoys the process of manicuring a lawn.


The more powerful the tool, the more responsible its wielder should behave.

Ideally. In reality, that's impossible to enforce.


  >The more powerful the tool, the more responsible its wielder should behave.
I will argue that this is a false pretense, in part that you say it's impossible to enforce, but also for the fact that it does not happen in reality.

Anyone with a will to an objective will utilize any tools at their disposal, only the observer from another perspective will judge that this is 'good or not'. To the beholder, this has become the only way to achieve their goals.

An anecdote goes by the lady who was using a ww2 era hand grenade to crush spices in her kitchen for decades without anything happening. Goals were met and nothing bad happened but general consensus states that this is bad for many reasons, to which nothing happened.

Maybe it's not only responsibility, but the capability for one to understand the situation one is in and what is at their disposal. ..and a hint of 'don't be evil' that leads to good outcomes despite what everyone thinks.


> Maybe it's not only responsibility, but the capability for one to understand the situation one is in and what is at their disposal. ..and a hint of 'don't be evil' that leads to good outcomes despite what everyone thinks.

This understanding, and hint of broader/benevolent perspective, is what I meant by responsibility.

I'm not so naive as to expect it in general but I have known it to exist, that there are people who respect the responsibility implicit in proper use of their tools. The world is a labyrinth of prisoners' dilemmas so I get that there's a reasonable argument for being "irresponsible" whatever that means in the context.


For the greater good, would you go as far as saying that the act of responsibility comes from the top, the people who lead, and those in the public spotlight? Or would this ideology need to be indoctrinated in educational systems? Or do we have to hope and pray that each and every human born would need to go through the same process of learning and understanding to reach this level of responsibility?

I've taken to having a hobby car, and I'm pretty sure I could have been solving automotive problems for a living rather than computer problems; these days, automotive problems are computer problems, but my hobby car is my age and only has one computer in it, but it's not servicable ... it just works and runs the fuel injectors, or it stops working and I'll get a used replacement or a megasquirt. Computer problems are nicer, cause I don't smell like car for 2 days afterwards, but if there was no money in computer problems, I might have been redirected into car problems or other similar things.

Incredibly lucky, honestly. It's a rare thing to have a passion line up with a healthy income.

I went through that in the late 90s and saw the writing on the wall of the 2010s. Hoping it's not too cyclic

The analogy is not about safety and correctness, but about who is producing and who is assessing/analyzing/poking & prodding.

Formal verification and case automatikn can be done automatically anyway without a mathematician hand checking each case.

For an old example that predates LLMs, see the four color theorem.


I think fines are very reasonable. If you can’t safely do the thing, you should be punished for doing it. If you can’t safely safely do the thing then there is no issue.

You should think of the “programming” in dynamic programming the same way you think of in linear programming, integer programming, and constraint programming.

Indeed even in layman’s terms, thinking of it as in television programming is more accurate than thinking it is related to computer programming (as is mentioned in TFA)


> linear programming, integer programming, and constraint programming

Can't think of a universe in which you'd learn about these things before learning "computer programming".


There was a universe before digital computers were popular or a thing at all.

Computing predates computers.


Before digital computers existed, there were still computers. They were people. A baker is a person who bakes. A computer was a person who computes (computing mathematical results).

Sure, that's why I had computers italicized to indicate the specific modern definition, not everything that computes.

I know two people who are experts in linear programming who have never written a single line of code.

In the UK at A-level (age 16-18) you may still be taught linear and dynamic programming before ever touching a line of code! (Indeed, that was the same for me!)

Did your school used the specific terms "linear programming" and "dynamic programming" to refer to those topics? My original comment didn't phrase this clearly, but I was thinking less about techniques themselves, and more about encountering them under those specific labels.

As an analogy, it's not unusual to learn a good chunk of calculus before learning that "calculus" is a thing they're part of - for example, by having an over-eager physics teacher who teaches you some of it so we can understand physics material deeper, but without ever mentioning the branch of math we're now using.


I meant that in particular you should _not_ think of the “programming” part of “dynamic programming” in the sense of computer programming

I think of "programming" in "dynamic programming" the exact same way I think of it in "linear programming", "integer programming" and "constraint programming": it's probably some kind of software development that some computer scientists came up once and that I don't need to think about, because my normal programming has worked out pretty well so far

(Except, well, I guess I understand what "dynamic programming" is more than I understand what the other forms of programming you mention is; "dynamic programming" is solving certain classes of recursive problems by using arrays, sometimes nested arrays, to avoid re-computation, and somehow that's supposed to be more "dynamic" than not doing that)


I always find funny how many meanings "programming" has.

Television programming isn't a separate meaning from computer programming. Programming means creating a program. The program, whether you're talking about a computer, a television channel, or a live staged performance, is a list of what's going to happen in what order.

That's what I was hinting to. Pre mechanical programs were 'prewritten' sequences. Mostly linear, like a (literature) script. Musical score would fit that definition too. Then you can have slightly more complicated structures like dags and data flow, still linear in a way. When you add state,loops, feedback it gets interesting though.

Some of these just seem to be using Python out of spec and being surprised that implementation details exist, and misunderstanding boolean expressions.

Python doesn't have a spec. It barely even has documentation nowadays. (Sad. Twenty five years ago it was the gold standard of documenting software.)

Python's documentation today[1] is clearly more expansive, better formatted and more readily comprehensible than its documentation from 2001[2] (or even 2008[3]). There are multiple entire sections of documentation now that didn't exist before. Standards were just lower back then, partly because a larger percentage of "programmers" were accustomed to wrestling with beasts like C, and partly because systems were much smaller.

https://docs.python.org/3/

https://docs.python.org/2.0/

https://docs.python.org/2.5/


Good documentation cannot be "expansive". Good documentation must be a) thorough and b) minimal.

Modern Python documentation is absolutely horrible - there's a shitload of irrelevant rambling while absolutely crucial details are omitted.


Good documentation cannot be "expansive". Good documentation must be a) thorough and b) minimal.

stealing this


and too much casual tone of voice

> Python's documentation today[1] is clearly more expansive, better formatted and more readily comprehensible than its documentation from 2001[2] (or even 2008[3]).

Documentation is not a specification. Specifications cover all behavior that should be expected, and specify which behavior is implementation-defined or undefined. If something isn't defined them this is a failure in the specification that requires fixing. The point of a specification is to allow independent parties to do clean room implementations that can be used interchangeably.


The majority of the comment I was replying to was about documentation, and I was responding to that.

https://docs.python.org/3/reference/grammar.html and https://docs.python.org/3/reference/index.html look pretty comprehensive to me, and they're backed up by a thorough collection of PEPs: https://peps.python.org/pep-0000/

As a relatively recent example, here's the language reference documentation for the match statement https://docs.python.org/3/reference/compound_stmts.html#the-...


That seems uncalled for, the docs are great and the various guides are usually a good starting point to understand and then use the stdlib.

I don’t understand what you mean by “doesn’t have a spec”

The existence of Pypy and CPython and separate but compatible entities shows that there is


CPython is the de-facto reference implementation of Python.

Pypy was developed by reverse engineering CPython and their automated tests feature explicit comparisons with CPyton.

You made the opposite point you thought you were making.


in raku, the test suite (ROAST) is the spec

any compiler that can pass ROAST is valid


> Twenty five years ago it was the gold standard of documenting software.

That was PHP. Though Python was a close second


BB(748) is a natural number, and _all_ natural numbers are computable.


There is definitely a function f such that f() = n for all n ∈ ℕ.

But there is also a function g that you cannot prove whether g() = n.

Important distinction.

This means that somebody could claim that the value of BB(748) = n but you cannot be sure if they are correct (but you might be able to show they are wrong).


Noncompliant, but what could this reasonably impact?


Pointers are frequently used as keys for map-like data structures. This introduces collisions that the programmer can't check for, whereas NULL is very often special-cased.


> Noncompliant, since `malloc(0)` is specified to return a unique pointer if it's not `NULL`.

I know I've seen that somewhere, but may I ask what standard you're referring to?


It's POSIX.

> Each [...] allocation shall yield a pointer to an object disjoint from any other object. The pointer returned points to the start (lowest byte address) of the allocated space. If the space cannot be allocated, a null pointer shall be returned. If the size of the space requested is 0, the behavior is implementation-defined: either a null pointer shall be returned, or the behavior shall be as if the size were some non-zero value, except that the behavior is undefined if the returned pointer is used to access an object.

https://pubs.opengroup.org/onlinepubs/9799919799/functions/m...


Not just POSIX, also the ISO C standard itself. https://en.cppreference.com/w/c/memory/malloc


That doesn't say the pointer has to be unique.


cppreference isn't the standard, and while the text they write looks like it's the same verbiage that would be authoritative, it's not. (And there's some criticism of it from standards committee members in that regard).

The current C standard text says:

> The order and contiguity of storage allocated by successive calls to the aligned_alloc, calloc, malloc, and realloc functions is unspecified. The pointer returned if the allocation succeeds is suitably aligned so that it can be assigned to a pointer to any type of object with a fundamental alignment requirement and size less than or equal to the size requested. It can then be used to access such an object or an array of such objects in the space allocated (until the space is explicitly deallocated). The lifetime of an allocated object extends from the allocation until the deallocation. Each such allocation shall yield a pointer to an object disjoint from any other object. The pointer returned points to the start (lowest byte address) of the allocated space. If the space cannot be allocated, a null pointer is returned. If the size of the space requested is zero, the behavior is implementation-defined: either a null pointer is returned to indicate an error, or the behavior is as if the size were some nonzero value, except that the returned pointer shall not be used to access an object.

So yeah, the allocations are required to be unique (at least until it's free'd).


> Each such allocation shall yield a pointer to an object disjoint from any other object.

Phrasing could be slightly more clear to prevent someone from making the argument that -1 is disjoint from all objects as it does not point to an object


And if you use 0 as the value of NULL pointer, then -1 can't ever point to an object (because adding 1 to it should generate a non-NULL pointer, so that pointer comparisons are not UB).

So yeah, C implementations have to reserve at least two addresses, not just one. By the way, the standard to this day allows NULL, when cast to a pointer type, to be something else than all-bits-zero pattern (and some implementations indeed took this opportunity).


But adding 1 to a pointer will add sizeof(T) to the underlying value, so you actually need to reserve more than two addresses if you want to distinguish the "past-the-end" pointer for every object from NULL.

--

While it's rare to find a platform nowadays that uses something other than a zero bit pattern for NULL as normal pointer type; it's extremely common in C++ for pointer-to-member types: 0 is the first field at the start of a struct (offset 0); and NULL is instead represented with -1.


> so you actually need to reserve more than two addresses if you want to distinguish the "past-the-end" pointer for every object from NULL.

Well, yes and no. A 4-byte int can not reside at -4, but a char could be; but no object can reside at -1. So implementations need to take care that one-past-the-end addresses never equal to whatever happens to serve as nullptr but this requirement only makes address -1 completely unavailable for the C-native objects.


On a system where -1 points to an object, I don't think comparing that to null(0) would be UB, because null is one past an object.


It is in ANSI 89, under memory management functions.


Thanks, I'd missed that.


The first sentence of the article says:

> These credentials weren’t recycled from old hacks or reposted from public breaches. They’re new, undocumented, and highly dangerous


Yeah they say that. They also write "they’ve spent months digging through the mess". Wait, what? They spent months not telling anyone. Something is fishy here.


Objectively the Vision Pro was innovative.


I don't see it.

To me, it seems to have the same problems as the Google Glass, in that it's far too expensive and doesn't have a clear idea what its own USP is.

That said, while I've played with a few different VR headsets, I've not had a chance to play with the VP, so perhaps there's something in the quality that would become visible if not for the prohibitive price.


The 128K Mac was not too far off from being a promising toy (it wasn't until the Apple LaserWriter became available that the true promise/utility was made real) --- you have to build the expensive first version for early adopters so that you can later figure out how to make the affordable version folks will actually buy and use.


Content is indeed key.

But you don't have to start expensive: cheaper headsets already existed for several years before it came out, and those are perfectly adequate games consoles; and of the reviews I've seen for home cinema and virtual monitor uses, nobody seems to prefer AVP over other headsets half the price.


It has a lot of drawbacks; I agree completely. I don’t own one, I don’t like it as a product, and I wouldn’t buy one, but the tech was undeniably innovative.


I do not own any AAPL (neutral feelings; don't know enough about their direction).

Unfortunately, my mid-sized US city still doesn't have an Apple Store... so I haven't demoed the VisionPro, yet; but I want to be able to see if its endorseable for my Parkinsons friend (whose eyes still work well); I suspect if I demoed it I'd also purchase one too.

Absolutely seems innovative.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: