I've been leaning on ChatGPT about 1/4 of the time for programming personal projects. It's a great way to get boilerplate out of the way, as well as quickly introduce me to things I didn't know about. Once I get started, though, I stopped using the AI. I'd use it for code reviews, but it isn't smart enough to thoughtfully make comments about how data is pipelined through a complete application (it lacks in other ways, too, but that's the big thing I noticed).
> it isn't smart enough to thoughtfully make comments about how data is pipelined through a complete application
My take is that data-centric programming requires too much context for GPT, and we're going to see a move back to doing things in a more OO way, hopefully with better languages than exist now. The ability to reason locally about objects helps both human and AI developers, so we can build larger systems. Data-centric/functional programs are akin to hand-crafted, artisanal goods that were slowly overtaken by standardized parts/division of labor production methods in the industrial revolution, and we are in the middle of one for software [0]. By the ended of it, software engineering may no longer be an oxymoron.
The reason I mentioned data pipelines is I've adopted a compiler type view of programming, where data is mutated through series and or parallel steps until a satisfactory side-effect or output is reached. I find its a lot easier to define a problem when I think about it this way.