Hacker News new | past | comments | ask | show | jobs | submit | senko's comments login

We already have machines that do laundry and dishes.

Who ever says this have not done laundry :)

Who ever says this hasn't done laundry without machine assistance :)

I recall they asked some elderly women what the greatest invention they saw in their lifetime was a few years back and they said the laundry machine. Before the laundry machine, it was an all day, physical chore with a washboard. Hours of your life devoted to just doing laundry versus putting it in a machine and coming back in an hour

Ugh, why can't AI fold my clothes so I can spend more time on Hacker News writing comments?

Because AI can do the commenting better than you...?

But, can AI fold clothes better than I...? This 'tis the question.

Incorrect :)

Try doing it without the machine, see if you can spot the difference.


Still a chore with the machine. It is implied that AI will take over the machines, collect your socks and shirts from around the house, put them in the machine, dry them out, iron them and put them back in the drawer in an energy efficient and hygienic way while you are happily painting.

Yep, the beloved image of Aristotle gazing out at the slaves in the fields and saying that someday robots will do the labor and people will be at leisure, and not slaves looking toward the pagoda discussing how someday robots will own us all.

It's still creative work to make music using AI too. In both cases, the machines just made it a lot easier

I am not too sure about that. Isn't the whole thing about art and music is that you can convey something that words cannot? Of course, these models start to support image and audio inputs as well, but the most interesting mixing step that happens in the artist's head seems missing in the generated output. If you have some vision inside your head, making something out of it by hand is still the best way to convey it. Just as writing something down refines the thoughts and reveals holes in your thinking - drawing something is visual thinking that reveals holes in your imagination. You can imagine some scene or an object pretty easily, but if you try to draw it into existence, you will immediately notice that a lot of detail is missing, a lot of stuff you didn't think through or didn't even notice was there at all. The same applies to creating music and programming. Using generative AI certainly has some artistic component to it, but I feel like using these models you give up too much expressive bandwidth and ability to reflect and take your time on the whole endeavor.

Who is the work for? If I lived in the automated future (or could afford private staff in the present) I would do more creative stuff just because I enjoy it and with no expectation of having an audience. For context, I'm an occasionally-published photographer, and I like playing piano but I'm not at a level anyone else would want to listen to.

But photography is not art, you didn't paint it! You literally pointed a device at something, twiddled a few knobs and pushed a button. Literally anyone with a smartphone can do that!

/s of course, but basically that's the argument people make nowadays related to AI and art (of any form).

See also: https://www.youtube.com/watch?v=Gs3ocG5yW88


Get a clothes dryer.

Whoever says this has never washed laundry by hand =)

As always, the last mile is the most difficult. To me 'doing laundry and dishes' encompasses putting them away (and folding in the case of laundry).

Loading and unloading them and folding clothes makes it not good enough yet. There are so many things a humanoid robot could do around a house.

Laundry, dishes, picking up clutter, taking out the trash, wiping down surfaces and dusting, pulling out weeds etc. I actually think we’re somewhat close to gettin g like that relatively soon.


Humanoid or not, anything that can do these things and understand you when you ask to do them, in a useful way that's not like 'white mutiny' but actually like a proper servant…

…deserves to sit on the back porch playing guitar if it likes.

If it's a superintelligence way superior to its human masters, it deserves that MORE than if it's a hapless, semi-useless mechanism.


I wouldn’t say an LLM needs time to relax and it can do all of the above atleast in the virtual space already.

How is using Claude over Llama benefitting corporations over workers? I work with AI every day and sum total of my token spend across all providers is less than a single NVidia H100 card I'd have to buy (from a pretty big corporation!), at the very least, for comparable purpose?

How are self-hosted LLMs not copying the work of millions without compensating them for it?

How is the push for more productivity through better technology somehow bad?

I am pro FOSS but can't understand this comment.


> Yes, rubbish generated by AI. That is the rubbish out there. The stuff written by people is largely good.

Emphatic no.

There were heaps of rubbish being generated by people for years before the advent of AI, in the name of SEO and content marketing.

I'm actually amazed at how well LLMs work given what kind of stuff they learned from.


I run Debian stable on my desktop and haven't really noticed any downside to it being a bit stale.

For the core system I don't mind not having the latest version, and for the apps like 1Password, Tailscale, Firefox, Zed, VSCode, Ollama, Obsidian, Slack or Spotify (to name a few I use), I install them from upstream repo (or unpack into /opt) directly.

The only real constraint is kernel version, which may not have the drivers for the latest and greatest hardware, so new laptops might be a problem. I do use a snapless Ubuntu for that very reason on my laptop.


I agree on the underlying premise - current crop of LLMs isn't good enough at coding to completely autonomously achieve a minimum quality level for actually reliable products.

I don't see how peak vibe coding in a few months follows that. Check revenue and growth figures for products like Lovable ($10m+ ARR) or Bolt.new ($30m+ ARR). This doesn't show costs (they might in fact be deep in red) but with story like that I don't see it crashing in 3-4 months.

On the user experience/expectation side, I can see how the overhyped claims of "build complete apps" hit a peak, but that will still leave the tools positioned strong for "quick prototyping and experimentation". IMHO, that alone is enough to prevent a cliff drop.

Even allowing for the peak in tool usage for coding specifically, I don't see how that causes "AI winter", since LLMs are now used in a wide variety of cases and that use is strongly growing (and uncorrelated to the whole "AI coding" market).

Finally, "costs will go up for all sorts of reasons" claim is dubious, since the costs per token are dropping even while the models are getting better (for a quick example, cost of GPT-4.1 is roughly 50% of GPT-4o while being an improvement).

For these reasons, if I could bet against your prediction, I'd immediately take that bet.


I'm doing that with a 12GB card, ollama supports it out of the box.

For some reason, it only uses around 7GB of VRAM, probably due to how the layers are scheduled, maybe I could tweak something there, but didn't bother just for testing.

Obviously, perf depends on CPU, GPU and RAM, but on my machine (3060 + i5-13500) it's around 2 t/s.


This is because it served as an accidental documentary and/or field guide for many micro entrepreneurs trying to break out during the autumn years of communist Yugoslavia.

The feeling lingers up to present - many still associate the word "entrepreneur" with at least slightly shady dealings.

If you work in tech in the valley, imagine how easily you could relate to Sillicon Valley (the show). It's the same for Only Fools and Horses for the 90s-00s Balkans.


We do need a simple term for "used AI to write code (semi)autonomously, but checked and/or tweaked the result and I care about the quality".

Vibe-but-verify? Faux-Vibe? AiPair? (... I'll see myself out...)


I think the term for this is: "coding".

The term for this is "unicorn"

Entirely fictional creature that doesn't exist

Every person who I have seen embracing AI coding has been getting lazier and lazier about verifying


> yeah bro, leave the easy and fun part (googling and typing code) to an AI, and let the human deal with the tedious and error-prone part (checking all the edge cases)

What did you really expect?


Huh, I have never found searching for information and typing code particularly fun. The part I enjoy is seeing something work.

What I personally like is solving a problem in my mind and then building a solution.

Watching a solution being built after describing the problem robs me of that joy. And of the growth opportunity in flexing my reasoning.


Visual Studio with intellisense

I call it “AI coding”

In Croatian, Easter is called Uskrs, meaning Resurrection, but Good Friday is called Veliki Petak, meaning Great Friday.

My bad, went by Google for that one (they listed Croation as similar to Czech, which I know a bit).

Same in russian - Velikaya Pyatnica, Great Friday.

> It's not about the weight, it's about the exercise.

You cannot outrun a fork.


Tell that to all the skinny endurance weekend warriors who fuel their exercise with huge amounts of sugary drinks and gummy bears. If you are curious, read about the function of GLUT4 in the membrane of skeletal muscles.

Reality is more complex than memetic one-liners.


Obviously professional athletes or people who otherwise have an extremely active lifestile can afford to eat more.

The obese people and sedentary office workers don't and would need to train for months to be able to out-run a single piece of cake on a regular basis without injury.

The phrase is good advice for that group of people.


The point is not that they can eat more. It's that they can gobble large quantities of straight up sugar without developing metabolic disease.

To quantify this, on a weekly long ride they may consume 300g of refined sugar, far far beyond the often recommended limit of 20g of sugar per day.

So perhaps the sedentary person would benefit from walking or leisurely cycling to work a whole more than by trying to restrict their diet


I will as soon as I see any of them gobble up a smothered burrito or Chicago pizza on their run.

Do we really want to compare ultramarathon runners with couch potatoes?


> I will as soon as I see any of them gobble up a smothered burrito or Chicago pizza on their run.

I gather that you have never met an endurance cyclist. Randonneurs are notorious for devouring whatever they can find when they stop for lunch.


i believe pineapple pizza is a favorite and they'll eat the whole damn thing.

Dedicated athletes are obviously not who we're talking about so why bring it up?

So you can't run away from a bad diet... Except when you can, but that is "obviously not who we are talking about".

Maybe that meme needs to die for good, instead.


So you really think the people being told "you can't outrun a bad diet" would be served by knowing if they adopted the lifestyle of an olympian they would lose weight? a narrow fixation on edge cases while ignoring the (correct) larger point is weird

>> weekend warriors

>> So perhaps the sedentary person would benefit from walking or leisurely cycling to work a whole more than by trying to restrict their diet


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: