Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just don't get the point of ladybird. They have full time engineers and are soliciting donations, so it's clearly more than a hobby project. Maybe my assumptions are off, but I just can't imagine they could ever become competitive in terms of features, security and performance with the big engines. Blink is setting the pace, Webkit is barely able to keep up and Gecko is slowly falling behind. All of these teams are orders of magnitudes larger than the Ladybird team. If you think that Blinks dominance is a thread to the web it's not enough to have an alternative engine you need enough adoption of that engine so web devs make sure their site is compatible with that engine. Most of this also applies to Servo, but at least their technical project goals (embeddable, modular, parallel, memory safe) sound at least moderately compelling. Maybe Ladybird has similar goals, but at least their website doesn't really state any technical goals.




It is donation funded with no reliance on outside parties. They don't have to inject ads into pages like brave did or sell out to Google compromising their independence on web standards.

They're ahead of Servo already anyway, and better funded.


In the last 24h alone Chromium merged almost 900 CLs (their equivalent to a pull request) into the src/Chromium repo, Ladybird had 7. Yes a project that started fresh a couple of year ago with decades of hindsight can be more efficient than one started 16 year ago as the fork of a fork, but if I had to guess they'll sooner or later reach a point where they have implemented the low hanging fruit and chromium moves faster away than they can catch up.

I'm not even proximal to webdev, can someone explain why people keep making pages using new stuff? I get why google keeps adding things, but why do people use it? Well over half the pages I go to look better without js. HN looks identical.

Many of the newer standards added to the web platform boil down to supporting development of web apps, not only web pages. For example, the current iteration of the File System API in Chrome (https://developer.mozilla.org/en-US/docs/Web/API/File_System...) allows web apps to request permission to read and write to the user's file system. This is great for many tools.

Of course, this can be fairly controversial: More app-like capabilities lead to more complex standards and harder-to-secure browsers. There's also overengineering where people use web app techniques to develop web pages. You don't need (much) JS or even any of the new standards for HN, but for something like Google Docs or Figma, it's a different story.


> Chromium merged almost 900 CLs ... Ladybird had 7

Imagine being in your annual review and your boss has a chart of your performance compared to your peers and it's just the count of PRs you merged. You begin to protest that merged PRs is not a good metric for contributions, then he switches to the next slide which is just this comment you made on HN...


Depends what's in those CLs I guess. Google crap like automatic sign in, passkeys, prefetching? Developer tools stuff? Very little of it is core web.

Not being funded by Google money is a pretty big deal. Some of the developers are former webkit devs so they have a good foundation to start from. It remains to be seen if they can pull it off.

Orion adding Windows support (getting WebKit running on Windows again) would be pretty good too.


WebKit runs on Windows, the Windows port just needs work to bring it up to the level of the Linux port. I got every JIT tier enabled in JavaScriptCore [1] and enabled libpas (the memory allocator). The Windows port is moving to Skia in line with the Linux port.

Really just needs more people (and companies) pushing it forward. Hopefully Kagi will be contributing improvements to the Windows port upstream.

[1] https://iangrunert.com/2024/10/07/every-jit-tier-enabled-jsc...


Ladybird has better results in web rendering tests than Servo, and slowly is gaining on Firefox.

They are already quite competitive.


In the update videos posted on the Ladybird YouTube channel it's said that they have exhausted most of the low hanging fruit in terms of correctness. Browsers and the web standard have a very long tail of odd behavior that you need to implement. I could be wrong, but if I had to guess they'll stall at a point where it's just good enough that some people will make it work, but it's not really useful for general use.

True, but the longer the tail the less likely that you are affected by it.

Since a couple weeks ago it became possible to view all major commercial news websites I use in my country.

If it works for most websites and you only have to reach for Chrome sometimes it would still be perfectly usable.


Ladybird is extremely slow, it's far from being competitive at all.

They're prioritizing correctness to the spec over speed and are still 'officially' in pre-alpha. It's still to be determined how well they can bridge the gap there.

For casual web browsing it's plenty fast enough already to do a lot of things, but they're a relatively small team fighting against decades of optimization.


All browsers are fast enough once you block all the useless web bloat.

But Ladybird's explicit goal is to work on the "real web", i.e. without blocking all that bloat

What? No one is expecting Ladybird to be fast at this stage. No one is claiming that it is. Ladybird is competitive because of the speed of which it is improving.

Very unfair to look at ladybird and call it slow, when its not even alpha and shouldn't be used yet

Remember you're experiencing a debug build of pre-alpha software.

Larger teams do not necessarily mean you get stuff faster. If anything after some point, a large team can be hard to get things moving and have tons of issues with communication.

Well Andreas Kling has worked on Safari and WebKit and (obviously) has talked to a lot of browser people. He knows what he is doing, and he frequently says that no one that has actually worked on a browser thinks it's impossible to create a new one, even with a small team (...of highly motivated and skilled people).

I think there are two things to keep in mind.

1) Apple and Firefox have enough resources to implement the most recent web standards. When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

2) Devs aren’t created equal. It’s possible for a team of 8 people to be 10x more productive than another team of 8.


> When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

Or because they are reluctant to implement it for technical reasons? Not every "standard" that gets thrown on the table and implemented by Google is a brilliant idea.


What's wrong with Webkit? It's super fast. I tested Orion browser recently.

Good implementations are hard to come by outside Apple's ecosystem.

If anything, Ladybird is an independent implementation of the web standards, and the devs have identified and helped solving quite a few bugs and and ambiguities in the standards, which benefits everyone, from browser devs including the big guns to web developpers and users.

I think the browser attempts are all wrong trying to "cover all edge cases" they should focus on being able to transform any and all dark patterns down into something simpler.

Hell starting out as extracting text, pictures, video for all nightmare sites. Then slowly add whatever features dont actually lead to dark patterns


It was the similar sentiment with 0xide.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: