Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really want to see the people that have performance issues with Zod and what's their use case.

I mean it.

I've been parsing (not just validating) runtime values from a decade (io-ts, Zod, effect/schema, t-comb, etc) and I find the performance penalty irrelevant in virtually any project, either FE or BE.

Seriously, people will fill their website with Google tracking crap, 20000 libraries, react crap for a simple crud, and then complain about ms differences in parsing?



We use it heavily for backend code, and it is a bit of a hot path for our use cases. However the biggest issue is how big the types are by default. I had a 500 line schema file that compiled into a 800,000 line .d.ts file — occupying a huge proportion of our overall typechecking time.


That sounds absolutely absurd.

Are you using a lot of deeply nested objects + unions/intersections?


A fair number of unions, yeah. Which also means some of the tricks for keeping the types small don’t work —- ie taking advantage of interface reuse.


Yup, maintained an e-commerce site where the products were coming from a third party api and the products often had 200+ properties and we often needed certain combinations of them to be present to display them. We created schemas for all of them and also had to transform the data quite a bit and used union types extensively, so when displaying a product list with hundreds of these products, Zod would take some time(400+ ms) for parsing through that. Valibot took about 50ms. And the editor performance was also noticeably worse with Zod, taking up to three seconds for code completion suggestions to pop up or type inference to complete - but truth be told valibot was not significantly better here at the time.

I agree though, that filling your website with tracking crap is a stupid idea as well.


Zod is the default validator for https://github.com/gajus/slonik.

Zod alone accounts for a significant portion of the CPU time.


> In the context of the network overhead, validation accounts for a tiny amount of the total execution time.

> Just to give an idea, in our sample of data, it takes sub 0.1ms to validate 1 row, ~3ms to validate 1,000 and ~25ms to validate 100,000 rows.


I’ve used it on the backend to validate and clean up tens of thousands of documents from Elasticsearch queries, and the time spent in Zod was very much noticeable


My issue is ts server perromance not so much runtime




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: