Request for the bun team: please provide a clear support policy/EOL timeline. Bonus points for clarity on the stability guarantees that are offered between versions and modules.
I don't think the bun devs are currently parcelling things up like that so doubt this would be worth the effort at this time, i.e. I don't think they're backporting any fixes from 1.1 into a 1.0.x release.
As far as I'm aware, Rust has been doing this for 9 years now (77 new versions), I'm not sure if that's been "expensive" but people seem to like it and it's working well so far
I use both Deno and Bun in production (albeit, on different projects).
They are both great upgrades from node. In particular with the first class support for TypeScript.
Bun is great for large projects with the enhanced DX over any node based environments I have worked on - I use it for a mono-repo project with several frontends and a GraphQl backend. Involved test suites run in 5 seconds, etc.
Deno seems to work really well i lambda style environment (I use them with Supabase) due to their module approach that are entirely stand alone. This is great for small scripts to glue things together.
Bun has -i/--install=fallback which I thought was pretty similar to Deno but I haven't used Deno much to compare. I was thinking about starting to write my scripts with `#!bun -i` but haven't fiddled with it much yet.
The API for the shell function is kind of neat, in that it seems to prevent you from accidentally creating shell injection vulnerabilities by calling it without properly quoting the arguments.
It uses types to get quoting right? Or it quotes everything (regardless if it's already quoted)?
Ironically, the first time I saw the former was in a Python templating library (in the early 2000s -- from distant memory I think it might have been the work of the MemsExchange team?)
Formatters basically differentiate the literal parts of the string and the template arguments. There's also a neat postgres library that does the same for sql quoting.
Tagged templates[0], the language feature that enables this, were introduced in ECMAScript 2015 apparently – arguably at least somewhat new in the lifespan of JavaScript. :)
Java is getting a similar feature with template processors[1], currently in preview.
It would be nice to have it in Python as well – i.e. not just f-strings, but something that (like tagged templates) allows a template function process the interpolated values to properly encode them for whatever language is appropriate (e.g. shell, SQL, HTML, etc.). Apparently someone is working on a proposal[2], although there doesn't seem to be much recent progress.
Wow that’s pretty neat. Because of that I also learned about import attributes (https://github.com/tc39/proposal-import-attributes) which is probably going to be quite useful and make the 50 lines of imports in some of my files look even dumber.
Seems like a good release.
I watched their video, and some charts were a bit unclear, as in, I didn't know if they were comparing with the previous Bun version or Node.js.
My experience with using Bun in side projects has been good. The built-in APIs work well in my experience, and I hope popular runtimes adopt at least a subset of them. The hashing and the SQLite bindings come to mind as APIs that I wish were available in Deno and Node.js as well.
They collect some telemetry by default. I don't think the install script tells you that. The only mention of it that I've found is in the Bunfile documentation: https://bun.sh/docs/runtime/bunfig#telemetry
I'd prefer if it was opt-in, and that users were given instructions for disabling it if they want to during installation.
They offer an option to create a dependency-free executable for your project, which bundles the runtime with your .js entrypoint. That works great if you want a single binary to distribute to users, but at the moment, the file size is still pretty big (above 90MB on GNU/Linux for a small project). Not terrible, but nothing comparable to Go or QuickJS yet. I wonder if in the future, Bun would offer an option to compile itself with certain features disabled, so we'd get a smaller binary.
I have been playing with using Bun as a Haxe target. It works pretty well and IMO it's a choice to consider if you like Haxe more than TypeScript, or if you want to add a web server to an existing Haxe project without adding another programming language. You can also to do things like generating validation code at compile time, which seems hard to do with just TypeScript.
Note that we do not currently send telemetry anywhere. The extent of what we track right now is a bitset of what builtin modules were imported and a count of how many http requests were sent with fetch(), and a few things like that. This is used in the message printed when Bun panics so we can have a better idea of what the code was doing/did when it crashed
Sorry, I didn't understand your post. From the last 2 phrases, it seems to be the case that you do collect some data.
So, when you say
> Note that we do not currently send telemetry anywhere.
You mean that Oven do not send that data to someone else, right?
I still see value in having a privacy policy so that users can find out what is collected by Oven in a concise way, and how to opt out of that. As far as I know, the fact that any data is collect at all, and that there's a flag to disable it is only mentioned in a documentation page for Bun's TOML config file.
That is unrelated. This is a code signing / notarization issue because we don’t distribute Bun via the Mac App Store and likely the way it was installed was via npm or something other than .zip file we distribute. Code signing is necessary due to the JIT entitlement in Bun (otherwise Bun would be a whole lot slower)
This is dead code from before Bun 1.0. This code exists but is not run and probably stripped from the final executable (Zig is great at dead code elimination). We do get the Linux kernel version to detect if syscalls like pidfd_open are supported and enable fast paths.
> I'd prefer if it was opt-in, and that users were given instructions for disabling it if they want to during installation.
So you mean an informed opt-out, right?
Thank you for the interesting hands-on experiences and insights regarding Bun. I followed the coverage and posts by Jarred here on HN for quite a while, think since the initial alpha release, but haven't used it.
Will keep your examples for helpful added platform APIs in mind for when I hopefully come around to doing sth with Bun!
It sounds like a great platform for JS scripting as well - I also think that could be a good and easy way to test the waters.
Really, kudos to Bun and Jarred Sumner for living up to the promise he made when the first version was announced!
> So you mean an informed opt-out, right?
I worded it wrong - I meant I'd prefer if it was opt-in, or at least informed opt-out, like you said. Thanks for pointing it out. :)
> It sounds like a great platform for JS scripting as well - I also think that could be a good and easy way to test the waters.
It is. Some other features you might enjoy are the built-in TypeScript support and test runner. It works well for one-off scripts too, if you'd prefer not using Bash.
For me, it was refreshing coming from Node.js. Hope it is an enjoyable experience for you as well.
Now that Discord will start showing ads, is there any chance Bun will support a communication platform that is open & ad-free like IRC, XMPP, or Matrix?
What kind of question is this? He is working on a JavaScript/TypeScript runtime not building a communication product. Why would you run to him to solve your pet grievance with Discord?
I imagine the complaint is around Bun's use of Discord as community coordination tooling -- linked in the site's header. The post isn't implying Bun should be involved in the creation of an alternative.
(I'm not here to throw shade at using Discord for OSS "communities" - I do as well - And am concerned about the path forward. Just want to clarify the question's intent.)
If open source, free software is a good enough ethos for your code base, it should be good enough for your community communications. Supporting only a proprietary locks out a swath of users & Discord in particular is an information blackhole—and now with ads!
Users that need special clients (accessibility, hardware, etc.). Users blocked by US sanctions. Users that have been moderated off the platform for something not in your community (account bans even happen accidentally). Users with privacy/anonymity concerns about the data collection (& now ads)—especially the chat rooms that require a SIM card. Users that take their FOSS or otherwise ethical software views or anti-corporate views to heart & want something built on those principle—from wanting to use free software to make free software, to wanting to outright avoid what some now call enshitification where a free (gratis) account clashes with the idea of freedoms, etc.
I think they used the word "support" to mean "use", rather than "build". That is - they weren't asking Bun developers to _build_ an alternative to Discord, but rather to stop _using_ Discord.
The opt-out telemetry is worrisome. Along with the fact that there doesn't appear to be a way to disable it for single-file executables if you plan to distribute a Bun cli app to users: https://github.com/oven-sh/bun/issues/8927
Are there plans for adding concurrent or parallel execution to the test runner? I recently tried looking at the code base to maybe implement it myself, and it looks like it wouldn't be easy without some reworks.
We need to do some form of this but I’m not exactly sure what yet. I suspect same process but multiple globals might work well. A lot of tests spend time sleeping or waiting for things. They might benefit from that kind of paralellism (like async/await, except between things it runs a whole other global object)
Threads could also work but the problem is you have to re parse & evaluate all the code. That’s a lot of duplicate work. It’s probably still worth it for large enough apps
Isn't there some way of cloning a loaded vm after loading a module? I would imagine that should be possible some how, that way you could parse once then execute in multiple threads.
I find it hilarious that we now present runtimes and other programming stuff like it was Apple presenting a new iPhone. This would be satire 15 years ago. No disrespect to Bun tho, I love Bun.
I feel like such a downer when I ask this about Bun and Deno, but: why should I use them instead of Node?
I don’t mean to take away from the obviously impressive engineering effort here. But VC funding always gives me pause because I don’t know how long the product is going to be around. I was actually more interested in Deno when it promised a reboot of the JS ecosystem but both Bun and Deno seem to have discovered that Node interoperability is a requirement so they’re all in the same (kinda crappy) ecosystem. I’m just not sure what the selling point is that makes it worth the risk.
We could drastically simplify the building and deployment process of our services. By far the greatest advantage is that it runs TS natively. Dropping the compilation stage simplifies everything. From docker imaging to writing DB migrations to getting proper stack traces.
You don't need source maps. You don't have to map printed stack traces to the source. Debugging just works. You don't need to configure directories because src/ is different than dist/ for DB migrations. You don't have to build a `tsc --watch & node --watch` pipeline to get hot reloading. You don't need cross-env. No more issues with cjs/esm interop. Maybe you don't even need a multi-stage Dockerfile.
That's for bun. Deno might have a similar story. We did not opt-in to the Bun-specific APIs, so we can migrate back if Bun fails. Maybe we could even migrate to something like ts-node. Shouldn't be that hard in that case.
IMHO the API of Bun, as well as the package manager, sometimes tries to be _too_ convenient or is too permissive.
Kind of. When you do try to run bun in production you'll find out that it has significant differences to node -- like not handling uncaught exceptions: https://github.com/oven-sh/bun/issues/429
We’ll add the uncaught exceptions handler likely before Bun 1.2 and fix the issue with sourcemaps. Sourcemaps do work (with some rough edges) at runtime when running TS/JSX/JS files, but are in a worse state with “bun build” right now.
We’ve been so busy with adding Windows support that it’s been hard to prioritize much else
What prisma issues are you running into? For us we just installed node alongside bun in our docker container and then ran prisma with node… was there something else?
Curious: does it run TS natively or does it just transpile for you? Because the former suggests exciting opportunity for better compiling or JITting if it can actually commit to holding on to the typing.
It does not do any type checking. You have to run tsc with noEmit separately. If you run `bun run foo.ts`, it just ignores all type annotations. It is transpiled to JS internally by removing the types (or it skips the types while parsing). While doing that, it keeps track of the original source locations. If you see some stack trace, you get the original location in the ts source.
Running tsc with noEmit is pretty much the standard in the frontend as well, as the TS is bundled by esbuild/rollup directly.
Does it support editing the source-files while in the debugger?
I've been hesitant to move to TypeScript because I'm unsure how well the debugger works in practice.
My current platform is Node.js + WebStorm IDE as a debugger. I can debug the JavaScript and I can modify it while stopping at a breakpoint or debugger-statement. It is a huge time-saver that I don't have to see something wrong with my code while in the debugger and then find the original source-file to modify and recompile and then restart.
Just curious, do Deno and Bun support edit-while-debug, out of the box? Or do I need to install some dependencies to make that work?
I'm not sure how difficult it would be for Nodejs to support .ts files natively. But if that's the main reason to use Bun, I'd be worried about its long term viability. Node could announce native .ts support at any time and then Bun might not look so good.
many people have commented, bun is all the tools in a single dependency: a test runner with in-memory db included? Shell support for window? Single file executable packaging? with macros? Code scratchpad that auto-installs dependencies? Programmatic APIs for transpiling/loading jsx (not tsx)? ... so on.
the jvm bytecode has been designed to be bytecode from day 1.
JS is the TS's bytecode but it has been designed to be a language to be developed in, which causes impedance mismatches as tools and people get confused about the usage context.
The most compelling argument for Deno is the permission system in my opinion. Node added a permission system recently, but it's much more coarse grained than Deno's. Being able to limit a script to listening on a specific hostname and port, or only allowing it to read a specific environment variable is pretty cool, and makes me less paranoid about trusting third party dependencies. Both Bun and Deno are also more performant than Node in many cases, and add a bunch of little quality of life improvements.
The real question is how much you can trust this. Those kinds of permission systems have been tried before - e.g. .NET used to have something called "Code Access Security". It was retired largely because the very notion of VM-enforced sandbox was deemed inadequate from experience. IIRC SecurityManager in Java was something similar, also deprecated for similar reasons. I'm afraid that Deno will just be a repeat of that.
I definitely wouldn't make the Deno sandbox my only line of defense — I'm a strong proponent of defense in depth. Now having said that, there's definitely a precedent for trusting V8's sandboxing capabilities. Cloudflare is running untrusted user code across their entire network and relying on V8 isolates as a sandboxing mechanism for Cloudflare Workers. I'm not sure I would go that far, but I do think we should be taking advantage of the strides browser developers have been making from a security perspective. When I re-watched Ryan Dahl's original conference talk where he introduced Deno, the sandboxing aspect was the part that resonated the most with me. But again, it's always best to have multiple layers of security. You should sandbox your applications and audit your dependencies, those mitigation techniques aren't mutually exclusive.
> The people who designed those things ultimately threw in the towel and said that if you want that kind of security, use containers or VMs.
I can see why they chose that route. It's a huge maintenance burden. I can't imagine Google throwing in the towel when it comes to securing their browser's JS engine though.
It's much easier to worry about locking down the few server-side modules which allow access to the underlying OS, than it is to have to worry about securing V8's JIT compiler. Node's module-based permission system literally just bans certain standard library modules from being imported (Deno's is more fine grained thankfully). That's a much smaller attack surface area to worry about compared to securing the underlying JS engine.
Also with Deno, it become very easy to write typed cli. .ts file can be run as script very easily with permission access defined on top of the script such as:
#!/usr/bin/env -S deno run --allow-net
Then one can just run ./test.ts if the script has +x permission.
Also project such as https://cliffy.io has made writing cli way more enjoyable than node.
It is a good idea to beware of the VC. So it is good idea to support project such as Hono (projects conform to modern web standard, and is runtime agnostic for JS).
> Also with Deno, it become very easy to write typed cli. .ts file can be run as script very easily with permission access defined on top of the script such as:
I do this all the time. I used to use `npx tsx` in my hashbang line to run TS scripts with Node, but I've started using Deno more because of the permissions. Another great package for shell scripting with Deno is Dax, which is like the Deno version of Bun shell: https://github.com/dsherret/dax
> Also project such as https://cliffy.io has made writing cli way more enjoyable than node.
This looks cool. I've always used the npm package Inquirer (which also works with Deno), but I'll have to compare Cliffy to that and see how it stacks up in comparison.
> Hono (projects conform to modern web standard, and is runtime agnostic for JS)
Hono is awesome. It's fast, very well typed, runs on all JS runtimes, and has zero dependencies.
What do you think of WebAssembly modules? It looks to me like the shared memory support that came with the threading proposal (which seems somewhat widely supported) should allow libraries to be isolated in their own modules and exchange data through explicitly shared memory even if they run on the same thread.
With secure isolation being a requirement for web browsers, and with the backing of multiple big companies, it seems like there should be enough momentum to make it work properly. Or maybe the browsers rely entirely on process isolation between different origins and won't care about the security of isolating individual modules?
Wasm is better positioned for this in that, as a lower-level spec, the attack surface is inherently much smaller. But also in many ways it is essentially a VM, just for an architecture that has no dedicated hardware outside of VMs.
Wasm shared memory semantics are very similar to process isolation on OSes, so presumably the same techniques can be used there (and if those techniques are faulty, then so are containers and VMs).
Imho those permission systems are still too rudimentary and too non-automated. Instead of CLI flags I would like to see permission enforced at dependency boundaries, e.g.
import foo from 'foo' with {permissions: ['fs', 'net']}
Enforcing permissions at dependency boundaries would be the ultimate goal, but trying to separate first-party code from third-party code within the same thread is a herculean task (if I pass a callback to a dependency, which permissions does it run with for example), and you can't really lean on JS engines to do the heavy lifting, because they weren't designed with that threat model in mind.
The best you can do currently is run your dependencies in a Worker, and enforce permissions programmatically for the worker [1]:
This isn't perfect by any means, and you shouldn't rely on it like a silver bullet, but if given the choice I'd rather have permissions in my security toolbox.
The speed increases are nothing to sneeze at; I've moved a few Vite projects over to Bun and even without specific optimizations it's still noticeably faster.
A specific use case where Bun beat the pants out of Node for me was making a standalone executable. Node has a very VERY in-development API for this that requires a lot of work and doesn't support much, and all the other options (pkg, NEXE, ncc, nodejs-static) are out-of-date, unmaintained, support a single OS, etc.
`bun build --compile` worked out-of-the-box for me, with the caveat of not supporting native node libraries at the time—this 1.1 release fixes that issue.
Bun's standalone executables are great, but as far as I'm aware unlike Deno and Node there's no cross compilation support, and Node supports more CPU/OS combinations than either Deno or Bun. Node supports less common platforms like Windows ARM for example (which will become more important once the new Snapdragon X Elite laptops start rearing their heads [1]).
We'll add cross-compilation support and Windows arm64 eventually. I don't expect much difficulty from Windows ARM once we figure out how to get JSC to compile on that platform. We support Linux ARM64 and macOS arm64.
It also helps avoid a node/v8 monoculture, just like with web browsers. I'm sure the ecosystem as a whole will get better because of it, even if you decide not to use it.
Banned? Is that why you had to post this on a green text account? Because that sounds immature. If you really have so many repos it sounds annoying that there isn't room for team level experimentation.
Devil's advocate: Deno and Bun are not yet fully backwards compatible with Node. I myself have run into a _ton_ of pain trying to introduce Bun for my team.
This can become a big time sink on bigger teams. That time could be saved by just not allowing it until a full team initiative is agreed on.
It's not immature, it's pragmatic. You do have to weigh the benefits of being able to use non-standard tools vs the cost of having not being able to reuse the same tooling, linters, compilers, and what-not for all projects.
When you have a lot of projects to support, it's rare for the benefits to outweight the costs
> If you really have so many repos it sounds annoying that there isn't room for team level experimentation.
For what it's worth, I'll say that I can understand such top down governance: you'd have an easier time around moving across projects that you work on within the org, there'd be less risk of a low bus factor, BOM and documentation/onboarding might become easier.
Same as how there are Java or .NET shops out there, that might also focus on a particular runtime (e.g. JDK vendor) or tooling (an IDE, or a particular CI solution, even if it's Jenkins).
On the other hand, if the whole org would use MySQL but you'd have a use case for which PostgreSQL might be a better fit, or vice versa, it'd kind of suck to be in that particular situation.
It's probably the same story for Node/Deno/Bun, React/Vue/Angular or anything else out there.
No reason why that mandated option couldn't eventually be Bun, though, for better or worse.
I can give a bit of perspective here.
I'm currently porting our the Vanilla Forums frontend (~500k lines of typescript) from using Node, Yarn (we used it back before npm supported lockfiles) and Webpack, to build with bun and Vite.
There are a few notable differences:
- The out of the box typescript interoperability is actually very nice, and much faster than using `ts-node` as we were before.
- Installations (although rare) are a fair bit faster.
- With bun I don't have to do the frankly crazy song and dance that node now requires for ES modules.
- Using bun is allowing us to drop `jest` and related packages as a dependency entirely and it executes our test suite a lot faster than jest did.
For my personal projects I now reach for bun rather than node because
- It has Typescript support out of the box.
- It has a nice test runner out of the box.
- It has much runtime compatibility with browsers (`fetch` is a good example).
- The built-in web server is sufficient for small projects and avoids the need to pull in various dependencies.
My old dog experience has proven multiple times that staying with the main reference tool for the platform always pays out long term, as most forks or guest languages eventually fade out after the hype cycle is over.
The existing tools eventually get the features that actually matter, and I avoided rewriting stuff twice, on the meantime I gladly help some of those projects to backport into the reference tooling for the platform.
The only place I really haven't followed this approach was in regards to C++ in UNIX, which at first sight might feel I am contradicting myself, however many tend to forget C++ was born at Bell Labs, on the same building as UNIX folks were on, and CFront tooling was symbiotic with UNIX.
ESM interop is inarguable. But these days Node has a test runner and compatibility with browsers (it implements fetch)… I guess I feel like Node is likely to catch up with most of this stuff over the lifetime of any long running project.
One of things that makes me more bullish on bun rather than Deno is that bun is intentionally aiming for compatibility with node and the npm ecosystem while Deno doesn't seem to be.
I really like Deno for small scripts and small side projects - it's just fast to get started with. And it allows me to use web standards, like URL imports to grab packages from CDNs instead of having a config file. There's just less to think about, like oh what was Node's crypto thing? Node is making strides in web compatibility, and building in things like a test runner. And I don't have much interest in migrating company projects away from Node. But Deno feels really fresh and light when I just need to run some JS.
AFAIK, Pnpm monorepos do not follow standard npm. Bun does follow standard npm monorepos.
Pnpm's feature to override dependency versions is nice for legacy projects with many 3rd party dependencies. Not sure if Bun has the same feature. I mostly use it on greenfield projects with dependencies that I control.
Node is safe choice, IMO. I tried deno and I think it's cool, but I'm staying on node for the time being. Things that deno makes easier are not that hard with node and stability matters for me. For example I had to spend few hours rewriting my tiny service with changed deno API. Don't have any experience with bun, though.
If there are any specific places where Deno didn't support a node API, please file a bug -- we definitely shoot for as close to node.js as possible, and if you have to modify your code that's most often on us.
No, I wrote my service some time ago (basically GitHub Webhook which does some crypto to validate payload and invokes kubectl) using Deno API and few months ago I took some time to update dependencies but found out that with new standard library version some APIs are either deprecated or removed, I don't really remember, but I felt the need to rewrite it.
Don't take it as a criticism, I totally understand that you need to iterate on API and nobody promised me that it'd be stable till the end of time, but still work is work.
https://github.com/denoland/deno/issues is the ideal place -- we try to triage all incoming issues, the more specific the repro the easier it is to address but we will take a look at everything that comes in.
I have ask myself the same question a couple of weeks ago and decided to use Node for some side stuff. Simply because of Node being the most mature, boring choice. Still, I like the DX improvements of both Bun and Deno a lot. We'll see how it all plays out in some years.
And UX is pretty great: integrated fetch, simplified fs api, integrated test runner (I miss good old TAP style assertions though), ESM/CJS modules just work, some async sugar.
I think if they offer me a paid *worker solution, with sqlite, that's something I'm willing to pay for.
I don't know how you're using Node and not thinking "I wish there was a better option than this". I can't wait to jump ship but Bun/Deno aren't quite there yet, for my needs.
I haven’t used them yet for full sized apps, but they are both fantastic for scripting and small CLIs. Between the ease of running scripts, nice standard libraries, npm ecosystem, and excellent type system, I now feel TypeScript is a better scripting language than Python or Ruby.
I used to think so too, but that was because I had never really used Python. I still think Ruby is a mess, but it's so amazing how easy it is to manipulate data in Python, and so much faster.
I recently wrote a Node/Bun/Deno app that parses a 5k line text file into JSON.
The JavaScript on any runtime takes 30-45 seconds.
The Python implementation is sub 1 second.
I would not have been able to finish the tool so quickly if I were stuck relying on JS.
I still love Typescript but I'm not as blind about it now.
That runtime doesn't make any sense. This script creates a 1,000,000 line CSV string and then parses it into JSON in 700ms with Bun, and this is doing both things the slow way, creating the string with a giant array map and join, and parsing with a giant split('\n') on newline and map.
It's a valid question, but does it matter for anyone except the dev team? Bun is open source, so VC-backing is mostly a helpful jumpstart. If they find a viable business model – great, development can be funded in perpetuity. If they don't, development was funded for a while by someone else's money and then Bun is just like any other open source project that lacks direct funding (most of them).
I think it does matter. Open-source software can still suffer from "enshittification" when there is constant need to generate profit. Fortunately it is open source, so it can be forked when things get bad, but even then there still may be lots of tech debt to undo.
Right...but if you're going to fork it and create fragmentation, then we might as well go back to Node which has at least been stable for the past few years.
Came for the ts interoperability, stayed for the performance.
Also seems like the most sensible project in the space - I tried Deno and it was... rough. Bun on the otherhand was easy to intergrate and a very pleasant experience.
I started using Bun by default for small personal projects. Having to set up Node, with Typescript and reloads always took the fun out of quickly prototyping something.
Is it me or does this project tries to do too many things at once? "Bun is an npm-compatible package manager", and an http server, and a websocket server, and a test library, and a bundler, and... why?
I think is that modern languages like Go and Rust (there are for sure others, but I don't have experience with any other language that ships with all the tools) ship with all the tools you need, formatter, linter, test runner, etc, Go goes even further than Rust and ships a very complete std focused on web and Javascript is used primarily in web, so it makes sense that ships with all the libraries needed to build a web server, also websocket is a standard, so it's easy to implement it and make it work with the browser.
Nowadays if you start a new Javascript project you need to setup, vite/esbuild/webpack, eslint/oxlint/biome, prettier, typescript, etc, that is a ton of dependencies that YOU need to maintain for years, and if it is part of the tool you are using, then you don't, ideally there shouldn't be a breaking change, let's see how bun manages that when the time comes.
I am waiting for bun or a tool that has everything I need to bundle my frontend app, I'm very tired of fiddling with all the dependencies and try to make to work every dependency together, I have a legacy project that I work on, I would have migrated a long time ago to another tool, but there is none that would fix the current issue and it's managing the project build, test, formatter, lint dependencies, after using Rust I feel super frustrated with the Javascript ecosystem state.
Also you asked why? I want to work on the project, there is already a lot of work to maintain the dependencies up to date, the tooling should not be part of that work.
> Nowadays if you start a new Javascript project you need to setup, vite/esbuild/webpack, eslint/oxlint/biome, prettier, typescript, etc, that is a ton of dependencies that YOU need to maintain for years
That's a choice YOU make, not everyone makes that choice, especially because they want to be able to continue working on a project for months/years without accruing automatic technical debt as all those projects move forward without actually thinking about backwards compatibility.
Right, but that’s why languages like Rust have an excellent DX. There is no complex choice about what linter, formatter, test runner, doc builder, or package manager to use. These at all such common requirements for building software at scale with lots of contributors that the language tooling just includes them. It’s not hard to maintain, because the language toolchain is so foundational it needs to be stable.
JS doesn’t have anything like that, which is why projects like Deno, Bun, Biome, etc are interesting. These projects explore how JS can also get a great out of the box experience without requiring the complex setup and maintenance steps that so many existing tools require.
Besides, professionally, you normally don’t make the choice in a vacuum. Linters, package managers, testing, bundling/building, type safety, and even formatting are all very useful in big projects with lots of people. So you often don’t get to say “ah we just won’t have unit tests because jest doesn’t care enough about backwards compatibility.”
No one likes to set up or update things. But when everything is coupled (why not even include a framework in Bun?), you are even more dependent on choices made by them and will be forced to upgrade no matter what. For example, if they decide that their implementation of the test component wasn't good enough in version X and they completely reimplement it in version X+1, you will have to upgrade your code, but maybe at that point you don't want to rewrite all your test suites, you just want to get the new http3 request handler, but you still must rewrite all your test suites...
When things are not that coupled, independent components you bring together, you can update the webserver, and/or some components, and keep the old version of the test library for now, until you decide it's time to upgrade.
Languagy type things tend to be more stable than the myriad of random npm packages out there. Hopefully Bun remains pretty stable.
I'm currently fighting PHP + Laravel. I want to upgrade PHP but I can't because the version of Laravel I'm using depends on an older version of PHP. So I have to upgrade them both in lockstep anyway.
Because that’s what people want. That’s how you can get a really good developer experience similar to golang or other languages. Just install one tool to build, lint, format, run tests, run your local project. No time spent trying to setup a bundler when what you want is to build a new project.
Regarding runtime libraries, it’s similar to the battery-included approach of go or python, you get what you need to get started out of the box and only reach for dependencies when you want to go further. Testing library, an http server, a websocket server that’s perfectly reasonable to have as core library of a runtime developed to run web servers.
Looks like it, it seems the 2% are mostly odd platform specific issues that the authors' did not deem very important (my assumption for the release happening anyway). AFAIK this[1] PR tries to fix them.
That's not what's happening. Bun has tests that are supposed to work on the platform, but currently doesn't.
Skipping tests from a Linux suite that doesn't make sense to run on Windows is very common. Skipping tests that should pass on a platform but doesn't just in order to cut a release isn't as common.
Every growing project has things thing the maintainers would like to work but that do not currently. The job of the maintainer is to balance the value of releasing a cut of the software as stands and learning about other unknown issues while continuing to work on the known, or to work on the known and ignore the unknown until finished with the known. Neither way is strictly good or bad.
What does that even mean? Bun is run by a fictional French character capable of interplanetary travel? Or are you calling them a spoiled brat with money? Why does that affect code organisation? Or is it code quality?
If you’re going to criticise the project and the people who run it, some clarity of communication and specifics on the issues would be appreciated so others can evaluate your claims. Otherwise it’s just empty insults which do not advance the discussion.
> The goal of Bun is to run most of the world's server-side JavaScript and provide tools to improve performance, reduce complexity, and multiply developer productivity.
Bun is still pretty young and experimental, and not really production ready, though it’s getting there fast. If it grows enough to force node to improve, or if it takes over node, that would be a success based on their own goal
There's been some polls on social media: Overall picture was 80-90% using Node. Then Bun, then Deno. I'd bet in the real world it's 99% Node for production. If in 3 years 5% were using Bun, it would be a great success (Node usage is huge). I think they're on track, but would not recommend Bun for production backends as of now.
Hooray for Windows support! That was keeping me from using Bun since I'm on Windows a fair bit. My experience with Bun has been excellent so far and I'm looking forward to using it more.
I just tried using Bun to run one of our more complex projects. Did the same with Deno a week ago and too many things weren't working. with Bun everything loaded perfectly, and I could immediately drop ts-node and nodemon because they're essentially redundant when using Bun. great stuff!
Is Bun executing TS or is it also compiling down to JS and executing that?
Edit:
The docs mention:
> Because Bun can directly execute TypeScript, you may not need to transpile your TypeScript to run in production. Bun internally transpiles every file it executes (both .js and .ts), so the additional overhead of directly executing your .ts/.tsx source files is negligible.
Transpiling TS is really easy task, because TS developers made huge effort to make it possible. You basically remove all types and that's about it. Probably could be done with simple character streaming algorithm.
At this point, IMO, it should just be implemented within V8. Would make things much simpler for everyone.
It is more compatible than not and there are workarounds for most of the things that wouldn't work, many of which you shouldn't be using in Typescript today, have known workarounds (enums), and/or you probably have lint rules already warning against using (namespaces). (There are more details elsewhere in that proposal document outside of the direct linked FAQ question.)
Great point. It is 100% possible to survive without enums and namespaces.
And in fact: I bet the Typescript team itself would deprecate them (or at least add extra checks to TS to avoid them) if the TC39 proposal above passed.
Yeah, if this type annotations proposal gets to higher stages I expect the associated ES target to get far more strict warnings/errors on those features.
Though I'd suggest that the Typescript team also doesn't seem to be waiting on that to try to deprecate them, either. They've made it somewhat clear that the only reason enums and namespaces survive is a commitment to deep backward compatibility (both go all the way back to 1.0) and are neither features that they would add today (without waiting on TC-39 proposals for those features in JS first). As an interesting side note: up until very recent versions the Typescript code base itself was one the biggest users of both enums and namespaces. (They managed to finally do a lot of namespace removal in the very recent ESM rewrite. I haven't checked where they are at in enum removal.) It is always fascinating to me how important backward compatibility can be when you bootstrap your compiler in its own language.
Why would doing work at run time that can be done at build time be a good idea? I have a CI, I'd rather have it so the build work rather than delegating that to my production servers.
(note that you still have to tsc everything anyway in the CI to check the types, so when you ship TS files to production your CI does the hard work, but then doesn't finish the easy part it so your prod server has to do it? Why?)
You can also ask why every node server traverses the file system to load dependencies at runtime instead of build time. Packaging your server with bun build can be the answer to both.
"TypeScript & JSX support. You can directly execute .jsx, .ts, and .tsx files; Bun's transpiler converts these to vanilla JavaScript before execution."
It’s not running TS directly, it’s just preconfigured to transpile TS to JS without the user having to bring extra tooling. Neat, but you’ll see the docs still recommend tsc for type checking at build.
I wonder what's the benefit of TS if there's no type-checking? If types are not checked that means the TS type-declarations could be totally wrong and nobody would know. In other words they could be misleading.
Why incur the type-declaration overhead if they are not used after all?
This is how typescript is run today. Typescript types never exist at runtime regardless of how typescript is run. There is no overhead defining types because they are deleted at runtime. The purpose of typescript is to make the editor experience better (autocomplete, error highlight). Typically typechecking is run in addition to tests to make sure there aren't a bunch of errors no one saw in editor.
It could be, but even today without Bun, a common approach is to do type checking in a separate step from the build. This is because tsc doesn’t parallelize well, so type checking will slow down the build a lot. So you can put the type check step in a separate CI job, and have it fail like unit tests would. Then the main build can be a lot faster since it just has to strip the annotations.
Plus, for local dev, iteration and watch/rebuild is more important than failing with invalid types on every change. Sometimes it’s helpful to circle back to fix/update types after you’ve tried a couple approaches. (TS can still be finicky at times!) On top of that, your IDE should report type errors as you work anyways.
I would still prefer though that Bun did it for me, in a separate process perhaps, so I wouldn't need to configure a separate CI job, or manually enter the tsc-command. I read that Bun has its own test-runner too so why not its own type-checker too.
On Node.js I just edit the source-code then re-start the debugger on it, and edit it while in the debugger then rinse and repeat.
I use runtime assertions to catch errors in argument-types etc. as needed.
the only time you run type-checker is on CI. For the majority of the time you only need the code to compute and your editor/IDE should already have its own bundled type-checker. Unless bun has its own type-checker which means it has to play catch-up with tsc (if that's even possible, typescript's type system is very complex), I don't find a lot of benefit for Bun to merely call tsc for me.
If it runs TS "the same way" it runs JS then how is that different to what I described? It would be a JS engine that strips the types before execution.
There is a persistent fantastical hope that TS can somehow be a compiled language. It can't, not without breaking compatibility with JS. Until, of course, someone manages to compile JS - but at that point TS would be irrelevant.
I say this as someone who loves TS and wouldn't want to be without it: TS is a fancy linter. JS defines the execution semantics of the language.
We've been using bun for a while now. We love the speed, but we love the integration even more. No need to use node, npm, nodemon, tsx, esbuild and jest.
Not sure I see the benefit of the bun shell. I use shell scripts when I know that the other people using the script will be able to run it in a similar shell to me, in order to cut down on dependencies. If I need it to be cross platform I just use a scripting language like JS.
Bun shell keeps the more esoteric syntax of Unix-like shells but also requires a dependency (Bun itself). If you already have Bun installed why wouldn't you just write a JS script?
I mandate WSL when working with windows developers, and GNU coreutils for mac developers so that I can assume some things about their dev environments. This solves that problem for shell scripts, there are still other ways where it's useful but a scripting environment is probably the biggest one.
The velocity of writing filesystem and file manipulation files in shell is many times greater than in javascript. Bun shell lets you leverage the power of shell for that stuff while being able to leverage your javascript code at the same time, with fairly minor downsides in exchange.
> The velocity of writing filesystem and file manipulation files in shell is many times greater than in javascript.
I think you're right, after achieving a level of expertise. I've never gotten there myself which probably explains my bias, but I appreciate hearing your perspective.
I can definitely see the value of not wanting to write the boilerplate kind of stuff you need to do for more shell like scripting. I can also see wanting to emulate the traditional syntax while abstracting away whether you are on a Unix system and still allowing traditional JavaScript syntax in between the sheep-y parts.
On the dependency side it'd be slick if you could bun --compile these like normal bun apps.
It's realy nice when I need to do shell stuff - much nicer to be able to use js to write a shell script than either go look up shell syntax again or use js with child_process.spawn()
So has Deno, but Bun's feel more evolved as it comes bundled with tools and sufficient examples for working with pointers. The only thing Bun is missing right now is the Deno equivalent of non-blocking FFI calls, ie `await mylib.myFunc()`.
Another one on my wishlist for Bun: embedded Bun. A library distribution would be nice, so that we can call into Bun as ie "libbun.so" from other languages. It would be more resourceful than just embedding WebKit/SpiderMonkey/v8 as these lack any real capabilities besides running vanilla JS.
One thing I miss in Node.js is ability to run a HTTPS -server in a simple way without having to muddle with generating & installing a correct type of certificate. I understand there can be a "self-signed certificate" but there doesn't seem to be any npm-module I could install to take care of that.
Since Bun is a "server-side" JavaScript platform it would be great if it could support https out of the box too.
More precisely, it runs its own local CA to issue certs for itself. Not exactly the same as self-signed certs (which is a cert whose key was used to sign itself), but better because leaf certs are short-lived and easily cycled. This allows for setting up trust easily, just need to trust the one root CA cert and every leaf cert for any domain served will be trusted.
Faster this, faster that. Is it finally segfault free?
I've tried it like 3 times in span of last year with different projects only to find out it segfaults at runtime or when installing package.
I only use bun for tests/builds/storybook, but I haven't had it segfault at all. I suspect that you've got a dependency that is hitting an undocumented node API that isn't fully implemented. They talk about those in the blog post, they're a known thing.
I admire your video production values :) Nice job there, One thing is, what gear do you guys use? The audio was really good for what was probably a shotgun mic?
Can you tell I don't use bun yet? Because none of my questions are about the release. But I tried it out a few weeks ago on my NextJS project and it didn't work, but will try bun 1.1.
Even before this past week's XZ backdoor revelation, checking binaries into source control rather than building from source seems quite questionable. In fairness to the Bun developer's, they have a comment in their build.zig file acknowledging that this shim should be built more normally rather than being checked in.
For no discernible reason, it is using a bunch of undocumented Windows APIs. The source cites this Zig issue as one reason for why they think it is OK to use undocumented APIs:
I don't see any good reasons here cited for using undocumented, unstable interfaces. For Zig's part, there seems to be some poorly-explained interest in linking against "lower level" libraries without any motivating use case (just some hand waving about security and drivers, neither of which makes much sense. Onecore.lib is a thing if you wanted a documented way of linking an executable that run on a diverse set of Windows form factors. And compiling drivers may as well be treated as a seperate target, since function names are different). For Bun, I assume they are trying to have low binary size. But targeting NTDLL vs. Kernel32 should not make a big difference, especially when the shim is just doing basic file IO. For an example of making small executable with standard API, you can make hello world 4kb using MSVC just by using /NODEFAULTLIB and /ENTRY:main with link.exe and this program :
So it should be possible to make a .bunx shim of small size without having to resort to undocumented API. (Current exe is 12kb). But even if the shim exe was 100kb, that would be still be an acceptable tradeoff for me than having to debug any problem that results from using non-standard APIs.
the motivation behind zig#1840 is that while the functions in ntdll aren't as well documented as the kernel32 functions, theyre not unstable and not having our binaries depend on kernel32.dll would lead to faster startup times as well as allow us to do things like use more performant algorithms for UTF-8 <-> UTF-16 conversion. on top of the things mentioned in the issue like having APIs with more powerful features.
For Bun's shim, it is linking against kernel32 anyways. And there is nothing special about it's use of NtCreateFile, NtReadFile, NtWriteFile, and NtClose that would preclude it from using the standard functions.
I'm not sure it's possible to not have kernel32 loaded into your process anyways. Even if you create an EXE that imports 0 DLLs, kernel32 gets loaded into the process by NTDLL. The callstack from main:
There are valid reasons to use APIs from NTDLL. Where I disagree with zig#1840 is the idea that it is always better to use NTDLL versions of API. Every other software ecosystem uses the standard Win32 APIs and diverging from that without a good reason seems like a good way to have unexpected behavior. One concrete example is most users and programmers expect Windows to redirect some file system paths when running on WOW64. But this is implemented in Kernel32, not ntdll.
On one hand I like the work Bun is doing, on the other hand the Bun module is starting to look like a messy "utils" module, aka random junk drawer. All of them useful of course, just... incoherent.
So able to optimize node.js and be $x times faster, but not able to calculate percentage of speedup correctly?
1.3s vs 2s it's not 58% speedup - it's 35%...
Such approach makes those projects looks like unicorns, not production stuff - even they are.
https://github.com/oven-sh/bun/issues/7990 (Via https://github.com/endoflife-date/endoflife.date/pull/4382)