- Published on
Bun Retrospective 2026 — 1.0 GA Promises vs Reality: Runtime, Bundler, Package Manager, Test Runner, Shell (An Honest Take from the Bun 2 Era)
- Authors

- Name
- Youngju Kim
- @fjvbn20031
Prologue — The 2023 promise, the 2026 invoice
I remember the air around September 2023, when Bun 1.0 shipped GA. The demos were striking. bun install was 25x faster than npm install, bun test was 13x faster than Jest, bun run started 4x faster than Node. Those charts were on every other Twitter timeline. Half of the room said "Node is over." The other half said "this is just another Yarn 1.0 moment."
It is May 2026. Time to read the invoice.
Bun kept about half of its promises. But it was not the half we expected.
bun install really is fast — promise kept. bun test is fast. Bun.serve really is lightweight and quick. But the scenario of "Bun replaces Node in production" is rare. The honest picture as of mid-2026 looks like this:
- Bun-as-a-toolkit is widely adopted —
bun install,bun test, Bun shell. CI times cut in half. - Bun-as-a-runtime remains niche — small API gateways with Bun.serve, edge workers, internal tools. Stories of a large monolith migrating to Bun are scarce.
- Node 22+ played a surprisingly good defense —
node --experimental-strip-types,node:test,node --watch, native fetch. Node returned Bun's serve from its own side of the court.
This post is an honest retrospective of the two and a half years after 1.0 GA. What over-delivered, what was a broken promise, and where Bun is the right call versus where Node still is.
1. The Bun runtime — Zig + JavaScriptCore, the original bet
The promise
Bun made three bets from day one.
- Rewrite the runtime in Zig — unlike Node which sits on V8 plus Node's own C++ glue, Bun is written in Zig to keep startup time and memory footprint small.
- JavaScriptCore (JSC), not V8 — the engine Safari uses. The general view is that JSC is lighter on startup and memory than V8.
- Node API compatibility —
fs,http,path,crypto, and friends are supported as-is. The promise being "drop a Node project into Bun and it runs."
The combination was meant to be a drop-in replacement that simply ran Node code, faster.
Reality — runtime speed
Benchmarks consistently show Bun beating Node. But where it beats Node matters.
Cold start ("console.log"):
Node 22: 35~60ms
Bun 1.2: 10~20ms
Deno 2: 25~40ms
HTTP throughput (simple echo, single core):
Node 22 (http): 80k req/s
Node 22 (uWebSockets) 280k req/s
Bun.serve: 250~310k req/s
Hono on Bun: 220~280k req/s
Deno serve: 200~240k req/s
SQLite query (1M rows, simple select):
better-sqlite3 (Node): 4.2s
bun:sqlite: 1.6s
Fast, yes. But look at the shape of the gap.
- Cold start, 2~3x faster: meaningful in AWS Lambda or other cold-start-sensitive workloads. Irrelevant in a long-running server.
- HTTP throughput: when Node is not on its default
httpmodule — using uWebSockets or fastify — the gap narrows substantially. A fair comparison is "optimized vs optimized," not "default vs default." - SQLite: a genuine win.
bun:sqliteis FFI directly into SQLite, so the N+1 chatter drops out.
The point: Bun is fast not because "JS executes faster," but because "I/O and native interfaces are lighter." For CPU-bound JS code, the gap against Node is often small.
Reality — Node compatibility
Bun advertised "90%+ Node compatibility" at 1.0. Mid-2026 reality, by area:
| Area | Compat status | Usable? |
|---|---|---|
fs, path, os, util | Near complete | Yes |
http, https, url | Near complete | Yes |
crypto | Mostly OK, some legacy API gaps | Yes |
child_process | Mostly OK | Yes |
worker_threads | OK | Yes |
cluster | Partial, edge cases differ | Caution |
| Native addons (N-API) | Broad support, a handful missing | Yes (mostly) |
vm module | A few options missing | Caution |
inspector (debugger protocol) | Partial | Caution |
async_hooks | Partial — affects APM tools | Caution |
process.binding internals | Not supported (intentional) | No |
Small services rarely hit a wall. But APM (Application Performance Monitoring) agents, OpenTelemetry auto-instrumentation, and legacy native addons still break. New Relic, Datadog and similar landed official Bun support only by late 2025, and some instrumentation remains partial.
"Node code drops in and runs" is true in small codebases and half-true in big ones. What breaks is rarely business code — it is the surrounding infrastructure (tracing, profiler, debugger).
2. bun install — the clearest win
Of all the areas, the least debatable win is bun install.
Benchmark
Hot cache, mid-size repo (~800 deps):
npm install: 18s
yarn install: 9s
pnpm install: 5s
bun install: 1.6s
Cold cache, same repo:
npm install: 65s
yarn install: 35s
pnpm install: 18s
bun install: 7s
bun install brought Rust-Cargo-like or Go-module-resolver-like speed into the JS world. This is a real DX difference — bun install finishes in five to ten seconds on CI every time, and locally a lockfile change does not test your patience.
A short history of the lockfile war
When 1.0 first shipped, bun.lockb was a binary lockfile. The goal was fast parsing, but PR reviewers could not see a diff. Through 2024 and 2025 the community pushed back hard, and Bun eventually introduced bun.lock (a text, JSON-superset format). As of 2026, new projects default to the text lockfile. Some older projects still keep bun.lockb.
Lesson: A DX bet that optimizes one side (speed) at the cost of another (diffability, code review) will be sent back by the community. Bun acknowledged this and corrected quickly. That deserves credit.
Where it is actually used
bun install can be adopted without switching the runtime. CI pipelines using bun install followed by npm run build or node server.js are common.
- Vercel and Netlify offer
bun installas a build option. - Next.js, SvelteKit, Astro and other major frameworks build cleanly when deps are installed with Bun.
- Docker images combining the
oven/bunbase for builds with a Node runtime cut build time roughly in half.
bun install is the safest first step into Bun adoption — and by far the most common one.
3. bun test — fast, but adoption is mixed
The promise
bun test claims Jest-compatible (or close-to-Jest) API, runs much faster than Jest, and executes TypeScript directly without configuring a transform.
Reality
Benchmarks show Bun test 310x faster than Jest. Against Vitest, the gap is 13x in either direction depending on the suite. Adoption is mixed.
| Tool | Mid-2026 adoption (impression) | Strength |
|---|---|---|
| Jest | Still #1 (legacy weight) | Ecosystem, plugins, stability |
| Vitest | Rapidly passing into #2 | Vite integration, ESM-native, watch UX |
| Bun test | #3, common in new projects | Speed, zero-config TS |
Node test (node:test) | Slowly growing | No deps, official |
Bun test has not displaced the main stack for a few reasons.
- Jest's plugin / ecosystem: snapshot serializers, custom matchers, jest-axe, jest-extended, and many more. Porting all of them takes time.
- Mocking differences:
jest.mock()compatibility keeps improving, but edge cases differ. - Monorepo integration: Turborepo and Nx integrate more smoothly with Jest and Vitest.
Bun test is most common in new single-package projects. In large monorepos, Vitest still tends to win.
4. The Bun bundler — aimed at esbuild's seat, but
The promise
Bun ships a built-in bundler — bun build. It openly cites esbuild as inspiration and claims to be faster.
Reality — the 2026 bundler landscape
The JS bundler landscape became much more crowded than it was in 2023.
| Bundler | Position | Notes |
|---|---|---|
| esbuild | Still dominant | Smallest surface, most stable |
| Vite (Rollup) | The dev-server standard | HMR, plugin ecosystem |
| Rolldown | The Vite next-gen | Rust-based, can beat esbuild |
| Turbopack | Next.js default | Vercel's bet, Webpack successor |
| Bun bundler | Niche, mainly inside the Bun ecosystem | Fast, zero-config TS, integrates with Bun runtime |
| swcpack | swc's bundler branch | An earlier Vercel-side experiment |
The Bun bundler is fast, that is true. But the plugin ecosystem is far smaller than esbuild's or Rollup's. For PostCSS, SVGR, MDX and similar areas, esbuild plugins do not always work as-is (Bun does try to be esbuild-plugin-compatible).
The Bun bundler is a sensible default inside the Bun ecosystem. Small SPAs served by Bun.serve, the dist of a Bun-only CLI tool. It did not become an industry-wide replacement for esbuild. That seat is being chased by Rolldown.
5. Bun.serve() — fighting for the web-framework seat
The promise
Bun.serve is "build a fast HTTP server without express." Fastify-level throughput from the standard library.
A 30-second look
// server.ts
import { serve } from "bun";
serve({
port: 3000,
async fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/hello") {
return new Response("hi");
}
if (url.pathname === "/echo" && req.method === "POST") {
const body = await req.json();
return Response.json({ ok: true, body });
}
return new Response("not found", { status: 404 });
},
});
console.log("listening on http://localhost:3000");
The web-standard Request/Response API matters. Same paradigm as Cloudflare Workers and Deno — Bun lined itself up there.
Reality — the rise of Hono
Bun.serve is quick. But you still need a router, middleware, and validation on top of it before you have a real app. The de-facto standard in 2026 is Hono.
// hono on bun
import { Hono } from "hono";
const app = new Hono();
app.get("/hello", (c) => c.text("hi"));
app.post("/users", async (c) => {
const body = await c.req.json();
return c.json({ id: 1, ...body });
});
export default { fetch: app.fetch, port: 3000 };
Hono runs the same code on Bun, Deno, Cloudflare Workers, Vercel Edge, and Node. That means picking Hono lets you defer or change the runtime choice. That is a big selling point. Hono-on-Bun is operationally safer than code tightly coupled to Bun.serve.
Bun.serve is good. But when people actually build, they pick Hono more often. Bun does not fight this — the Bun docs themselves point at Hono as the recommended router on top of Bun.serve.
6. bun:sqlite — the biggest over-delivery
If forced to pick the single feature that exceeded its billing, bun:sqlite is the answer.
Why it is good
On Node, using SQLite means installing better-sqlite3 (native addon) or sqlite3 (async). Both build native code at install time and occasionally hit cross-platform issues. bun:sqlite ships inside the runtime — no build step, just import and use.
Code
// sqlite.ts
import { Database } from "bun:sqlite";
const db = new Database("app.db");
db.run(`
CREATE TABLE IF NOT EXISTS posts (
id INTEGER PRIMARY KEY,
title TEXT NOT NULL,
created_at INTEGER NOT NULL
)
`);
const insert = db.prepare(
"INSERT INTO posts (title, created_at) VALUES (?, ?)"
);
const tx = db.transaction((rows: { title: string; ts: number }[]) => {
for (const r of rows) insert.run(r.title, r.ts);
});
tx([
{ title: "first", ts: Date.now() },
{ title: "second", ts: Date.now() },
]);
const recent = db
.query("SELECT id, title FROM posts ORDER BY id DESC LIMIT 10")
.all() as { id: number; title: string }[];
console.log(recent);
Prepared statements, transactions, named parameters — all synchronous. Bun and SQLite share a process, so there is no FFI overhead.
Where it is used
- Local data caches — metadata stores for CLI tools, scratch storage for tiny services.
- Embedded analytics — small kiosk dashboards, mini BI.
- Test databases — integration tests against in-memory SQLite instead of Postgres.
- Agent memory — AI agents frequently store their context/memory in SQLite.
bun:sqlitemakes Bun into a "next-generation scripting environment." It brought the role of Python'ssqlite3standard library into the JS ecosystem.
7. Bun shell ($) — another above-promise feature
The promise
A type-safe shell, inspired by zx (Google's JS shell tool). The difference is that zx runs on top of Node as a third-party tool, while Bun shell is built into the runtime.
Code — a pipeline
// shell.ts
import { $ } from "bun";
const files = await $`ls -1 src`.text();
console.log(files);
// Pipes and redirects
const wc = await $`cat package.json | wc -l`.text();
console.log("lines in package.json:", wc.trim());
// Safe interpolation (shell-injection-safe)
const target = "components";
await $`mkdir -p dist/${target}`;
// Throws on non-zero exit
try {
await $`exit 1`;
} catch (e) {
console.log("caught exit code:", e.exitCode);
}
// Environment variables
const home = (await $`echo $HOME`.text()).trim();
console.log("home:", home);
Two things matter most.
- Shell-injection safety — JS interpolation like
${target}is automatically escaped. Theos.system(...)family of footguns is blocked at the type level. - TypeScript integration — your IDE gives autocomplete and type info.
Reality — who uses it
Bun shell climbed quickly to the de-facto standard for build scripts and dev tools. The long incantations that used to live in package.json "scripts" move into scripts/build.ts and are written with Bun shell.
- It took some of make's seat.
- It solved the readability problems of shell scripts (quoting hell, special-character escaping).
- It is cross-platform — Bun shell behaves the same on Windows (POSIX emulation).
Bun shell is the feature that makes people install Bun even if they do not run Bun. When a shell script grows long and fragile,
bun + $shows up.
8. bunfig.toml, executable bundles, and the rest
bunfig.toml
Bun's configuration file. Registry, install behavior, test runner options, JSX transform options — all in one file.
# bunfig.toml
[install]
registry = "https://registry.npmjs.org/"
production = false
[install.cache]
dir = "~/.bun/install/cache"
[test]
preload = ["./test/setup.ts"]
[run]
silent = false
[loader]
".ts" = "tsx"
It does not invade the package.json (scripts, deps) and groups only Bun-specific settings. Clean separation.
Executable bundles — bun build --compile
bun build --compile --target=bun-linux-x64 ./cli.ts --outfile mycli
Package your JavaScript into a single executable. The Bun runtime is packaged in alongside it. The output file lands around 70MB, but it runs without any external dependency.
Compare with:
- Node 22:
node --experimental-sea-config(Single Executable Application) — similar idea, more manual. - Deno 2:
deno compile— stable equivalent. - pkg (Vercel): legacy tool, no longer actively maintained.
bun --compile gives you Bun's fast startup plus no-dependency distribution in one move. CLI authors use it often.
The rest
- Bun watch (
bun --watch) — equivalent to Node's--watch, with a faster restart. - Bun macros — run JS at build time and inline the result. Useful for static data baking.
- Bun lockfile auditing — integrated security advisories.
9. The Node 22+ counter-punch — Node fielded the ball Bun served
One of Bun's largest contributions was accelerating Node's evolution. If at 1.0-time Node felt "old, slow, heavy," 2026 Node is genuinely fast and modern. Some of that improvement plugged the very gaps Bun targeted.
| What Bun showed off | Node's answer |
|---|---|
| Zero-config TS | node --experimental-strip-types (Node 22) → stable in Node 24. Run .ts files directly. |
| Built-in test runner | node:test module + node --test. Stable since 2024, mocks included. |
| Native fetch | Stable since Node 18. No more node-fetch. |
| Watch mode | node --watch. No more nodemon. |
| Env-file flag | node --env-file .env. No more dotenv. |
| Single executable app | node --experimental-sea-config. |
| Core module imports | node:fs, node:path prefix. |
Bun's biggest impact was not "people installing Bun," but "pressuring Node into getting better." Every JS developer benefits, Bun-user or not.
Node 22+ still has slower cold start than Bun and still ships npm for install (much slower), but the dev-experience gap shrank dramatically.
10. Deno 2 — the security-first alternative that loosened Node-compat
Deno also pivoted with Deno 2 in 2024. Node compatibility mode was turned on aggressively, package.json is supported natively, and npm modules can be imported.
A simplified mid-2026 view of Bun vs Deno 2:
| Axis | Bun | Deno 2 |
|---|---|---|
| Runtime speed | Fastest | A touch behind Bun, plenty fast |
| Node compat | Broad, some gaps | Big jump in Deno 2, stable |
| Permission model | None by default (like Node) | Deny-by-default, explicit permission flags |
| Standard library | Relatively small | Rich and curated (@std) |
| TS support | Zero-config, fast | Zero-config, with type-check |
| Security stance | Speed first | Safety first (sandboxing) |
| Package manager | bun install | deno install (jsr.io or npm:) |
Deno 2 sits in the security-first corner; Bun sits in the speed-first corner. In practice they have been converging — Node compat is broad on both sides. Deno's --allow-net, --allow-read and similar flags give a real edge in production security but add friction in dev.
If your operating environment is high-security prod (government, finance), Deno 2 has an edge. For dev tooling, CI, and general web services, Bun wins on developer experience. For a company-wide standard runtime, Node remains the default.
11. Broken promises — where Bun fell short
The core of an honest retrospective: where did Bun not deliver.
11.1 The "drop-in Node replacement" narrative
That was the launch line. In practice, legacy enterprise codebases with native addons, APM agents, custom debuggers, or complex cluster usage often still do not work cleanly. "Drop-in" is true in small codebases.
11.2 Windows support
At 1.0 this was the biggest broken promise. For the first year, only Linux/macOS had first-class support, and the official advice for Windows was WSL. Native Windows support landed in late 2024, but even in mid-2026, some modules still hit edge cases on Windows.
11.3 Stability (the months right after 1.0)
In the six months after 1.0 GA, the field saw a non-trivial number of reports — "weird memory leaks," "OOM kills," "race conditions." Some early adopters running Bun in production rolled back to Node. Stability has improved a lot through 2025, but the shock of those first months left a lasting mark on reputation.
11.4 Observability and the APM gap
A persistent weak spot: APM instrumentation. The deep instrumentation New Relic, Datadog, and Dynatrace built over years on Node took 2+ years to reach parity for Bun. This alone keeps operations teams away — "production you cannot see is not production."
11.5 The occasional Node API rough edge
The exact callback signature of fs.readFile, internal use of process.binding, edge cases in worker_threads — narrower than before but not zero. Large libraries (including some paths in the TypeScript compiler) occasionally behave slightly differently on Bun.
12. Over-delivered — where Bun exceeded expectations
For balance, the things Bun did better than promised.
12.1 Bun shell
Covered above. As in section 7, Bun shell exceeded its mandate and became an industry default tool.
12.2 bun install speed and stability
The original 25x number varied by scenario, but consistently fast installs kept the promise. Cold cache, hot cache, lockfile diffs — all reasonable.
12.3 bun:sqlite
Section 6. A feature not in the original headline that turned into a hit.
12.4 Cold start
Cold-start advantage on Lambda and Cloudflare Workers became a bigger operational use case than the original framing.
12.5 The Bun team's responsiveness
When issues are reported they move quickly. Lockfile format pivot, Windows support, Node API additions — they took community feedback seriously. That deserves credit.
13. Who actually runs Bun in production — mid-2026 snapshot
The patterns I see in companies that have actually deployed Bun.
Case A — small API gateways / edge functions
- Tiny Bun.serve instances on fly.io handling routing, auth, and rate limiting.
- Cloudflare Workers uses its own V8 isolates, not Bun, but using Bun.serve locally as a mock server in place of Wrangler is a common dev pattern.
Case B — CLI tools
- Build a CLI in Bun, then ship a single executable via
bun build --compile. End users do not need Bun installed. - Some Prisma CLI components are written in Bun.
Case C — internal tools / dev tools
- Company dashboards, deploy automation, CI helpers. Small ops tools made with Bun.serve plus
bun:sqliteare very common.
Case D — AI agent runtimes
- A growing pattern: running AI agents inside a Bun sandbox. Fast cold start + bun:sqlite for memory + Bun shell for tool execution is a strong combo. Some AI infra providers (e2b, Modal) offer a Bun runner option.
Case E — almost never — large monoliths
- Very rare. Public stories of a traditional large monolith migrating to Bun can be counted on one hand. The risk-to-reward is not attractive, and APM/debugger pressure is strong.
Named cases (per public posts)
- Vercel: parts of internal build tooling and some edge-function workflows.
- Mintlify: pieces of the docs builder.
- Codeium / Replit / Glitch: Bun as an option in user runtime environments.
- Many startups: early single-instance APIs on Bun.serve plus Hono.
14. Decisions — where Bun fits, where Node still fits
The honest take.
Where Bun fits
- Dev workflows where CI speed matters —
bun install+bun testalone is a real win. - CLI tools —
bun build --compilefor single-file ship, fast startup. - Edge and serverless — cold-start advantage.
- Scripting / dev tooling — Bun shell.
- SQLite-heavy small services —
bun:sqliteperformance. - New projects with few external deps — compatibility issues unlikely.
Where Node still fits
- Large enterprise monoliths — APM, custom debuggers, native addon ecosystem.
- Legacy codebases where operational stability matters — change risk exceeds reward.
- Production where deep observability is mandatory — Datadog/NR auto-instrumentation maturity.
- Heavily regulated environments — Node's audit/CVE pipelines are well established.
- Company-standard runtime — easier to hire and train for.
Where Deno 2 fits
- High-security environments — the permission model is operationally meaningful.
- Development around a curated standard library —
@stdstability. - New projects that still need Node interop — Deno 2's npm compat is good now.
The three runtimes are no longer zero-sum. A growing number of companies run Bun (dev tooling/CI/CLI), Node (main service), and Deno (high-security pieces) side by side. The narrative of "one runtime wins all" does not hold in 2026.
Epilogue — promises, the invoice, and the next round
Two and a half years after Bun 1.0 GA. To summarize:
Bun did not keep its promise to "replace Node in one swoop." But it did keep its promise to "make the JS ecosystem faster" — both inside Bun and by pressuring Node from the outside. The result: every JS developer is using better tools today, whether they adopted Bun or not.
What to remember:
- Think of Bun as a toolkit, not just a runtime. Pick
bun install,bun test, Bun shell, andbun build --compileindividually. Adopt the runtime later, slowly. - Node 22+ is genuinely better. If your image of Node is the old one, revisit it.
- Do not forget Deno 2. It is the right option for security/standard-first operations.
- APM compatibility still lags. Confirm support before going to production.
- Windows keeps improving, but first-class status is recent.
Adoption checklist (before adopting Bun)
- Does our APM (NR/DD/Sentry/...) officially support Bun? At what level of instrumentation?
- Do our native addons (sharp, node-canvas, ...) run on Bun?
- Can our CI accept Bun in the build stage? What about Docker base images?
- Do our people have time to adapt to Bun debugging? (Understand the limits of Chrome DevTools protocol compatibility.)
- Where exactly do we adopt Bun? (Phases: lockfile only → install only → test/scripts → CLI → small service → main service.)
- What is the rollback story? Cost of moving back from Bun to Node?
Common anti-patterns
- Deploying Node code on Bun and forgetting non-functional requirements (observability).
- Letting CI speed lure you into changing the production runtime in one step. Those two changes can be separated, and should be.
- Overusing Bun shell — for a single command, a plain
child_process.spawnSyncis fine. - Using
bun:sqliteas the production primary database. SQLite is single-node data; do not put it in Postgres's seat. - Trusting the marketing benchmarks as-is. Measure on your own workload. "Fast at 1k qps" and "fast at 100k qps" are different questions.
Coming next
- "Hono Guide 2026 — runtime-agnostic web framework (Bun/Deno/Workers/Vercel/Node)" — one codebase across five runtimes.
- "Node 22~24 — how strip-types, node:test, --watch, and --env-file changed dev experience" — the benefits even non-Bun-users get.
- "Deno 2 retrospective" — one year after the Node-compat bet, how did it play out.
References
Bun official
- Bun homepage — https://bun.sh/ — latest version, changes.
- Bun docs — https://bun.sh/docs — runtime, bundler, test, shell, all in one place.
- Bun on GitHub — https://github.com/oven-sh/bun — issues, release notes.
- Bun blog (releases) — https://bun.sh/blog — per-version changes.
Node.js
- Node.js homepage — https://nodejs.org/
node:testdocumentation — native test runner from Node 22+.- The Node strip-types proposal (TC39 type annotations) and its relationship to TypeScript erasable syntax.
Deno
- Deno homepage — https://deno.com/
- Deno 2 announcement — the Node-compat bet and
npm:specifier support.
Benchmarks and comparisons
- "Bun vs Node vs Deno: HTTP performance" — multiple community benchmarks (workload-dependent, measure yours).
- TechEmpower Web Framework Benchmarks — the standard reference for Bun/Node/Deno framework comparisons.
Web frameworks / runtime-agnostic
- Hono — https://hono.dev/ — runs on Bun, Deno, Workers, Node.
- itty-router, h3 — other options in the same category.
SQLite
bun:sqlitedocumentation — the API for Bun's built-in SQLite.- better-sqlite3 — the de-facto Node comparator.
Bun shell, zx
- Bun shell documentation — full
$API usage. - Google zx — https://github.com/google/zx — the inspiration for Bun shell.
Production adoption (per public writing)
- Vercel engineering posts on Bun adoption.
- Many startup retrospectives — "why we did (not) pick Bun."
Critical perspectives
- Early-days production issue retrospectives on Bun.
- Ops-team writing on the limits of APM auto-instrumentation.
- Analyses of areas where Node has closed the gap.