Slow Server Start
When cold-starting the dev server, a bundler-based build setup has to eagerly crawl and build your entire application before it can be served.
Vite improves the dev server start time by first dividing the modules in an application into two categories: dependencies and source code.
Vite serves source code over native ESM. This is essentially letting the browser take over part of the job of a bundler: Vite only needs to transform and serve source code on demand, as the browser requests it. Code behind conditional dynamic imports is only processed if actually used on the current screen.
When a file is edited in a bundler-based build setup, it is inefficient to rebuild the whole bundle for an obvious reason: the update speed will degrade linearly with the size of the app.
In some bundlers, the dev server runs the bundling in memory so that it only needs to invalidate part of its module graph when a file changes, but it still needs to re-construct the entire bundle and reload the web page. Reconstructing the bundle can be expensive, and reloading the page blows away the current state of the application. This is why some bundlers support Hot Module Replacement (HMR): allowing a module to "hot replace" itself without affecting the rest of the page. This greatly improves DX - however, in practice we've found that even HMR update speed deteriorates significantly as the size of the application grows.
In Vite, HMR is performed over native ESM. When a file is edited, Vite only needs to precisely invalidate the chain between the edited module and its closest HMR boundary (most of the time only the module itself), making HMR updates consistently fast regardless of the size of your application.
Vite also leverages HTTP headers to speed up full page reloads (again, let the browser do more work for us): source code module requests are made conditional via
304 Not Modified, and dependency module requests are strongly cached via
Cache-Control: max-age=31536000,immutable so they don't hit the server again once cached.
Once you experience how fast Vite is, we highly doubt you'd be willing to put up with bundled development again.
Why Bundle for Production
Even though native ESM is now widely supported, shipping unbundled ESM in production is still inefficient (even with HTTP/2) due to the additional network round trips caused by nested imports. To get the optimal loading performance in production, it is still better to bundle your code with tree-shaking, lazy-loading and common chunk splitting (for better caching).
Ensuring optimal output and behavioral consistency between the dev server and the production build isn't easy. This is why Vite ships with a pre-configured build command that bakes in many performance optimizations out of the box.
Why Not Bundle with esbuild?
Vite's current plugin API isn't compatible with using
esbuild as a bundler. In spite of
esbuild being faster, Vite's adoption of Rollup's flexible plugin API and infrastructure heavily contributed to its success in the ecosystem. For the time being, we believe that Rollup offers a better performance-vs-flexibility tradeoff.
esbuild has progressed a lot in the past years, and we won't rule out the possibility of using
esbuild for production builds in the future. We will keep taking advantage of new capabilities as they are released, as we have done with JS and CSS minification where
esbuild allowed Vite to get a performance boost while avoiding disruption for its ecosystem.
How is Vite Different from X?
You can check out the Comparisons section for more details on how Vite differs from other similar tools.