Michal Zalecki
Michal Zalecki
software development, testing, JavaScript,
TypeScript, Node.js, React, and other stuff

Optimize React build for production with webpack

This guide is a form of writing down few techniques that I have been using with ups and downs for the past two years. Optimizations highly depend on your goals, how users are experiencing your app, whether you care more about time to interactive or overall size. It should not come as a surprise, that like always, there is no silver bullet. Consider yourself warned. Although you have to optimize for your use cases, there is a set of common methods and rules to follow. Those rules are a great starting point to make your build lighter and faster.

TL;DR

  1. Minify with UglifyJS
  2. Remove dead code with Tree shaking
  3. Compress with gzip
  4. Routing, code splitting, and lazy loading
  5. Dynamic imports for heavy dependencies
  6. Split across multiple entry points
  7. CommonsChunkPlugin
  8. ModuleConcatenationPlugin
  9. Optimize CSS class names
  10. NODE_ENV="production"
  11. Babel plugins optimizations

Minify with UglifyJS

UglifyJS is a truly versatile toolkit for transforming JavaScript. Despite the humongous amount of the configuration options available you only need to know about few to effectively reduce bundle size. A small set of common options brings major improvement.

module.exports = {
  devtool: "source-map", // cheap-source-map will not work with UglifyJsPlugin

  plugins: [
    new webpack.optimize.UglifyJsPlugin({
      sourceMap: true,   // enable source maps to map errors (stack traces) to modules
      output: {
        comments: false, // remove all comments
      },
    }),
  ]
};

You can take it from there and finely tune it. Change how UglifyJS mangles functions and properties names, decide whether you want to apply certain optimizations. While you are experimenting with UglifyJS keep it mind that certain options put certain restrictions on how you use certain language feature. Make yourself familiar with them or you come across, or rather your users come across, some few hard to debug problems present only in production bundle. Obey the rules of each option and test extensively after each change.

Remove dead code with Tree shaking

Tree shaking is a dead code, or more accurate, not imported code elimination technique which relies on ES2015 module import/export. In the old days, like 2015, if you have imported one function from the entire library you would still have to ship a lot of unused code to your user. Well, unless library supports methods cherry-picking like lodash does, but this is a story for a different post. Webpack has introduced support for native imports and Tree shaking in version 2. This optimization was popularized much earlier in JavaScript community by rollup.js. Although webpack offers support for Tree shaking, it does not remove any unused exports on its own. Webpack just adds a comment with an annotation for UglifyJS. To see the effects of marking exports for removal disable minimization.

module.exports = {
  devtool: "source-map",

  plugins: [
    // disable UglifyJS to see Tree shaking annotations for UglifyJS
    // new webpack.optimize.UglifyJsPlugin({
    //   sourceMap: true,
    //   output: {
    //     comments: false,
    //   },
    // }),
  ]
};

Tree shaking only works with ES2015 modules. Make sure you have disabled transforming modules to commonjs in your Babel config. Node does not support ES2015 modules and you probably use it to run your unit tests. Make sure transformation is enabled in the test environment.

{
  "presets": [["es2015", { "modules": false }], "react"],
  "env": {
    "test": {
      "presets": ["es2015", "react"],
    }
  }
}

Let's try it out on a simple example and see whether the unused export is marked for removal. From math.js we only need fib function and doFib function which is called by fib.

// math.js
function doFact(n, akk = 1) {
  if (n === 1) return akk;
  return doFact(n - 1, n * akk);
}

export function fact(n) {
  return doFact(n);
}

function doFib(n, pprev = 0, prev = 1) {
  if (n === 1) return prev;
  return doFib(n - 1, prev, pprev + prev);
}

export function fib(n) {
  return doFib(n);
}

// index.js
import { fib } from "./common/math";

console.log(fib(10));

Now you should be able to find such comment in the bundle code with the entire module below.

/* unused harmony export fact */

Enable UglifyJS back, fact along with doFact will be removed from the bundle. That is the theory behind the Tree shaking. How effective is it in a more real-life example? To keep proportions I am going to include React and lodash into the project. I am importing 10 lodash functions I use the most often: omit, pick, toPairs, uniqBy, sortBy, memoize, curry, flow, throttle, debounce.

Such bundle weights 72KB after being gzipped. That basically means that in spite of using Tree shaking, webpack bundled entire lodash. So why we bundle lodash and only given exports from math.js? Lodash is meant for both browser and node. That is why by default it is available as a commonjs module. We can use lodash-es which is ES2015 module version of lodash. Such bundle with lodash-es weights... 84KB. That is what I call a failure. It is not a big deal as this can be fixed with babel-lodash-plugin. Now it is "only" 60KB. The thing is, I would not count on Tree shaking to significantly reduce bundled libraries size. At least not out of the box. The most popular libraries did not fully embrace it yet, publishing packages as ES modules is still a rare practice.

Compress with gzip

Gzip is a small program and a file format used for file compression. Gzip takes advantage of the redundancy. It is so effective in compressing text files that it can reduce response size by about 70%. Our gzipped 60KB bundle was 197KB before gzip compression!

Although enabling gzip for serving static files seems an obvious thing to do only about 69% of pages is actually using it.

If you are using express to serve your files use compression package. There are a few available options but level is the most influential. There is also a filter allowing you to pass a predicate which indicates whether a file should be compressed. Default filter function takes into account a size and a file type.

const express = require("express");
const compression = require("compression");

const app = express();

function shouldCompress(req, res) {
  if (req.headers["x-no-compression"]) return false;
  return compression.filter(req, res);
}

app.use(express.static("build"));
app.use(compression({
  level: 2,               // set compression level from 1 to 9 (6 by default)
  filter: shouldCompress, // set predicate to determine whether to compress
}));

When I do not need additional logic which is possible with Express (like Server Side Rendering) I prefer to use nginx for serving static files. You can read more about configuration options in nginx gzip docs.

gzip on;
gzip_static on;
gzip_disable "msie6";

gzip_vary on;
gzip_proxied any;
gzip_comp_level 6;
gzip_buffers 16 8k;
gzip_http_version 1.1;
gzip_min_length 256;
gzip_types text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript application/vnd.ms-fontobject application/x-font-ttf font/opentype image/svg+xml image/x-icon;

Ok, but what level of compression should you choose? It depends on the size of your files and your server CPU. Just as a reminder, bundle I was working on before is 197KB JavaScript file. I did some stress tests with wrk using 120 connections, 8 threads, during 20 seconds on MacBook Pro 15" mid-2015:

wrk -c 120 -d 20s -t 8 http://localhost:8080/app.8f3854b71e9c3de39f8d.js

I am focusing on border values and a default one I use most of the time.

level 1 level 6 level 9
68.3 KB 57.9 KB 57.7 KB
2493 2435 2284

I do not have any brilliant conclusion here. It is what I could expect and using default level 6 makes the most sense to me in this setup. Static files have this advantage that they can be compressed during the build time.

const CompressionPlugin = require("compression-webpack-plugin");

module.exports = {
  plugins: [
    new CompressionPlugin(),
  ]
};

File compressed this way is 57.7KB and does not take processor time thanks to gzip_static on; set. Nginx will just serve /app.8f3854b71e9c3de39f8d.js.gz when /app.8f3854b71e9c3de39f8d.js is requested. The other takeaway about gzip is to never enable gzip for images, videos, PDFs, and other binary images as those are already compressed by their nature.

Routing, code splitting, and lazy loading

Code splitting allows for dividing your code into smaller chunks in such a way that each chunk can be loaded on demand, in parallel, or conditionally. Easy code splitting is one of the biggest advantages of using webpack which gained webpack great popularity. Code splitting made webpack the module bundler. Over two years ago, when I was moving from gulp to webpack, I was using angularjs for developing web apps. Implementing lazy loading in angular required few hacks here and there. With React and its declarative nature, it is much easier and much more elegant.

Thre are few ways you can approach code splitting and we will go through most of them. Right now, let's focus on splitting our application by routes. The first thing we need is a react-router v4 and few routes definition.

// routes.jsx

import React from "react";
import { Route, Switch } from "react-router-dom";

import App from "./common/App/App";
import Home from "./routes/Home/Home";
import About from "./routes/About/About";
import Login from "./routes/Login/Login";

const Routes = () => (
  <App>
    <Switch>
      <Route exact path="/" component={Home} />
      <Route exact path="/about" component={About} />
      <Route exact path="/login" component={Login} />
    </Switch>
  </App>
);

export default Routes;

// Home.jsx

import React from "react";
import { pick, toPairs, uniqBy } from "lodash-es";

const Home = () => { ... }

// About.jsx

import React from "react";
import { sortBy, memoize, curry } from "lodash-es";

const About = () => { ... }

// Login.jsx

import React from "react";
import { flow, throttle, debounce } from "lodash-es";

const Login = () => { ... }

I have also split my 9 favorite lodash utility functions "equally" across the routes. Now with routes and react-router, application size is 69KB gzipped. The goal is to make loading each route faster by excluding other pages code. You can check code execution coverage in Chrome DevTools. Approximately, over 47% of the code bundled is not used when entering given route.

React Router v4 is a full rewrite of the most popular router library for React. There is no simple migration path. The upside is that the new version is more modular and declarative. The downside is that you need few additional packages to match the functionality of the previous version like query-string or qs for parsing query params and react-loadable for components lazy loading.

To defer loading a page component until it is really needed we can use react-loadable. Loadable HOC expects function which will lazy load and return a component. I am not keen on the idea of adding this code to each route. Imagine next version is a breaking change and you have to go though every route to change the code. I am going to create a LoadableRoute component and use in my routes definition.

// routes.jsx

import React from "react";
import { Route, Switch } from "react-router-dom";
import Loadable from "react-loadable";

import App from "./common/App/App";

const LazyRoute = (props) => {
  const component = Loadable({
    loader: props.component,
    loading: () => <div>Loading&hellip;</div>,
  });

  return <Route {...props} component={component} />;
};

const Routes = () => (
  <App>
    <Switch>
      <LazyRoute exact path="/" component={() => import("./routes/Home/Home")} />
      <LazyRoute exact path="/about" component={() => import("./routes/About/About")} />
      <LazyRoute exact path="/login" component={() => import("./routes/Login/Login")} />
    </Switch>
  </App>
);

export default Routes;

After implementing dynamic import for routes components, loading any given route takes 65KB instead or 69KB. You can say it is not much but keep in mind that we have just installed react-loadable. Down the road, this improvement pays off.

Before webpack 2, when imports were not natively supported, to code split and load code dynamically you used require.ensure. The disadvantage of using webpack 2 imports is it doesn not allow you to name your chunks. It is not really webpack 2 fault, it is just not the part of import proposal. Instead of the app.[hash].js, home.[hash].js, about.[hash].js and login.[hash].js bundle contains the app.[hash].js, 0.[hash].js, 1.[hash].js, 2.[hash].js. This is not very helpful, especially when you trying to tackle regression issues. For example, after some change you heve noticed that bundle has grown in size. Adding new dynamic import can change names of unrelated modules. Fortunately, webpack 3 already addressed that issue with so called "magic comments":

<LazyRoute exact path="/" component={() => import(/* webpackChunkName: "home" */ "./routes/Home/Home")} />
<LazyRoute exact path="/about" component={() => import(/* webpackChunkName: "about" */ "./routes/About/About")} />
<LazyRoute exact path="/login" component={() => import(/* webpackChunkName: "login" */ "./routes/Login/Login")} />

// -rw-r--r--  1 michal  staff    62K Aug  6 11:07 build/app.ca30de797934ff9484e2.js.gz
// -rw-r--r--  1 michal  staff   3.2K Aug  6 11:07 build/home.a5a7f7e91944ead98904.js.gz
// -rw-r--r--  1 michal  staff   6.0K Aug  6 11:07 build/about.d8137ade9345cc48795e.js.gz
// -rw-r--r--  1 michal  staff   1.4K Aug  6 11:07 build/login.a68642ebb547708cf0bc.js.gz

Dynamic imports for heavy dependencies

There are multiple ways to benefit from dynamic imports. I have covered dynamic imports for React components already. The other way to optimize is to find a library or other module of significant size which is used only under certain conditions. An example of such dependency can be libphonenumber-js for phone number formatting and validation (70-130KB, vary depending on selected metadata) or zxcvbn for a password strength check (whopping 820KB).

Imagine you have to implement login page which contains 2 forms: login and signup. You want to load neither libphonenumber-js nor zxcvbn when a user wants to only log in. This, yet naive, example shows how you can introduce better, more refined rules for on demand, dynamic code loading. We want to show password strength only when user focus the input, not sooner.

class SignUpForm extends Component {
  state = {
    passwordStrength: -1,
  };

  static LABELS = ["terrible", "bad", "weak", "good", "strong"];

  componentWillReceiveProps = (newProps) => {
    if (this.props.values.password !== newProps.values.password) {
      this.setPasswordStrength(newProps.values.password);
    }
  };

  showPasswordStrength = () => {
    if (this.state.passwordStrength === -1) {
      // import on demand
      import("zxcvbn").then((zxcvbn) => {
        this.zxcvbn = zxcvbn;
        this.setPasswordStrength(this.props.values.password);
      });
    }
  };

  setPasswordStrength = (password) => {
    if (this.zxcvbn) {
      this.setState({ passwordStrength: this.zxcvbn(password).score });
    }
  };

  render() {
    const { onSubmit, onChange, values } = this.props;
    const { passwordStrength } = this.state;

    return (
      <form onSubmit={onSubmit}>
        <div>
          <label>
            Email:{" "}
            <input type="email" name="email" value={values.email} onChange={onChange} />
          </label>
        </div>
        <div>
          <label>
            Password:{" "}
            <input type="password" name="password" value={values.password} onChange={onChange} onFocus={this.showPasswordStrength} />
            {passwordStrength > -1 && <div>Password is {SignUpForm.LABELS[passwordStrength]}</div>}
          </label>
        </div>
        <input type="submit" />
      </form>
    );
  }
}

Split across multiple entry points

As you probably know, a single webpack build can have multiple entry points. This feature can be used to very effectively reduce loaded code for particular parts of the application. Imagine that your app works similar to Heroku frontend. So, you have a homepage which introduces the service but most of the features, so as code, is meant for logged in users (apps management, monitoring, billing etc.). Maybe you do not even need to use React for your homepage and entire JavaScript code required comes down to displaying a lame popup. Let's write some VanillaJS!

// home.js

const email = prompt("Sign up to our lame newsletter!");

fetch("/api/emails", { method: "POST", body: JSON.stringify({ email }) });

We are going to use different HTML file and code most of the code there. HTMLWebpackPlugin is used for generating HTML file with injected entry points paths. We need two separate files, home.html for our new homepage, and index.html for the rest of the pages. To generate two separate files use two instances of HTMLWebpackPlugin with different config. You want to explicitly specify chunks for each file.

modules.exports = {
  entry: {
    app: [path.resolve("src/index.jsx")],
    home: [path.resolve("src/home.js")],
  },

  plugins: [
    new HTMLWebpackPlugin({
      filename: "home.html",
      excludeChunks: ["app"],
      template: path.resolve("src/home.html"),
    }),
    new HTMLWebpackPlugin({
      excludeChunks: ["home"],
      template: path.resolve("src/index.html"),
    }),
  ]
};

The last thing is to customize your server so GET / serves home.html. Add handler for GET / before express.static middleware so home.html takes precedence over index.html.

app.get("/", (req, res) => {
  res.sendFile(path.resolve("build", "home.html"));
});

app.use(express.static("build"));

app.get("*", (req, res) => {
  res.sendFile(path.resolve("build", "index.html"));
});

This way we went down from loading over 70KB of JavaScript to only 3.5KB! Using multiple entry points requires good planning and understanding business requirements. On the other hand, the implementation itself is really simple.

CommonsChunkPlugin

The CommonsChunkPlugin can be used to create a separate file (a chunk) consisting of modules which are used across multiple entry points and their children. The advantage of having one file for common modules it the lack of repetition, which is always a good thing. Moreover, a chunk with a hash calculated from its content in the name can be aggressively cached. Once a file is downloaded, it can be later served from the disk until it changes.

There are few ways to use CommonsChunkPlugin and you can combine them for the best result. Provided configuration consists of creating two instances. The simplest is just creating a separate file, of a given name, with modules reused across entry points (app and home). Next configuration makes sure that modules reused across children (about and login) are also going to be exported to separate file and will not add size to each children chunk. With mixChunks you can set the least number of chunks reusing a module before module can be exported.

module.exports = {
  entry: {
    home: [path.resolve("src/home.js")],
    app: [path.resolve("src/index.jsx")],
  },

  plugins: [
    new webpack.optimize.CommonsChunkPlugin({
      name: "commons",
    }),
    new webpack.optimize.CommonsChunkPlugin({
      children: true,
      async: true,
      minChunks: 2, // the least number of chunks reusing a module before module can be exported
    }),
  ],
};

ModuleConcatenationPlugin

ModuleConcatenationPlugin or actually Scope Hoisting is the main feature of webpack 3. The basic idea behind Scope Hoisting is to reduce the number of wrapper functions around the modules. Rollbar does that too. In theory, it should speed up the execution of the bundle by reducing the number of closures that have to be called. I might sound skeptic but I am really excited that webpack team is pursuing performance! The talk I linked to is just very good and sheds some light on micro-optimizations.

Although some users report significant improvement in a bundle size that improvement would mean that webpack glue code is a majority of their bundled code. From my experience, it may save you few kilobytes here and there (for an average bundle ~350KB) but that is it.

module.exports = {
  plugins: [
    new webpack.optimize.ModuleConcatenationPlugin(),
  ],
};

Similar to Tree shaking, this optimization works only with ES modules.

Optimize CSS class names

Whether styles splitting applies to you or not depends on how do you handle your styles. If you use CSS-in-JS kind of solution, you just ship JS code and CSS strings styling your components together with components themselves.

On the other hand, if you do prefer to use css-loader and ExtractTextPlugin there is a way in which you can affect the size of the shipped CSS code. I heve recently been testing how class names influence the size of the bundle. As you may know, css-loader allows for specifying a pattern in which selector is mapped to unique ident. By default, it is 23 characters hash. I did few tests, tried few patterns in one of the projects and I was more than pleased with the results. At first glance, the less code the better so the shortest class names should give the best result.

default [hash:23] [hash:12] [local]-[hash:5] [name]-[hash:5]
bundle (with assets) 8540 KB 8504 KB 8524 KB 8532 KB
gzip 3936 KB 3904 KB 3829 KB 3892 KB
Savings N/A 32 107 44

Due to nature of how compression works making idents more similar to each other results in smaller gzipped bundle. Selector names are used in both CSS file and components, size savings are doubled. If you have a lot of small components and many generic, similar class names (wrapper, title, container, item) your results will be similar to mine. If you have a smaller number of components but a lot of CSS selectors you might be better off with [name]-[hash:5].

To use ExtractTextPlugin, source map, and UglifyJsPlugin, remember to enable sourceMap option. Otherwise, you can come across some not so obvious to debug issues. By not so obvious I mean some crazy error messages which does not say you anything and does not seem to be related anyway to what you have just done. Love it or hate it but sometimes that is the price of tools which are doing the heavy lifting for you.

NODE_ENV="production"

This one seems pretty obvious! I bet when you read the title "Optimize React build for production with webpack" you could think of at least two: NODE_ENV and UglifyJS. Although it is a pretty common knowledge, confusions happen.

How does it work? It is not a react-specific optimization and you can quickly try it out. Create entry point with following content:

if (process.env.NODE_ENV !== "production") {
  alert("Hello!");
}

Create a development build. That is should be the content:

webpackJsonp([1],{10:function(n,o,e){n.exports=e(11)},11:function(n,o,e){alert("Hello!")}},[10]);
//# sourceMappingURL=app.fc8e58739d91fe5afee6.js.map

As you can see there is even no if statement but let's move along. Make sure that NODE_ENV is set to "production". You can do it with:

module.exports = {
  plugins: [
    // make sure that NODE_ENV="production" during the build
    new webpack.EnvironmentPlugin(["NODE_ENV"]),
  ],
};

// or

module.exports = {
  plugins: [
    new webpack.DefinePlugin({
      "process.env": {
        NODE_ENV: JSON.stringify("production"),
      },
    }),
  ],
};

Now that is the build:

webpackJsonp([1],{10:function(n,o,c){n.exports=c(11)},11:function(n,o,c){}},[10]);
//# sourceMappingURL=app.3065b2840be2e08955ce.js.map

Here is how we got back to UglifyJS and dead code elimination. During the build process.env.NODE_ENV is replaced with a string value. UglifyJS seeing

if ("development" !== "production")

skips if statement as it is always true. Opposite happens when minimizer comes across

if ("production" !== "production")

This condition is always false and UglifyJS drops the dead code. It works the same with prop types!

Babel plugins optimizations

Babel opens new opportunities not only for writing cutting edge JavaScript for production but also for a wide variety of optimizations. I heve mentioned babel-lodash-plugin already, there is a lot more to explore.

You know that React does not check props in production. You can also notice that despite this optimization, prop types are still present in components code. It is a dead code, I know that, you know that, but uglify does not know that. You can use babel-plugin-transform-react-remove-prop-types to get rid of those calls.

{
  "presets": [["es2015", { "modules": false }], "stage-2", "react"],
  "env": {
    "production": {
      "plugins": ["transform-react-remove-prop-types"]
    }
  }
}

Wrap up

I have went though few possible optimizations but did not mention about one of the most important thing in the entire process. Whatever optimization you are applying make sure to extensively test the outcome. There are few good tools which can help you to do that e.g. BundleAnalyzerPlugin. If you prefer to do it the old way:

tar -czf build.tar.gz build && du -k build.tar.gz

Minimizing data users download is only the first step to improve performance. Stay tuned and if you like the article follow me on twitter to learn more performance optimization techniques!

Photo by Goh Rhy Yan on Unsplash.