Upgrade your Lerna Workspace - Make it Fast and Modern!

Juri Strumpflohner - Oct 28 '22 - - Dev Community

Currently experiencing a slow Lerna workspace? Feels old and you're looking for alternatives? Keep on reading, because all it might need to get a fast and modern experience is to upgrade your Lerna package.

Lerna has come a long way:

  • created 6 years ago to solve the specific problem of managing the Babel repo packages
  • then became the de-facto tool for managing & publishing packages in JS-based monorepos
  • got declared "not actively maintained" in March 2022
  • got revived again in May 2022 by passing on stewardship to the Nx team.

Since May 2022, a lot has changed! Lerna became fast & modern, getting features you'd expect from a 2022 monorepo management tool. Version 6 just got out a couple of weeks ago. If you missed that, check out my blog post "Lerna Reborn - What's new in v6", which goes into the details of all the cool features that were part of that release.


Prefer a video version of this post instead? I got you covered:


Step 1: Still using "old Lerna"? Upgrade!

Upgrading your Lerna workspace should be as easy as installing the latest lerna npm package.

The team increased the major version to be cautious and to clearly communicate the chance for potential breaking changes. Most workspaces won't experience them, though.

Step 2: Configure Caching & Task Pipeline

To get the most out of your Lerna workspace, enable caching. Caching can be enabled by creating a nx.json and defining the cacheable operations. In addition to that, you can also specify which tasks need to be run in topological order (e.g. due to dependencies between build tasks).

The easiest way to set this up in a mostly automated fashion is to run the following command:

npx lerna add-caching
Enter fullscreen mode Exit fullscreen mode

This generates a new nx.json, which can be fine-tuned further.

// nx.json
{
  "tasksRunnerOptions": {
    "default": {
      "runner": "nx/tasks-runners/default",
      "options": {
        "cacheableOperations": [
          "build",
          "test"
        ]
      }
    }
  },
  "targetDefaults": {
    "build": {
      "dependsOn": [
        "^build"
      ],
      "outputs": [
        "{projectRoot}/build"
      ]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Step 3: Fine-tune Cache Inputs

When it comes to caching, you can even get a step further. On each run, the caching mechanism computes a hash out of all the project's files and uses that hash to query the cache for an existing run.

You might want to fine-tune that, though. Example: you don't want to invalidate your test task's cache just because we updated a package's README.md. To avoid that, you can define inputs for your operations:

// nx.json
{
  ...,
  "targetDefaults": {
    ...
    "test": {
      "inputs": [
          "{projectRoot}/**/*", 
          "!{projectRoot}/**/*.md"
      ]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

This simple configuration first, by default, includes all project files, but in a 2nd glob pattern, excludes Markdown files in particular (notice the ! in front).

You can optimize these expressions and even reuse them across different targets. But for more details, consult the Lerna docs.

Step 5: Add Remote Caching

The real benefit of caching is when you distribute and share it with co-workers and your CI. Leveraging previously cached runs (especially on CI) results in huge time savings.

Since Lerna uses Nx as a task runner, the remote cache setup is based on Nx Cloud.

Running the following command sets up your workspace:

npx nx connect-to-nx-cloud
Enter fullscreen mode Exit fullscreen mode

Important! Remote Caching on Nx Cloud is completely free for open source projects! Reach out!

More on the Lerna docs

Running into Roadblocks when Upgrading?

Join the Nrwl Community Slack, in particular, the #lerna channel to ask your question and get help.


Learn more

Also, if you liked this, click the ❤️ and make sure to follow Juri and Lerna on Twitter for more!

#nx

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .