Andrew Duthie photo

Back to Basics: A Possible Future Without JavaScript Build Tools

Published on

As someone who spends the majority of their time working in a modern front-end JavaScript tech stack, it may be surprising to hear that I'm often disillusioned by the rate of churn in our development workflows. But for me, it's not the near-daily introduction of new libraries. In fact, I find this perpetual inventiveness to be inspiring. No; instead, it's that we push ourselves further and further away from anything resembling a shared baseline of knowledge. For each new paradigm we layer into our stack, we too easily overlook its broader human impact as a barrier to entry, a prerequisite of knowledge unique to our project.

I've worked in this space long enough to recall something of an explosion of tooling and modernized development workflows some five-to-seven years ago; a messy, beautiful, exciting advent of new JavaScript syntaxes, shared package libraries, and bundlers to bring this all together. At the time, we looked at these bundlers and transpilers as necessary evils, given that the platforms we build for lagged behind the bleeding edge. But as the years have progressed, isn't it time we step back and reevaluate the status quo?

Recently, I undertook a new project which was to be distributed as a browser extension for Chrome and Firefox. Knowing that my target audience was limited to those using modern evergreen browsers, I was curious to use this as an opportunity to explore how the native platform has advanced, to peel back the layers of abstraction we have seen need to embrace. Could I still achieve my goals?

What's wrong with build tools?

Aside from considering the thought experiment for its own sake, it's a fair question to consider why any of this matters. While I'm sure anyone who has spent any amount of time working in front-end development can attest to their own experiences in how these tools impact our workflows, at a high level I see the following influences:

Modules: A brief history

Back when JavaScript web apps began to take off, the available options for organizing one's code were limited. For the most part, developers were constrained to operating in a single JavaScript file. While there were some patterns which emerged to try to reign in this chaos, more often than not these files would inevitably devolve into multiple thousands of lines of "soup"-y "spaghetti code". To be able to break down our code into independent logical groupings, and to define dependencies between them, new tools like RequireJS emerged to help organize the code of our browser application.

In parallel, Node.js was continuing to grow in popularity. Node had adopted the CommonJS module system as a simple means to define dependencies between files in a project, and to reuse shared libraries published to npm. Tools like Browserify existed to bridge this gap of Node.js and the browser.

But in the meantime, the language itself has evolved to describe a common pattern for declaring and consuming modules. You may already be accustomed to writing in this syntax:

import React from 'react';

One of the responsibilities of these bundling tools we've used over the past years has been to convert the above syntax into the single JavaScript file(s) we expected to be required to write.

And yet, the times have changed. All modern browsers have supported this syntax of modules for almost two years. And just days ago with its v13.2.0 release, Node.js unflagged modules as experimental, marking the first stable release where you can choose to author your Node.js JavaScript using the standard ES Module semantics.

So as more organizations begin to drop support for Internet Explorer 11, you should consider whether native modules are a real option for organizing your browser code.

Managing dependencies with Pika, a different sort of tool

Of course, even if you choose to adopt the modules syntax into your own code, this still leaves open the question of how you might pull in shared code, most notably common modules published to npm.

There's nothing to stop you from importing directly from a URL, such as that of a CDN where the module is hosted. In fact, I think this is great for prototyping, and I'm thrilled to see that some projects are starting to recommend this option in their quickstart guides. (Aside: While not a secret by any means, I think part of the success in projects like Vue.js can be attributed to optimizing for this sort of demonstration of the benefits of their approach in as few up-front steps as possible)

If you'd rather avoid relying on a third-party network, you could also choose to download these files local to your project. But, depending on how many dependencies you choose to use, this may become difficult to manage over time. In fact, this tedium was one of the original reasons for a project like Bower to have been created, since even at the time, it was a pain point to maintain dependencies on jQuery, Modernizer, etc., few as they were.

And, unfortunately, while we can use NPM in our front-end projects to download dependencies locally, the files as they exist in node_modules are often not formatted in a way which can be used directly in a browser project.

For this reason, I was happy to find a tool like Pika Web. Unlike most build tools, Pika Web is to be run at install-time, meaning that you'd typically only run it once for each dependency you add, then never think about it again. Under the hood, the Pika CDN will pre-apply the necessary transforms to ensure the code can be run in the browser.

With a tool like Pika, you can download the dependencies you want to use into your local environment with an assurance they will run in the browser.

Is there a future without any of these tools?

I'm sure there is a palpable irony in promoting a tool like Pika in a post so otherwise entrenched in discouraging tooling. As more projects begin to distribute ES modules directly, the problems that a tool like Pika addresses become fewer and fewer. That said, there are still some uncertainties and open questions about how we define requirements on the platforms in which our code is to be run, and how our dependencies define their own dependencies.

Consider a project like Preact, which is designed to work pretty well out-of-the-box using ES modules. Even in this project, if you start to inspect the module code, you can see how a statement like import from 'preact'; becomes impossible to resolve in a browser, because the named "preact" dependency has no intrinsic meaning outside of npm. And unfortunately, we currently have no control over how this is resolved as a consumer of this code.

Thankfully, this is an area of active development with the import maps specification. From the "basic idea" section of that page, you should be able to get a grasp for how we might be able to better resolve these named projects in the future, by having control over how these named imports resolve to remote or local files.

<script type="importmap">
	{
		"imports": {
			"lodash": "/node\_modules/lodash-es/lodash.js"
		}
	}
</script>

<script type="module">
	import {uniq} from 'lodash'; const nums = uniq( \[ 2, 1, 2 \] );
</script>

With this, you can imagine how there might be a future where the dependencies we install using npm are referenced directly in our browser code.

What's the consensus?

In my browser extension project, I was happy to find how few obstacles there were in using modules directly. The code I wrote was very close to resembling something I might write in an environment rife with build tools, including third-party dependencies like React, a JSX-like syntax, and TypeScript-like type checking (see example). It was a joy to simply open my editor and start coding, knowing that I was only a page refresh away from seeing my latest changes, and that the code I was debugging in my Chrome Debugger was the exact code I had been writing.

Are there downsides? For my purposes, I'd say there were few to none. There are trade-offs in a not-quite-JSX syntax, or choosing to use JavaScript-only type checking. But I found that these trade-offs would often challenge me to consider whether I needed these things. In retrospect, I probably would have been fine to avoid JSX altogether, considering that the "raw" form is not all too difficult to work with. TypeScript is great, but I can also get along nearly just as well with JavaScript alone.

I also recognize that I was privileged to be able to target latest versions of Firefox and Chrome, and that this experience doesn't translate to all web projects. But I think the state of browsers in 2019 is much better than it was in 2014, in that we can rely more on "evergreen" browser auto-updates to be able to take advantage of new language features. And while the JavaScript standard continues to add new features year-over-year, these revisions haven't been nearly as dramatic as that first transition from ES5 to ES2015.

So, for your next project, I would encourage you to consider the ways you can subtract from your layers of tooling as much as to be eager to bring in the newest and greatest. I sense that you might be surprised to find how capable the native environments can be.