Recently I came across this post saying that CommonJS is hurting the JavaScript world. And while I do agree that the CommonJS specification was not a good one, I also disagree a lot with the article. As someone who have strong opinions about the subject, I decided to write a post about it.
So, here’s the problem: supposing that you have a legacy system that’s years old, and that basically is modular, meaning that people write extensions, or libraries, to that. Like for example, NodeJS’ CommonJS. Now, here’s the problem: if that legacy system is flawed, and you need to change somehow, and your solution is “let’s do something that is better in any way than the older model, it’s faster, etc, but also completely incompatible with the old stuff” – meaning that we have to rewrite everything to be able to be used on this new format – then you’re doing it wrong.
And this stuff happens in the JavaScript ecosystem all the time. The new, shiny thing, is completely incompatible with everything that was ever written and needs you to rewrite everything, down to the last dependency, to conform to the new standard.
Now, imagine if the same behavior was applied to basically anything else – like for example, you buy a new electronic device for your home, and it basically accepts 145 volts. And you’re like, “that’s not a thing. We either have 110/120 or 220/240 volts”. And here’s a problem: the manufacturer will say, “well, that’s the new standard. Now every electronic device, and everything that uses power will be 145 because the new standard is better, safer, and uses less power”.
The thing is – it doesn’t matter if it’s better, because everything that’s on your house doesn’t conform to this. So you’ll have to use some kind of adapter for everything, one that converts between whatever voltage you had to the new standard, and as soon as you house adopts the new standard you will still need an adapter for the old stuff.
Now, you see, how absurd is that behavior on the JavaScript world? Actually, JavaScript is worse than that – because not only they adopt a new standard, but make the new standard completely incompatible with the older standard to the point that there’s no “adapter” or “I can use the old thing in new code” – and that’s where the problem resides: as soon as the new standard is incompatible with the older one, but the older one is compatible with the newer one (like CommonJS is compatible with ESM, but not the opposite), that immediately translates to “people will not migrate to the new standard”.
How to make a standard that people will adopt
So, in this specific case of ESM versus CommonJS, how would I, specifically, draft this new standard? Honestly, by adapting things – if you know that the CommonJS is incompatible with the ESM, put something on the Javascript standard and force implementations to adopt it. In this specific case, if you’re importing a module, and that module tries to import something that was written in CommonJS, make an exception on the interpreter and cause it to become “async” but avoid new events to be processed until that one require
ends loading up the file it was supposed to required (basically, simulating a synchronous require). Yes, it’s ugly, and yes, it requires changes on the JS virtual machine, changes that I am sure are not easy to implement. You can maybe even say “we’ll support this behavior for 10 years, then we’ll remove it completely”; but by doing this, you’re basically givin people years to migrate to the new standard, and it’s actually easier to use the new standard, because it’s backwards compatible and requires no change to legacy code – not harder, or, in this specific ESM case, literally impossible.
Another issue that the original post seems to have missed is that it says that CommonJS was invented because of the needs of a server-side Javascript, where in reality, people were asking for ways to split their JS files in different modules for ages – in fact, JS first appeared in 1995, and the first implementation of require
and such was at 2009.
Let that sink in – the only language that we can use to make scripts and code in the browser did not have a way to separate modules, libraries, etc in different directories, for fourteen years! And then, somebody did it, and that was, basically, ignored again for six years when finally we have the official standard, that is basically incompatible with everything else that people ever written.
Also, you don’t need to believe me when I say that people were asking for modules for ages – just look at the RequireJS first release, one library that implements “modules” in Javascript (the only difference being that you can require
code from it, basically being not ideal, but easier to migrate to than ESM; which is wild – something in user-space integrating better with stuff that’s the core of the language). If you look at the first release, it dates from 2010 – yes, five years before ESM landed on Javascript.
If that’s not a huge facepalm, I don’t know what is – now, you literally have two different standards of separating code in modules (well, actually, three if you think about Babel + Webpack/Browserify/Parcel/ESBuild/Microbundle/… as one different way of handling modules) and you create a third (or forth) one – and expect people to migrate everything to it, by offering…
… well, nothing. No “auto conversion” tool, no “compatibility layer”, literally nothing.
No build steps
The author also mentions a future where we won’t need a build step, and… this:
as the standard and the focus shifting towards cloud primitives — the edge, browsers, and serverless compute — and CommonJS simply doesn’t cut it
I… really don’t have words. It’s so… weird, that I don’t even know where to start, but I’ll try anyway. First – lots of new frameworks require a build step by default – SolidJS, Svelte, Vue, React… but let’s ignore that, and imagine that we can pay the performance hit that it takes to not use a build step (in the case of Svelte and SolidJS) and we decide to not use JSX, to not use any library that writes JSX, and migrate everything to h('div', {className: "foo"}, [...])
and that we somehow normalize this (wow, already a lot of thing to assume we would compromise, right? It shows how absurd the argument is); now, let’s install React…
… and we have 460kb of code in our node_modules
. But let’s say you also need react-dom – that makes things go up to 5.0mb. Adding Redux make this code go up to 6.3mb, and so on. Chlorine, my other side-project, have a node_modules
that contains 108mb! When I bundle things, everything becomes 12mb.
That’s almost a 90% reduction on size! But that’s not all – I bundle things without minifying, because this code is server-side and because Node.JS is a horrible runtime where the stacktraces sometimes show the wrong file and size of my code when using source-maps (and the source-maps are bigger anyway than just not minifying things). If I do minify my code, the actual size is 3.1mb!
So, the “focus shifting between edge, browser, serverles” (I actually don’t know what they meant with “edge” but let’s hope it’s not the browser from a specific company) will not change much – you don’t deploy a node_modules
in a webpage, because it’s simply too big; you don’t deploy a node_modules
in a servless environment, because, again, it’s too big, deploy is more complicated, and AWS actually asks you to limit the size of your lambda code for example; finally, some people pointed that tree-shaking is easier on ESM because the compiler can detect all import
clauses, and that ESM doesn’t allow for “conditional requires”…
…doesn’t it? Look at this code fragment and tell me what you see. The standard do allow for conditional, client-side, user-spacing loading. In fact, it needs to – because the require order is asynchronous, you need to have a tool to “await” for something to load before continuing the process if you do some “side-effect” load (shouldn’t be needed, but JS is JS – people do all kinds of things in the language)…
So much time lost!
I’m saying that as someone that’s working in Pulsar, a side project that’s a fork of Atom, a fork that it’s a compatibility nightmare: because not only we have to deal with incompatibilities between Node.JS things, but we also need to handle the electron incompatibilities between versions, the native modules between versions both in Node.JS and Electron (and operating system) differences, web standards that changed a lot (and that packages depend on it), etc.
You know what I would love to be doing? Making Pulsar better, faster, more reliable, handle big files better, experiment new libraries and data structures to speed up things, add more UI elements, add more LSP support, and making the best editor I could possibly make (and want) too. You know what I am doing? Converting code to use the new standard, sometimes to the point of slowing things down because the new standard comes with a performance hit, for the absolute same feature I had originally.
I have to run the fastest I can, just to keep on the same place… and I honestly don’t bother to run “twice as fast” when this problem only exists in the JS world.