As a developer who is in this ecosystem so long that it makes me shrug when I think about it, I loved the humble writing style and loved the fact that she could teach me a thing or two.
Being able to say "I don't know how this works" requires a lot of experience too, not just confidence.
Especially in an interview, people saying "I don't know anything about what you asked", depending on the requirements, can end the interview but could also make the candidate gain a lot of points.
All the (great) recommendations in this comment section are a great example of how the js ecosystem works. there are so many great new technologies available for building apps, but standardization and discoverability are far from solved.
There are new esm based cdns (similar things have been around since the browserify days), awesome new build tools like esbuild, skypack, and swc that are faster and/or simpler than webpack. new language features have been standardized that reduce the need for various types of libraries, etc.
For anyone who doesn't spend an unreasonable time following these developments discovery of all this cool new things seems very tough.
On the opposite side of things, as a developer of a library I think is really useful, figuring out how to help people find it, I have no idea where to start.
What a fantastic post! I rarely do frontend web development. Perhaps only once every year or two. I don't follow frontend development much, and every single time I need to write some javascript the preferred tech stack of the day is completely different. There are so many tools, so many different ways of doing things. It's completely overwhelming.
So I end up doing the same thing the author was doing: just use <script> tags. If I need a library I try to download a minified version of it and put it in a script tag. This is usually good enough for me as I typically use JS for very small things. “pretend it’s 2005”. It works and if something breaks I know how to solve it.
For the kind of JS projects that I work on, learning the tooling can easily take 3X the time I need to write the project itself. I know that there are several meta-tools that will automatically configure all other tools for you, but I really dislike them. They usually end up downloading half of the entire internet, and when you move an inch from their preferred defaults everything becomes a nightmare.
Esbuild seems like the perfect tool for people like me. It's simple, fast and it allows me to import things in JS! I'm definitely going to look at it next time I'm writing some frontend code.
> problem 1: libraries that tell you to npm install them
Agreed. I wish `npm install` was just some option for those who wanted it. But no! Somehow it became the defacto standard and everyone just expects everyone else to already be onboard.
I got onboard reluctently because that's the only way to use the libraries I cared about.
> problem 2: I don’t understand frontend build tools
Absolute and total agreement.
I can't agree enough.
I think no one really understands them.
> “webpack” which I will not try to explain much because I don’t understand it, I think it’s a system with a million plugins that does a million things
I haven't worked with anyone who understood webpack. They just use a scaffold-builder to set it up in a pre-configured way and then you just never touch it.
> Recently I learned about esbuild which is a more Unix-style tool.
esbuild is the best thing that happened to frontend development. It finally freed us (well, me anyway) from the messy js build systems.
But I'd like to strongly object to describing it as a "unix-style" tool. Unix style tools to me are tools that deal with plain text and expect you to pipe them with other tools to get the job done. esbuild is nothing like that. It's the entire build pipeline in one tool. You don't work with it by piping its inputs and outputs to other tools. It provides you a programming interface and you write code to make it do what you want.
EDIT:
To clarify, I think it's a very good thing that esbuild is not a unix-style tool. I know the common wisdom on HN is in favor of the unix philosophy, but I don't buy into that.
Unix tools should do something well in CLI, and be transparent enough to deconstruct the command into series of commands if it is a complex process. Piping is not always practical (see make).
> Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
But as you can see, esbuild is nothing like that. It doesn't expect itself to take input from other programs, nor does it expect its output to be piped to other programs.
Another formulation:
- Write programs that do one thing and do it well.
- Write programs to work together.
- Write programs to handle text streams, because that is a universal interface.
> “webpack” which I will not try to explain much because I don’t understand it, I think it’s a system with a million plugins that does a million things
Is webpack really considered such a difficult beast? I was an early adopter of TypeScript so I think I started with grunt, then gulp, then finally webpack 3, 4 and now 5.
I mean it's not my main job or anything, and I don't really enjoy configuring things but spending a handful of days reading the docs once a year or whatever is not really a problem. I have a couple of files, one for local one for production. About 50-70 lines each and considering it is compiling typescript, linting, minifying, scss, testing and so on... I find I get a lot of bang for my buck. It's not like the previous tooling was really any easier to deal with.
The production build takes about 50 seconds, and the local build takes about 3 seconds. It's honestly the least of my problems.
> Is webpack really considered such a difficult beast
For whatever it's worth, I'm the sort of person who writes makefiles for fun, and I regularly work with the build systems of go, C, rust, java, ruby, python, and javascript.
webpack is easily the most complicated, slow, frightening build system I have ever encountered and I will go far out of my way to avoid it.
TIL as it's not really something I can say I am an expert on although I am really surprised by the bar height being set by others. Hopefully I will be pleasantly surprised when I try some new tooling in the future!
It's also completely possible that I had my brain warped in weird ways by old software, and you'll have your brain warped in your own weird ways that work better in the future. Anyway, I know I'm always glad to have somebody who knows their way around webpack on the team :)
> It's not like the previous tooling was really any easier to deal with.
If you really understood the previous tooling, that probably explains why you have the mental models in place to understand what webpacker is doing with increasing automation. Say, if you were using babel without webpacker already.
For people who skipped that or came to JS in webpacker era (myself included), it's hard to understand what is going on. the universe of what webpacker does seems inconsistent and random and involves concepts underlying concepts underlying concepts where it seems like you need to understand the mysterious base to understand what the top-level tool is doing before forgetting it and letting the tool do it for you.
But webpacker isn't just insane, it is what it is for a reason, involving solving specific problems that were in existence in previous ways of doing things. If you had those problems personally... it's a lot easier to make sense of it.
I assume/sense. I don't know this from personal experience with webpacker... but it's a familiar sort of situation with software development technology.
> But webpacker isn't just insane, it is what it is for a reason, involving solving specific problems that were in existence in previous ways of doing things. If you had those problems personally... it's a lot easier to make sense of it.
I think this a big issue for non-js / frontend people, particularly in the python webdev community.
People don't know the story of front end and the user demand for refined interfaces. The demand has been high even though browsers just recently offered consistent support for modern js / css features.
I suspect this will come back to bite backend people, to some extent because the dust is settling on frontend. So the anti-everything on frontend attitude might blind people to the rapid advance of JS and node.
At two places I’ve worked, we had performance issues with webpack where builds would take tens of minutes. Every frontend guru would come in with their “oh, it’s probably X”, spend half a day debugging, and then give up.
Moreover, webpack does all sorts of things I don’t care about: support for a dozen different module systems, javascript versions, polyfills, scss, JSX, etc etc. I just want to issue a few requests and update the DOM.
I guess I’m spoiled by Go and Rust where I can basically just throw a list of dependencies at the build tool and it spits out an executable (almost immediately in the case of go).
The problem with webpack, beside being very slow, is it's based on config files.
Config files are brittle and very hard to understand.
Spending several days sifting through the webpack docs and maintaining 70 lines of config files sounds like a nightmare.
Now, for esbuild, I have more lines than that for the build script, but the thing is, it's just code. The configuration part is about 10 lines. The other stuff is just programming logic.
Navigating documentation to understand what parameters something accepts is ok.
Spending time editing code is ok. It's what I do most of the time anyway.
The reason I moved away from gulp (v3?) at the time was because I considered configuration (especially an intellisense configuration) really simple as opposed to the god awful pipable javascript files gulp went for.
Not sure how much that plays to my strengths though because everything I have ever used has json file configurations or object file configurations.
Anyway I am definitely going to kick some tires on ESBuild, but I will just wait a while so thanks for some additional insight.
I never understood the point of gulp and other libraries that claim to help you write "commands". I just never used them and instead just wrote the code directly.
Back in the day I was using amd.js and they had a scriptable command line utility so I just wrote a javascript file that collected all the required inputs for it and then built up the build command. (or something like that .. I don't remember the details).
> I have a couple of files, one for local one for production. About 50-70 lines each
I don't think you realize how hellish this sounds. I haven't had >100 lines of build configuration in any other environment since my configure.in/Makefile.am days, and even that was more transferable to other tools than Webpack, which is only good for Webpacking.
Modern webpack is also much simpler than when it started, as they’ve followed their zero-config northstar (#0CJS) for a while now. Many simple projects can literally run without a config file at all. The config file only grows in relation to the complexity you want to take on, like adding Typescript or wanting to run WebWorkers, but that seems fair? It’s not like you can just start adding Kotlin or Scala to a Java project without changing your setup.
And there are plenty of higher-level tools than webpack that are pre-configured to handle most of your variations anyway (eg CRA) and are a much better starting point if you don’t need to fine tune your build or do complicated/exotic things (which isn’t a breeze in any language I’ve experienced).
What would be the alternative in todays projects? Like typescript and scss need compiled, then there are builds which are minified with source maps vs local which isn't, then people want browserlist and scss linting and so on. I don't see how configuration can be escaped but I would be happy to have a path forward to change that if I knew how.
Oh no, I agree it's absolutely too late for existing projects. Good sense needed to prevail at least 4-5 years ago, but who wanted to deal with being seen as "elitist" or "gatekeeping" for daring to suggest the entire ecosystem made no sense?
Webpack is definitely a difficult beast because it's sort of an everything tool. It's supposed to be just a bundler but because it supports plugins people use it as a full build system. It doesn't really work that well as a full build system. It's also quite buggy in my experience, e.g. the multi-config feature doesn't really work at all.
I love reading about JS in a context where I’m not expected to understand everything about every single obscure abbreviation from the past 15 years. Great post.
Great read, kudos for using strace to understand what esbuild does! I should do that more often.
This is another reason to favor small and simple tools that do one thing: you can “take it apart” and understand it by using another simple tool. (Simple in a relative sense, e.g. compared to Webpack).
By the way for those who don’t know what strace is, it’s a command line utility that prints out system calls of another process, like opening files or network connections.
Esbuild is great. That said, I find as its one major weakness the inability to alias/dedupe libraries at least in a pnpm monorepo. Even if it's the same version and I assume the same link, it will bundle it twice and you can get some pretty obscure errors when you run your app.
I did try various setups but none worked. With Vite at least I was able to alias libraries to same import thus avoiding importing them twice/x times.
But I'd kinda prefer using esbuild as Vite can get funky too, especially I've noticed if you build a local library with multiple chunks (eg just using tsc) the HMR will get out of hand. As I just tried, it basically seemed to go on indefinitely. Maybe there's a config for that but it's not nice, I can tell you that.
I've recently being helping a friend who is starting with Node.js, and I strongly recommended him (and anyone else getting started now) to use all `import` instead of `require`.
I would say it takes a bit more of work because the tutorials are normally written on commonJS (require), but otherwise it's just a different kind of work. Both of them have their peculiarities and issues, and CommonJS might also bite you big time. So if you have to choose, and both give you a bit of pain (for now), I'd suggest to choose the one that is more future-proof.
About the article on Esbuild and ESM vs CommonJS, some other personal notes:
- Node.js' `import` is more strict than frontend tools' `import`. You can do `import myfile from './myfile'` on the front-end but you need to specify the ".js" on the backend.
- You can no longer use __dirname with ES6 modules. That's a feature, and I found that I normally have enough control over where the script is run from that I just prefer write all relative paths instead of bothering shiming it.
- The Esbuild from the author is great and I use something similar for simpler projects and libraries I develop. However for making WebApps, I prefer to also have Hot Module Reload. It's like Prettier, at the beginning it feels unnatural and overbearing, but after you get used to it you realize what a huge order of improvement it is vs the way you were used to.
FWIW, you can replace __dirname with information from import.meta (and 2 lines of JS). It’s not quite as automagic but can yield the exact same info. You’re also right that it might just be a code-smell in general though.
Yeah, I've had 2 layers of deception there. You think it might be those 2-3 lines from the popular StackOverflow answer, but after hunting a bug for a while I realized that it was url-encoding that space in the path name and saying "directory doesn't exist" when in the console it was obviously correct.
Drove me crazy until I realized in the full path there was, early in the path, a single "%20" and my JS senses told me it doesn't make sense for spaces to be url-encoded in paths.
In the way that some libraries have been updated to be ESM-only, so if you follow a tutorial to-the-letter and then try to require() that library it'll fail with a cryptic error message (at least for new users).
I’ve gotten so tired of the near infinite complexity of the angular / react / webpack / typescript / sass projects at work that for my home projects I’m now exploring no build tool solutions. Modern browsers are now so powerful that a lot of the things we needed build tools for no longer apply:
- http/2 multiplexing has made it unnecessary to limit the number of script or css files loaded at once. Faster connections together with http/2 mean payload splitting is no longer necessary for js and css (which means everything can load up front from index.html). Gzipping removes the need for minification.
- Any npm package can be loaded as script tags or through direct es6 module import from unpkg.com. I use a single ‘externals.js’ which exports those dependencies so I can just do “import { foo } from ‘externals.js’”
- sass is mostly unnecessary thanks to widespread css3 support, @import for modularity (see also http/2), bem notation for namespacing, and css variables for reuse
- es6 is supported in all browsers, so you can write modern code. ES6 modules “just work” if you import from a file path.
- vue and preact are designed to be used without build tools. An app deployed as a static site can use hash routing to need no builds and no server-side router.
Experimenting with this stuff has really made me wonder why we use all of these build tools.
I wrote a generic loader that’s compatible with old school scripts, amd, node, and es imports that does just that. It’s zero config and would be fully automated except given a local npm/yarn env, I just need some way of going from “fookpg” in “import foo from foopkg” to node_modules/foopkg/entrypoint.JS
How can you resolve a module name to its path without external dependencies?
Well, the libraries included at runtime from cdn are still minified. It’s only the app code that is gzipped and not minified, and it is only the very largest projects where minification will make a meaningful difference for the app’s own (not library) code. There is usually no point to minifying the app’s own code.
As for tree shaking. When using lightweight libraries meant for the no build path this isn’t necessary because these aren’t pulling in lots of dependencies. Preact is tiny and is a good enough react for my needs.
I wouldn’t recommend this approach for large web apps, but most web apps are small enough that a no build tools approach seems viable to me.
> - vue and preact are designed to be used without build tools. An app deployed as a static site can use hash routing to need no builds and no server-side router.
So is React. JSX and React are 2 different things. But honestly, React without JSX is a pain to write.
Minification and hashing for secure script embedding are still needed, but it could provided directly by a server middleware, depending on the language one uses.
Julia is one of my favorite examples of learning in public as a developer. Her technical chops are unquestioned so when she says things like "I have no idea what import does" it really drives home how much the JS community has to do to bring along the rest of the developer population who just wants to make web stuff occasionally without going to JS Build Tool University.
I love this and maybe it would be great for someone who works on the spec/bundlers to write a "Dear Julia" letter explaining things and giving context to specific parts on why things are the way they are, and pointing out clear ways that we as a community should try to improve.
I don't understand why people are holding JS to a higher standard than other languages, is it just because it used to be easier?
Knowing nothing about Java, I wouldn't expect to open a Java file and know what every line means. There's a million reasons to find modern web dev complicated but not taking the time to google "js import" is not one of them. Also I wouldn't expect to "just know" how to take a Java program and build it in a way that I could host it somewhere, I'd expect to have to work at it.
I think it's more that depending on your point in time, build process and environment, `import` has changed meaning every couple of years.
I consider myself having deeper knowledge about JS and node than any other language, having spent more total time in it than anything else (except maybe bash).
I have a clearer understanding of what import/require really does in most other languages.
The flow-chart complexity and possible outcomes of what `import` really does in JS/TS are certainly greater than anything else I've encountered.
(I do find go's repository-based approach quite frustrating to work with when forking dependencies but at least it's straight-forward)
I still find the whole gopath/gomodules thing unfortunate but it is what it is at this point (and frankly that’s a bit out scope since we’re talking imports here.. if we’re talking about the state of packaging then JS absolutely has competition; e.g. https://xkcd.com/1987/)
On top of this, a lot of the tooling is just shit. I don't just mean it makes design decisions I disagree with (which is also true of e.g. Maven, or Bundler) or that they're lacking in niceties (e.g. every C development workflow), I mean it's just absolutely poorly implemented CADT-except-they're-actually-25-and-overcapitalized trash.
Yarn blew NPM away on speed with more features and nicer developer workflow. Webpack is on major version 5 and still everyone just uses CRA rather than try to configure it by hand, but then nobody really knows how to debug CRA if something breaks. The entire babel stack is ridiculous. The dominant tools have just been so awful, for so long.
> Yarn blew NPM away on speed with more features and nicer developer workflow.
I dunno if it's still true, but for years into people saying "LOL use yarn, it's a better replacement for NPM" it was still really easy to find packages that broke under Yarn because it lacked some feature or other, or was skipping some obviously-a-good-idea sanity check that NPM did and so crashed rather than proceeding after adjusting its approach, or to venture slightly off the happy path of doing "yarn install" and running into features that yarn didn't support, but NPM did. A venture into the issue tracker in that time period was enlightening, and I don't just mean the sheer count, but digging into some of the issues and why they were happening.
[EDIT] in fact, in the agency I was at at the time, a kind of joke developed that a project wasn't fully underway until you'd been forced to replace Yarn with NPM.
> Within the actual language, there isn't a universal syntax just to use a package.
There is. The issue is one of mindset. You recognize Maven and Gradle as belonging to a sort of "parastandard" set of technology, offering proprietary (albeit free/open source) glimpses of how one could conceivably solve the set of problems that they're meant to be used for, but when it comes to JS, you're elevating the parastandard stuff to the level of being part of "JS".
Neither CommonJS nor NodeJS's `require` nor package.json nor TypeScript nor esbuild are part of JS. You shouldn't give them any privileged status that you aren't willing to give to Maven or Gradle when you think of Java.
I think you're misunderstanding what I'm saying. You're still talking about package managers and build systems. I'm talking only about syntax. TypeScript, NPM, esbuild, Gradle, and Maven are completely separate from what I'm talking about.
I'm not misunderstanding. (The difference between parastandard tech like package managers vs base-level stuff in e.g. the language itself was actually the point of my message, so I understand the distinction.) You're saying that unlike Java, JS at the language level doesn't have imports, but what you're saying is untrue.
It's entirely possible that the projects you're most familiar with aren't using them, but the JavaScript language has native modules, including standardized syntax and semantics for import statements.
Yes, I know JavaScript has native modules at the language level, but they are not universally supported.
Most importantly, there can't/won't be a breaking change for the previous iterations of module import, so we will continue to have many different ways to import.
It's not an issue of whether JS has a "blessed" way of doing it (and it didn't even have that for a long time). It's that there's more than 3 ways to do it, and many of them look similar and have confusing conflicts with compilers/packagers.
Yes, imports in JS are very frustrating. A big part of it is that your imports might be interpreted in, say, four different environments, and you have to consider how those four different environments work, how they can be configured, and pick some way of importing your packages that works in all four.
The four environments I often end up with are: Browser, Node.js, Rollup.js, and TypeScript.
For example, for a long time now, it has been sensible to just use "import" statements in code you intend to run in the browser. For a debug build, you don’t have to bundle, since browsers support import, but the imports have to be relative and have the full pathname. Node.js has a configuration option that allows you to use "import", but it is finicky and it can be surprising how many things will break when you turn it on—the alternative is to use ".mjs" as an extension. TypeScript has specific requirements about file extensions as well, and does not modify them. Finally, Rollup.js is hopelessly configurable.
Consider a simple requirement, “This code runs in the browser, I want to write a test using Jest.” I know it can be done, but I’m spending a bunch of time digging through forum posts to make it work, and understand how to make Jest load the code that already runs in the Browser, because the Node.js environment is so different… and most of the time, I end up writing code just to get something to build.
So the more than 3 ways, off the top of my head:
- Import paths: non-relative? relative? relative with leading slash?
- Import paths: .js suffix? no .js suffix?
- Files: use .mjs / .js suffix? use .js / .cjs suffix?
You wrote, "I was talking about import in JavaScript itself, not build tools or package managers," and keep mentioning things like "the import syntax of the language", so this feels like retconning.
The confounding details you bring up are entirely the result of that idiosyncratic tooling. JS itself has `import`, thus there is no problem of the sort that you describe—at least not as you've described it. (I.e., this isn't to say that there are no problems, only that the way that your characterization of the problem as being one with "JS itself" doesn't match the problem that actually exists.)
To say it again, if you see a problem with JS, that's because you're elevating the parastandard stuff to a higher level than it deserves. If you're able to meaningfully separate "Java" from "Maven and Gradle" in your mind, then you should be able to do the same with "JS" and the NodeJS derpitude.
Maven and Gradle have been around for years, and they are pretty much understood.
Moreover, you can skip them entirely and roll your own in a couple of hours [1]
Meanwhile in JS world? Build tools now change faster than frameworks, and not a single person is concerned with upgrade paths, or working in tandem with other tools.
After struggling to comprehend JS import and modules as much as I did as a single dev, I concluded that it’s my fault - that there’s something wrong with me.
I had to read the MDN guides multiple times and still need to refer to them.
Setting the build system part for my new projects is still the most challenging task.
Why do I have to learn how to build a grand piano every time I want to play a simple melody?
require ‘thing’ in Ruby requires the thing and that’s it.
require in JS should be use i. Node but not in the browser if I remember? Or maybe not? I don’t know, I guess I’ll just try things y til it works.
Python's packaging infrastructure is a mess, but the even the _environment_ surrounding import, let alone the module semantics, largely hasn't changed since the introduction of venv.
For all the other failings of the Python 3 migration, the Python community strongly and rightly rejected 2to3 which would've eventually produced a babelesque mess if continued.
It's holding JavaScript to its own higher standard: <script src="..."></script> is dead simple, and trivial to understand and get working.
NPM, import, etc. seem mindbogglingly complex in comparison, with an enormous number of ways to do each thing, most of which require at least one build step <script> just doesn't, and enormous numbers of third party dependencies, and if you go away for a few months and come back, there are good odds things will be different and broken.
Like Julia, I don't understand JS build systems and packaging etc., which no doubt contributes to the above experience, but I'm comparing to <script>, whose simplicity is almost unbeatable.
> It's holding JavaScript to its own higher standard: <script src="..."></script> is dead simple, and trivial to understand and get working.
The most simple example is almost never a representative example that can be used as a standard to hold the whole language to.
In the same vein, one could say: "See how easy it is to build a C program with `gcc main.c`? Why should I have to figure out how to use make?". And the answer is simple: Most projects that go beyond a toy example have much more elaborate requirements(/whishes), like "I'd like to reuse code that another person has written in the past so I don't have to do it".
C also lacks a sane build system, so if you’re trying to make the point that JS isn’t so bad, comparing it with C isn’t the way to go. Consider that no other language needs a bundler with all of the weird stuff it supports. You might argue that it’s good that JS supports all of these things and that’s fine, but
what you’re really saying is complexity is good which is also fine but incompatible with the “JS is no more complex than other languages” position.
Most languages also don't really have to worry about shipping dozens of plaintext source code files over the wire that then might get executed in an environment you have no control over that doesn't actually support the code you wrote (There are still people running old IE versions and loosing 1% of customers might be really costly at scale). I'm not saying that the current ecosystem isn't an overly complex mess but it does actually solve some problems.
I empathize with the unique difficulties of the web and I don't doubt that the complexity solves for real problems (e.g., old IE versions); however, I take issue with treating all problems as equal irrespective of how niche those problems are. The C and C++ folks make the same kind of argument to justify their abusive build systems--if the build systems made sane assumptions, then it would exclude certain niche use cases. Of course, we all agree that sane tooling excludes niche use cases, but we disagree about whether excluding ~1% of use cases for a dramatically improved experience for the ~99% is worthwhile (and we may even disagree about some of the qualifiers I used in this sentence).
Webpack sure has problems (for example speed) but most of the more niche things that people do aren't actually supported by it but implemented through the plugin system, at which point any category of niche/sane kind of flies out of the window. If you enable people to do weird things people will do weird and maybe unwise things.
The best solution might be to take notes of how people tend to develop js stuff nowadays, scrap javascript completely and use something completely new that doesn't have the issues of javascript. But that seems rather unrealistic at this point.
There is no problem JS transpilers+bundlers tackled that earlier compiler writers had not tackled, better, before.
> shipping dozens of plaintext source code files over the wire that then might get executed in an environment you have no control over that doesn't actually support the code you wrote
What does "plaintext source code" or "over the wire" do to distinguish this from "compiling a binary targeting a minimum supported ABI"? Bundled JavaScript doesn't even have to deal with dynamic linking!
> There is no problem JS transpilers+bundlers tackled that earlier compiler writers had not tackled, better, before.
I'm not sure what I am supposed to do with that statement/answer? It's obviously very easy to just assume that every js developer must be an idiot but that is hardly a fruitful discussion to have.
> What does "plaintext source code" or "over the wire" do to distinguish this from "compiling a binary targeting a minimum supported ABI"? Bundled JavaScript doesn't even have to deal with dynamic linking!
Correct me if I'm wrong on that but it doesn't really matter if something is written in Go, Rust 2015, Rust 2018, Rust 2021, Zig, D or whatever else comes to mind, assuming static linking of course. I can compile it, I can ship it and the binary will work. I can't just ship typescript out, browsers don't understand it. I can't just ship modern js out as I have no idea if the users browser understands the code.
Bundled javascript doesn't have to deal with dynamic linking because it is, in essence, static linking. The whole dynamic linking thing was sort of tried with CDNs shipping js libraries, didn't really work out all that well in practice, relying on some different service to be available for your dependencies is only a good idea until that service has downtime.
In this scenario, the binary is your ES3 or ES5 or what have you.
Compiling Go to an x86 ABI, Rust to the same x86 ABI, C to the same x86 ABI, etc. is an analogous problem to "shipping dozens of plaintext source code files over the wire that then might get executed in an environment you have no control over". It's harder, by order of magnitude and on every axis, than compiling TS, ES2015, whatever, all down to some lesser ES version. And yet these compilers are faster, more effective, and easier to use.
You've missed my point about static vs. dynamic linking entirely. The point is that C compilers did "bundling" for years while supporting dynamic linking, and now you're trying to say JavaScript is tackling some way more difficult problem when it has a much simpler language-to-"ABI" translation process and doesn't even need to support that?
As you say, this maybe isn't a fruitful discussion - but it's a true one. JS tooling developers got seriously deluded about the novelty of their problems and quality of solution at some point ca. 2014, and we will be dealing with the results for another decade or more. At the very least let's start being honest about it.
I tend to agree with the overall thrust of your argument and the conclusion you're driving toward, but it's at least interesting to note that the distribution model of the web puts unique pressure on tools to ship dramatically smaller payloads. In Go or Rust, we ship debug symbols in production builds by default (or whatever it is which give us stack traces on panic), but in JS sourcemaps typically aren't shipped in production builds because they are too bulky. Similarly, there are a litany of caching concerns which apply in the browser world but not natively.
I also think that invoking "C compilers did X" weakens your argument because the C build ecosystem is quite a lot worse than the JS ecosystem and without any good excuses (its ecosystem is not rapidly evolving, there is no distribution model putting comparable pressure on latency, etc).
But in general, I agree that there is a lot the JS world could learn from the native world.
> the C build ecosystem is quite a lot worse than the JS ecosystem
Other than module packaging (and that's admittedly a big "other than"), I don't agree. Esbuild is the first tool I used that's even close to e.g. Make in comprehensibility (~ ease of use x flexibility x ability to debug when something goes wrong). That's a pretty low bar, and I don't know how long esbuild will stay that way.
Agree to disagree I guess. I don't want to go to bat for JS, but I've done more than my share of C and C++ and Make, CMake, Autotools, etc are just fucking nuts, and that's ignoring the fact that they completely punt on package management. JS is painful by the standards of modern programming languages, but C and C++ are on the next level.
"Stockholm Syndrome" is the only way I can rationalize people vouching for C or C++ (or JS, to a significantly-lesser-but-still-not-negligible extent) build tooling.
I'm not trying to argue that it is harder, just different in some ways and ignoring that for the sake of "javascript bad" isn' going to change the point that there are challenges involved. Nevertheless I understand your pint a lot better now, thanks for the explanation.
I think we are both on the same page that the JS ecosystem overall is not in a good shape. But, you know, maybe THIS wave of new shiny tools is finally solving the issues once and for all (not holding my breath though...)
I do, but it's not well supported by a lot of packages. I dig around manually for URLs on unpkg and other JS CDNs, but most libraries' instructions just say "run npm ... or yarn ...". Maybe that complexity's worth it, but understanding what's going on is a barrier for me to use it, just like Julia describes.
I'm very sympathetic to Julia's confusion and your yarn/npm frustration. But consider whether those libraries themselves might also be part of the complexity that are not worth the cost. Julia's trying to use Vue. Vue (and Webpack, and Babel, and the entirety of NPM) nominally exists because it's supposed to improve the developer experience. If instead it contributes negatively in other ways and you have to spend so much time wading through the tangled mess that it confronts you with, what's the point?
Ignoring the nullified value proposition, there are other reasons besides (like security) to reject what the NodeJS/NPM world considers standard practice.
I mainly use big standalone ones that provide enough to be worth it, and far more than I'd build myself. Things like MathJax, JSXGraph, Plotly, and D3 when I properly learn it. Not coincidentally, those aren't too hard to use without NPM.
I feel like JS gets thrown under the bus in discussions, but no body talks about arcane file structures in Java or implicit pathing lookups which have the same kind of "historical path dependence" to why they exist as JS has for require/import.
All languages have their warts and 99% of the time they're all doing the same thing just in slightly different ways.
My primary frustration with frontend development is that I have gone to JS Build Tool University... Roughly 10 times now. I'm not sure I've ever set up two separate frontends the same way. Even when I avoid the shiny new tools and stick with crusty slow webpack, there's a new version that behaves entirely different than the previous version. I'm enjoying the speed and simplicity of ESBuild.
I've been poking fun at JS for a long time (friendly though, I'm consistently amazed at what those people manage to do with little more than straws and duct tape ;-)
Just be glad Gradle hasn't entered the JS scene yet:
With JS there is at least some standards now: both Angular and React at least has some systems for getting things running in a standard way and keeping it somewhat aligned.
With Gradle however I haven't seen two similar setups so far. Edit: I might have seen two similar setups.
Gradle is capable of very powerful things so on larger projects there is generally a project specific Gradle dialect in play and I can understand why this frustrates newcomers but it's an important capability for large projects and monorepos.
I would say most small projects (especially OSS) tend to stick to a simple subset of Gradle even if it's multiproject build etc.
The amount of churn in the JS build tool space easily outstrips any one time learning you need to do to surpass the Gradle knowledge cliff where all this complex stuff just makes sense.
I find working with Gradle infuriating. It's layers upon layers of magic. When something breaks, you get a baroque error message, a usually useless stack trace, and a suggestion to re-run the possibly very long build with -info or -debug. At that point, instead of not having enough output, you're usually drowned in megabytes of irrelevant messages.
This is what I was referring to the knowledge cliff.
Gradle is incomprehensible until it isn't. It has a very steep but short learning curve and understanding is completely binary. It's very unfortunate but it is how it is.
I still take it over the JS stuff but I do acknowledge it's shortcomings. I also don't do Android development so I don't know if it's particularly bad there.
The knowledge cliff doesn't end with Gradle because it's infinitely extensible. So you have to understand all the plugins as well. Because there is so much implied magic happening behind the DSL, when it breaks, you don't have a clue why. Reading the documentation is often insufficient, you have to track down the source code for the version of the plugin you're using.
In general, I find DSLs make my life harder, not easier.
Gradle is my least favorite build tool ever. It's just too much magic in too many places.
This is not an endorsement of the JS ecosystem either. It also sucks, but it has wasted less of my time than Gradle (again, in the context of Android).
My thoughts exactly! I'm mostly a backend developer and also fine with doing occasional frontend stuff as long as the build system works, but when npm croaks with one of its cryptic error messages, I wouldn't know where to begin to fix it, and that's extremely frustrating...
If you click on the link and read the header (not the actual article or it's title) you will see that it's posted on Julia Evans blog. That should answer your question.
Esbuild is great. We couldn't fully migrate to esbuild due to some hard dependencies on webpack but we are currently using esbuild-loader which reduced our build times considerably.
> there are 2 standards (CommonJS and ES6 modules), and that require is CommonJS and import is ES6 modules
Kind of, but not really. There are JavaScript modules as defined by the ECMAScript standard and implemented in Web browsers and anyone else who cares about implementing the spec, and then there is NodeJS, which has its own way of doing things, along with people who refuse to give it up and insist on dragging those headaches into places where you should not otherwise expect to see them. People call the latter "CommonJS modules" (including e.g. the TypeScript team), but in fact CommonJS is a whole thing on its own that NodeJS doesn't actually follow. The thing that CommonJS and NodeJS-style imports have in common is that both used the `foo = require("bar")` convention.
To understand the mess that is the current JS ecosystem (and hopefully how to get out of it), people really need to come to grips with recognizing NodeJS's role as an IE-like entity:
- 800-pound gorilla
- not under the control of the best and the brightest
- implements its own share of proprietary (vendor-specific) shenanigans
- the people who shamelessly target it and see nothing wrong with doing things this way are the source of the the worst problems within the ecosystem or on a given project
- responsible for a massive corpus of legacy junk that you are expected to be familiar with and that it will take years to clean up
Using ES mudules everywhere seems to be picking up steam recently. It is annoying jumping between frontend files and backend files in the same project using different module systems.
I think in about 6 months not touching common js modules from front-to-back for a whole project will be viable without major quirks.
To my knowledge what node implements is nearly a superset of CommonJS. It is true that a pure CommonJS implementation cannot handle many node packages due to additional features that node has added to the module descriptor, or due to use of node specific built-in packages/globals/extra properties on require, etc.
But I was under the impression that node could generally import and handle strictly conformant CommonJS packages just fine in practice.
Of course, I won't argue that many npm packages not being conformant CommonJS packages makes things difficult for non-node server-side javascript environments, since they either need to add additional node compatibility, or might need their own package repository.
I can't decide if the root cause of this problem is in esbuild or in Vue. The fact you can't do "import vue from 'Vue';" with esbuild means either it's module resolver is terrible, or Vue is doing something super-weird with the way it bundles artifacts. Either way, something is seriously wrong.
FWIW, "import react from 'React';" works fine with esbuild with no config, so I'm inclined to think this is a problem in Vue.
> The fact you can't do "import vue from 'Vue';" with esbuild
You misunderstood, they say this is exactly how it works. But they wanted a specific build instead of the default one of vue (the one with the runtime compiler, not default because it has performance penalties).
It could be argued the default was wrong (when originally chosen).
If you don't have the runtime compiler, then that means either you: 1. handwrote the the compiled templates, which is rather unlikely, or 2. you are using additional tooling to precompile.
If you are using additional tooling to precompile then that tooling ought to handle aliasing to the runtime only version, (or if not using a cli style tool, the instructions for setting up the bundler to use a loader should include setting up the alias).
However it is probably too late to fix things, since existing projects will assume the default is the compiler-less runtime, and would perform worse if things were changed, since existing projects would not have the new alias to run-time only version in place.
If you're going to have a build step at all, instead of hand rolling your build commands and configs, just use Vite[1] already. It uses esbuild for development and rollup for production (since esbuild is not featureful enough for a lot of production use cases), you get instantaneous startup, hot module replacement, static asset imports, etc. for free. It's from Evan You, the creator of Vue, but it also supports React, which works pretty well with it in my experience. Btw SvelteKit also uses Vite.[2]
> But I stopped using those tools and went back to my old <script src="script.js"> system because I don’t understand what those vue-cli-service and vite are doing and I didn’t feel confident that I could fix them when they break. So I’d rather stick to a setup that I actually understand.
She's doing things the right way here - applying the beginner's mindset, trying to understand things rather than cargo-culting by copy-pasting magical incantations from github. There's just so much incidental complexity in JS tooling, and her instinct to steer clear of that insanity by using Unix tools like esbuild is spot on.
I am fairly good at avoiding cargo culting and only using things I understand. I am uncomfortable developing any other way.
With JS build tools, I eventually gave up and accepted being uncomfortable.
(And to be clear, we're talking "levels of abstraction" here. Or interface/implementation. I don't need/want to understand the implementation of the tools/libraries I use. I need/want to understand how they work at the "interface" level, the mental model for what they offer and how to make them do what I want, and what choices are available on what axes. I normally achieve this. With webpacker, I give up and just copy and paste magic phrases).
I mean, there is a certain amount of buy-in into cargo culting. She could've used something like mithril.js and that would've worked with a script tag only setup without having to worry about template language compilers or anything of the sort. The tutorial even walks you through how to do exactly that.
Esbuild is a fantastic tool, but the nature of the problem domain is indeed inherently complex. "I don't know what import does" may sound like admitting ignorance, but it is also borderline clairvoyant. Resolving a module involves the incredibly convoluted node resolution algorithm, semantics of doppelganger packages, semantics of peer dependencies, architecture-specific optional dependencies, other ad-hoc standards and spin-offs such as package.json `browser` fields and import maps, esm vs cjs vs umd woes... any of those could break and the error messages are impenetrable when they do.
On the flip-side this approach sounds kind of like premature optimization. You don’t have to understand the entire path from pressing buttons on a keyboard to a program causing photons to exit your monitor. You instead trust that large parts of that chain just work.
If what you’re doing works via script tag, the odds of Vite/CRA/etc not working for you are extremely low, so why worry about problems you might have before you ever have them. It’s perfectly fine to opaquely trust your tools while focusing on the details you actually care about.
In another context, no Java course ever starts off with explaining “public static void main(String[] args) {…}” and students just “cargo-cult” it for a while until through other exposures to various bits of functionality through natural use they build a context through which to eventually circle back and easily understand what that crucial bit of code actually means. And that’s long before someone would consider trying to get a deeper understanding of what the compiler is doing! “Welcome to CS101, we’re going to start off talking about byte-code and intermediate representations” would have a success rate of literally 0%.
First, esbuild is meant to be a complete JavaScript bundling platform. It already packs a lot of features, and is in the process of acquiring more. What it doesn't pack natively can be bolted on via the plugin system. It's an end-to-end solution, or almost one (it doesn't give you a dist folder you can just deploy, you still need to add an index.html), not "doing one thing and one thing well", unless you define your "one thing" to be very broad.
Secondly, here's an analogy: cargo is too much magic, people learning Rust should apply a beginner's mindset by downloading dependency crates themselves and finding the appropriate rustc command lines rather than using cargo build. Sounds unconvincing?
Seems right to me, too, but perhaps it's not the beginner's mindset. My colleague, who is an actual beginner, started the frontend part of a project, read up to the vue-cli scaffolding incantation, copy-pasted it into his terminal and checked in the result. The consequence was that I have to fix the unavoidable breaks and other npm and webpack issues, because he has no clue.
Apart from that, debugging the builds is a bad experience.
> instead of hand rolling your build commands and configs, just use Vite[1] already.
For greenfield projects? Certainly. For existing projects? Oh, you're in for a world of pain.
I'm all for breaking with the old and ushering the new era in, but there should be an upgrade path for as many possible combinations of the old as possible.
As luck would have it I decided to convert our project to vite to see if it's feasible.
So far it's:
- Absolute imports don't work and are broken in any number of subtle ways
We use typescript with a setup that resolves `/our-stuff` to `./src`.
Good luck making this work with vite. Somehow it insists that you should replace all absolute imports with `/@/` and setup an alias for that. All other options (like using vite-tsconfig-paths) will not work because vite's `base` config will break this.
I "solved" it by listing all directories in `src` and converting them into vite aliases.
- Now I'm stuck with `[vite] Internal server error: failed to resolve "extends":"../../tsconfig.json" in <some-path>/mui-theme/tsconfig.json`
How to fix it? Hell if I know.
At this point I'm just giving up, and looking into how to maybe upgrade to webpack 5.
Your best bet is to try and remove as much config as possible first. Simplify and then migrate or upgrade.
Webpack allows you to paint yourself so far into a corner that it becomes impossible to move anywhere. We should see this for what it is: an existential threat for your app. It eventually grinds everything to a halt.
Point taken, but it's not like switching the average existing webpack project (let alone a webpack 4 project) to esbuild is just a walk in the park. We're talking about whether you should roll your own esbuild commands here.
If you can roll your own esbuild commands to build your codebase without change, good for you. Once you’ve done that, good luck figuring out a dev server with HMR, since esbuild certainly doesn’t offer that.
Because when it actually works it's a very refreshing development experience, once you get used to it you never want to return to the crappy "let me refresh this page to see my changes" workflow.
But hey, tons of people swear by the simple setup they claim to understand; good for them. As someone who don't work on frontend stuff professionally, I wouldn't pretend I "understand" esbuild and can always fix it when it breaks. Ironically, TFA presents an issue with this simple setup that the author could not figure out without help from experts:
> I was really baffled by this, but I was eventually able to figure it out with some help from people who understood Javascript.
The sane option would be to use a codemod to rewrite your code to not need bespoke tooling configurations. You are locking yourself into legacy tooling.
> Forcing someone to mindlessly and absolutely needlessly re-write thousands of import statements isn't really a solution
This is how progress is measured in the web world. The impact of your code/project/patch is measured by how many lines have to be rewritten to adopt it. The higher the number, the better your web wizardry rank is.
If you're using any ES6 features - default params, template literals, arrow functions, let/const, spread, etc, your cut-off for browser support will be ~2017 (= no support for IE 11).
swc (https://swc.rs) does transpilation and it's as fast as esbuild. Setup is slightly more complex. I'm hoping esbuild will add ES5 transforms at some point, would much rather have a Go tool underneath.
I love swc, it's close to a drop-in replacement for babel, but with a native executable and doesn't download half of the internet in JavaScript dependencies. You can use your existing build system, but get substantial speedups over using babel.
What about web projects, though? Isn’t that what the discussion is about (script tags and all)? Node projects aren’t too bad because you don’t need anything beyond local fs module resolution.
What I do is run esbuild and tsc (with no emit) in parallel in watch mode. That way, I get fast rebuilds and iterations, but still eventually see the errors spit out.
My code editor catches 90% of the errors, but it’s definitely nice to have tsc catch the rest earlier than the CI process.
Sadly it's a lot more than old IE versions, there are a lot of mobile devices with surprisingly old/buggy browsers around.
Of course this is highly dependent on the target audience but at my last job we did some runtime feature detection (basic stuff like arrow functions, let/const...) and shipped a modern build or a legacy build. And something like 10% of users got the legacy build, most of them on mobile browsers.
IE usage could be 50% in your industry, or your company may guarantee IE support for one reason or another. Global stats aren't always representative of your business.
The purpose of vite is not to be fast for build, but to be a live reload instant dev platform. So you almost never run build. You code with vite's server, and it let you see the result instantly in the browser. You can use modules, modern js, and transpiled languages transparently.
Then when you put in prod, you build once. 30s once a day doesn't matter.
I mean, notepad.exe is arguably a better text editor for most people compared to anything else most of us are using.
I'll happily take some minor overhead in build configuration over adding 30s+ to every automated build if it's not a personal or small-scale community project.
Stop wasting people's time with slow tools and telling them it doesn't matter. That's not a way to treat your users.
Your fancy hot module reloading is not what I want. What I want is to reload the page and see my changes.
I've never seen a "hot reload" that works properly. The one some of co workers are using right now works like this:
- You edit a file
- You wait 5 seconds for the "instant" rebuild
- The browser reloads the page 5 times
- After a total of over 10 seconds, you can finally see the effects of your changes.
Maybe vite's HMR is better, I will never know because I don't care.
The link you sent has the following to answer why not esbuild:
> some of the important features needed for bundling applications are still work in progress - in particular code-splitting and CSS handling.
But this is not true. It handles css and does code splitting. Now, you have to be very explicit about code splitting: you have to define other entry points that share the same libraries as the main program so that it forces those shared libraries into separate chunks. To have more chunks you have more entry points where each entry point uses just one or two libraries. Yes, it's not perfect, but it's not a blocker for getting to production.
Where as the 30 seconds build is a total blocker for development iteration.
Sample size of 1 here: My vite project hot reloads in a few milliseconds, keeps state, and shows me all recent changes with no issues. Sounds like your co-workers are using webpack-based hmr, that can become slow very fast.
> What I want is to reload the page and see my changes.
And what other people want is to not having to reload everything, lose all state on the page and make API requests all over when they add a 2px margin.
> I've never seen a "hot reload" that works properly.
Yeah, you've never seen a hot reload that works properly means someone else can't develop something new that works. And whatever crappy workflow that you enjoy must be great for everyone else.
Anyway, talking to close minded individuals is pointless, so this is it.
I read that as yet another 30 seconds for a CI or production build, on top of every other inefficient tool shipping with an inflated sense of necessity and no feel for devops concerns. Accepting such into your toolset remains a primary source of creeping bloat.
The JS ecosystem is riddled with such pests, and (as with webpack) eradication from our workflows always proves by far the best policy.
Sure, please contribute to esbuild so that it solves more of people's actual needs. It's not like Evan doesn't want to switch to esbuild in Vite completely. My link to "Why Not Bundle with esbuild?" should be a good start for where you should look.
The fact of the matter is, most of these projects came along for justifiable reasons, and represent some improvements over the existing tech: speed, ease of configuration, better dev experience, better use of platform features, and so on. I'll certainly grant that there's an aspect of chasing the new shiny here, but among the senior FE folks I know, the costs of migration are always being weighed against the actual benefits you get.
Additionally, the cost of adoption/migration is very often a high-priority concern by the people writing these libraries. The pendulum is swinging back from configuration to convention.
With regards to the pace: it mainly has to do with the fact that the Web-as-application-platform has only come into its own in the past 10-15 years or so (i.e. since the death of Flash.) The tooling is just now catching up. Another HUGE factor is the growing adoption of TypeScript- for the past several years, TypeScript tooling and existing JS build/bundle systems have gradually adapted to each other, which has driven change. (Trust me: trying to figure out how to set up TypeScript compilation under webpack was confusing and miserable for a long time.)
If you're just glancing into front-end OSS every once in awhile, or if you're not heavily using the features available in these tools, you might not notice or value these improvements. And that's perfectly OK- you can certainly continue to use whatever's comfortable for you. But there's more here than meets the eye.
Yes. I've used them all. I think Julia's approach is the best: code like it's 2005 until you feel like you hit roadblocks, then pick the simplest and most focused tool for the job (which at this point, I think esbuild is the best pick for its scope and speed).
I'm dealing with a complex webpack/babel setup at work. It is rife with issues, and it amazes me that these are issues that we created ourselves. Endless dev time is wasted on debugging overly complex toolchains.
If you're already familiar with JavaScript, it takes about a day's worth of Node.js API knowledge to write your own build scripts with esbuild.
I think Julia had the right attitude. Instead of taking the leap of faith with a popular tool that you don't understand, instead, choose a smaller, focused tool that you can understand. As you build your application, you can more clearly and organically identify your painpoints, and then you re-evaluate if a more complex tool would be a better fit.
I think going the other way around (as you seem to suggest) is a common mistake made by many entering the JS dev world.
EDIT: For the record, I use Vite, esbuild, and rollup across several projects.
Would you help me set it up to process SCSS with dart-sass and load stimulusjs controllers as well as possibly jquery and some other packages in conjunction with 11ty, middlemanapp or rails?
All I need is processing of scss and mostly vanilla js.
As long as your code doesn't need transpiling, there's also the possibility of importing modules using the type attribute in script tags: <script type="module" src="script.js"> – which is very well supported: https://caniuse.com/es6-module
The drawback of this, IIRC, is that if you want any sort of third-party library using this approach, you'll trigger a million HTTP requests, one for each part of the transitive closure of dependencies. Maybe things have improved.
"Introduction people to strace" is one of the most effective ways I've found to improve their debugging experience.
It's glorious.
In case it's handy, the two options I find myself telling people about most often are: '-f' to keep tracing across forks, and '-s' to increase the number of characters of strings that get shown when you're trying to debug complete reads/writes.
(filtering for specific syscalls and stuff is also really neat but most of the time I just throw a full strace into a file and chop it up with my usual unixy tools from there)
Yes, you don't need Parcel, Vite, Webpack, Rome or Rollup to code with Javascript.
I never understood how to configure webpack and you don't know what thoses tools are doing.
Just keep things simple. Write your own bundler with esbuild. Here is a script I wrote to build a Svelte Typescript Tailwind projetct :
https://github.com/Tazeg/svelte-typescript-esbuild-tailwind-...
My biggest feeling about learning JavaScript build tools and ecosystems is I learned nothing! I learned a bunch of ways to navigate through man-made unecessary complexities, and end up with no new knowledge in computing. The brightside is it increases my tolerence for bloatware, which is a valuable skill to have nowadays.
esbuild is one of the few good tools, it has a ton of features and it's still fast and lean.
I found EsBuild via Snowpack was quite nice until it wasn't. I hit issues around package resolution and usage of libraries like Mobx, which was deeply displeasing. I have moved back to Webpack because 50ms vs 400ms refresh-after-save latency doesn't matter to me when once a month I have to spend a day fixing those things I have long since taken for granted
I think you should be able to do something similar to the webpack config that Vue ships would be to use https://github.com/igoradamenko/esbuild-plugin-alias and pass in a similar `runtimeCompiler` option to your build script.
Sounds like she should consider just importing her script.js as a module and then import the rest from there. Probably no need for a build step and bundling at all. Also she might want to just use custom elements instead of Vue as your browser also just supports those out of the box.
I can relate to the article so much! Even though I'm familiar with node.js and npm, my approach to frontend is also index.html and script.js with no build tools. This way I can at least know how it all works and fix it when something breaks.
If you want to keep things simple while using packaged libs, there's no need for a build step. You can import ES Modules directly in a script by using URL as module name. Use e. g. ESM.sh as the registry, most NPM packages are available.
Skypack looks interesting. However, as I come from a Nix background, I'd like to a) pin my dependencies and b) do a hash check at install/download time to ensure the pinned package hasn't been modified behind the scenes.
Based on https://docs.skypack.dev/skypack-cdn/api-reference/pinned-ur..., it appears that you can do a. with Skypack, but this requires a manual step: look up the package in the CDN with curl or your browser and copy-paste the URL into your JS import statement. Is there any tooling to automate this?
Also, there appears to be no way to fail the build if the contents of the pinned URL change. Are Skypack users relying on Skypack to ensure that can't happen?
I know it's just piling on at this point, but I've been doing a lot of software builds and deployment over the last 20 years, mostly in a scientific computing context. Nothing I've dealt with is as bad as JavaScript.
I've generated and hacked dozens of autotools builds and written m4 macros of my own; I've hacked CMake builds; written RPM spec files; fixed Rcpp package installs; written and modified innumerable Makefiles; setup Tomcat webapps; and use Python for most of my daily work. I've even hacked around with Boost's build thing.
Complexity is not the issue with JavaScript. Stability is the problem.
Most devs do not want to "know" build systems. You want it to work most of the time and when it doesn't, you want to be able to find the answer to a problem. You want that knowledge to accumulate over time so that the answer you found the last time still works.
Autotools is an insane system, but it's been the same insane for 30 years.
Currently using vue-cli with Vue 2 and Vuetify, a configuration depending on a lot of precisely pinned versions to get functional. There is a pile of deprecated library warnings during the lengthy build process that I have no idea how long it will take me to address. Could be a couple of hours, could be several days, could be impossible.
Would like to try Vite or esbuild because God knows we could use the speed up, but after investing a couple of weeks to figure out the precise balance of versions that will work and propagating to our 8 or 9 applications, where are we going to be next year? I saw a Tweet yesterday about vite on swc. Is that going to be the winner?
I half wish Richard Stallman would take over. How desperate is that?
You really prefer autotools or CMake to esbuild just because they are older and don't change as much? I will take esbuild over CMake every day of the week and twice on Sunday. C++ build tools are an abject disaster. (And CMake does change anyway)
Not the parent commenter. I spent time using autotools long ago and developed a grudging respect for what they did and how they worked. Their design made sense, even though the system itself was quite cursed. CMake also—I never choose it for my own projects, and I hate using it, but I respect the consistency and portability.
Building JavaScript is just a fucking disaster. I’ve used a fair number of different bundling tools—Browserify, Webpack, Rollup, Esbuild, and Closure (the compiler). I’ve also used old-school concatenation… anything from <script> tags to something like “cat”. It’s just fucking awful. Stay on the straight, narrow path and you’ll survive. Deviate slightly and you’re doomed.
Making things worse, you might have two different environments you run your code in. Node.js and the browser. Surprisingly, the browser lurches ever fowards, and Node.js is the anchor keeping us in the past. Making things worse, you might try to use TypeScript. Making things worse, you might use JSX.
Nearly every JS project I work on needs a ton of babysitting w.r.t. the build system. At various places I’ve worked, there might be a team handling that for me, but if I’m working on a side project, it is very difficult to keep the complexity of the build system down to a reasonable level. Most guides on how to use frameworks will tell you to do something like “oh, just use create-react app” or similar, and you end up with a couple dozen new dependencies. Your build system will be a mix of templated code pasted in to your repo and third-party libraries. Integrating with anything else often requires various "adapter" dependencies, but it's a roll of the dice whether those libraries are built reasonably.
My basic desire is often a fairly simple list… I want front-end TypeScript code to be type-checked and bundled, I want a dev server that serves the bundle, I want to see build errors quickly and easily, I want to run tests without a browser environment. I know this is possible, but every time I’ve gotten it, it’s taken an unreasonable amount of effort.
Not your parent commenter. I worked both with CMake/autotools and Java build tools (not esbuild specifically) and the point (which I agree with) is - yes - C/C++ build tools are complicated, but once you set it up it just runs most of the time without any problem. With things like webpack you are always one upgrade away from things breaking down in spectacular fashion, and the troubleshooting involves hunting github issues, blogs, and sometimes looking through the source code to figure out why this combination of npm package versions gives you the trouble.
Somebody once described configuring fvwm2 (which is still my WM of choice, and indeed was theirs at the time of writing) as "like training a brain damaged hyena."
The thing is, as the same somebody noted, that then it stays trained.
To quote from upthread, "the same insane for 30 years" is in and of itself a huge feature that makes up for quite a lot of insane.
I must be crazy because I still prefer regular Make to CMake. It seems like CMake tried to fix autotools problems (which were not Make's problems) but then created new problems of it's own. Why is setting an install prefix so complicated when it's the ONE option that almost everyone wants? Why the case sensitive name ending in .txt?
And CMake still seems extremely C/C++ focused. I don't want a language-specific build tool anymore. Heck, I'm not even sure I necessarily need a build tool so much as a generic task runner with patterns and complex/dynamic dependencies.
I think CMake is a real pain and Autotools is an older pain. But I can figure them out and things I learned two blue moons ago are still basically true. The churn in JS makes it really hard for me to accumulate knowledge and solutions.
I think you missed the point. Churn is a real problem. Even if the tools are improving, it does need to be balanced against stability. They admited autotools was insane. but it was a consistent well know insanity.
Things are getting better. but js has definitely had more churn, and reinvented more wheels than anything else i know of.
I think you missed my point. Churn is obviously less desirable in the abstract, but the state of C++ build tools in particular is deplorable and the lack of churn doesn't come close to compensating for how terrible it is.
esbuild is the gold standard. Its creator has done a phenomenal job (and it's still early in its lifecycle). It's the first time I used a JS build tool and didn't scream at the monitor when dealing with nonsensical APIs and error output.
The "early in its lifecycle" part is the source of GP's worry. esbuild _seems_ great now, but based on the history of JS tooling, who knows what will happen in a year or two. Look at the recent example of the JS community's multi-year embrace and then rejection of Webpack. I would be nervous about migrating too.
I share a similar fear (generally speaking), but in this particular case I think the risk is worth it because the cost of setup is next to zero and the speed of the thing is unreal. If something catastrophic happens with the project years down the road, at most I'll have spent a few hours getting it wired up (worth it considering the positive impact it's had on my productivity).
It's not the usual JS personality screaming "wEbPaCk iS dEaD!!!lol" it's clear the author is thoughtful (observable via the project itself, the docs, and how he interacts with people in Github comments).
One advantage of autotools is that at least a lot of the skills are transferable. Shell scripts and Makefiles (and OK, M4 isn't so popular these days, but it was for a reasonably long time) are things I can learn and use everywhere, not just this project, not just C projects, not even just development.
I don't learn any transferable skills writing Webpack or Babel config. They don't even necessarily transfer to the next generation of the same tool.
I would also take a stable autotools over an unstable autotools, and I consider most of JS-land roughly equal in usability to unstable autotools. Autotools was also not that bad on the happy path where you were targeting, say, the top five POSIX systems over ten years - and maybe you didn't even need it at all, if you didn't need the performance from distinguishing each platform's best supported fd polling variant. Vs. JS, which is still bad even if you're only targeting evergreen browsers, because maybe you're stuck trying to integrate CSS modules with TypeScript or something. That kind of problem simply doesn't exist in autotools world.
You're not totally wrong; I'm getting into that age range.
It's not that I prefer those tools; they are really obtuse. It's just that I can't handle the moving target of JavaScript. I'm sure I can figure out esbuild just fine, but in a year or two it could be abandoned in favor of swc or Vite or a Vite-esbuild-swc uber package.
I find it fascinating that there are people still using script tags. I have personally become so tied to npm and typescript that it feels that I'm programming in a completely different language and ecosystem, with a build process and IDE-like type checking and linting, a more procedural data-oriented C-like paradigm.
I've never touched the old JS guts, `module.exports`,the prototype chain, ... Babel (and later TypeScript) allowed me to live in ESM land from the beginning. I expect any tool/library to be available using exclusively on npm, in reality, I've been constantly fighting with it over the past few years.
I usually get a shock when I see libraries still offering minified bundles that assign to the global namespace, to me this seems like outdated practice, but I guess that this is still really prevalent in JS, even in 2021. I sometimes want to go back to it, with the amount of configuration and boilerplate necessary to set up a project with npm, Prettier, ESLint, Typescript, Webpack/Parcel/Rollup, SPA vs MPA, React vs Vue vs $SPAFRAMEWORK, it was a lot easier back in the day.
I think the biggest failure in the JS system is the lack of standardisation, it's so versatile that almost anything is possible. A number of different module systems were invented to solve the initial growing pains (the first would literaly replace a `require()` function call with the contents of the given file), and we never escaped them as they became the foundations of Node.js/npm. Then we needed a non-blocking variant, so we just called it `import()`, it returns a promise. Then we wanted proper imports, so we invented the `import ... from ...;` ESM syntax. We made a named import and default import variant (with most bundlers offering a compatibility layer with CJS-type exports, so imports now have a hidden `__esModule: true` key...). In order to stay compatible with legacy CJS, we also added the `import * as ... from ...;` syntax as an escape hatch?
When TypeScript came along, in order to be compatible with JS it became common practice to distribute compiled TS code along with the source code, with an adjacent declaration `.d.ts` file to provide type information. The source code was never leveraged in any way, and so people just stopped shipping the `src` directory, only `dist` and a `package.json`. It makes debugging a pain, but compiling TS libraries would have been tricky, considering the widely varying different feature flags (async, DOM, ...) and strictness options. Perhaps Deno will help with this to some regard. The great thing about Go or Rust (or any other language, really) is that you import source code, not illegible compiled code (unless you're dynamically linking). I believe that Rust is especially good in this regard, it remains compatible with older code by requiring crates to specify a Rust "edition".
The big difference is Rust and Go are new and learned from their predecessors. If they live long enough, they too will become complicated (well maybe not Go since it’s central design paradigm is simplicity) and seem crusty compared to newer counterparts. Java started simple, but has grown and “forked” and now there’s Kotlin or Scala etc. All that variation was because Java was successful and people wanted to use it but in a slightly different way. JavaScript is the same: people keep evolving it _because_ it is so popular.
Being able to say "I don't know how this works" requires a lot of experience too, not just confidence.
Especially in an interview, people saying "I don't know anything about what you asked", depending on the requirements, can end the interview but could also make the candidate gain a lot of points.
Thanks for the write-up!