Am I using devDependencies wrong or is everyone bad at explaining it?
Official docs say it's for
Packages that are only needed for local development and testing.
Umm, okay. Not 100% clear there. Some articles mention things like ESLint or Jest (k, I'm onboard there) but others mention Babel or WebPack. I get that you don't need WebPack libraries to be loaded in the browser but how the hell do you bundle up your code without it? When you use npm ci or npm install you'll get all dependencies but isn't it good practice (in a CICD environment) to use --omit=dev or --only=prod?
If you’re bundling code, you’re doing development work, and you’d have devDependencies installed:
For a library, once you’ve shipped your code, someone using it wouldn’t need your bundler/testing libraries/dependency types/compilers/etc installed, so they wouldn’t install your devDependencies.
For an application, if you’re building it on the server, you’re probably doing it wrong, but in that case you would want to install devDependencies. If you’ve built it locally or in a pipeline and you’re running it on the server from your artifact, then you probably don’t need devDependencies, as those are only useful during dev and build.
Good points. I never build libraries, only websites, so it didn't really occur to me that the dependency types we're mostly intended for that use case.
I use a pipeline to build and right now there's one stage that just installs everything, then I have separate build and test jobs. The two main issues I'm trying to correct are the fact that npm takes ages to install dependencies (even with npm ci) and that I'm subject to security scans and I don't want to be held up because of a vulnerability in my testing tools.
I think it makes more sense if you think about backend applications: If you write a Webserver with ExpressJS in typescript, you need typescript only to compile it (dev dependency) but once compiled, you only need ExpressJS in your node_modules for the app to be able to run ("regular" dependency).
Frontend development is a bit strange in that respect, because often everything gets bundled into your dist/ directory, so technically there are no runtime dependencies? In that case it's more of a hint to let you know "this goes into the bundle" vs. "this is part of the compiler toolchain"
It should be what you need to run it in production, so most frontends really only should declare dev dependencies.
Take jQuery for example: all dev dependencies. They ship a bundle, so all of the libraries and tooling they use, you don't care.
If your frontend ever becomes a dependency of a bigger thing, or gets imported by the server as a dependency for your UI, you also don't need to download all of your runtime dependencies you bundled together, you really only need the bundle.
For frontend I use the dev dependencies for testing tools. Ex. Cypress. That way my build pipeline doesn't need to download a bunch of things that won't be used.
You only need to worry about devDependencies vs dependencies if you are going to publish the project you are working on as an npm package. If you are making a webapp or something else that you will run, then it doesn't matter.
This isn't exactly the case but yes, I would prefer to keep the dependency list as small as possible, mainly because I'm subject to security scans and I don't want things to get held up because there's a vulnerability in my linter.
Exactly. If nobody ever runs npm install , don't worry about it. (Like, literally, you can put half your dependences in dependencies and half in devDependencies and it will be fine.)
If you do, then every dependency the person who runs that command doesn't necessarily need goes into your devDependencies.
Are you using Node.js on the backend, or are you just using JS for the frontend?
In frontend JS, everything should be a dev dependency. At runtime, you only use the JS bundles generated by your bundler, which is a build-time process. That means you don't actually directly use Node modules at runtime, just the bundled versions, so there should not be any non-dev dependencies.
For backend JS, anything only used in dev (unit testing, mocking, bundlers, etc) should be a dev dependency, and stuff you need at runtime (like Express, etc) should be a regular dependency.
I don't think it's accurate to say that for frontend, everything should be a devDependency. It's more a matter of personal taste what goes where. I've had good experiences with using devDependencies for anything that doesn't end up in the bundle, and everything else as a normal dependency. That seems more useful than having everything in one category.
Sure, that's fine. I think I didn't word my comment well. What I meant to say was that you only ever do dev installs for frontend apps, so there's no difference between dev dependencies and regular dependencies. You can split things however you want.
Deployment is part of dev, so no, we don't typically use
--omit=dev or --only=prod in the deployment pipeline.
It is usual (and actually pretty important) to have --omit=dev or --only=prod in the test pipeline.
This often means we do need separate build environments in CICD for test and for deployment.
Specifically, what is important is that some part of the CICD process will fail if a production dependency is missing. Normally, that's the unit test run, supported by a --omit=dev or --only=prod flag to be sure that no upstream dev dependency causes a false success.
This outcome can be achieved in various other ways, but is usually achieved by separating the test and deployment environments.
Edit: to cloud matters further, there's plenty of cases where not bothering with --omit=dev or --only=prod during test run is fine. So you will still see healthy well run projects that do not separate test from deploy, and that's not necessarily a mistake.
But the best practice is a best practice for good reasons, so naturally we stick with it when unsure.