npm / npm

This repository is moving to: https://github.com/npm/cli

Home Page:http://npm.community

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add "globalDependencies" option in package.json for installing global dependencies.

martinheidegger opened this issue · comments

I found myself more than once now in the situation that I needed a module I depended on globally. I usually solved it by creating a separate script that would additionally call npm install -g. However this does become unreadable and at some point just messy to handle so I thought to formally ask: May it be possible to get a globalDependencies property that acts pretty much like the dependencies property but installs the referenced dependencies globally?

commented

What exactly is the use case for this?
Binaries that would otherwise be globally installed can be found in node_modules/.bin for non-global modules.
Also see the npm FAQ: Why can't npm just put everything in one place, like other package managers?

My used case is coffee-script (and other command-line tools): I want them to be installed for the user of my package to continue work with it. As it is now: users need to read a readme and prepare coffee-script to run the unit-tests, deploy tasks or whatever by themselves rather than having all those global dependencies installed with a simple "(sudo) npm install" ... would make the work within our team a lot easier (less frustrating)

commented

Seems like a clear-cut devDependencies case, since that's what they are - development dependencies.

Actually, it seems to me like a case where your prepublish script should be compiling the CoffeeScript code so that users of your package don't need to either have CoffeeScript installed, or compile your package themselves. If someone wants to develop your package (i.e. work with the source), I think it's fair to require them to install CoffeeScript - and they probably already have it, if they're a CoffeeScript developer in the first place.

Yeah, we're never going to do this.

What's the recommended way to lock versions for global packages (like bower and grunt) in source control?

They should not be global; you should locally depend on them in your package. For e.g. grunt see this example: https://github.com/domenic/client-side-packages-demo/blob/with-grunt-etc/package.json#L7-L29

Gotcha, thanks.

Note: I'm happy doing this to get the benefits of versioning, but this is officially unsupported by Grunt CLI and gives warnings:

https://npmjs.org/package/grunt-cli

commented

I was looking for that kind of solution myelf, I coded something for my project and finally I extracted the code to an npm - you can use it, npm install package-script

What's the officially reccomended way to ensure you have something installed globally from package.json?
preinstall?

http://stackoverflow.com/questions/14657170/installing-global-npm-dependencies-via-package-json

package.json isn't meant to be used for global dependencies.

On Monday, December 9, 2013, Kevin Suttle wrote:

What's the officially reccomended way to ensure you have something
installed globally from package.json?
preinstall?

http://stackoverflow.com/questions/14657170/installing-global-npm-dependencies-via-package-json


Reply to this email directly or view it on GitHubhttps://github.com/isaacs/npm/issues/2949#issuecomment-30164995
.

Right, I realize that, but sometimes you want to check for things like nodemon. Just wondering if it should be handled in package.json's preinstall or grunt. Probably grunt I guess.

http://stackoverflow.com/a/14657796

There's no blessed official way (and this is arguably a bad idea anyway), so you should do what works best for your situation/environment.

Actually, I will need a mechanism just like this for https://github.com/mcrio/bashsh. The use case is interpreted interpreters:

If you have an interpreter in an already-interpreted language (in the Bashsh case it is Bash, but same applies to JavaScript / CoffeeScript), then a Unix environment will not give you a clean way to do a relative she-bang:

#!/usr/bin/env ./node_modules/.bin/my-interpreter

tl;dr: THIS DOES NOT WORK

The relative path will be interpreted as being relative from the caller's working directory (pwd), instead of relative to the script's own location.

#!/usr/bin/env my-interpreter

works

Thus, if I ever have a package that depends on Bashsh, its installation needs to ensure that it is present via npm install -g bashsh somehow.

Of course if you go down that path, you'd have to ensure that whatever interprets my-interpreter (bash, coffee, node) is also installed globally:

#!/usr/bin/env coffee or bash or node

# This is `my-interpreter`.
# `coffee` or `bash` or `node` above must also be absolute (and installed of course).

The good thing here (in my use case) is that it is safe to not have any global installation mechanism. Sure, it is frustrating for the user to separately have to run npm install -g bashsh, but at least if the step gets forgotten, the invalid interpreter will cause the script to fail fast:

$ echo '#!/usr/bin/env invalid-interpreter' > test-script
$ chmod +x test-script
$ ./test-script
env: invalid-interpreter: No such file or directory

Btw in case you wonder: I start providing bashsh as an npm package first, as:

  • it is the package manager I'm most familiar with
  • most scripts that I have already written with its predecessor simplify my daily git and npm workflow considerably (and transparently)
commented

Can anyone point me to a definitive answer about why the node community prefers modules being in the local directory by default?

And skip the hand wavy "disk space is cheap" or "yeah this is never going to happen" stuff.

What is a serious reasoned answer for this? I honestly don't understand

From the technical point of view, that's just how node.js works.

The reasoning it works that way is because software packages change, not always in a compatible way, and one project might depend on an older version, while another project might depend on a newer version. Now if you try to use both of these projects, it would require some kind of hackery to make them both work if those two package version were incompatible.

commented

Still I don't see any problem since in the package.json the version can be in specific for each project and in the global all the required versions can be stored. Pretty much like maven handle its packages.

commented

If you want them deduplicated, use a filesystem with block-level dedupe or something. It's not really worth complicating node for.
One of the huge conveniences this setup gives you is that you can just fiddle with the files in node_modules while debugging or whatever and you don't have to worry about affecting anything else.

If you want them deduplicated, use a filesystem with block-level dedupe or something.

Filesystems with block-level dedupe work well when you have a big file copied somewhere else. For example, VM images.

Node.js packages have a lot of small files, and deduping almost doesn't help with them. FS still needs a place to store references to all of those files. That's what symlinks are for.

From a "newbie" perspective...

I'm not a new developer, just new to NodeJS.

I have 4 projects I’m building on my dev server to help me learn NodeJS.

Prior to starting these projects, I had preinstalled (via npm -g) all the module dependancies (because it just makes sense to me to centralize and standardize all my generic dependancies like express, Jade Bower...). These are not products that will be changed by me - ever - only updated from time to time).
Each project has only between 5 and 10 files and represent less than 1 MB per project, but because I need the node_modules in the project folders (even symlinks duplicate a lot), the projects each have their own sub-folder structure of node_modules.
This makes the total number of files for all 4 projects to be 4,316 files and about 400MB.

I think it's an issue when a few little projects take up hundreds of times more disk space than the actual project files - just to duplicate the same files over and over.

It makes sense to allow development dependencies to be installed globally (Bower, for example). If you were to install these locally within a package, you'd be unable to use them from the command line as they're not in the PATH. This means that whilst my package depends on Bower, I can't list it as a dependency to be installed with npm install as it must be installed globally to be used via command line out of the box.

Docker Development Use Case

Because npm install always puts the node_modules dir in a subdir under package.json, you now have a directory containing non-source files in a directory containing (at least one) source file (package.json).

When developing using Docker, you'd (I'd ;-) like to keep the source tree completely separate from the non-source tree (generated files, downloaded/cached files). If the source tree were completely separate from the non-source tree then I could COPY my source tree into the Docker image and RUN npm install to cache the packages in the image (outside the source tree). Later I could run the image: docker run --volume $PWD:/usr/src/app ... to have the container mount my source files from the native filesystem.

This would let me benefit from cached NPM modules and would also let my Docker container see the latest code changes. Tools like gulp-watch could run inside the container and efficiently add packages and recompile sources as needed in development.

An easy solution might be to allow a -g option directly on npm install e.g. npm install -g could mean: install every dependency in package.json globally. That would be ideal for Docker development because a container is a microcosm. No need for the local/global distinction inside there.

@Bill There is a set of options how a package system can lookup & store dependencies.

Store all dependencies at one place, using the same folder. i.e. /var/lib/npm

This has the problem that two separate applications that are started on the same machine basically have access to the same executable which in turn have access to the similar folder system. A call like fs.readFile(path.join(__dirname, 'tmp.txt') could provide potential security leaks.

Specify a relative sub-folder to lookup in a file (let's call it package.json)

That way you would trade a common way to do things for a confusing way to do things: But it would look individual.

Specify a relative sub-folder to lookup in a file and allow ..

If you allow .. as option to folders you could run into a conflict if you install two packages in the same parent folder. Also you would be limited the places where you can install those packages. i.e. / would be out.

Specify a different path to lookup packages globally:

I.e. there could be a npm config setting to lookup and install packages. The problem with this is: its global and it should be applied to any npm install command. If that folder a relative child folder: i.e. ./libs instead of ./node_modules then you might run in the issue that some package has its source code in that folder and would accidentally break. if you had ../node_modules instead of ./node_modules the you would again run in the "can not be installed in /" sort of issue and if you allow an absolute path like /var/lib/npm then you would again run in the security issue.

Specify it as a strictly defined subfolder for every system

That is the status quo. And there is really no downside to it except the one you just wrote about. The fix for this would be simple though, think of following structure:

- package.json
- node_modules
- server

A dockerfile could look like this

FROM my_nice_image

RUN mkdir /code
COPY ./package.json /code/package.json
RUN cd /code; npm i --production
COPY ./server /code/server

WORKDIR /code
CMD npm start

Got really backtracked when I ran into the same problem as @Bill on my Docker system. There's no need for a local/global distinction as I'm isolating the entire app with Docker anyway. I'd also like to be able to mount my source code during development, and similarly would like if there were a npm install -g that just installed all my dependencies globally.

@martinheidegger your proposed solution doesn't work as it doesn't fully support the development case. In my case, I'd want to mount all my source code, including the package.json, so that inside the container I could run something like npm install --save <package> and my host's package.json would contain the package version (as it's mounted).

With your proposed solution, it's not possible to do this, as if I were to mount all my source code to /code it would overwrite /code/node_modules as my host has no node_modules directory. If I tried doing an npm install after mounting, a node_modules directory would appear on my host, which is not what @Bill or I want.

Currently, my solution is to mount all of my subdirectories separately, instead of mounting the parent directory (which includes package.json), and then individually mounting package.json as well as any other files needed - example of my compose file below. In this way, node_modules remains only on the container, but this solution doesn't really scale as I add more directories and files...

frontend:
  build: .
  volumes:
    - ./css:/code/css
    - ./fonts:/code/fonts
    - ./images:/code/images
    - ./js:/code/js
    - ./package.json:/code/package.json
    - ./webpack.config.js:/code/webpack.config.js
  command: npm run watch

@agilgur5 I should probably mention I opened this issue three years ago and that I am feeling okay that it is closed ;)

@agilgur5 Did you try to "link" the node_module's folder to a system path before running npm install

RUN mkdir /node_modules
RUN ln -s /node_modules /code/node_modules

Just wanted to update this for anyone who comes by as I did find a great solution after 3 months of trying different things: here's my StackOverflow answer

Basically, it involves creating a volume for the node_modules folder, but not having that mounted to the source directory, so my Compose file looks like this:

volumes:
  - ./:/code
  - /code/node_modules

@martinheidegger I realize you're okay with it, but other people, like me and the ones before me, comment on old closed issues all the time when there's still problems. I had tried getting around this with mounting files separately (causes package.json EBUSY), symlinking node_modules and just mounting the whole directory (causes cross-OS symlink problems and an infinite loop in NPM when you disable symlinks), and then changing NODE_PATH (not quite what that's for), with no avail, and having a global install would have made all of that entirely unnecessary. The advent of containers mean that global installs should be the default (no separate folders needed when they're isolated), or at the very least always be an option, hence my comment and frustration on this issue. Thank you for offering a fix, it didn't work, as I stated on SO, but got me closer to the solution!

At least now I have a simple workaround to this for all my node apps (would be better if a workaround wasn't always necessary though...) and hopefully other people can use it too!

Correct me if I am wrong but does this fix your problem

{
    "name": "Meh",
    "description": "Some Description",
    "scripts": {
        "preinstall": "npm list someDependency -g || npm install someDependency -g"
    }
}

Should check if someDependency is installed globally and if not then install it globally

edit: for multiple dependency support use:
"preinstall": "(npm list someDependency -g || npm install someDependency -g) && (npm list otherDependency -g || npm install otherDependency -g)"

@RedSparr0w how about requiring a specific version?

I am writing a yeoman-generator. This generator requires a minimal yo-cli (yo) version, since only then I can use some new features like promises for this.prompt.

I have a similar need. The "typescript" module is installed globally where it makes the most sense to install since it is general and used for all my projects. If npm install "tslint" locally than tslint breaks with an error. Therefore I need to add a requirement for "tslint" to be installed at the same global/local scope as typescript.

@davidreher
For a specific version you could use

{
    "name": "Meh",
    "description": "Some Description",
    "scripts": {
        "preinstall": "npm list someDependency@1.4.5 -g || npm install someDependency@1.4.5 -g"
    }
}

@mmc41
something like this should work? maybe (cant test as dont really have a project it would work for but let me know!)

{
    "name": "Meh",
    "description": "Some Description",
    "scripts": {
        "preinstall": "(npm list someDependency && npm install otherDependency) || (npm list someDependency -g || npm install someDependency -g) && npm install otherDependency -g"
    }
}

checks if someDependency installed locally if so then install otherDependency locally
otherwise check if someDependency installed globally if not then install someDependency globally but either way install otherDependency globally because someDependency is not installed locally

or do it the other way around check if installed globally first then if not install both locally.

@RedSparr0w It gives me a false positive 😭 (e.g. for rimraf). Any other workaround?

screen shot 2016-10-29 at 5 26 02 pm

@dwiyatci Try using rimraf -h || echo "pkg not installed globally"

I have a project I'm syncing over Dropbox, and I had a small problem when running Electron from node_modules. It was randomly breaking on my other systems unless I reinstalled it. I'm assuming something is breaking or not initializing correctly during Dropbox's sync operations, so I decided to install Electron globally on each of my systems to work around the problem.

Here's my new npm start line. When invoked, it tries to run Electron. If that fails, it will install it globally and then start it.

"scripts": {
  "start": "/usr/local/bin/electron .; if [ $? -eq 127 ]; then npm install electron@1.4.15 -g; /usr/local/bin/electron .; fi"
},

Thankfully, syntax errors in my Electron init script don't exit with code 127, so this works for me. Hope it helps you too.

Welcome to the age of containers, where the operating system is configured only to run one app