I have a folder of dynamically loaded asset files I want to include in my parcel output directory. How can I include unreferenced static asset files like .json, .jpeg, .txt, .etc with my parcel build command?
With Parcel v2 there's a different plugin: https://github.com/elwin013/parcel-reporter-static-files-copy
yarn add parcel-reporter-static-files-copy --dev
then you need to create .parcelrc or add to the following to it. (Note: "..." is literal not something you need to fill in):
{
"extends": ["#parcel/config-default"],
"reporters": ["...", "parcel-reporter-static-files-copy"]
}
now any files (and sub-directories) in a directory named static will be automatically copied to the website (typically your dist folder) when you do the regular parcel build.
Note: This answer is for Parcel v1
There is a parcel plugin for that:
https://www.npmjs.com/package/parcel-plugin-static-files-copy
Install it:
yarn add parcel-plugin-static-files-copy --dev
Or
npm install -D parcel-plugin-static-files-copy
Then, in package.json, add:
"staticFiles": {
"staticPath": ["path/to/a/staticFolder"]
}
It should copy your files to the public folder.
Stay safe!
The best way to handle this is having control of the code. npm already provides the tools needed for this job. In the package.json, when running commands with &&, the first command will run, and if it does finish without any error, the second command will also be executed. Running &, however, will run each command in the background independently, regardless of what happens to the other command.
In other words:
Use && for sequential execution.
Use & for parallel execution.
For example:
project/
|dist/
|...
|src/
|assets/
|text.txt
|memos.txt
|info.ini
|css/
|style.css
|img/
|a.png
|b.jpg
|c.jpeg
|data.json
|not-to-copy.json
|not-to-copy.conf
|index.js
|index.html
|package.json
If you have a project structure like this add, some scripts to the package.json
{
...
"source": "src/index.html",
"scripts": {
"clean-dist": "rm -rf dist && mkdir dist",
"copy-img": "cp -vR ./src/img ./dist",
"copy-data": "cp -r src/data.json dist",
"copy-assets": "cp -r src/assets/* dist",
"copy-files": "npm run copy-img & npm run copy-assets & npm run copy-data",
"init": "npm run clean-dist && npm run copy-files",
"start": "npm run init && parcel",
"build": "npm run init && parcel build"
},
...
}
This configuration will sequentially run clean-dist and copy-files. The former will delete the dist directory and make the directory again. Then copy-files will copy src/img -> dist/img, src/assets/* -> dist/* and src/data.json -> dist/data.json in parallel. Finally, parcel will be executed.
You can edit your package.json scripts to copy the files after the build has executed. This is how I had a .htaccess file to the dist folder:
"build": "rm -rf dist && parcel build src/index.html -d dist --public-url ./ '.' cp src/.htaccess dist"
Related
I have a project with two folders, client and server, and I also have a package.json at the root of the project to manage starting and installing on the client and server from one place. The problem occurs when I try to install packages from the root using one of the install scripts I have:
"scripts": {
"install-server": "npm install --prefix server",
}
when I run the above script it not only creates a node_modules folder but also downloads additional files into the server folder, which I think are supposed to go into the node_modules folder.
But the following script works fine
"scripts": {
"install-server": "cd server && npm install",
}
why is that?
I am trying to set up my project based on MonoRepo for that I have used lerna.js now as far as I know lerna.js work as follows.
For Dev
Create lerna.json and update all packages and workspace details to package.json and lerna.json
run yarn or npm client as usual you do.
any folder inside packages/ will be treated as node_module so you can directly call them.
Now running my application as dev works fine but when to build or transpile my es6 code using Babel for production than lerna.json doesn't work right following are problems which are making me confused about how to use it.
Do I have to Publish all my packages to npm for using them in production?
running lerna Bootstrap links packages but when I view my package inside node_modules it still contains es6 code..due to this node application throws an error for using import statement which node didn't understand unless you use experimental flag.
Following is the Example:
lerna.json
{
"packages": [
"packages/*"
],
"version": "independent",
"npmClient": "yarn",
"useWorkspaces": true
}
packages Directory
packages/
context/
dyna_modules/
www/
npm scripts
"clean": "lerna clean --yes && rimraf node_modules && rm -rf package-lock.json yarn.lock",
"build::js": "babel ./packages --out-dir ./build/packages/ --ignore node_modules",
"build::nonjs": "babel package.json prisma.yml Dockerfile docker-compose.yml .env --out-dir ./build --copy-files && babel ./packages/routes --out-dir ./build/packages/routes --copy-files",
"build": "rm -rf ./build && mkdir ./build && npm run build::js && npm run build::nonjs",
"dev:nodemon": "DEBUG=*,-babel,-babel:*,-express:*,-nodemon:*,-nodemon,-snapdragon:*,-finalhandler,-follow-redirects nodemon -L --exec babel-node --inspect index.js",
"dev": "yarn dev:nodemon --ignore-engines",
"prestart": "npm run build && lerna bootstrap",
"start": "cd ./build && node ./packages/www/index.js",
Problem
Running yarn start always fails with error for using the Import statement. During the inspection, I found out that all my packages/modules inside node_module contain es6 syntax instead of a trans-piled code.
I am trying to structure my project based on lerna.js now as per documentation
everything is setup and is working fine when run as development server following is directory
structure.
index.js
packages/
package1
package2
index.js => contains: import Package1 from 'package1' <--- Package1 etc path is managed by lerna.
package3
DEV & BUILD Command
# Build
"build::js": "babel ./packages --out-dir ./build --ignore node_modules",
"build::nonjs": "babel ./package1 --out-dir ./build/package1 --copy-files",
"build": "rm -rf ./build && mkdir ./build && npm run build::js && npm run build::nonjs"
# Dev
"dev:nodemon": "DEBUG=*,-babel,-babel:*,-express:*,-nodemon:*,-nodemon,-snapdragon:*,-finalhandler,-follow-redirects nodemon -L --exec babel-node --inspect index.js",
"dev": "yarn dev:nodemon",
Now runing dev command on my project works fine but when files are build for production than babel doesnt change Package1 path to its absolute location just like regular module.
...
var _Cache = _interopRequireDefault(require("package1")); // <<---- how to get ride of this problem. as. node tires to locate module but there is no package1 in node_modules its just regular modules which is inside packages/ directory. now running it on dev simply works.
I'm using typescript on my project and I can successfully watch + compile .ts files and output them to dist folder.
here is the scripts part of my package.json
"start": "npm run build && npm run watch",
"build": "npm run build-ts && npm run tslint",
"test": "cross-env NODE_ENV=test jest --watch",
"watch": "concurrently -k -p \"[{name}]\" -n \"Typescript,Node\" -c \"cyan.bold,green.bold\" \"npm run watch-ts\" \"npm run serve\"",
"serve": "nodemon dist/server.js",
"build-ts": "tsc",
"watch-ts": "tsc -w",
"tslint": "tslint -c tslint.json -p tsconfig.json"
The problem is I want to use js templating engine (nunjucks) and I need to watch the view files inside the views folder and move them to the dist folder.
Is there a way by just using npm scripts or nodejs?
Or do I need to use other tools like gulp or webpack?
I have the "same" request to for a CRUD graphql back-end server, but don't want to use gulp or webpack just to keep it simple.
I see that you use nodemon like me. Then, according the docs at https://github.com/remy/nodemon, it can be used it to monitor changes of any kind of file other than the default js. More over, nodemon can monitor the status of other transactional server other than node.
The first task is detecting the changes of wanted files: in my case I want copy the *.gql files in my src/schema folder to build/schema folder. For that, you can use the ext for the kind of files, and watch option for the source folder to explore.
The second one task is matter of copying the files. Naturally, you can use the copy command of your host OS. In my case I use the DOS xcopy command of the Windows shell (or cp in Unix like OS). nodemon has an "event-hook" with the event option, that can execute a command line when an event occurs. Just we need the restart event of the node server when the changes are detected for nodemon.
You can use the command line options, or a global config file, or in you local package.json project config file. I show up the last one using nodemonConfig section of package.json:
"nodemonConfig": {
"watch": [
"./src/schema",
"./build"
],
"ext": "js,gql",
"events": {
"restart": "xcopy .\\src\\schema\\*.gql .\\build\\schema /Y /O /R /F /I /V /E"
}
}
Ozkr's answer is great, I just want to add what worked for me, I had to change it a bit as nodemon was running into an infinite restart otherwise:
"nodemonConfig": {
"watch": [
"./views",
"./public"
],
"ext": "hjs,js",
"events": {
"restart": "cp -r views dist \n cp -r public dist"
}
}
copy-and-watch does just that:
I use this code to copy html files during development:
"copy_html": "yarn copy-and-watch src/mail_templates/* prod/mail_templates --watch --clean",
Hi I use npm install jquery to install a jQuery for my project.but i find it is located in node_modules\jquery with many unwanted files.
but I just wana put node_modules\jquery\dist\jquery.min.js into static\jquery folder
what is the best and common way? copy and paste manually?
You can use npm to do this. In your package.json, add the following to the scripts key
...
"scripts": {
"build:jquery": "cp node_modules/jquery/dist/jquery.slim.min.js static/jquery/"
},
...
Then you can run: npm run build:jquery
You can add more build tasks to this section as you need them such as copying images and minifying scripts and css, then chain them together in a single command with npm-run-all:
$ npm install npm-run-all --save-dev
And...
...
"scripts": {
"build:jquery": "cp node_modules/jquery/dist/jquery.slim.min.js static/jquery/",
"build:images": "cp -R src/assets/images/ static/images/",
"build": "npm-run-all -p build:*"
},
...
Then run npm run build
npm is a great build tool and often bypasses the need for an additional build framework such as Gulp or Grunt. It can also handle file watchers and such to rebuild when things are modified automatically.