Gulp Copy content from one file to another file - javascript

I am trying to copy the content from one file to another. I am trying the following code but its throwing me an error.
gulp
.src('core/core.config.local.tpl.js')
.pipe(gulp.dest(core/core.config.js));
Error: EEXIST: file already exists, mkdir 'C:\Users\krish\Documents\test\app\core\core.config.js'
at Error (native)
Is there any other process that I could use to copy content?

gulp.dest expects a directory as an argument. All files are written to this destination directory. If the directory doesn't exist yet, gulp tries to create it.
In your case gulp.dest tries to create a directory core/core.config.js, but fails since a regular file with the same name already exists.
If your goal is to have the regular file at core/core.config.js be overwritten with the contents of core/core.config.local.tpl.js on every build, you can do it like this:
var gulp = require('gulp');
var rename = require('gulp-rename');
gulp.task('default', function() {
gulp.src('core/core.config.local.tpl.js')
.pipe(rename({ basename: 'core.config'}))
.pipe(gulp.dest('core'));
});

I think you were missing the quotes only when entering the the question here, right?
gulp.dest() expects a folder to copy all the files in the stream, so you cannot use a single file name here (As the error says: "mkdir failed"). See: https://github.com/gulpjs/gulp/blob/master/docs/API.md#gulpdestpath-options
When you do your gulp.src() you can set a base path to build the relative path from and so gulp can copy your single file to the given output folder. See: https://github.com/gulpjs/gulp/blob/master/docs/API.md#optionsbase
As you also want to rename the file, you need something like gulp-rename: https://www.npmjs.com/package/gulp-rename

Related

require.resolve not finding file even though fs can

I'm having a problem with node.js require.resolve the method that makes no sense having with the test I'm running first.
So my Code
let checkPluginCanLoad = (pluginName, pluginFile) => {
return new Promise((res, rej) => {
// build the plugin's class files path
console.log(`looking for file at ${path}/${pluginName}/${pluginFile}`)
let req = `${path}/${pluginName}/${pluginFile}`
// check it exists blocking
let fileState = fs.existsSync(`${req}.js`);
if(fileState){
console.log(`File ${req}.js ${fileState?"exists.":"does not exist."}`);
// it exists try to load it
let plugin = require.resolve(`${req}`);
res(plugin);
}else{
// could not find or load the plugin
rej(`Plugin is invalid can't find or load class for ${pluginFile}`);
}
});
}
The vars are set to , path = "Plugins", pluginName = "TestPlugin", pluginFile = "TestPlugin";
My output is
Plugins from 'Plugins' have been loaded. index.js:36
looking for file at Plugins/TestPlugin/TestPlugin Plugins.js:133
File Plugins/TestPlugin/TestPlugin.js exists. Plugins.js:138 failed to
load plugin 'TestPlugin' with error 'Error: Cannot find module 'Plugins/TestPlugin/TestPlugin''
The final line
load plugin 'TestPlugin' ... comes from the system above this one catching the rejection.
So the file Exists according to FileSystem but the resolver can't find it.
I have tried prepending it with ../ before the path in case it's resolving from the file that is running it and not the application directory,
I have also tried prepending it with ./ in case it's running it from the application directory.
you need to add ./ to tell node that you are targeting local file.
let plugin = require.resolve(`./${req}`);
if you forgot node will search in steps described here
So this is a bug caused by symlinks on windows with junctioned directories,
So for my dev environment it junctions the directories from my Git Repository directory. but Node is running it through the junction directory but seeing the original directory.
E.G
Git repo is in D:\Projects\NodeAppRepo
then the dev environment is running inside D:\Project\NodeApp\
I have index.js, Core/, node_modules/ all running via a junction inside the NodeApp to the paths in the git repo.
so i'm running node index.js inside D:\Project\NodeApp\ wich is in turn loading D:\Project\NodeApp\Core\Core.js this then imports ./Plugins.js this is where the break happens because the path node has for plugins is not D:\Project\NodeApp\Core\Plugins.js it is D:\Projects\NodeAppRepo\Core\Plugin.js

aws-lambda Cannot find module

I keep getting this error in the aws-lambda console when uploading code from a zip file. I have tried uploading other zip files and they work correctly. The .js file is named "CreateThumbnail.js" in the zip file. I believe the handler is also named properly "CreateThumbnail.handler". the node_modules subdirectory is also setup. Anyone have any idea?
{
"errorMessage": "Cannot find module 'CreateThumbnail'",
"errorType": "Error",
"stackTrace": [
"Function.Module._resolveFilename (module.js:338:15)",
"Function.Module._load (module.js:280:25)",
"Module.require (module.js:364:17)",
"require (module.js:380:17)"
]
}
The way I was able to get this to work was:
Name the file exports.js
Name the handler, within the file, exports.handler
Set the handler in the lambda config to exports.handler
Zip up only the contents of the folder, not the folder itself (as mentioned above) and rename the zip file exports.zip
Ok, I did this myself, just make sure that you make the zip such that the .js file doesn't end up inside a folder, because AWS would unzip the file you upload and tries to find a .js file by the name of handler you gave, and if its inside a folder it won't help you.
One possible problem is if you upload the lambda as a zip file created via PowerShell Compress-Archive. Compress-Archive has a bug which causes AWS to extract the files into a flat tree (no subdirectories), with backslashes in filenames:
This exact error can show up if your zipped file(s) do not have world-wide read permission. (chmod -R ugo+r).
Check the file permissions before they are zipped. This is not emphasized enough unfortunately by AWS and it caused a lot of headaches for many.
If you are using AWS Lambda Layers you need to validate if your directory structure is on the needed structure for a layer:
For example for the moment.js node.js module you need the following structure:
aws-lambda-layer.zip
│ nodejs
│ nodejs/node_modules
└ nodejs/node_modules/moment
So to create a layer zip file with the correct structure we can use the following command on the root of our project:
mkdir -p nodejs && cp -r node_modules nodejs/ && zip -r aws-lambda-layer.zip nodejs
Some library files might not have global Read so lambda will not be able to read to content and build the content.
Make sure all files in node_modules are readable before packaging:
chmod -R +r node_modules
Then zip and upload.
This is the instruction from https://docs.aws.amazon.com/lambda/latest/dg/nodejs-package.html that I have followed and it works.
To update a Node.js function with dependencies
Open a command line terminal or shell. Ensure that the Node.js version in your local environment matches the Node.js version of your function.
Create a folder for the deployment package. The following steps assume that the folder is named my-function.
Install libraries in the node_modules directory using the npm install command.
npm install the_package_that_is_missing
Create a .zip file that contains the contents of your project folder. Use the r (recursive) option to ensure that zip compresses the subfolders.
zip -r function.zip .
Upload the package using the update-function-code command.
aws lambda update-function-code --function-name my-function --zip-file fileb://function.zip
Now your function is ready to run!
I had this problem on a custom module I had built that was in the node_modules dir. Everything ran fine in testing on my Win10 machine, but when uploaded I kept getting that same "Cannot find module 'modulename'" error.
It turns out that I had a mismatch; here's the package.json line from the module that couldn't be found:
"main": "./build/modulename.js",
and here's the actual filename:
Modulename.js
Case-sensitive; Windows isn't, linux (and thus AWS) is.
This is unrelated but google brought me here, so:
AWS will give you an error:
Unable to import module '<myfile>': Error
What was really happening for me, was that was requiring an unexisting JS file. The error is a bit misleading.
I ran into this same scenario, solved it by using these specific steps to create a Layer, then hook that up to the Lambda function.
make a new empty directory:
mkdir newdir && cd newdir
install whatever npm things:
npm install --save xyz
make a directory skeleton that matches the expected Lambda structure for Node14 (there's a different structure for Node12, or various other languages; see https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html?icmpid=docs_lambda_help):
mkdir -p nodejs/node14
copy the "node_modules" directory into that newly made directory skeleton:
cp -R node_modules nodejs/node14
zip the whole thing up (name it whatever you want):
zip -r custom-drivers-node14.zip nodejs
from there, go to AWS console, Lambda, then "Layers" and create a new layer. In the dialog, upload your .zip file ("custom-drivers-node14.zip").
finally, edit your Lambda function in AWS console, and add a new Layer – the interface might change, but as of now, this is under the main screen for a single function, then scroll way down to the bottom. Follow the "Add a layer" flow, choose the Layer you made, and then try your code.
One final note, this code structure worked:
const xyz = require('xyz');
exports.handler = async (event) => {
xyz.doSomething();
}
AWS Lambda uses the name of the file and the name of the handler function, so if you defined your handler like this: exports.myHandler = function(event, context) in a file named index.js, your handler is index.myHandler.
This turned out to be a simple one for me.
I was getting, cannot create index. in my case, my main lambda file with the exports.handler in had to be called index.js
Try calling your main file CreateThumbnail.js
The tutorial tells you to include the following items in your zip file:
CreateThumbnail.js
/node_modules/gm
/node_modules/async
What it fails to consider is that there are dependencies of the two packages (gm, async) that also need to be part of the package.
So here's what you have to do:
Change directory to node_modules folder in your project folder, and run the command 'npm install gm async'. This will install gm, async and all their dependencies in this folder.
Now package the 'CreateThumbnail.js' file and the complete 'node_modules' folder into a zip file and upload it. It should work now.
So your complete package should look something like this:
CreateThumbnail.js
/node_modules/.bin
/node_modules/array-parallel
/node_modules/array-series
/node_modules/async
/node_modules/cross-spawn
/node_modules/debug
/node_modules/gm
/node_modules/isexe
/node_modules/lodash
/node_modules/lru-cache
/node_modules/ms
/node_modules/pseudomap
/node_modules/which
/node_modules/yallist
File Name:
app.js
Lambda Function in "app.js":
exports.handler = function(event, context)...
Lambda Handler on Amazon Console:
app.handler ({app}.js + exports.{handler} = app.handler)
When you unzip the folder, you should see:
app.js
node_modules

Can not understand behavior of gulp.src()

Here is my code:
'use strict';
var gulp = require('gulp'),
$ = require('gulp-load-plugins')(),
module.exports = function(options) {
gulp.task('test', function () {
gulp.src('external/bower_components/bootstrap-sass-official/assets/stylesheets/_bootstrap.scss')
.pipe(gulp.dest('dist/'));
});
};
This string gulp.dest('dest/') will save file _bootstrap.scss in 'dest/' folder.
If I change the string
gulp.src('external/bower_components/bootstrap-sass-official/assets/stylesheets/_bootstrap.scss')
to string
gulp.src('external/*/bootstrap-sass-official/assets/stylesheets/_bootstrap.scss')
then string
gulp.dest('dest/')
will save file _bootstrap.scss in dest/bower_components/bootstrap-sass-official/assets/stylesheets folder.
Can you explain and give me the link where to read about why in first case we have no lots of folder inside dest folder, and in the second case there are lots of folder inside dest folder?
If I understand correctly, in the second case, glob pattern will be converted to array of full files paths. It means that in my example glob nodejs module will convert glob pattern 'external/*/bootstrap-sass-official/assets/stylesheets/_bootstrap.scss' to
array ['external/bower_components/bootstrap-sass-official/assets/stylesheets/_bootstrap.scss']. So why do I have lots of nested folders inside 'dest/' folder in second case, but have no nested folders inside 'dest/' folder in first case?
Well indeed it is a little weird, since very implicit behaviour of node-glob. But in fact it is what you want to expect. For example think of the case you have multiple folders in external which will both have files bootstrap-sass-official/assets/stylesheets/_bootstrap.scss'). Then you can't save two files with the same name in your dest.
Without having looked into node-glob source, I think it mitigates this case by chopping of the files path after /*/ and will automatically attach it to your dest path.
Having said that is implicit behaviour, you can easily avoid it by assigning the array yourself gulp.src(['firstfile.js']) if you really need the array or want to add files in the future.
Since Gulp docs don't offer too much info on globbing patterns, I found those links to be the best help. Read up on node-glob and Gulp on smashing.

Grunt - get current calling folder, and not gruntfile current folder

If I have Grunt installed in some folder /foo, but my current folder is /foo/bar/baz, and I run "grunt sometask" from within my current folder, how can I get Grunt (or NodeJS for that matter) to determine my current path? That is to say, how can I programmatically GET the folder I was in when I called grunt?
When I use process.cwd(), I get the path of the gruntfile, ie, "foo", which is not what I want.
I don't have to do this in Grunt specifically, any nodejs-based solution would work.
According to the source code:
By default, all file paths are relative to the Gruntfile
And, voilá, this line of code shows how grunt actually changes the current directory to the path of the Gruntfile:
process.chdir(grunt.option('base') || path.dirname(gruntfile));
However, option --base is there for just that. See docs: http://gruntjs.com/api/grunt.file
If you don't need to do it from inside the Gruntfile, simply run a script that captures the process.cwd() and then execs grunt.
See: https://www.npmjs.com/package/exec
var exec = require('exec');
process.cwd(); // Will have your current path
exec(['grunt', 'mytask'], function(err, out, code) {
if (err instanceof Error)
throw err;
process.stderr.write(err);
process.stdout.write(out);
process.exit(code);
});
in the Mac or Linex, you can get this by
process.env.PWD
in the windows, unknown
You can edit the grunt-cli to get finish this.
grunt-cli/bin/grunt
require(gruntpath).cli({_originDir:basedir});
then in the gruntfile.js, you can follows:
grunt.option('_originDir')

Javascript: get package.json data in gulpfile.js

Not a gulp-specific question per-se, but how would one get info from the package.json file within the gulpfile.js; For instance, I want to get the homepage or the name and use it in a task.
This is not gulp specific.
var p = require('./package.json')
p.homepage
UPDATE:
Be aware that "require" will cache the read results - meaning you cannot require, write to the file, then require again and expect the results to be updated.
Don't use require('./package.json') for a watch process, as using require will resolve the module as the results of the first request.
So if you are editing your package.json those edits won't work unless you stop your watch process and restart it.
For a gulp watch process it would be best to re-read the file and parse it each time that your task is executed, by using node's fs method
var fs = require('fs')
var json = JSON.parse(fs.readFileSync('./package.json'))
This is a good solution #Mangled Deutz. I myself first did that but it did not work (Back to that in a second), then I tried this solution:
# Gulpfile.coffee
requireJSON = (file) ->
fs = require "fs"
JSON.parse fs.readFileSync file
Now you should see that this is a bit verbose (even though it worked). require('./package.json') is the best solution:
Tip
-remember to add './' in front of the file name. I know its simple, but it is the difference between the require method working and not working.
If you are triggering gulp from NPM, like using "npm run build" or something
(This only works for gulp run triggers by NPM)
process.env.npm_package_Object
this should be seprated by underscore for deeper objects.
if you want to read some specific config in package.json like you want to read config object you have created in package.json
scripts : {
build: gulp
},
config : {
isClient: false.
}
then you can use
process.env.npm_package_**config_isClient**

Categories

Resources