I keep getting this error in the aws-lambda console when uploading code from a zip file. I have tried uploading other zip files and they work correctly. The .js file is named "CreateThumbnail.js" in the zip file. I believe the handler is also named properly "CreateThumbnail.handler". the node_modules subdirectory is also setup. Anyone have any idea?
{
"errorMessage": "Cannot find module 'CreateThumbnail'",
"errorType": "Error",
"stackTrace": [
"Function.Module._resolveFilename (module.js:338:15)",
"Function.Module._load (module.js:280:25)",
"Module.require (module.js:364:17)",
"require (module.js:380:17)"
]
}
The way I was able to get this to work was:
Name the file exports.js
Name the handler, within the file, exports.handler
Set the handler in the lambda config to exports.handler
Zip up only the contents of the folder, not the folder itself (as mentioned above) and rename the zip file exports.zip
Ok, I did this myself, just make sure that you make the zip such that the .js file doesn't end up inside a folder, because AWS would unzip the file you upload and tries to find a .js file by the name of handler you gave, and if its inside a folder it won't help you.
One possible problem is if you upload the lambda as a zip file created via PowerShell Compress-Archive. Compress-Archive has a bug which causes AWS to extract the files into a flat tree (no subdirectories), with backslashes in filenames:
This exact error can show up if your zipped file(s) do not have world-wide read permission. (chmod -R ugo+r).
Check the file permissions before they are zipped. This is not emphasized enough unfortunately by AWS and it caused a lot of headaches for many.
If you are using AWS Lambda Layers you need to validate if your directory structure is on the needed structure for a layer:
For example for the moment.js node.js module you need the following structure:
aws-lambda-layer.zip
│ nodejs
│ nodejs/node_modules
└ nodejs/node_modules/moment
So to create a layer zip file with the correct structure we can use the following command on the root of our project:
mkdir -p nodejs && cp -r node_modules nodejs/ && zip -r aws-lambda-layer.zip nodejs
Some library files might not have global Read so lambda will not be able to read to content and build the content.
Make sure all files in node_modules are readable before packaging:
chmod -R +r node_modules
Then zip and upload.
This is the instruction from https://docs.aws.amazon.com/lambda/latest/dg/nodejs-package.html that I have followed and it works.
To update a Node.js function with dependencies
Open a command line terminal or shell. Ensure that the Node.js version in your local environment matches the Node.js version of your function.
Create a folder for the deployment package. The following steps assume that the folder is named my-function.
Install libraries in the node_modules directory using the npm install command.
npm install the_package_that_is_missing
Create a .zip file that contains the contents of your project folder. Use the r (recursive) option to ensure that zip compresses the subfolders.
zip -r function.zip .
Upload the package using the update-function-code command.
aws lambda update-function-code --function-name my-function --zip-file fileb://function.zip
Now your function is ready to run!
I had this problem on a custom module I had built that was in the node_modules dir. Everything ran fine in testing on my Win10 machine, but when uploaded I kept getting that same "Cannot find module 'modulename'" error.
It turns out that I had a mismatch; here's the package.json line from the module that couldn't be found:
"main": "./build/modulename.js",
and here's the actual filename:
Modulename.js
Case-sensitive; Windows isn't, linux (and thus AWS) is.
This is unrelated but google brought me here, so:
AWS will give you an error:
Unable to import module '<myfile>': Error
What was really happening for me, was that was requiring an unexisting JS file. The error is a bit misleading.
I ran into this same scenario, solved it by using these specific steps to create a Layer, then hook that up to the Lambda function.
make a new empty directory:
mkdir newdir && cd newdir
install whatever npm things:
npm install --save xyz
make a directory skeleton that matches the expected Lambda structure for Node14 (there's a different structure for Node12, or various other languages; see https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html?icmpid=docs_lambda_help):
mkdir -p nodejs/node14
copy the "node_modules" directory into that newly made directory skeleton:
cp -R node_modules nodejs/node14
zip the whole thing up (name it whatever you want):
zip -r custom-drivers-node14.zip nodejs
from there, go to AWS console, Lambda, then "Layers" and create a new layer. In the dialog, upload your .zip file ("custom-drivers-node14.zip").
finally, edit your Lambda function in AWS console, and add a new Layer – the interface might change, but as of now, this is under the main screen for a single function, then scroll way down to the bottom. Follow the "Add a layer" flow, choose the Layer you made, and then try your code.
One final note, this code structure worked:
const xyz = require('xyz');
exports.handler = async (event) => {
xyz.doSomething();
}
AWS Lambda uses the name of the file and the name of the handler function, so if you defined your handler like this: exports.myHandler = function(event, context) in a file named index.js, your handler is index.myHandler.
This turned out to be a simple one for me.
I was getting, cannot create index. in my case, my main lambda file with the exports.handler in had to be called index.js
Try calling your main file CreateThumbnail.js
The tutorial tells you to include the following items in your zip file:
CreateThumbnail.js
/node_modules/gm
/node_modules/async
What it fails to consider is that there are dependencies of the two packages (gm, async) that also need to be part of the package.
So here's what you have to do:
Change directory to node_modules folder in your project folder, and run the command 'npm install gm async'. This will install gm, async and all their dependencies in this folder.
Now package the 'CreateThumbnail.js' file and the complete 'node_modules' folder into a zip file and upload it. It should work now.
So your complete package should look something like this:
CreateThumbnail.js
/node_modules/.bin
/node_modules/array-parallel
/node_modules/array-series
/node_modules/async
/node_modules/cross-spawn
/node_modules/debug
/node_modules/gm
/node_modules/isexe
/node_modules/lodash
/node_modules/lru-cache
/node_modules/ms
/node_modules/pseudomap
/node_modules/which
/node_modules/yallist
File Name:
app.js
Lambda Function in "app.js":
exports.handler = function(event, context)...
Lambda Handler on Amazon Console:
app.handler ({app}.js + exports.{handler} = app.handler)
When you unzip the folder, you should see:
app.js
node_modules
Related
I keep getting this error when I try to publish a site to GitHub pages.
Conversion error: Jekyll::Converters::Scss encountered an error while converting 'assets/css/style.scss':
19
No such file or directory # dir_chdir - /github/workspace/docs
When I try to change the folder to root, it publishes the readme.md file. When I change it to doc, I get this error.
The file runs fine locally. I don't know what the problem is. I searched online but did not find a solution that helps. Any help works.
The error is on my main branch when I try to publish it.
Links to the repo and the error on GitHub:
https://github.com/Rsmdo/dadport2
https://github.com/Rsmdo/dadport2/tree/main
I tried making a new repo but this did not solve the issue, also I went through the code to see if there were any parsing errors but there were none.
https://talk.jekyllrb.com/t/cannot-deploy-site-via-github/6883/11 says that "Jekyll can’t find the files the theme uses". The page also suggests using root instead of docs or any folder.
There are options to set the directory where Jekyll writes files to and reads file from, for example: bundle exec jekyll s -s /docs, which leads to errors in my case due to the non-existing path based on the root path (the path can also be relative I guess).
See Source: /docs, the other path do not show the docs path though.
PS C:\Users\User\usr.github.io> bundle exec jekyll s -s /docs
Configuration file: none
Source: /docs
Destination: C:/Users/User/usr.github.io/_site
Incremental build: disabled. Enable with --incremental
Generating...
Error reading file C:/Users/User/usr.github.io/_layouts/archive.html: No such file or directory # rb_sysopen - C:/docs/Users/User/usr.github.io/_layouts/archive.html
This may help:
https://jekyll.one/pages/public/manuals/jekyll/user_guide/configuration/
https://github.com/burtlo/jekyll-core/blob/master/site/docs/configuration.md
I am creating a VSCode extension, and following the getting started guide (https://code.visualstudio.com/api/get-started/your-first-extension) have used yeoman scaffold to get started. I created a new file, newModule.js in the same directory and want to import it as a module for use in the main extension.js script. I then do:
const newModule = require('./newModule.js');
This throws an error:
cannot find module 'newModule' require stack: -
This problem disappears if I copy my file to the node_modules folder created by default. I would like to know what is going on here, and what the best way of handling imports is when working with javascript/Node.js/vs-extensions.
I also notice that node_modules folder is not pushed to github by default, why?
The node_modules folder is for storing all the code from the libraries and packages you are using. It is excluded from git because it is a waste of space and a distraction to store them all in your versioning-control, as you can just re-download them anytime.
Just put your module in the same /src directory, and use the import syntax to import it, instead of require.
import newModule from './newModule';
For example, see how it is done in this sample code.
Please instead
const newModule = require('./newModule.js');
Try this
import newModule from './newModule');
// ^^ Do not use file extension
Also make sure that the file you are calling is in the same directory
I have a Webpack-templated Vue project, initiated through vue-cli.
I have created a simple 'vue.config.js' file stored in the root folder (where package.json is at) containing the following:
// vue.config.js
module.exports = {
productionSourceMap: false
}
Though when building the project using "npm run build" it ignores it.
I have tried different configurations to check if the problem is with the file or the setting, and the problem is with the file.
I am using webpack#3.12.0, vue#2.6.11, #vue/cli 4.2.3 and npm#6.9.0.
Make sure your build confiuration (in your case the webpack build configs) include your file.
Generally, you will have a source folder (often src) and the builder will build all the files in that dir only. Then you have your destination directory (often dist or build) where your build files will be stored.
Two solutions:
add your conf file to the build source.
move your vue.conf.js file into your source directory
For some reason, I did not manage to get vue.config.js to work.
Alternatively, I edited my webpack config, which as my build files mentioned was located at /config/index.js
Then, I proceeded to pass my build configurations to the build parameter which already appears on the file.
build: {
...
}
And it worked. I assume it may be because I used npm run dev instead of the vue-service-cli, so webpack did not go through the vue.config.js file.
I have an installation file created by electron-builder and faced issue that can't find a way how to run powershell script after installation.
The idea is to make some changes in windows registry and set permission for application folder.
As far as I understand it should be configured in build section in package.json. In api I found that it is exist afterPack method, but I can't figure it out how to execute powershell file through it.
Thank you.
First, create an afterPack.js file, containing this code:
exports.default = async function () {
const { exec } = require('child_process');
const bat = exec('powershell "& ""path/to/powershell/file.ps1"""');
}
Edit the path to the .ps1 file accordingly. You may need to add a '.\' to the start of the path. The console readout will tell you if you do.
Next, edit your package.json file to include an afterPack parameter in your build settings, like so:
Now when you run your build script, it will run your .ps1 file.
I would like to bundle a largish node.js cli application into a single .js file.
My code is structured as follows:
|- main.js
|--/lib
|----| <bunch of js files>
|--/util
|----| <bunch of js files>
...etc
I can use browserify to bundle the whole thing into one file using main.js as the entry point, but Browserify assumes the runtime environment is a browser and substitutes its own libraries (e.g. browserify-http for http). So I'm looking for a browserify-for-node command
I tried running
$ browserify -r ./main.js:start --no-builtins --no-browser-field > myapp.js
$ echo "require('start') >> myapp.js
but I'm getting a bunch of errors when I try to run $ node myapp.js.
The idea is that the entire application with all dependencies except the core node dependencies is now in a single source file and can be run using
$ node myapp.js
Update
=============
JMM's answer below works but only on my machine. The bundling still does not capture all dependencies, so when I try to run the file on another machine, I get dependency errors like
ubuntu#ip-172-31-42-188:~$ node myapp.js
fs.js:502
return binding.open(pathModule._makeLong(path), stringToFlags(flags), mode);
^
Error: ENOENT, no such file or directory '/Users/ruchir/dev/xo/client/node_modules/request/node_modules/form-data/node_modules/mime/types/mime.types'
You can use pkg by Zeit and follow the below steps to do so:
npm i pkg -g
Then in your NodeJS project, in package JSON include the following:
"pkg": {
"scripts": "build/**/*.js",
"assets": "views/**/*"
}
"main": "server.js"
Inside main parameter write the name of the file to be used as the entry point for the package.
After that run the below command in the terminal of the NodeJS project
pkg server.js --target=node12-linux-x64
Or you can remove target parameter from above to build the package for Windows, Linux and Mac.
After the package has been generated you have to give permissions to write:
chmod 777 ./server-linux
And then you can run it in your terminal by
./server-linux
This method will give you can executable file instead of a single .js file
Check out the --node option, and the other more granular options it incorporates.