convert nodejs function to AWS Lambda compatible - javascript

I am pretty new to both nodejs and AWS Lambda. I have created a very small nodejs function that is working fine locally. Now I need to run it on AWS Lambda, but looks like there are some handlers requirement which I am not understanding completely.
Below is my nodejs function that I need to run it on Lambda. Any idea what changes do I need to make to execute it on AWS? Thanks
(async function () {
DOMAIN = "abc.xyz.com";
KEY = "***";
const mailchimpClient = require("#mailchimp/mailchimp_transactional")(KEY);
const run = async () => {
const response = await mailchimpClient.senders.addDomain({
domain: DOMAIN,
});
console.log(response);
};
run();
})();

Basically you just need to export a function with a specific name: handler
exports.handler = async function(event, context) {
console.log("EVENT: \n" + JSON.stringify(event, null, 2))
return "foo.bar"
}
In this handler, you just need to return something to mark as success or throw an error to mark as failure.
In your case, this should work:
var DOMAIN = "abc.xyz.com";
var KEY = "***";
const mailchimpClient = require("#mailchimp/mailchimp_transactional")(KEY);
exports.handler = async function(event, context) {
const response = await mailchimpClient.senders.addDomain({
domain: DOMAIN,
});
console.log(response);
return "success"
}
Here more examples and advanced configurations:
https://docs.aws.amazon.com/lambda/latest/dg/lambda-samples.html
https://github.com/awsdocs/aws-lambda-developer-guide/blob/main/sample-apps/blank-nodejs/function/index.js
https://github.com/awsdocs/aws-lambda-developer-guide/blob/main/sample-apps/nodejs-apig/function/index.js

It depends if you are using any framework to create your aws serverless code.
However, your usual code would be somthing like this.
exports.handler = function(event, context) {
console.log('Lambda A Received event:', JSON.stringify(event, null, 2));
context.succeed('Hello ' + event.name);
};
If you want a easier way to work with AWS serverless code such as Lambdas look at arc.codes
Also, here is a link to the AWS docs https://docs.aws.amazon.com/lambda/latest/dg/nodejs-handler.html

Related

Nestjs Interceptor not being invoked

I am using NestJS with a serverless app (deployed to AWS Lambda). I now have a need to use middleware, or Interceptors as they are called in nest, but I'm struggling to get them to work. I have changed from using NestFactory.createApplicationContext to NestFactory.create, as per the docs, that's what wraps Controller methods with enhancers, e.g. Interceptors
I am registering the Interceptor in a module, so it should be globally available
const loggingInterceptorProvider = {
provide: APP_INTERCEPTOR,
useClass: LoggingInterceptor,
};
My bootstrap looks like so
export async function bootstrap(Module: any) {
if (app) return app;
app = await NestFactory.createApplicationContext(Module);
return await app.init();
}
Now the non-standard bit, because I am using a generic "builder" (library code), the builder is passed the controller name as a string, and it is then invoked, as such
// the Module is accessible in the bootstrap via a closure, not shown in this code
const app = await bootstrap();
const appController = app.get(Controller);
// functionName is a string
const controllerFunction = appController[functionName];
const boundControllerFunction = controllerFunction.bind(
appController,
);
const result = await boundControllerFunction(body);
I am not seeing any of my Interceptor logging output. Am I doing something wrong? Or is it the way I am invoking the Controller that is not working with Interceptors?
EDIT:
For completeness, this is the correct bootstrap function I use
let cachedApp: INestApplication;
export async function bootstrap(Module: any) {
if (cachedApp) return cachedApp;
cachedApp = await NestFactory.create(Module, {
bufferLogs: true,
logger: ['error', 'warn'],
});
await cachedApp.init();
return cachedApp;
}
It happens because you've called the controller method directly, bypassing the nestjs lifecycle. When nest js server handles the request it applies its internal mechanisms for running interceptors, validation pipes, and exception filters. If you call class method directly it will not be used.
In your case you can follow this section of nestjs documentation:
https://docs.nestjs.com/faq/serverless#example-integration
let server: Handler;
async function bootstrap(): Promise<Handler> {
const app = await NestFactory.create(AppModule);
await app.init();
const expressApp = app.getHttpAdapter().getInstance();
return serverlessExpress({ app: expressApp });
}
export const handler: Handler = async (
event: any,
context: Context,
callback: Callback,
) => {
server = server ?? (await bootstrap());
return server(event, context, callback);
};
The "standalone application feature" from docs is useful if you want to call some service code, not a controller.
By the way, in the code snippet, you can see the variable server, they moved it outside of a handler function intentionally. Because in AWS lambdas it can be cached between different requests.
I found a/the way to do it, using the very poorly documented feature ExternalContextCreator. So basically the last code snippet I posted above, would become this
import { ExternalContextCreator } from '#nestjs/core/helpers/external-context-creator';
// the Module is accessible in the bootstrap via a closure, not shown in this code
const app = await bootstrap();
const appController = app.get(Controller);
// functionName is a string
const controllerFunction = appController[functionName];
const extContextCreator = app.get(ExternalContextCreator);
const boundControllerFunction = extContextCreator.create(
appController,
controllerFunction,
String(functionName),
);
const result = await boundControllerFunction(body);

How to fix Cloud Function error admin.database.ref is not a function at exports

I'm currently trying to modify my Cloud Functions and move in under https.onRequest so that i can call use it to schedule a cron job. How it i'm getting the following error in the logs.
TypeError: admin.database.ref is not a function
at exports.scheduleSendNotificationMessageJob.functions.https.onRequest (/user_code/index.js:30:20)
at cloudFunction (/user_code/node_modules/firebase-functions/lib/providers/https.js:57:9)
exports.scheduleSendNotificationMessageJob = functions.https.onRequest((req, res) => {
admin.database.ref('/notifications/{studentId}/notifications/{notificationCode}')
.onCreate((dataSnapshot, context) => {
const dbPath = '/notifications/' + context.params.pHumanId + '/fcmCode';
const promise = admin.database().ref(dbPath).once('value').then(function(tokenSnapshot) {
const theToken = tokenSnapshot.val();
res.status(200).send(theToken);
const notificationCode = context.params.pNotificationCode;
const messageData = {notificationCode: notificationCode};
const theMessage = { data: messageData,
notification: { title: 'You have a new job reminder' }
};
const options = { contentAvailable: true,
collapseKey: notificationCode };
const notificationPath = '/notifications/' + context.params.pHumanId + '/notifications/' + notificationCode;
admin.database().ref(notificationPath).remove();
return admin.messaging().sendToDevice(theToken, theMessage, options);
});
return null;
});
});
You cannot use the definition of an onCreate() Realtime Database trigger within the definition of an HTTP Cloud Function.
If you switch to an HTTP Cloud Function "so that (you) can call use it to schedule a cron job" it means the trigger will be the call to the HTTP Cloud Function. In other words you will not be anymore able to trigger an action (or the Cloud Function) when new data is created in the Realtime Database.
What you can very well do is to read the data of the Realtime Database, as follows, for example (simplified scenario of sending a notification):
exports.scheduleSendNotificationMessageJob = functions.https.onRequest((req, res) => {
//get the desired values from the request
const studentId = req.body.studentId;
const notificationCode = req.body.notificationCode;
//Read data with the once() method
admin.database.ref('/notifications/' + studentId + '/notifications/' + notificationCode)
.once('value')
.then(snapshot => {
//Here just an example on how you would get the desired values
//for your notification
const theToken = snapshot.val();
const theMessage = ....
//......
// return the promise returned by the sendToDevice() asynchronous task
return admin.messaging().sendToDevice(theToken, theMessage, options)
})
.then(() => {
//And then send back the result (see video referred to below)
res.send("{ result : 'message sent'}") ;
})
.catch(err => {
//........
});
});
You may watch the following official Firebase video about HTTP Cloud Functions: https://www.youtube.com/watch?v=7IkUgCLr5oA&t=1s&list=PLl-K7zZEsYLkPZHe41m4jfAxUi0JjLgSM&index=3. It shows how to read data from Firestore but the concept of reading and sending back the response (or an error) is the same for the Realtime Database. Together with the 2 other videos of the series (https://firebase.google.com/docs/functions/video-series/?authuser=0), it also explains how it is important to correctly chain promises and to indicate to the platform that the work of the Cloud Function is finished.
For me, this error happened when writing admin.database instead of admin.database().

Graphql Yoga Playground with Lambda - "Server cannot be reached"

I'm in the process of setting a graphql endpoint with servlerless/ lambda and am receiving an error when trying to connect to the graphql playground that comes with graphql-yoga. When I go to my route that has the playground (/playground) it launches the playground interface however it just says:
Server cannot be reached
In the top right of the playground. It's worth noting i'm using the makeRemoteExecutableSchema utility to proxy to another graphql endpoint (which is my CMS called Prismic). I don't believe this is the issue as I have successfully connected to it with the playground when testing on a normal express server.
Here is the code in my handler.js
'use strict';
const { makeRemoteExecutableSchema } = require('graphql-tools');
const { PrismicLink } = require("apollo-link-prismic");
const { introspectSchema } = require('graphql-tools');
const { ACCESS_TOKEN, CMS_URL } = process.env;
const { GraphQLServerLambda } = require('graphql-yoga')
const lambda = async () => {
const link = PrismicLink({
uri: CMS_URL,
accessToken: ACCESS_TOKEN
});
const schema = await introspectSchema(link);
const executableSchema = makeRemoteExecutableSchema({
schema,
link,
});
return new GraphQLServerLambda({
schema: executableSchema,
context: req => ({ ...req })
});
}
exports.playground = async (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const graphQl = await lambda();
return graphQl.playgroundHandler(event, context, callback);
};
I have followed this guide for getting it running up till here and am fairly sure i've followed similar steps for what applies to what i'm trying to do but can't seem to figure out where i've gone wrong.
Thanks,
Could you take a look at what version of the graphql-yoga package you are using?
I had a similar problem using the Apollo server in combination with Kentico Cloud Headless CMS and I found this issue:
https://github.com/prisma/graphql-yoga/issues/267

Decrypt multiple env. variables nodejs - AWS Lambda

I'm having difficulty decrypting multiple environment variables in nodejs for an AWS lambda. I've looked at the code sample supplied in the console and the following two related questions:
Question 1,
Question 2
I have been able to successfully decrypt a single environment variable through their code sample, however, when I try to apply a cleaner approach through the use of promises (methods outlined in the questions above), I get this error when testing the lambda function in the console:
TypeError: First argument must be a string, Buffer, ArrayBuffer,
Array, or array-like object.
I was wondering if anyone has had this issue before and how I could go about resolving it?
Edit:
I've added some samples from my code below
const AWS = require('aws-sdk');
const mysql = require('mysql');
let connection;
const encrypted = {
username: process.env.username,
password: process.env.password,
database: process.env.database,
host: process.env.host
};
let decrypted = {};
const encryptedEnvVars = [process.env.username, process.env.password, process.env.database, process.env.host ];
exports.handler = (event, context, callback) => {
if (isEnvVarsDecrypted()) {
processEvent(event, context);
} else {
Promise.all(encryptedEnvVars.map(decryptKMS))
.then(decryptEnvVars)
.catch(console.log);
}
};
function decryptKMS(key) {
return new Promise((resolve, reject) => {
const kms = new AWS.KMS()
kms.decrypt({ CiphertextBlob: new Buffer(key, 'base64') }, (err, data) => {
if(err) { reject(err); }
else { resolve(data.Plaintext.toString('ascii')); }
});
});
}
var decryptEnvVars = data => {
return new Promise((resolve, reject) => {
console.log(data);
decrypted.username = data[0].Plaintext.toString('ascii');
decrypted.password = data[1].Plaintext.toString('ascii');
decrypted.database = data[2].Plaintext.toString('ascii');
decrypted.host = data[3].Plaintext.toString('ascii');
resolve();
});
};
var isEnvVarsDecrypted = () => {
return decrypted.username && decrypted.password && decrypted.database && decrypted.host;
}
If key is null, then new Buffer(key, 'base64') will fail with the error you describe.
When I ran your code myself:
If any environment variable was missing, the error occurred
When all environment variables were declared, the error ceased
So, you should confirm that the environment variables you reference are actually defined.
A couple of other pointers:
Make sure you are always calling the lambda callback, regardless of success/failure; this is how you signal to the lambda environment that execution has ended.
After calling decryptEnvVars, you should call your processEvent function

How should I handle a db connection in Javascript / AWS Lambda

In my JS Lambda function I have something along the lines of the following...
import utils from './utils'
index.handler() {
return utils.initDB()
.then(function() {
return utils.doSomething()
utils.js:
var dbConfig = null;
var knex = null;
function initDB() {
dbConfig = require('../db');
knex = require('knex')(dbConfig);
return;
}
Basically, how should I pass around the knex object? Is it okay to have as a global var in the utils file? Should I return it to the handler and pass it into every smsUtils.doX call? I'm thinking this might be causing problems with db connection/pooling, but I don't know how to find out.
For anyone who stumbles on this in the future (i.e. me when I'm googling how to do this again in a year):
http://blog.rowanudell.com/database-connections-in-lambda/ explains connection reuse in Lambda. Should look something like this:
const pg = require('pg');
const client = new pg.Client('postgres://myrds:5432/dbname');
client.connect();
exports.handler = (event, context, cb) => {
client.query('SELECT * FROM users WHERE ', (err, users) => {
// Do stuff with users
cb(null); // Finish the function cleanly
});
};

Categories

Resources