Send CloudWatch logs to database - javascript

I am attempting to save our Cloudwatch logs in an on-premise Postgres database. I'm currently exporting logs to S3 and save in DynamoDB. My requirement now is to persist it in our DB, using node and AWS js-SDK. I'm not very strong on node and js-SDK, so I'll greatly appreciate any idea.
I tried a simple implementation.
const pools = require('../src/common/db'),
const AWS = require('aws-sdk');
// set the cwl
let cwl = new AWS.CloudWatchLogs({
region: 'us-east-1',
accessKeyId: 'ABCD1234',
secretAccessKey: 'MNBV76543',
Bucket: 'My_bucket'
});
// Get the events
cwl.getLogEvents({
logGroupName: 'OurLogGroupname',
logStreamName: 'specifiedLogstream'
}, (error, success ) =>{
if(error){
console.log(error)
}
console.log(success)
})
// Try saving to db
let sql = ''
pools.query_db('abc', 'INSERT INTO logging.aws_logs(request_id, duration, billed_duration) VALUES (?,?,?)', function(err, res){
if(err) return callback(err);
callback();
})

I would prefer the following way, if you really want to store all messages from Cloudwatch into a database:
Add a subscription to your Cloudwatch LogGroup
This subscription can be configured to trigger a Lambda
The Lambda will have the following logic:
extract the message from the event variable
prepare your SQL statement
connect to database (retry if not possible)
execute the SQL statement (retry if not possible)
done
One good example on how to extract the message of a Cloudwatch Subscription invocation would the one for sending those logs to Opensearch (search the blueprints)

Related

Why is my AWS Dynamo Lambda function returning null or timing out completely?

I'm using an AWS DynamoDB to store a string associated with a user. Photo of my dynamoDb table.
In my lambda function, I have the partition key (username) and I am trying to get the string that is associated with the username. It is returning null and not even getting to my console.log instruction.
//I don't think you need to see the top part, I'm showing it just in case
let dynamo = new AWS.DynamoDB({apiVersion: '2012-08-10'});
exports.handler = async function(event){
let username = 'adminDebug'; //test parameter
return await listSongs();
}
async function listSongs(){
var params = {
ExpressionAttributeValues: {
":v1": {
S: "adminDebug"
}
},
KeyConditionExpression: "username = :v1",
TableName: "name_of_my_table"
};
console.log("about to launch"); //appears before the null return
dynamo.query(params, (err, data)=>{
console.log("function has returned"); //does NOT fire off before query ends
if(err){
console.log(err, err.stack);
}
else return data;
});
}
I have also tried to do this with promises like so
//main snippet
return await listSongs();
async function listSongs()
return dynamo.query(params).promise();
and that will make it throw an error "errorMessage": "2022-08-21T16:24:17.5Z 9740-720-48a3e-5b0d0e Task timed out after 3.01 seconds"
Additional notes:
My lambda has the AmazonDynamoDBFullAccess role
My db does not have a sort-key. I just want to link a string to a primary-key.
I am using query() because getItem() requires me to have a sort-key.
I am using the AWS lambda web-app to code this.
Lambda is hooked up to an API gw but for testing, I am using the standard 'test' feature.
Thank you so much for your help. I wouldn't ask if I hadn't spent 3+ hours on this already.
A common reason for AWS Lambda functions to time out is that the function makes an outbound network request e.g. to a website or to an AWS service such as DynamoDB or S3 but the Lambda function has no network route to the internet or to an AWS service endpoint. The attempt to connect then fails after a certain timeout or after 3 seconds when the Lambda service times out the Lambda function (because the default timeout is 3 seconds).
This typically happens because the Lambda function was incorrectly configured to attach to a VPC.
If you don't need the Lambda function in a VPC, then don't configure it to attach to any VPC.
If you do need to attach to a VPC, then attach it to a private subnet (specifically not a public subnet) and ensure an outbound network path to the internet via a route table to a NAT device/gateway and IGW. Alternatively, if your connection is to an AWS service only then you can add a VPC Endpoint.

Amazon Cognito Registration Confirmation

So, I am using (or at least trying to) Amazon Cognito with Lambda functions for auth.
Here's the flow: I send request, it goes to API Gateway, which directs it to a specific Lambda function.
I am using Node JS with amazon-cognito-identity-js library.
I am able to register user.
The thing is that, Cognito sends email with the confirmation code after the registration. I am unable to create another Lambda (API endpoint) function for confirmation, since it requires CognitoUser object (which you receive after registering or login). Here is the code from AWS documentation:
cognitoUser.changePassword('oldPassword', 'newPassword', function(err, result) {
if (err) {
alert(err);
return;
}
console.log('call result: ' + result);
});
So basically, it's not designed for Lambda functions, since it requires to save the state - the user after the registration.
Am I getting it wrong? Is there a way?
Oh, ok, my bad.
Apparently you can create a CognitoUser object using only username and user pool:
const poolData = {
UserPoolId : process.env.COGNITO_USER_POOL_ID,
ClientId : process.env.COGNITO_CLIENT_ID
};
const userPool = new AmazonCognitoIdentity.CognitoUserPool(poolData);
...
const userData = {
Username : email,
Pool : userPool
};
and then you can call
cognitoUser.confirmRegistration(confirmationCode, true, function(err, result) {
if (err) {
alert(err);
return;
}
alert(result);
});

Testing Postgre Database on Heroku

I am currently building an API with Swagger on Heroku and i would like to try if the endpoints are creating the correct entries in the Postgre Database. However, once i try to connect to Heroku's Postgre in my testing envoirment the connection is rejected. I think this is because in the continous integration envoirnment heroku creates a sandbox and does not accept connections to the real dbs.
I tried to create a backup database as suggested here: https://devcenter.heroku.com/articles/heroku-postgres-backups
but i couldn't find the information to access!
Any help is appreciated!
Thank you.
Here is my code test:
'use strict';
var should = require('should');
var request = require('supertest');
var assert = require('assert');
var server = require('../../../app');
var pg = require('pg');
var knex = require('knex')({
client: 'postgresql',
connection: process.env.DATABASE_URL
});
describe('controllers', function () {
describe('testing cenas', function () {
it('test1', function (done) {
request(server)
.get('/hello')
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200)
.end(function (err, res) {
assert.equal(res.body, 'Hello,Hugo Pereira!');
done();
});
});
});
});
it gives an error:
Unhandled rejection Error: Unable to acquire a connection
at Client_PG.acquireConnection (/app/node_modules/knex/lib/client.js:332:40)
at Runner.ensureConnection (/app/node_modules/knex/lib/runner.js:233:24)
at Runner.run (/app/node_modules/knex/lib/runner.js:47:42)
at Builder.Target.then (/app/node_modules/knex/lib/interface.js:39:43)
at testPg (/app/api/controllers/hello_world.js:27:44)
at swaggerRouter (/app/node_modules/swagger-tools/middleware/swagger-router.js:407:20)
at swagger_router (/app/node_modules/swagger-node-runner/fittings/swagger_router.js:31:5)
at Runner.<anonymous> (/app/node_modules/bagpipes/lib/bagpipes.js:171:7)
at bound (domain.js:301:14)
at Runner.runBound (domain.js:314:12)
at Runner.pipeline (/app/node_modules/pipeworks/pipeworks.js:72:17)
at Runner.flow (/app/node_modules/pipeworks/pipeworks.js:223:19)
at Pipeworks.flow (/app/node_modules/pipeworks/pipeworks.js:135:17)
at Pipeworks.siphon (/app/node_modules/pipeworks/pipeworks.js:186:19)
at Runner.<anonymous> (/app/node_modules/bagpipes/lib/bagpipes.js:98:22)
at bound (domain.js:301:14)
0 passing (2s)
The heroku pg:psql command should connect to your database. You can also pass it parameters to connect to alternative/backup databases.
If you are using a heroku postgres url then one thing you need for external access is the query string ?ssl=true appended to the url
e.g.
"DATABASE_URL" : "postgres://user:pass#ec2-54-235-206-118.compute-1.amazonaws.com:5432/dbid?ssl=true"
I manage to solve this problem using the heroku in-dyno database (https://devcenter.heroku.com/articles/heroku-ci-in-dyno-databases). Once i got the database I go to a endpoint in my dev env that i made myself that gives me a dump file of the schemas of my real db. In the test env i import this file and voila i have all the schemas and tables i need for my test envoirment.

How do I maintain multi tenant data base connections nodejs

Let's say I have a service app.js. Whenever a new client connects to the service , we will check if a mongodb connection is already establish for this client or not.
If it is not available then we will fetch the server ip, dbname ,collection name from a configuration file, connect to the db,and reply to the user.
Note: we can add a new client and corresponding info to Client Info at any time. (dynamically)
Client Info ClientId: ServerIp : Database Name :Collection Name I have tried to store mongo object in array so I can reuse them object based on database name from user's session data. But I keep running into circular json error. How do I store multi tenant database connections?
async.eachSeries(conf.clientDbs.clientsList, function(clientDetails,callback){
console.log(clientDetails);
mongodb.MongoClient.connect(conf.clientDbs.connection+clientDetails.dbName, function (err, database) {
if (err) {
console.log(err);
process.exit(1);
}
// Save database object from the callback for reuse.
var tempdbobj = {};
tempdbobj["obj"] = database
allDbs[clientDetails.team_id] = tempdbobj;
console.log("Database connection ready for "+clientDetails.team_id);
allDbs[clientDetails.team_id].obj.collection('collection_name').find({"ref_id":"111"}, function(dberr, testDoc){
if (dberr) {
console.log(dberr);
callback();
}
else {
console.log(testDoc);
callback();
}
});
});
});

How to connect MySQL with nodejs controllers?

I have a server on sails nodejs and I am trying to connect my controllers with my MySQL db through a wrapper file that would create the connection pool. My purpose is that I use that pool everytime a function in any controller needs to interact with DB, and in such a way that connection is created at the time interaction starts and connection is closed at the time interaction is over. For this, I have created a wrapper file db.js
db.js
var mysql = require('mysql');
var connection = mysql.createConnection({
host:"localhost",
port: '3306',
user:"ye_old_username",
password:"ye_old_password",
database: "ye_old_schema"
});
module.exports = connection;
Now, I am creating a connection pool called ConnectionPool.js
ConnectionPool.js
var mysql = require('mysql'),
config = require("./db");
/*
* #sqlConnection
* Creates the connection, makes the query and close it to avoid concurrency conflicts.
*/
var sqlConnection = function sqlConnection(sql, values, next) {
// It means that the values hasnt been passed
if (arguments.length === 2) {
next = values;
values = null;
}
var connection = mysql.createConnection(config);
connection.connect(function(err) {
if (err !== null) {
console.log("[MYSQL] Error connecting to mysql:" + err+'\n');
}
});
connection.query(sql, values, function(err) {
connection.end();
if (err) {
throw err;
}
next.apply(this, arguments);
});
}
module.exports = sqlConnection;
I have followed the method answered on this question to create the connection pool: How to provide a mysql database connection in single file in nodejs
And finally, I am trying to run a function from a controller using the wrapper and the connection pool. The code inside the Controller is
var connPool = require('./ConnectionPool');
module.exports = {
testConn: function(req, res){
connPool('SELECT * from user where ?', {id: '1'}, function(err, rows) {
if(err){
sails.log.debug(err);
}else{
console.log(rows);
}
});
}
};
All the three files, the wrapper, the connection pool, and the controller are in the same Controllers folder.
Now, when I send a request to the URL through my client, that would invoke the testConn function inside the controller, I get the following response on server log:
[MYSQL] Error connecting to mysql:Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''#'localhost' (using password: NO)
This error is coming from the line connection.connect(function(err) { in connection pool file.
When I try to log on my MySQL db through the same credentials on command line, I am through it. Therefore I believe that db.js file has some format related issue because of which a proper connection is not getting initiated. There can be other reason as well, but the reason I suspect seems to be very strong.
I need some guidance on solving this issue. Any help will be appreciated.

Categories

Resources