Redis connection is lost after multiple calls to function - javascript

The program I am writing is a status display screen for alarms, each of which is represented by a channel.
When the server is started (run on a vagrant virtual machine), an Influx database is accessed, the data (comprising of 1574 'channels') is processed and put into a Redis database. This runs fine and the GUI is displayed with no issues when the webpage is refreshed, although it takes a long time to load (up to 20s), and nearly all of this time is spent in the method below.
However, after a few refreshes/moving around the site, it often crashes with the following error:
{ AbortError: Redis connection lost and command aborted. It might
have been processed.
at RedisClient.flush_and_error (/vagrant/node_modules/redis/index.js:362:23)
at RedisClient.connection_gone (/vagrant/node_modules/redis/index.js:664:14)
at RedisClient.on_error (/vagrant/node_modules/redis/index.js:410:10)
at Socket. (/vagrant/node_modules/redis/index.js:279:14)
at emitOne (events.js:116:13)
at Socket.emit (events.js:211:7)
at onwriteError (_stream_writable.js:417:12)
at onwrite (_stream_writable.js:439:5)
at _destroy (internal/streams/destroy.js:39:7)
at Socket._destroy (net.js:568:3) code: 'UNCERTAIN_STATE', command: 'HGETALL', args: [
'vista:hash:Result:44f59707-c873-11e8-93b9-7f551d0bdd1f' ], origin:
{ Error: Redis connection to 127.0.0.1:6379 failed - write EPIPE
at WriteWrap.afterWrite (net.js:868:14) errno: 'EPIPE', code: 'EPIPE', syscall: 'write' } }
This error is displayed 1574 times (once for each channel), and occurs when the program reaches this function:
Result.getFormattedResults = async function (cycle) {
const channels = await Channel.findAndLoad()
const formattedResults = await mapAsyncParallel(channels, async channel => {
const result = await this.findAndLoadByChannel(channel, cycle)
const formattedResult = await result.format(channel)
return formattedResult
})
return formattedResults
}
mapAsyncParallel() is as follows:
export const mapAsyncParallel = (arr, fn, thisArg) => {
return Promise.all(arr.map(fn, thisArg))
}
findAndLoadByChannel() finds the channel and loads it with this line:
const resultModel = await this.load(resultId)
And format() takes the model and outputs the data as in a JSON format
There are two 'fetch(...)' commands (which are needed and cannot be combined) in the front end, and the problem rarely occurs when I comment out one of them (either one). This is making me think it could be a max memory or max connections problem? (increasing maxmemory in the config file didn't help). Or a problem with using so many promises (a concept I am fairly new to).
This has only started to occur as I have added more functionality and I assume the function needs optimizing but I have taken over this project from someone else and am still quite new to node.js and redis.
Versions:
Vagrant: 2.0.1
Ubuntu: 16.04.5
Redis: 4.0.9
Node: 8.12.0
npm: 5.7.1

I've now moved all the 'getting' of the data (from redis) to the server side channels.controller file.
So, where before I would have:
renderPage: async (req, res) => {
res.render('page')
},
I now have a method like:
renderPage: async (req, res) => {
const data1 = getData1()
const data2 = getData2()
res.render('page', {data1, data2})
},
(Don't worry, these aren't my actual variable names)
Where the two 'data' variables were previously retrieved using the 'fetch' method.
I export the data once it's loaded into redis, and import it in the controller file, where I have the getters to combine it all into one return array.
The pages now take milliseconds to refresh and I haven't had any crashes

Related

Why am I getting a timeout error when exporting a mariadb connection pool in Node.js?

EDIT
I found the error. The mistake was very obvious: I did not include the
require("dotenv").config(); in the connection.js file. Without this, the database connection simply fails after a timeout because it does not have any connection details.
I found an update log from the Mariadb Node.js connector team stating they have a few errors where Mariadb does not provide sufficient error messages (it sometimes only offers a "timeout" without further information), so I changed what I was looking for, and found the mistake.
For anyone getting a similar error message, this can mean anything, so check all parts of your code!
Original Post
I am trying to get familiar with Nodejs and express, but ran into an issue that I can't seem to solve:
When creating a Mariadb database pool in a seperate file, and exporting the pool using module.exports, I am having trouble using the same pool in another file. I get a timeout error when trying to use the pool to query a database.
If I use the exact same code in the same file instead of two separate files, the query works perfectly, so I think there is something going wrong during module.exports = pool.
Am I missing something? Thanks in advance!
I have two files:
index.js:
// import express web framework
const express = require("express");
//create an express application
const app = express();
const pool = require('./database/connection')
const cors = require('cors');
//middleware
app.use(cors())
app.use(express.json())
getData = async () => {
data = await pool.query("call stored_procedure")
console.log (data)
}
getData()
app.listen(3001, () => {
console.log('Serving running on port 3001')
})
and connection.js:
//import mariadb library
const mariadb = require("mariadb")
//function that create mariadb connection pool for database
const createPool = () => {
try {
return (
mariadb.createPool({
connectionLimit: 10,
host: process.env.MARIADB_HOST,
user: process.env.MARIADB_USER,
password: process.env.MARIADB_PASSWORD,
database: process.env.MARIADB_DB,
port: 3306
})
)
}
catch (err) {
console.error('Failed to connect to database: ')
console.error(err)
}
}
const pool = createPool()
//export database connection pool
module.exports = pool
Running this app results in the following error (after some time):
path_to_dir/node_modules/mariadb/lib/misc/errors.js:57
return new SqlError(msg, sql, fatal, info, sqlState, errno, additionalStack, addHeader);
^
SqlError: (conn=-1, no: 45028, SQLState: HY000) retrieve connection from pool timeout after 10001ms
(pool connections: active=0 idle=0 limit=10)
at Object.module.exports.createError (path_to_dir/node_modules/mariadb/lib/misc/errors.js:57:10)
at Pool._requestTimeoutHandler (path_to_dir/node_modules/mariadb/lib/pool.js:345:26)
at listOnTimeout (node:internal/timers:557:17)
at processTimers (node:internal/timers:500:7) {
text: 'retrieve connection from pool timeout after 10001ms\n' +
' (pool connections: active=0 idle=0 limit=10)',
sql: null,
fatal: false,
errno: 45028,
sqlState: 'HY000',
code: 'ER_GET_CONNECTION_TIMEOUT'
}
I found the error. The mistake was very obvious: I did not include the require("dotenv").config(); in the connection.js file. Without this, the database connection simply fails after a timeout because it does not have any connection details. I found an update log from the Mariadb Node.js connector team stating they have a few errors where Mariadb does not provide sufficient error messages (it sometimes only offers a "timeout" without further information), so I changed what I was looking for, and found the mistake.
For anyone getting a similar error message, this can mean anything, so check all parts of your code!

Node.js 'fs' throws an ENOENT error after adding auto-generated Swagger server code

Preamble
To start off, I'm not a developer; I'm just an analyst / product owner with time on their hands. While my team's actual developers have been busy finishing off projects before year-end I've been attempting to put together a very basic API server in Node.js for something we will look at next year.
I used Swagger to build an API spec and then used the Swagger code generator to get a basic Node.js server. The full code is near the bottom of this question.
The Problem
I'm coming across an issue when writing out to a log file using the fs module. I know that the ENOENT error is usually down to just specifying a path incorrectly, but the behaviour doesn't occur when I comment out the Swagger portion of the automatically generated code. (I took the logging code directly out of another tool I built in Node.js, so I'm fairly confident in that portion at least...)
When executing npm start, a few debugging items write to the console:
"Node Server Starting......
Current Directory:/mnt/c/Users/USER/Repositories/PROJECT/api
Trying to log data now!
Mock mode: disabled
PostgreSQL Pool created successfully
Your server is listening on port 3100 (http://localhost:3100)
Swagger-ui is available on http://localhost:3100/docs"
but then fs throws an ENOENT error:
events.js:174
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '../logs/logEvents2021-12-24.log'
Emitted 'error' event at:
at lazyFs.open (internal/fs/streams.js:277:12)
at FSReqWrap.args [as oncomplete] (fs.js:140:20)
Investigating
Now normally, from what I understand, this would just mean I've got the paths wrong. However, the file has actually been created and the first line of the log file has been written just fine
My next thought was that I must've set the fs flags incorrectly, but it was set to 'a' for append:
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
Removing Swagger Code
Now here's the weird bit: if I remove the Swagger code, the log files write out just fine and I don't get the fs exception!
This is the specific Swagger code:
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
When I comment out this code, the log file writes out just fine.
The only thing I can think that might be happening is that somehow Swagger is modifying (?) the app's working directory so that fs no longer finds the same file?
Full Code
'use strict';
var path = require('path');
var fs = require('fs');
var http = require('http');
var oas3Tools = require('oas3-tools');
var serverPort = 3100;
// I am specifically tried using path.join that I found when investigating this issue, and referencing the app path, but to no avail
const __logdir = path.join(__dirname,'./logs');
//These are date and time functions I use to add timestamps to the logs
function dateNow(){
var dateNow = new Date().toISOString().slice(0,10).toString();
return dateNow
}
function rightNow(){
var timeNow = new Date().toTimeString().slice(0,8).toString();
return "["+timeNow+"] "
};
console.info("Node Server Starting......");
console.info("Current Directory: " + __dirname)
// Here I create the WriteStreams
var logsFile = fs.createWriteStream(__logdir+"/logEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Log File to location: %s \nWith error description: %s',__logdir, err);
});
var errorsFile = fs.createWriteStream(__logdir+"/errorEvents"+dateNow()+'.log',{flags: 'a'},(err) =>{
console.error('Could not write new Error Log File to location: %s \nWith error description: %s',__logdir, err);
});
// And create an additional console to write data out:
const Console = require('console').Console;
var logOut = new Console(logsFile,errorsFile);
console.info("Trying to log data now!") // Debugging logging
logOut.log("========== Server Startup Initiated ==========");
logOut.log(rightNow() + "Server Directory: "+ __dirname);
logOut.log(rightNow() + "Logs directory: "+__logdir);
// Here is the Swagger portion that seems to create the behaviour.
// It is unedited from the Swagger Code-Gen tool
// swaggerRouter configuration
var options = {
routing: {
controllers: path.join(__dirname, './controllers')
},
};
var expressAppConfig = oas3Tools.expressAppConfig(path.join(__dirname, '/api/openapi.yaml'), options);
var app = expressAppConfig.getApp();
// Initialize the Swagger middleware
http.createServer(app).listen(serverPort, function () {
console.info('Your server is listening on port %d (http://localhost:%d)', serverPort, serverPort);
console.info('Swagger-ui is available on http://localhost:%d/docs', serverPort);
}).on('error',console.error);
In case it helps, this is the project's file structure . I am running this project within a WSL instance in VSCode on Windows, same as I have with other projects using fs.
Is anyone able to help me understand why fs can write the first log line but then break once the Swagger code gets going? Have I done something incredibly stupid?
Appreciate the help, thanks!
Edit: Tried to fix broken images.
Found the problem with some help from a friend. The issue boiled down to a lack of understanding of how the Swagger module works in the background, so this will likely be eye-rollingly obvious to most, but keeping this post around in case anyone else comes across this down the line.
So it seems that as part of the Swagger initialisation, any scripts within the utils folder will also be executed. I would not have picked up on this if it wasn't pointed out to me that in the middle of the console output there was a reference to some PostgreSQL code, even though I had taken all reference to it out of the main index.js file.
That's when I realised that the error wasn't actually being generated from the code posted above: it was being thrown from to that folder.
So I guess the answer is don't add stuff to the utils folder, but if you do, always add a bunch of console logging...

TypeError: key must be a string, a buffer or an object at typeError with GCP file exists

I'm trying to simply test for an existence of a file on our Google Cloud Platform (GCP) storage. I'm using GCP buckets on express js servers. Below is essentially a very simple exampled copied off of https://googleapis.dev/nodejs/storage/latest/File.html#exists
EDIT: This is how I authenticate the GCP key:
const { Storage } = require('#google-cloud/storage');
const storage = new Storage({
projectId: 'my-cloud',
keyFilename: process.env.GOOGLE_APPLICATION_CREDENTIALS,
});
const bucketName = 'my-ci';
(With small changes, I realise you are supposed to return data[0])
const bucket = storage.bucket(bucketName);
const file = bucket.file(path);
const exists = await file.exists().then(data => {
return data
})
But when I try run this, I get the error:
[nodemon] starting `node --inspect server/server.js`
Debugger listening on ws://127.0.0.1:9229/9a677766-4a93-4499-b57c-55f5f05096d7
For help, see: https://nodejs.org/en/docs/inspector
Server listening on port 4000!
/opt/node_modules/jwa/index.js:115
return new TypeError(errMsg);
^
TypeError: key must be a string, a buffer or an object
at typeError (/opt/node_modules/jwa/index.js:115:10)
at checkIsPrivateKey (/opt/node_modules/jwa/index.js:61:9)
at Object.sign (/opt/node_modules/jwa/index.js:147:5)
at Object.jwsSign [as sign] (/opt/node_modules/jws/lib/sign-stream.js:32:24)
at JWTAccess.getRequestHeaders (/opt/node_modules/google-auth-library/build/src/auth/jwtaccess.js:87:31)
at JWT.getRequestMetadataAsync (/opt/node_modules/google-auth-library/build/src/auth/jwtclient.js:76:51)
at JWT.getRequestHeaders (/opt/node_modules/google-auth-library/build/src/auth/oauth2client.js:238:37)
at GoogleAuth.authorizeRequest (/opt/node_modules/google-auth-library/build/src/auth/googleauth.js:593:38)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
Since the error message did not give a useful traceback, I did some digging on my own. Through putting console.log statements everywhere, I narrowed it down to the line
const exists = await file.exists().then(data => {
return data
})
and tried various approaches from removing the .then(...) clause, to removing the await (which, does work until the promise is resolved). None of these seemed to have worked.
What may be potential causes of this?
Eventually figured it out - it was due to the GCP key being an older version of the key. If you get an error like above, try to check that your key is correct.

Google cloud functions error on redis createClient when using in async mode

I have a few google cloud functions which make use of the redis memory store and it gives me this Redis connection to :6379 failed - read ECONNRESET at TCP. onread error every time any of function deployed. Previously I shared the createClient() code with all of the functions by creating a separate util file and including them on the CFs, I thought that was the issue. But please note that this Redis cache is working as expected other than this error.
Then I tried putting util code inside each of the google cloud functions which use the redis client to create the client. But I'm still getting this error from every cloud functions when every I deploy any of a cloud function. Even when deploying the functions that do not use the redis.
Here's how I create a client :
const bluebird = require('bluebird');
const redis = bluebird.promisifyAll(require('redis'));
const cache = redis.createClient({ port: REDIS_PORT, host: REDIS_HOST });
cache.on("error", (err) => {
console.log("API One - Redis cache error : " + err);
});
const list = async(data) => {
// Do something with data.
let cachedData;
if(cache.connected) {
await cache.hgetAsync(key); // Get cached Data.
}
// Do something with cached data if cachedData available.
if(cache.connected) {
await cache.hsetAsync(key, data); // Set Some Data.
}
return data;
}
module.exports = functions.https.onCall(list);
Why I'm seeing this error on every cloud function logs?
Sample error logs I get:
API One - Redis cache error : Error: Redis connection to <Ip Address>:6379 failed - read ECONNRESET
API Two - Redis cache error : Error: Redis connection to <Ip Address>:6379 failed - read ECONNRESET
Have you tried closing the redis connection before the function finishes?
The redis module may have background callbacks active during the life of the client, not closing the connection prior to function termination may be causing the connection to timeout when the cloud function terminates. Make sure that all asynchronous operations finish before the function terminates.
For example:
Example
Let me know if this works for you.

How to use credentials to work with nodegit.push on Windows

Edit: I'm changing the question to suit my current understanding of the problem which has changed significantly.
Original Title: Nodegit seems to be asking for wrong credentials on push
When trying to push using nodegit nothing seems to work on Windows (while they work fine on Linux).
Using SSH
sshKeyFromAgent - error authenticating: failed connecting agent
sshKeyNew - credentials callback is repeatedly (looks like an infinite loop
but I can't be sure)
sshKeyMemoryNew: credentials is called twice and then node exits with no diagnostic (the exit and beforeExit events on process aren't signalled)
Using HTTPS
userpassPlaintextNew: [Error: unknown certificate check failure] errno: -17
Original question follows.
I'm trying to get nodegit to push and the following question seems to address this situation. However I'm not able to get it to work.
I've cloned a repository using SSH and when I try to push, my credentials callback is being called with user git and not motti (which is the actual git user).
try {
const remote = await repository.getRemote("origin");
await remote.push(["refs/head/master:refs/heads/master"], {
callbacks: {
credentials: (url, user) => {
console.log(`Push asked for credentials for '${user}' on ${url}`);
return git.Cred.sshKeyFromAgent(user);
}
}
});
}
catch(err) {
console.log("Error:", err);
}
I get the following output:
Push asked for credentials for 'git' on git#github.[redacted].net:motti/tmp.git
Error: { Error: error authenticating: failed connecting agent errno: -1, errorFunction: 'Remote.push' }
If I try to hardcode motti to the sshKeyFromAgent function the error changes to:
Error: { Error: username does not match previous request errno: -1, errorFunction: 'Remote.push' }
This my first time trying to programmatically use git so I may be missing something basic...
Answer for some questions from comments:
I'm running on windows 10
node v8.9.4
git version 2.15.0.windows.1
nodegit version 0.24.1
the user running node is my primary user which when I use for git in command line works correctly
Instead of using git.Cred.sshKeyFromAgent - you could use git.Cred.sshKeyNew and pass your username / keys along.
const fs = require('fs');
// ...
const username = "git";
const publickey = fs.readFileSync("PATH TO PUBLIC KEY").toString();
const privatekey = fs.readFileSync("PATH TO PRIVATE KEY").toString();
const passphrase = "YOUR PASSPHRASE IF THE KEY HAS ONE";
const cred = await Git.Cred.sshKeyMemoryNew(username, publickey, privatekey, passphrase);
const remote = await repository.getRemote("origin");
await remote.push(["refs/head/master:refs/heads/master"], {
callbacks: {
credentials: (url, user) => cred
}
});
You need to run an ssh agent locally and save your password there. Follow these steps to make it work:
Enable the ssh agent locally (automatically runs on OS X): https://code.visualstudio.com/docs/remote/troubleshooting#_setting-up-the-ssh-agent
Run 'ssh-add' in the same CLI as you're running your nodegit actions and enter your passphrase
I hope this helps because I also struggled a lot with it and it can be very frustrating.

Categories

Resources