"mongoError: Topology was destroyed" when trying to delete/update a document - javascript

I am trying to make a discord bot from NodeJS that utilizes MongoDB for its database. When I try to delete or update a document, sometimes, it returns mongoError: Topology was destroyed. I have read up on this error before and it says that the connection was being interrupted.
Here is the code for my Database Handler:
class DatabaseHandler {
constructor(client) {
this.client = client;
}
async connect(callback) {
try {
await this.client.connect();
await callback(this.client);
} catch (err) {
console.error(err);
} finally {
this.client.close();
console.log("CLIENT CLOSED");
}
}
}
module.exports = DatabaseHandler;
Here is the place that the error occurs:
DB.connect(async (client) => {
console.log(ObjectId(this._id));
let DBList = await client.db("Giveaways").collection("giveawayData");
let delVal = {
_id: ObjectId(this._id)
};
await DBList.deleteOne(delVal); // error occurs here
})
I do not think it is because of the this.client.close() because it is executed after all of the operations are finished.

Related

MongoDb Mongoose GridFs Problem Connecting After close

if i turn my backend on and call the getImage API It works fine.
But if I Call any other API before it. I get this error (uncaughtException: MongoNotConnectedError: Client must be connected before running operations)
I suspect this is because I'm closing the connection after each API in the Finally{} But I'm using the DbConnect() Function in the start of the GetImage Function Why is it not working properly?
It used to work before I started closing the Mongoose Connection after each API.
How can I make it work? and still close the Connection?
This Is The DbConnect Function
export async function dbConnect(): Promise<mongoose.Connection> {
if (cached.conn) {
return cached.conn
}
if (!cached.promise) {
const opts = {
bufferCommands: false,
useNewUrlParser: true
}
cached.promise = mongoose.connect(MONGODB_URI, opts).then((mongoose) => {
return mongoose
})
}
cached.conn = await cached.promise
return cached.conn
}
This is the Close Connection I'm using in the end of each api
export async function closeConnection() {
mongoose.connection.close();
}
And Lastly this is the api i'm using to get an image from GridFs
//This is the GetImage Function
apiRoute.get(async (req: NextApiRequest, res: NextApiResponse) => {
try {
await dbConnect();
if (!req.query.id) {
return res.status(400).json({ success: false, msg: "Bad Request" })
}
let gfs = new mongoose.mongo.GridFSBucket(mongoose.connection.db, { bucketName: "files" })
const readStream = gfs.openDownloadStream(new mongoose.mongo.ObjectId(String(req.query.id)));
readStream.pipe(res)
} catch (err) {
return res.status(500).json({ success: false })
}
});

Timeout acquiring a connection when streaming results using Express

We use the following code to stream the results of a query back to the client:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_user: 'foo' })
.stream()
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
While the code seems to have an excellent memory usage profile (stable/low memory usage) it creates random DB connection acquisition timeouts:
Knex: Timeout acquiring a connection. The pool is probably full. Are
you missing a .transacting(trx) call?
This happens in production at seeming random intervals. Any idea why?
This happens because aborted requests (i.e client closes the browser mid-request) don't release the connection back to the pool.
First, ensure you're on the latest knex; or at least v0.21.3+ which has introduced fixes to stream/pool handling.
From the on you have a couple options:
Either use stream.pipeline instead of stream.pipe which handles aborted requests correctly like so:
const { pipeline } = require('stream')
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
return pipeline(stream, JSONStream.stringify(), res, err => {
if (err) {
return console.log(`Pipeline failed with err:`, err)
}
console.log(`Pipeline ended succesfully`)
})
} catch (err) {
next(err)
}
})
or listen to the [close][close] event on req and destroy the DB stream yourself, like so:
app.get('/events', (req, res) => {
try {
const stream = db('events')
.select('*')
.where({ id_session: req.query.id_session })
.stream()
// Not listening to this event will crash the process if
// stream.destroy(err) is called.
stream.on('error', () => {
console.log('Stream was destroyed')
})
req.on('close', () => {
// stream.end() does not seem to work, only destroy()
stream.destroy('Aborted request')
})
stream.pipe(JSONStream.stringify()).pipe(res)
} catch (err) {
next(err)
}
})
Useful reading:
knex Wiki: Manually close streams. Careful, the stream.end mentioned here doesn't seem to work.
knex Issue: stream.end() does not return connection to pool

My Node Script Hangs after functions are finished

I'm calling three functions, after the completion of these functions I want my script to close on it's own but it just hangs.
I've tried making the functions async/promise based, closing the database after each 'mongodb' type function, and using process.exit() within a function as a callback to the last called function.
Connecting to the (local - not Atlas) Database:
MongoClient.connect(local, {useNewUrlParser: true, useUnifiedTopology: true}, function(err, db) {
if (err) {
console.log(err)
}
else {
console.log('Connected to MongoDB...')
//Read in data from jsonfiles and store each file's contents into the database : This is where the functions are being called... within a successful connect to the MongoDB
insertJSON(db, jsonfiles, 'requests', jsonfilesSource)
insertJSON(db, issuedfiles, 'issuedLicenses', isssuedfilesSource)
insertLicenses(db)
}
db.close()
})
Function 1:
function insertJSON(db, dirBuf,collection, sourceFolder) {
var database = db.db('license-server')
var collection = database.collection(collection)
fs.readdir(dirBuf, function(err, files) {
if (err) {
console.log(err.message)
}
else {
files.forEach(function(filename) {
var text = fs.readFileSync(sourceFolder + filename);
var filecontents = JSON.parse(text)
//collection.insertOne(filecontents)
collection.findOne({"DisplayTitle" : filecontents.DisplayTitle, "NodeInformation" : filecontents.NodeInformation, "Date": filecontents.Date})
.then(function(result) {
if(result) {
console.log(`An Item could already be in the database: A file is unique if its display title, nodeinformation, and date are different.
the items display title is ${result.DisplayTitle}`)
return
}
else {
collection.insertOne(filecontents)
console.log(`Added ${filecontents.DisplayTitle} to database`)
}
})
.catch(function(error) {
console.log(error)
})
})
}
})
}
Function 2:
function insertLicenses(db) {
// Set up GridFS to import .lic and .licx files into the database
var database = db.db('license-server')
var collection = database.collection('fs.files')
var bucket = new mongodb.GridFSBucket(database);
var dirBuf = Buffer.from('../license-server/private/licenses')
fs.readdir(dirBuf, function(err, files) {
if (err) {
console.log(err.message)
}
else {
files.forEach(function(filename) {
collection.findOne({"filename": filename}).
then(function(result) {
if(result) {
console.log(`The file ${filename} is already in the database`)
return
}
else {
fs.createReadStream('./private/licenses/' + filename).
pipe(bucket.openUploadStream(filename)).
on('error', function(error) {
assert.ifError(error)
}).
on('finish', function() {
console.log(`Uploaded ${filename}`)
})
}
})
})
}
})
// I tried calling db.close() here since this is the last function to be called. No luck.
}
I'm guessing it has something to do with the mongodb functions having their own way to close themselves but I couldn't seem to find what I was looking for in previous attempts to resolve this issue.
The expected result should be the script closing itself, the actual result is a handing script.
All of these database calls are asynchronous -- the result of this code running is to immediately call db.close and then do the work in insertJSON and insertLicenses. If you were to rewrite this to use async/await (and you'd need to update your other functions as well) the db.close call would close the db, and that would allow the script to exit:
await insertJSON(db, jsonfiles, 'requests', jsonfilesSource)
await insertJSON(db, issuedfiles, 'issuedLicenses', isssuedfilesSource)
await insertLicenses(db)
db.close()
https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Asynchronous/Introducing
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function

How to connect to MongoDB with Async/Await in Nodejs?

I have a Connection class with static methods that allows me to create a singleton type object for MongoDB connections. Using Async along with Await, I never get the connection to 'fire' before the rest of my code executes.
Using the traditional Promise / .then this Connection class works. Using Latest Nodejs version and MongoDB version.
static connectDb() {
//If MongoDB is already connected, return db object
if (this.dbClient) {
//const currDbClient = Promise.resolve(this.dbClient);
console.log(`MongoDB already connected!`);
return this.dbClient;
}
//Otherwise connect
else {
async () => {
try {
const newDbClient = await MongoClient.connect(this.url, this.options);
console.log(`DB is connected? ${newDbClient.isConnected()}`);
ConnectMeCore.dbClient = newDbClient;
return newDbClient;
} catch (error) {
console.error(`MongoDB connection failed with > ${error}`);
}
};
}
}
I expect the await to 'wait' for the DB to connect, or at least resolve the promise.
Thanks to #JaromandaX for helping find the answer!
The calling code can use a Promise.then to execute code once the DB connection happens.
DbConnection.connectDb().then(() => {
console.log("Is it connected? " + DbConnection.isConnected());
//Do CRUD
DbConnection.closeDb();
});
You can import this method(as part of 'Connection' class) into any class that needs to have a DB connect. A singleton for on DB connection. The working method fragment is as follows.
static async connectDb() {
//If MongoDB is already connected, return db object
if (this.dbClient) {
const currDbClient = Promise.resolve(this.dbClient);
console.log(`MongoDB already connected!`);
return currDbClient;
}
//Otherwise connect using 'await', the whole methos is async
else {
try {
const newDbClient = await MongoClient.connect(this.url, this.options);
console.log(`DB is connected? ${newDbClient.isConnected()}`);
this.dbClient = newDbClient;
return newDbClient;
} catch (error) {
console.error(`MongoDB connection failed with > ${error}`);
throw error;
}
}
}

Call Magento SOAP inside Meteor method invoked by the client

I'm using zardak:soap package in Meteor to connect with Magento SOAP v2 API. I've created a file inside the 'server' folder where I create a soap connection on Meteor.startup. Then I run a ticker that invokes random soap method every 30sec just to keep the connection up.
let soapConnection;
Meteor.startup(() => {
soapConnection = createAPIConnection('http://magento.site.com/api/v2_soap/?wsdl=1', {username: 'user', apiKey: 'password'});
});
function createAPIConnection(url, credentials) {
try {
let client = Soap.createClient(url);
let loginResult = client.login(credentials);
let sessionId = loginResult.loginReturn.$value;
return {
conn: client,
sessionId: sessionId
};
} catch (e) {
if (e.error === 'soap-creation') {
console.log('SOAP Client creation failed');
}
return null;
}
}
function tick() {
try {
soapConnection.conn.catalogCategoryInfo({
sessionId: soapConnection.sessionId,
categoryId: 1
}, (err, result) => { });
} catch (e) { }
}
Then I have a Meteor method that is called from the client. When it is called, the soap method call fails and I'm getting a 'soap error' message in console.
Meteor.methods({
'createMagentoCustomer'(customer) {
try {
soapConnection.conn.customerCustomerCreate({
sessionId: soapConnection.sessionId,
customerData: customer
}, (err, res) => {
if (err)
console.log('soap error');
else
console.log(res);
});
} catch (e) {
console.log('SOAP Method <customerCustomerCreate> call failed');
}
},
});
So, the ticker works well with no problems, but when I try to call soap via Meteor method, it fails. Notice that the soapConnection method is not null and I do receive error in the soap method callback.
Any suggestions?
Meteor version 1.3.4.1

Categories

Resources