The below code works, but I would like to only use async/await, so my question is ยด: How can I turn
cat.save().then(() => console.log('Saved in db'));
into using await instead?
The reason I have mongoose.connection.once() is to only send commands when MongoDB is connected. if this could use await as well, then it would be really great =)
import mongoose from 'mongoose';
import { connectDb } from './modules/connectDb';
const { Schema } = mongoose;
const catSchema = new Schema({ name: String });
(async () => {
connectDb('testDB');
mongoose.connection.once('open', () => {
console.log('MongoDB is connected');
mongoose.connection.db.listCollections().toArray(function (err, names) {
console.log(names);
});
const catModel = mongoose.model('testColl', catSchema);
const cat = new catModel({ name: 'Zildjian' });
cat.save().then(() => console.log('Saved in db'));
});
})();
connectDb.ts
import mongoose from 'mongoose';
import { strict as assert } from 'assert';
import { readToml } from './readToml';
const db = readToml('./config/database.toml');
export function connectDb(
database: string = db.database,
uri: string = db.uri,
username: string = db.username,
password: string = db.password,
) {
assert(typeof uri === 'string');
assert(typeof database === 'string');
assert(typeof username === 'string');
assert(typeof password === 'string');
const URI = `mongodb+srv://${username}:${password}#${uri}/${database}?retryWrites=true&w=majority`;
try {
mongoose.connect(URI);
} catch (err) {
console.error(err);
}
}
Try below code and make sure the function contains await keyword should always async function (async keyword should be used before function name). But in your case callback function already defined as async function.
Just change the saving part alone to below, you are good to go.
try {
const catModel = mongoose.model('testColl', catSchema);
const cat = new catModel({ name: 'Zildjian' });
const response = await cat.save(); // if error while saving, catch will get executed
console.log(response); // saved record
// return success response
} catch (err) {
console.log('err' + err);
// return error response
}
First you need to make connectDB async and then git rid of mongoose.connection.once() as you otherwise would need all your Mongoose code to be in there.
main().catch((err) => console.log(err));
async function main() {
await connectDb('testDB');
const catSchema = new mongoose.Schema({ name: String });
const catDocument = mongoose.model('testColl', catSchema);
const catObj = new catDocument({ name: 'Zildjian' });
await catObj.save();
}
connectDB
import mongoose from 'mongoose';
import { strict as assert } from 'assert';
import { readToml } from './readToml';
const db = readToml('./config/database.toml');
export async function connectDb(
database: string = db.database,
uri: string = db.uri,
username: string = db.username,
password: string = db.password,
) {
assert(typeof uri === 'string');
assert(typeof database === 'string');
assert(typeof username === 'string');
assert(typeof password === 'string');
const URI = `mongodb+srv://${username}:${password}#${uri}/${database}?retryWrites=true&w=majority`;
const options = {
bufferCommands: false,
autoCreate: false,
};
try {
await mongoose.connect(URI, options);
} catch (err: any) {
throw err.message;
}
}
In order to use await you have to provide a promise to await for.
cat.save() returns a promise, so this should work. Nevertheless you can also only use await in async function, so you should also declare the callback function for the open event as async:
(async () => {
connectDb('testDB');
mongoose.connection.once('open', async () => {
console.log('MongoDB is connected');
mongoose.connection.db.listCollections().toArray(function (err, names) {
console.log(names);
});
const catModel = mongoose.model('testColl', catSchema);
const cat = new catModel({ name: 'Zildjian' });
await cat.save();
console.log('Saved in db');
});
})();
Related
Where am i going wrong here?
Using mocha, chai, sinon and proxyquire for an express server and sequelize ORM linked with a postgres database
I am trying to test a login controller route from my express server
Before I show the file which I want to run my test on here is what "../services/authService.js" file looks like
../services/authService
const UserService = require("./userService");
module.exports = class AuthService extends UserService {
};
// so UserService will have the method findByEmail
// UserService class looks like this and it is coming from another file require("./userService.js) as stated above
/*
class UserService {
async findByEmail(email) {
try {
const user = await User.findOne({ where: { email: email }});
if (user) {
return user;
}
throw new Error("User not found");
} catch (err) {
err.code = 404;
throw err
}
}
}
*/
And here is the auth-controller.js file which I want to run the test on
auth-controller.js
const bcrypt = require('bcryptjs');
const AuthService = require("../services/authService"); // is a class which extends from another calls see the code above
const authService = new AuthService();
const jwtGenerator = require('../utils/jwtGenerator');
const createError = require("http-errors");
exports.loginRoute = async (req, res, next) => {
try {
req.body.password = String(req.body.password);
// db query trying to force a sinon.stub to resolve a fake value. But code wont pass here hence 500 error
const userQuery = await authService.findByEmail(req.body.email);
const compare = await bcrypt.compare(req.body.password, userQuery.password);
if (!compare) throw createError(401, 'Incorrect password.');
const user = {
id: userQuery.id, role: userQuery.is_admin ? "Administrator" : "User", email: userQuery.email, Authorized: true
}
const token = jwtGenerator(user);
return res
.cookie("access_token", token, {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
}).status(200).json({ message: "Logged in successfully ๐ ๐", user, token });
} catch (error) {
next(error);
}
}
This code works in production but I cannot seem to test it. I used proxyquire to require the modules that the function uses. I have a big problem in making proxyquire work when it comes to my class AuthService here is my test file. As proxyquire is not working with classes some how. proxyquire is not using make AuthServiceMock at all cant figure out why.
First of these are my helper variables which I will use in the test file
../test-utils/user-helper
const createAccessToken = (payload) => jwt.sign(payload, TOKEN, {expiresIn: "1h"});
let loginDetail = {
email: "admin#test.com",
password: "123456"
};
let loginAdminUser = {
id: 1,
email: "admin#test.com",
password: "123456",
is_admin: true
}
const loginUser = {
id: 1,
email: "admin#test.com",
password: "123456",
is_admin: true
}
const adminUser = {
id: 1,
email: 'admin#test.com',
password: '123456',
is_admin: true,
first_name: 'john',
last_name: 'doe',
created_at: "2020-06-26T09:31:36.630Z",
updated_at: "2020-06-26T09:31:49.627Z"
}
module.exports = {
createAccessToken,
loginDetail,
loginAdminUser,
loginUser,
adminUser
}
And here is the test file I placed comments espcially around proxyquire when I am trying to use it as this is giving me some issues when it comes to using it with classes. And as well it is not calling mocked/stubbed npm modules for some reason
auth-controller.spec.js
"use strict";
const _ = require("lodash");
const path = require("path");
const proxyquire = require("proxyquire").noCallThru().noPreserveCache();
const chai = require("chai");
const { expect } = chai;
const sinon = require("sinon");
const sinonChai = require("sinon-chai");
chai.use(sinonChai);
// const AuthServiceOriginalClass = require("../../services/authService"); If i use this directly in proxyquire it calls the original class
const { createAccessToken, loginDetail, loginAdminUser, loginUser, adminUser } = require("../test-utils/user-helper");
const controllerPath = path.resolve('./controllers/authController.js');
describe("login route", () => {
let proxy, authService, bcryptStub, fakeCallback, fakeReq, fakeRes, fakeNext, resolveFn, token;
let result, bcryptStubbing, response;
class UserServiceMock {
async findByEmail(email) {
try {
if (email) {
return loginAdminUser;
}
} catch (error) {
throw error;
}
}
}
class AuthServiceMock extends UserServiceMock {};
bcryptStub = {
compare: function() { return true }
};
let tokeen = (kk) => {
return createAccessToken(kk);
}
// token = sinon.mock(createAccessToken(loginAdminUser)); // ?? which 1 to use?
token = sinon.spy(createAccessToken); // ?? which 1 to use?
// token = sinon.stub(createAccessToken) ?? which 1 to use?
proxy = proxyquire(controllerPath, {
"../services/authService.js": AuthServiceMock, // seems like this is not called at all
// "../services/authService.js": AuthServiceOriginalClass, // commented out if use this instead it calls the original class instant
"bcryptjs": bcryptStub,
"../utils/jwtGenerator": token,
// "#noCallThru": true // keep on or off?
});
before("Stub my methods", () => {
authService = new AuthServiceMock();
// If I call the entire loginRoute I want this stub authTry to be called inside of it and resolve that object value
authTry = sinon.stub(authService, "findByEmail").withArgs(loginDetail.email).resolves(loginAdminUser);
sinon.stub(bcryptStub, "compare").resolves(true); // force it to return true as that seems to be like the code of authController.js
// sinon.stub(token, "createAccessToken")
});
before("call the function loginRoute", async () => {
// fakeCallback = new Promise((res, rej) => {
// resolveFn = res
// });
fakeReq = {
body: {
email: loginDetail.email,
password: loginDetail.password
}
};
fakeRes = {
cookie: sinon.spy(),
status: sinon.spy(),
json: sinon.spy()
}
fakeNext = sinon.stub();
await proxy.loginRoute(fakeReq, fakeReq, fakeNext).then((_result) => {
result = _result;
});
console.log("result")
console.log(result) // undefined
console.log("result")
});
it("login route test if the stubs are called", async () => {
expect(authService.findByEmail).to.have.been.called // never called
// expect(bcryptStubbing).to.have.been.called // never called
// expect(response.status).to.deep.equal(200); // doesn't work
}).timeout(10000);
after(() => {
sinon.reset()
});
});
Where am i going wrong here in the test?
I'm trying to Stream JSON from MongoDB to S3 with the new version of #aws-sdk/lib-storage:
"#aws-sdk/client-s3": "^3.17.0"
"#aws-sdk/lib-storage": "^3.34.0"
"JSONStream": "^1.3.5",
Try #1: It seems that I'm not using JSONStream.stringify() correctly:
import { MongoClient } from 'mongodb';
import { S3Client } from '#aws-sdk/client-s3';
import { Upload } from '#aws-sdk/lib-storage';
const s3Client = new S3Client({ region: env.AWS_REGION });
export const uploadMongoStreamToS3 = async (connectionString, collectionName) => {
let client;
try {
client = await MongoClient.connect(connectionString);
const db = client.db();
const readStream = db.collection(collectionName).find('{}').limit(5).stream();
readStream.pipe(JSONStream.stringify());
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: readStream,
},
});
await upload.done();
}
catch (err) {
log.error(err);
throw err.name;
}
finally {
if (client) {
client.close();
}
}
};
Error #1:
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of
type string, Buffer, ArrayBuffer, Array, or Array-like Object.
Received type object
at Function.from (buffer.js:305:9)
at getDataReadable (/.../node_modules/#aws-sdk/lib-storage/src/chunks/getDataReadable.ts:6:18)
at processTicksAndRejections (internal/process/task_queues.js:94:5)
at Object.getChunkStream (/.../node_modules/#aws-sdk/lib-storage/src/chunks/getChunkStream.ts:17:20)
at Upload.__doConcurrentUpload (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:121:22)
at async Promise.all (index 0)
at Upload.__doMultipartUpload (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:196:5)
at Upload.done (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:88:12)
Try #2, using the variable jsonStream:
const readStream = db.collection(collectionName).find('{}').limit(5).stream();
const jsonStream = readStream.pipe(JSONStream.stringify());
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: jsonStream,
},
});
Error #2:
ReferenceError: ReadableStream is not defined
at Object.getChunk (/.../node_modules/#aws-sdk/lib-storage/src/chunker.ts:22:30)
at Upload.__doMultipartUpload (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:187:24)
at Upload.done (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:88:37)
Try #3: use stream.PassThrough:
client = await MongoClient.connect(connectionString);
const db = client.db();
const readStream = db.collection(collectionName).find('{}').limit(5).stream();
readStream.pipe(JSONStream.stringify()).pipe(uploadStreamFile('benda_mongo.json'));
...
const stream = require('stream');
export const uploadStreamFile = async(fileName) => {
try{
const pass = new stream.PassThrough();
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: pass,
},
});
const res = await upload.done();
log.info('finished uploading file', fileName);
return res;
}
catch(err){
return;
}
};
Error #3:
'dest.on is not a function at Stream.pipe (internal/streams/legacy.js:30:8'
Try #4: mongodb.stream({transform: doc => JSON.stringify...}) instead of JSONStream:
import { S3Client } from '#aws-sdk/client-s3';
import { Upload } from '#aws-sdk/lib-storage';
import { env } from '../../../env';
const s3Client = new S3Client({ region: env.AWS_REGION });
export const uploadMongoStreamToS3 = async (connectionString, collectionName) => {
let client;
try {
client = await MongoClient.connect(connectionString);
const db = client.db();
const readStream = db.collection(collectionName)
.find('{}')
.limit(5)
.stream({ transform: doc => JSON.stringify(doc) + '\n' });
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: readStream,
},
});
await upload.done();
}
catch (err) {
log.error('waaaaa', err);
throw err.name;
}
finally {
if (client) {
client.close();
}
}
};
Error: #4:
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of
type string, Buffer, ArrayBuffer, Array, or Array-like Object.
Received type object
at Function.from (buffer.js:305:9)
at getDataReadable (/.../node_modules/#aws-sdk/lib-storage/src/chunks/getDataReadable.ts:6:18)
at processTicksAndRejections (internal/process/task_queues.js:94:5)
at Object.getChunkStream (/.../node_modules/#aws-sdk/lib-storage/src/chunks/getChunkStream.ts:17:20)
at Upload.__doConcurrentUpload (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:121:22)
at async Promise.all (index 0)
at Upload.__doMultipartUpload (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:196:5)
at Upload.done (/.../node_modules/#aws-sdk/lib-storage/src/Upload.ts:88:12)
Try #5: using stream.PassThrough() and return pass to pipe:
export const uploadMongoStreamToS3 = async (connectionString, collectionName) => {
let client;
try {
client = await MongoClient.connect(connectionString);
const db = client.db();
const readStream = db.collection(collectionName).find('{}').limit(5).stream({ transform: doc => JSON.stringify(doc) + '\n' });
readStream.pipe(uploadStreamFile());
}
catch (err) {
log.error('waaaaa', err);
throw err.name;
}
finally {
if (client) {
client.close();
}
}
};
const stream = require('stream');
export const uploadStreamFile = async() => {
try{
const pass = new stream.PassThrough();
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: pass,
},
});
await upload.done();
return pass;
}
catch(err){
log.error('pawoooooo', err);
return;
}
};
Error #5:
TypeError: dest.on is not a function
at Cursor.pipe (_stream_readable.js:680:8)
After reviewing your error stack traces, probably the problem has to do with the fact that the MongoDB driver provides a cursor in object mode whereas the Body parameter of Upload requires a traditional stream, suitable for be processed by Buffer in this case.
Taking your original code as reference, you can try providing a Transform stream for dealing with both requirements.
Please, consider for instance the following code:
import { Transform } from 'stream';
import { MongoClient } from 'mongodb';
import { S3Client } from '#aws-sdk/client-s3';
import { Upload } from '#aws-sdk/lib-storage';
const s3Client = new S3Client({ region: env.AWS_REGION });
export const uploadMongoStreamToS3 = async (connectionString, collectionName) => {
let client;
try {
client = await MongoClient.connect(connectionString);
const db = client.db();
const readStream = db.collection(collectionName).find('{}').limit(5).stream();
// We are creating here a Transform to adapt both sides
const toJSONTransform = new Transform({
writableObjectMode: true,
transform(chunk, encoding, callback) {
this.push(JSON.stringify(chunk) + '\n');
callback();
}
});
readStream.pipe(toJSONTransform);
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: 'extracted-data/benda_mongo.json',
Body: toJSONTransform,
},
});
await upload.done();
}
catch (err) {
log.error(err);
throw err.name;
}
finally {
if (client) {
client.close();
}
}
};
In the code, in toJSONTransform we are defining the writable part of the stream as object mode; in contrast, the readable part will be suitable for being read from the S3 Upload method... at least, I hope so.
Regarding the second error you reported, the one related with dest.on, I initially thought, and I wrote you about the possibility, that the error was motivated because in uploadStreamFile you are returning a Promise, not a stream, and you are passing that Promise to the pipe method, which requires a stream, basically that you returned the wrong variable. But I didn't realize that you are trying passing the PassThrough stream as a param to the Upload method: please, be aware that this stream doesn't contain any information because you are not passing any information to it, the contents of the readable stream obtained from the MongoDB query are never passed to the callback nor the Upload itself.
I found additional solution using stream.PassThrough, using JSONStream will stream array of objects instead of one after the other:
export const uploadMongoStreamToS3 = async (connectionString, collectionName) => {
let client;
try {
client = await MongoClient.connect(connectionString);
const db = client.db();
const passThroughStream = new stream.PassThrough();
const readStream = db.collection(collectionName)
.find('{}')
.stream();
readStream.on('end', () => passThroughStream.end());
readStream.pipe(JSONStream.stringify()).pipe(passThroughStream);
await uploadStreamFile('benda_mongo.json', passThroughStream);
}
catch (err) {
log.error(err);
throw err.name;
}
finally {
if (client) {
client.close();
}
}
};
export const uploadStreamFile = async(fileName, stream) => {
try{
log.info('start uploading file', fileName);
const upload = new Upload({
client: s3Client,
params: {
Bucket: 'test-bucket',
Key: `${fileName}`,
Body: stream,
},
});
const res = await upload.done();
log.info('finished uploading file', fileName);
return res;
}
catch(err){
log.error(err);
return;
}
};
I have a file index.js as below. Where I am trying to call a async function getConn in other function createThumbnails. But I am getting the error as "failed to connect to DEDC: 1433 - self signed certificate" in the catch block.
const sharp = require('sharp');
const sql = require('mssql')
// CONNECTION CONFIGURATION OF BASE DB
async function getConn() {
try {
const config = {
user: 'sa_user',
password: '*******',
server: 'DEDC',
database: 'DEMO_BASE'
}
const pool = await new sql.ConnectionPool(config)
const req=await pool.connect()
const conn = await req.request()
return conn;
} catch (err) {
return err;
}
};
const createThumbnails = async() => {
try{
var conn = await getConn();
const query = `exec DBBASE.get_client_info`
var clientusers = await conn.query(query);
} catch (err) {
return err;
}
}
createThumbnails()
How do I exactly call the function getConn inside createThumbnails. Please help. Thanks in advance
It's because you are using variable with the same name as the function.
Try different name:
var conn = await getConn();
const query = `exec DBBASE.get_client_info`
var clientusers = await conn.query(query);
You encounter what called hoisting. Kyle Simpson has a great explaination on this topic
var getConn = await getConn();
which means getConn will be initialized first, before assignment, which equivalents to
var getConn // initialized
getConn = await getConn() // assignment
Then turned out that you got the error
Solution here is to store it in a different variable name, like
var conn = await getConn();
async function getConn() {
return {
query: async () => {
console.log("query called");
},
};
}
const createThumbnails = async () => {
try {
var conn = await getConn();
const query = `exec DBBASE.get_client_info`;
var clientusers = await conn.query(query);
} catch (err) {
console.log(err);
}
};
createThumbnails();
We need to use trustServerCertificate: true in DB configuration i.e in const config
i create an function to send email with nodemailer, but after run my console throw me:
TypeError: cb is not a function
at tryHandleCache (C:\Users\Maciek\Desktop\GoParty\backend\node_modules\ejs\lib\ejs.js:226:12)
at Object.exports.renderFile (C:\Users\Maciek\Desktop\GoParty\backend\node_modules\ejs\lib\ejs.js:437:10)
at Object.fn (C:\Users\Maciek\Desktop\GoParty\backend\api\controllers\user\create.js:47:28)
at <anonymous>
at process._tickDomainCallback (internal/process/next_tick.js:229:7)
my function to sendEmails.js
const transporter = require('nodemailer').createTransport(sails.config.custom.email)
module.exports = {
inputs:{
to: { type:'string', required:true },
subject: { type:'string', required:true},
html: {type:'string', required:true}
},
exits:{
success: {
description: 'All done.'
}
},
fn: async function(inputs, exits){
const options = {
from: sails.config.custom.email.auth.user,
to: inputs.to,
subject: inputs.subject,
html: inputs.html
}
transporter.sendMail(options, (err, info) => {
if(err){
return exits.error(err)
}else return exits.success(info.response)
})
}
}
my create.js where i must send email with correct variables:
const ejsVariable = {
activeCode: inputs.activateCode
}
// const html = await ejs.renderFile(templatePath, ejsVariable)
// const subject = 'EventZone - potwierdzenie rejestracji'
// const res = await sails.helpers.email.sendEmail(inputs.email, subject, html)
// if(!res){
// return this.res.badRequest('Confirmation email has not been send.')
// }
thanks for any help
ejs.renderFile takes 4 parameters, the last one is a function. Example usage:
ejs.renderFile(filename, data, options, function(err, str){
// str => Rendered HTML string
});
it doesn't return a promise so you can't await it.
try replacing
const html = await ejs.renderFile(templatePath, ejsVariable)
const subject = 'xxx'
const res = await sails.helpers.email.sendEmail(inputs.email, subject, html)
with
ejs.renderFile(templatePath, ejsVariable, async (err, html) => {
const subject = 'xxx'
const res = await sails.helpers.email.sendEmail(inputs.email, subject, html)
})
UPDATE
you can use util.promisify to make the ejs.renderFile function return a promise and thus work with async await like so:
const util = require('util') //first import `util`
....
const asyncEjsRenderFile = util.promisify(ejs.renderFile)
const html = await asyncEjsRenderFile(templatePath, ejsVariable)
const subject = 'xxx'
const res = await sails.helpers.email.sendEmail(inputs.email, subject, html)
FIXED: USER storageEngine: "wiredTiger"
I use Mocha / Chai / Supertest and Mongodb-Memory-Server to test my app. But's I received error: Transaction numbers are only allowed on storage engines that support document-level locking
In real database and test by postman, it's working well.
My code:
In database.js
const mongoose = require('mongoose')
const { MongoMemoryReplSet } = require('mongodb-memory-server')
mongoose.set('useFindAndModify', false);
const connect = async () => {
try {
let url = process.env.MONGO_URL
let options = {
//Something
}
if (process.env.NODE_ENV === 'test') {
const replSet = new MongoMemoryReplSet();
await replSet.waitUntilRunning();
const uri = await replSet.getUri();
await mongoose.connect(uri, options)
//log connected
} else {
await mongoose.connect(url, options)
//log connected
}
} catch (error) {
//error
}
}
I have two model: Company and User. I made a function to add a member to company with used transaction. My code
const addMember = async (req, res, next) => {
const { companyId } = req.params
const { userId } = req.body
const session = await mongoose.startSession()
try {
await session.withTransaction(async () => {
const [company, user] = await Promise.all([
Company.findOneAndUpdate(
//Something
).session(session),
User.findByIdAndUpdate(
//Something
).session(session)
])
//Something if... else
return res.json({
message: `Add member successfully!`,
})
})
} catch (error) {
//error
}
}
Here's router:
router.post('/:companyId/add-member',
authentication.required,
company.addMember
)
Test file:
const expect = require('chai').expect
const request = require('supertest')
const app = require('../app')
describe('POST /company/:companyId/add-member', () => {
it('OK, add member', done => {
request(app).post(`/company/${companyIdEdited}/add-member`)
.set({ "x-access-token": signedUserTokenKey })
.send({userId: memberId})
.then(res => {
console.log(res.body)
expect(res.statusCode).to.equals(200)
done()
})
.catch((error) => done(error))
})
})
And i received error: Transaction numbers are only allowed on storage engines that support document-level locking'
How can I fix this?
Add retryWrites=false to your database uri. Example below:
mongodb://xx:xx#xyz.com:PORT,zz.com:33427/database-name?replicaSet=rs-xx&ssl=true&retryWrites=false