Await isn't waiting for promise to resolve - javascript

Good evening all!
I have been stuck on this issue for a while and I can't seem to solve it through sheer Googling and so I am reaching out to you all.
Context:
I am writing a small application that handles all the calendars and basic project information for all the interns at our company because my boss is constantly asking me what they're up to and I wanted to give him something that he could look at, so I decided to solve it with code whilst also learning a new framework in the process(Express).
Right now I have my routes all set up, I have my controllers all set up, and I have my DB cursor all set up. When I make the call to the route I have defined, it runs the getAllUsers() controller function and inside that controller function it makes a call to the database using the getAllUsers() function on the DB cursor, I want the code to wait for the DB cursor to return its result before continuing but it isn't and I can't work out why. The DB cursor code does work because it fetches the data and logs it out fine.
Any help would be greatly appreciated, I have put the three bits of code in question below, let me know if you need me to show more.
p.s ignore the 'here1', 'here2', etc calls, this is how I have been working out what's happening at any point in time.
routes.ts
import express from 'express';
import controllers from './controller.js';
export default (app: express.Application) => {
// Users
app.route('/users').get(controllers.getAllUsers)
app.route('/users').post(controllers.postNewUser)
app.route('/users').delete(controllers.deleteUser)
app.route('/user/:emailAddress').get(controllers.getUser)
app.route('/user/:emailAddress').put(controllers.updateUser)
}
controllers.ts
import express from 'express';
import dbcursor from '../services/dbcursor.js';
// Interfaces
import { Project, User } from '../services/interfaces.js'
const controllers = {
// Users
getAllUsers: async (req: express.Request, res: express.Response) => {
try {
const dbRes = await dbcursor.getAllUsers();
console.log('here 3', dbRes)
res.status(200).json({
message: 'Users fetched succesfully!',
dbRes: dbRes
});
} catch (err) {
res.status(400).json({
message: 'Failed to get users.',
dbRes: err
});
}
},
}
dbcursor.ts
import dotenv from 'dotenv';
import mongodb from 'mongodb'
dotenv.config();
// Interfaces
import { User, Project } from './interfaces'
// DB Client Creation
const { MongoClient } = mongodb;
const uri = process.env.DB_URI || ''
const client = new MongoClient(uri, { useNewUrlParser: true, useUnifiedTopology: true });
const dbcursor = {
// Users
getAllUsers: async () => {
let dbRes;
try {
await client.connect(async err => {
if (err) throw err;
console.log("here 1", dbRes)
const collection = client.db("InternManager").collection("Users");
dbRes = await collection.find().toArray()
console.log("here 2", dbRes)
return dbRes;
});
} catch(err: any) {
return err;
}
},
}

It's generally not a good idea to mix callbacks and promises. Try not passing a callback to the client.connect method, and you should be able to await the promise as expected
getAllUsers: async () => {
let dbRes;
try {
await client.connect();
console.log("here 1", dbRes)
const collection = client.db("InternManager").collection("Users");
dbRes = await collection.find().toArray()
console.log("here 2", dbRes)
return dbRes;
} catch(err: any) {
throw err; // If you're just catching and throwing the error, then it would be okay to just ignore it
}
},

Related

Looking for a solution to stream dynamic changes in JSON data on to browser without refreshing

I have a node.js / next.js api built that essentially does a bunch of stuff after the user submits text into a form on the front end. One of the things it does is write stage completion messages periodically to a JSON file to signify the completion of certain stages.
my api looks something like this
import dbConnect from '../../../lib/dbConnect'
import Demo from '../../../models/Demo'
import fs from 'fs'
import shell from 'shelljs';
export default async function handler(req, res) {
const {
method,
body,
} = req
await dbConnect()
switch (method) {
case 'GET':
try {
const demos = await Demo.find({})
res.status(200).json({ success: true, data: demos })
} catch (error) {
res.status(400).json({ success: false })
}
break
case 'POST':
try {
const initialjson = '[]'
const timestamp = Date.now();
// stage 1
if (shell.exec('./initial_checks.sh').code !== 0) {
shell.echo('Sorry stage failed');
shell.exit(1);
};
const objSuccess1 = JSON.parse(initialjson);
objSuccess1.push("Stage 1 complete", + timestamp);
const finalJSONSuccess1 = JSON.stringify(objSuccess1);
fs.writeFileSync('success-stage.json', finalJSONSuccess1);
// stage 2
if (shell.exec('./secondary_checks.sh').code !== 0) {
shell.echo('Sorry stage failed');
shell.exit(1);
};
const objSuccess2 = JSON.parse(initialjson);
objSuccess2.push("Stage 2 complete", + timestamp);
const finalJSONSuccess2 = JSON.stringify(objSuccess2);
fs.writeFileSync('success-stage.json', finalJSONSuccess2);
const demo = await Demo.create(
req.body
)
res.status(201).json({ success: true, data: demo })
} catch (error) {
res.status(400).json({ success: false })
}
break
default:
res.status(400).json({ success: false })
break
}
}
I am using socket.io, my server.js file is
server.js
const app = require("express")();
const server = require("http").Server(app);
const io = require("socket.io")(server);
const next = require("next");
const dev = process.env.NODE_ENV !== "production";
const nextApp = next({ dev });
const nextHandler = nextApp.getRequestHandler();
let port = 3000;
const fs = require('fs')
const data = fs.readFileSync('success-stage.json', 'utf8')
io.on("connect", (socket) => {
socket.emit("now", {
message: data
});
});
nextApp.prepare().then(() => {
app.all("*", (req, res) => {
return nextHandler(req, res);
});
server.listen(port, (err) => {
if (err) throw err;
console.log("> Ready on port: " + port);
});
});
and here is the pages/index.js file
import { useEffect, useRef, useState } from "react";
import io from "socket.io-client";
export default function IndexPage() {
const socket = useRef();
const [hello, setHello] = useState();
useEffect(() => {
socket.current = io();
socket.current.on("now", (data) => {
setHello(data.message);
});
}, []);
return <h1>{hello}</h1>;
}
so at this point we are seeing the 2nd message from my JSON file match what is rendered on the frontend when I build my application. It looks like this
["Stage 2 complete",1664289144513]
I am wondering how I can stream this data onto the front end for clients without having to refresh the page? I need it to show the current stage's success message... There are 5 total stages, so i guess i am looking for a way to either stream data or maybe to revalidate the browser window like every second without having to refresh... is this possible?
Any help would be greatly appreciated... Thanks in advance for your time everyone...
You've already got a solution implemented that can handle this. What you're describing is exactly what sockets are for -- bidirectional communication between the client and server without refreshing the page.
Just create a new socket listener on the frontend for a new topic, maybe "stageStatus", and then emit messages to that topic on the backend at various stages in the process. That's it!

MongoDB reusable custom javascript module

I would like to create a local Javascript module I can "require" in other files to handle all MongoDB CRUD operations.
I wrote something as:
-- dbConn.js file --
require('dotenv').config()
const MongoClient = require('mongodb').MongoClient
const ObjectID = require('mongodb').ObjectID
let _connection
const connectDB = async () => {
try {
const client = await MongoClient.connect(process.env.MONGO_DB_URI, {
useNewUrlParser: true,
useUnifiedTopology: true
})
console.log('Connected to MongoDB')
return client
} catch (err) {
console.log(error)
}
}
exports.findOne = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.findOne(query);
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
exports.findAll = async () => {
let client = await connectDB()
if (!client) {
return;
}
try {
const db = client.db("Test_DB");
const collection = db.collection('IoT_data_Coll');
const query = {}
let res = await collection.find(query).toArray();
return res;
} catch (err) {
console.log(err);
} finally {
client.close();
}
}
Then in another file (not necessary inside Express app), say
-- app.js ---
const findAll = require('./dbConn').findAll
const findOne = require('./dbConn').findOne
findAll().then(res => JSON.stringify(console.log(res)))
findOne().then(res => JSON.stringify(console.log(res)))
I wonder if it is correct?
I have to close the connection after each method/CRUD operation?
I was trying to use IIF instead of ".then", as:
(async () => {
console.log(await findOne())
})()
But I receive a weird error saying that findAll is not a function.
What's wrong with it?
Thanks.
It really depends on your use case which isn’t clear If you are using Express or just stand alone and how frequent are you planning to run app.js
Either way your code is expensive, each time you reference dbCon.js you are opening a new connection to the database.
So you can fix app.js by only requiring dbCon.js once and use it..
The best practice is to ofcourse use connection pooling https://www.compose.com/articles/connection-pooling-with-mongodb/

Promises in JS: using Axios to write to mongoDB

I am struggling to get my head round Promises. I think i understand the concept but I am unable to get them to work on the backend.
I have read several stackoverflow posts. I still see a few which are only months old so I guess i am not the only one :)
Specifically, I need help on how I can pass the result of a resolved promise within my code. In the code below, I fetch a JSON file from the starwars api and want to write it onto a mongodb atlas collection.
I use axios.get, which returns a promise. I then resolve it using .then and then use insertOne on mongoDB collections.
On the frontend for e.g in React, it works as expected, where you use setState to change the state by using the setState within the .then function.
I don't understand why it doesn't work in the backend.
Could you please tell me what I need to change so I can get it to write to mongoDB atlas?
var axios = require("axios");
const MongoClient = require("mongodb").MongoClient;
var db;
const getData = () => {
return axios
.get("https://swapi.co/api/people/1")
.then(response => {
if (!response.data) throw Error("No data found.");
console.log(JSON.stringify(response.data)) **//This returns the data as expected.**
return JSON.stringify(response.data);
})
.catch(error => {
console.log(error);
throw error;
});
};
console.log(getData()); **// This returns {Promise <pending>}**
const client = new MongoClient(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true
});
// Connect to database and insert default users into users collection
client.connect(err => {
console.log("Connected successfully to database");
let d = {
name: "Luke Skywalker",
height: "172",
mass: "77",
hair_color: "blond",
skin_color: "fair",
eye_color: "blue"
};
db = client.db(process.env.DB_NAME);
db.collection("macroData").insertOne(d); //this works
db.collection("macroData").insertOne(getData); // this doesn't work as it still appears to be a promise
});
getData() returns a Promise, as you are well aware, so you have to wait on that promise to resolve. A straightforward approach would be to perform the insert once the data is available:
client.connect(err => {
// ...
getData().then(data => {
db.collection('macroData').insertOne(data)
})
})
Or, if you can use async/await:
client.connect(async err => {
// ...
const data = await getData()
db.collection('macroData').insertOne(data)
})
Your mongodb call needs to be within the axios promise, that way the resolved promise can be used to feed your database. That's what held me up for a while...
The code below is for Nick comment, I couldn't post the code in the comment as it was too long. You should have a database named test or appropriate name here let datab = client.db('test'), i think it comes by default when you create a mongodb atlas.
if you change your user and password in the mongourl, you should be good to go.
Hope that helps. This creates a starwars entry under test.starWarsData .. hope t
let axios = require("axios");
let MongoClient = require("mongodb").MongoClient;
let mongoParams = { useNewUrlParser: true, useUnifiedTopology: true };
let mongoUrl =
"mongodb+srv://user:password#cluster0-hnc4i.azure.mongodb.net/test";
//try feeding just the object e, and see if it works, in case your axios error catching is not great.
let e = {
a: "this is a",
b: "this is b",
c: "this is c",
d: "this is d"
};
let newUrl = "https://swapi.co/api/people/1";
console.log(newUrl);
let client = new MongoClient(mongoUrl, mongoParams);
client.connect(err => {
if (err) {
console.log(err.message);
throw new Error("failed to connect");
}
let datab = client.db("test");
console.log("db connected");
try {
axios.get(newUrl).then(res => {
try {
datab.collection("starWarsData").insertOne(res.data);
console.log("insert succeeded");
} catch (err) {
console.log("insert failed");
console.log(err.message);
}
});
} catch (err) {
throw Error("axios get did not work");
}
});

Issues with scope in try/catch while using async/await

My issue is that (seemingly) things are going out of scope, or the scope is being polluted when I enter my catch block in the function below:
export const getOne = model => async (req, res, next) => {
let id = req.params.id
let userId = req.user
try {
let item = await model.findOne({ _id: id, createdBy: userId }).exec()
if (!item) {
throw new Error('Item not found!')
} else {
res.status(200).json({ data: item }) // works perfectly
}
} catch (e) {
res.status(400).json({ error: e }) // TypeError: res.status(...).json is not a function
// also TypeError: next is not a function
// next(e)
}
}
Interestingly enough, using res.status(...).end() in the catch block works just fine, but it bothers me that I am not able to send any detail back with the response. According to the Express Documentation for res.send() and res.json I should be able to chain off of .status(), which, also interestingly enough, works just fine in the try statement above if things are successful - res.status(200).json(...) works perfectly.
Also, I tried abstracting the error handling to middleware, as suggested on the Express documentation, and through closures, I should still have access to next in the catch statement, right? Why is that coming back as not a function?
Why does res.status(...).json(...) work in my try but not catch block?
Why is next no longer a function in the catch block?
Thanks in advance!
Edit
This is failing in unit tests, the following code produces the errors described above:
describe('getOne', async () => {
// this test passes
test('finds by authenticated user and id', async () => {
expect.assertions(2)
const user = mongoose.Types.ObjectId()
const list = await List.create({ name: 'list', createdBy: user })
const req = {
params: {
id: list._id
},
user: {
_id: user
}
}
const res = {
status(status) {
expect(status).toBe(200)
return this
},
json(result) {
expect(result.data._id.toString()).toBe(list._id.toString())
}
}
await getOne(List)(req, res)
})
// this test fails
test('400 if no doc was found', async () => {
expect.assertions(2)
const user = mongoose.Types.ObjectId()
const req = {
params: {
id: mongoose.Types.ObjectId()
},
user: {
_id: user
}
}
const res = {
status(status) {
expect(status).toBe(400)
return this
},
end() {
expect(true).toBe(true)
}
}
await getOne(List)(req, res)
})
})
Why does res.status(...).json(...) work in my try but not catch block?
Seems like you're passing a non-express object that only has status & end methods when running using the unit testing. That's why it fails to find the json method

Mongoose query blocks Node.js

Mongoose blocks Node.js when it is getting data. I thought that it is supposed to be absolutely no blocking and when callback appears then it should just get back there.
The problem is with:
Container.find({}, function (err, documents) {
res.status(200).send(documents);
});
When I will run this route in ExpressJS it will just freeze NodeJS for around 10sec, and no one else can reach connection then.
I'm having a open connection to MongoDB at the start using Mongoose and not doing anything with it later on. What's the problem? It is supposed to work like that?
UPDATE:
So this is how I init mongoose
function initialDb() {
seed();
seedStructure();
startApplication();
}
database.connect();
database.loadModels(initialDb);
and this is the place where i connect and init models
import mongoose from 'mongoose';
import chalk from 'chalk';
import config from '../config';
export default {
loadModels(callback){
require('../../models/site');
require('../../models/page');
require('../../models/container');
require('../../models/template');
require('../../models/theme');
if (typeof callback === 'function') {
return callback();
}
},
connect () {
mongoose.connect(config.db.uri, function (err) {
if (err) {
console.error(chalk.red('Could not connect to MongoDB!'));
console.log(err);
}
});
},
disconnect(callback) {
mongoose.disconnect(function (err) {
console.info(chalk.yellow('Disconnected from MongoDB.'));
callback(err);
});
}
};
and the model
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var container = new Schema({
});
let model = mongoose.model('container', container);
module.exports = model;
It returns around 26k documents.
Ok so basically I found out, that if I will stream that instead of getting it with one callback it will work way better (I will be able to get into some other actions)
like this
var stream = Container.find({}).stream();
var array = [];
stream.on('data', (doc) => {
array.push(doc);
}).on('error', (err) => {
}).on('close', () => {
res.send(array);
});
It will solve the problem. So this is how I would get big data from mongodb, though why it is slowing so much if I will get it in one callback? Due to 12MB data? Big json that needs to be parsed or what?
Cause this is a quite mysterious for me (the reason for slow down)

Categories

Resources