I am struggling to get my head round Promises. I think i understand the concept but I am unable to get them to work on the backend.
I have read several stackoverflow posts. I still see a few which are only months old so I guess i am not the only one :)
Specifically, I need help on how I can pass the result of a resolved promise within my code. In the code below, I fetch a JSON file from the starwars api and want to write it onto a mongodb atlas collection.
I use axios.get, which returns a promise. I then resolve it using .then and then use insertOne on mongoDB collections.
On the frontend for e.g in React, it works as expected, where you use setState to change the state by using the setState within the .then function.
I don't understand why it doesn't work in the backend.
Could you please tell me what I need to change so I can get it to write to mongoDB atlas?
var axios = require("axios");
const MongoClient = require("mongodb").MongoClient;
var db;
const getData = () => {
return axios
.get("https://swapi.co/api/people/1")
.then(response => {
if (!response.data) throw Error("No data found.");
console.log(JSON.stringify(response.data)) **//This returns the data as expected.**
return JSON.stringify(response.data);
})
.catch(error => {
console.log(error);
throw error;
});
};
console.log(getData()); **// This returns {Promise <pending>}**
const client = new MongoClient(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true
});
// Connect to database and insert default users into users collection
client.connect(err => {
console.log("Connected successfully to database");
let d = {
name: "Luke Skywalker",
height: "172",
mass: "77",
hair_color: "blond",
skin_color: "fair",
eye_color: "blue"
};
db = client.db(process.env.DB_NAME);
db.collection("macroData").insertOne(d); //this works
db.collection("macroData").insertOne(getData); // this doesn't work as it still appears to be a promise
});
getData() returns a Promise, as you are well aware, so you have to wait on that promise to resolve. A straightforward approach would be to perform the insert once the data is available:
client.connect(err => {
// ...
getData().then(data => {
db.collection('macroData').insertOne(data)
})
})
Or, if you can use async/await:
client.connect(async err => {
// ...
const data = await getData()
db.collection('macroData').insertOne(data)
})
Your mongodb call needs to be within the axios promise, that way the resolved promise can be used to feed your database. That's what held me up for a while...
The code below is for Nick comment, I couldn't post the code in the comment as it was too long. You should have a database named test or appropriate name here let datab = client.db('test'), i think it comes by default when you create a mongodb atlas.
if you change your user and password in the mongourl, you should be good to go.
Hope that helps. This creates a starwars entry under test.starWarsData .. hope t
let axios = require("axios");
let MongoClient = require("mongodb").MongoClient;
let mongoParams = { useNewUrlParser: true, useUnifiedTopology: true };
let mongoUrl =
"mongodb+srv://user:password#cluster0-hnc4i.azure.mongodb.net/test";
//try feeding just the object e, and see if it works, in case your axios error catching is not great.
let e = {
a: "this is a",
b: "this is b",
c: "this is c",
d: "this is d"
};
let newUrl = "https://swapi.co/api/people/1";
console.log(newUrl);
let client = new MongoClient(mongoUrl, mongoParams);
client.connect(err => {
if (err) {
console.log(err.message);
throw new Error("failed to connect");
}
let datab = client.db("test");
console.log("db connected");
try {
axios.get(newUrl).then(res => {
try {
datab.collection("starWarsData").insertOne(res.data);
console.log("insert succeeded");
} catch (err) {
console.log("insert failed");
console.log(err.message);
}
});
} catch (err) {
throw Error("axios get did not work");
}
});
Related
I'm trying to update the document but the error says the query has already been executed.
MongooseError: Query was already executed: footballs.updateOne({ date: 'January 4' }, {})
app.post('/api/bookslot', async (req, res) => {
console.log(req.body);
try {
const token = req.headers['x-access-token'];
const decoded = jwt.verify(token, 'secret123');
const email = decoded.email;
const user = await UserModel.findOne({ email: email });
let sportname = req.body.selectedSport.toLowerCase();
const time = req.body.slotTime;
const seats = req.body.availableSeats - 1;
if (!sportname.endsWith('s')) {
sportname = sportname.concat('s');
}
const NewSlotModel = mongoose.model(sportname, slotSchema);
var update = {};
update[time] = seats - 1;
console.log(update);
const a = await NewSlotModel.updateOne(
{ date: req.body.slotDate },
{ $set: update },
function (err, success) {
if (err) return handleError(err);
}
);
return res.json({ status: 'ok' });
} catch (e) {
console.log(e);
res.json({ status: 'error' });
}
});
where am I going wrong?
You are using both async/await and callbacks in your code, causing mongoose to throw an error.
The actual effect of using them both is exactly the error type that you are receiving:
Query was already executed
Mongoose v6 does not allow duplicate queries.
Mongoose no longer allows executing the same query object twice. If
you do, you'll get a Query was already executed error. Executing the
same query instance twice is typically indicative of mixing callbacks
and promises, but if you need to execute the same query twice, you can
call Query#clone() to clone the query and re-execute it. See gh-7398
Duplicate Query Execution
To fix the issue, just remove the third argument from the await
NewSlotModel.updateOne
Making it:
const a = await NewSlotModel.updateOne(
{ date: req.body.slotDate },
{ $set: update }
);
Mongoose v6. Don't support callbacks any longer.. check the image.
const productCount = await Product.countDocuments((count) => count) BAD
const productCount = await Product.countDocuments(); GOOD
Good evening all!
I have been stuck on this issue for a while and I can't seem to solve it through sheer Googling and so I am reaching out to you all.
Context:
I am writing a small application that handles all the calendars and basic project information for all the interns at our company because my boss is constantly asking me what they're up to and I wanted to give him something that he could look at, so I decided to solve it with code whilst also learning a new framework in the process(Express).
Right now I have my routes all set up, I have my controllers all set up, and I have my DB cursor all set up. When I make the call to the route I have defined, it runs the getAllUsers() controller function and inside that controller function it makes a call to the database using the getAllUsers() function on the DB cursor, I want the code to wait for the DB cursor to return its result before continuing but it isn't and I can't work out why. The DB cursor code does work because it fetches the data and logs it out fine.
Any help would be greatly appreciated, I have put the three bits of code in question below, let me know if you need me to show more.
p.s ignore the 'here1', 'here2', etc calls, this is how I have been working out what's happening at any point in time.
routes.ts
import express from 'express';
import controllers from './controller.js';
export default (app: express.Application) => {
// Users
app.route('/users').get(controllers.getAllUsers)
app.route('/users').post(controllers.postNewUser)
app.route('/users').delete(controllers.deleteUser)
app.route('/user/:emailAddress').get(controllers.getUser)
app.route('/user/:emailAddress').put(controllers.updateUser)
}
controllers.ts
import express from 'express';
import dbcursor from '../services/dbcursor.js';
// Interfaces
import { Project, User } from '../services/interfaces.js'
const controllers = {
// Users
getAllUsers: async (req: express.Request, res: express.Response) => {
try {
const dbRes = await dbcursor.getAllUsers();
console.log('here 3', dbRes)
res.status(200).json({
message: 'Users fetched succesfully!',
dbRes: dbRes
});
} catch (err) {
res.status(400).json({
message: 'Failed to get users.',
dbRes: err
});
}
},
}
dbcursor.ts
import dotenv from 'dotenv';
import mongodb from 'mongodb'
dotenv.config();
// Interfaces
import { User, Project } from './interfaces'
// DB Client Creation
const { MongoClient } = mongodb;
const uri = process.env.DB_URI || ''
const client = new MongoClient(uri, { useNewUrlParser: true, useUnifiedTopology: true });
const dbcursor = {
// Users
getAllUsers: async () => {
let dbRes;
try {
await client.connect(async err => {
if (err) throw err;
console.log("here 1", dbRes)
const collection = client.db("InternManager").collection("Users");
dbRes = await collection.find().toArray()
console.log("here 2", dbRes)
return dbRes;
});
} catch(err: any) {
return err;
}
},
}
It's generally not a good idea to mix callbacks and promises. Try not passing a callback to the client.connect method, and you should be able to await the promise as expected
getAllUsers: async () => {
let dbRes;
try {
await client.connect();
console.log("here 1", dbRes)
const collection = client.db("InternManager").collection("Users");
dbRes = await collection.find().toArray()
console.log("here 2", dbRes)
return dbRes;
} catch(err: any) {
throw err; // If you're just catching and throwing the error, then it would be okay to just ignore it
}
},
I am developing the backend of an application using Node JS, Sequelize and Postgres database.
When the course is registered, the user must inform which organizations, companies and teachers will be linked to it.
The organization IDs are passed through an array to the backend, I am trying to do a check to make sure that the passed IDs exist.
What I've done so far is this:
const { organizations } = req.body;
const organizationsArray = organizations.map(async (organization) => {
const organizationExists = await Organization.findByPk(organization);
if (!organizationExists) {
return res
.status(400)
.json({ error: `Organization ${organization} does not exists!` });
}
return {
course_id: id,
organization_id: organization,
};
});
await CoursesOrganizations.bulkCreate(organizationsArray);
This link has the complete controller code, I believe it will facilitate understanding.
When !OrganizationExists is true, I am getting the return that the organization does not exist. The problem is when the organization exists, I am getting the following message error.
The Array.map() is returning an array of promises that you can resolve to an array using Promise.all(). Inside the map you should use throw new Error() to break out of the map - this error will be raised by Promise.all() and you can then catch it and return an error to the client (or swallow it, etc).
This is a corrected version of your pattern, resolving the Promise results.
const { organizations } = req.body;
try {
// use Promise.all to resolve the promises returned by the async callback function
const organizationsArray = await Promise.all(
// this will return an array of promises
organizations.map(async (organization) => {
const organizationExists = await Organization.findByPk(organization, {
attributes: ['id'], // we only need the ID
raw: true, // don't need Instances
});
if (!organizationExists) {
// don't send response inside the map, throw an Error to break out
throw new Error(`Organization ${organization} does not exists!`);
}
// it does exist so return/resolve the value for the promise
return {
course_id: id,
organization_id: organization,
};
})
);
// if we get here there were no errors, create the records
await CoursesOrganizations.bulkCreate(organizationsArray);
// return a success to the client
return res.json({ success: true });
} catch (err) {
// there was an error, return it to the client
return res.status(400).json({ error: err.message });
}
This is a refactored version that will be a bit faster by fetching all the Organizations in one query and then doing the checks/creating the Course inserts.
const { Op } = Sequelize;
const { organizations } = req.body;
try {
// get all Organization matches for the IDs
const organizationsArray = await Organization.findAll({
attributes: ['id'], // we only need the ID
where: {
id: {
[Op.in]: organizations, // WHERE id IN (organizations)
}
},
raw: true, // no need to create Instances
});
// create an array of the IDs we found
const foundIds = organizationsArray.map((org) => org.id);
// check to see if any of the IDs are missing from the results
if (foundIds.length !== organizations.length) {
// Use Array.reduce() to figure out which IDs are missing from the results
const missingIds = organizations.reduce((missingIds, orgId) => {
if (!foundIds.includes(orgId)){
missingIds.push(orgId);
}
return missingIds;
}, []); // initialized to empty array
throw new Error(`Unable to find Organization for: ${missingIds.join(', ')}`);
}
// now create an array of courses to create using the foundIds
const courses = foundIds.map((orgId) => {
return {
course_id: id,
organization_id: orgId,
};
});
// if we get here there were no errors, create the records
await CoursesOrganizations.bulkCreate(courses);
// return a success to the client
return res.json({ success: true });
} catch (err) {
// there was an error, return it to the client
return res.status(400).json({ error: err.message });
}
If you have an array of Ids and you want to check if they exist you should you use the (in) operator, this makes it so that you are hitting the DB only once and getting all the records in one hit (instead of getting them one by one in a loop), after you get these records you can check their lengths to determine if they all exist or not.
const { Op } = require("sequelize");
let foundOrgs = await Organization.findAll({
where: {
id: {
[Op.in]: organizationsArray,
}
}
});
My Lambda takes in an SQS message containing an ID and address. It parses out those fields, and updates the record associated with that ID in a dynamo table.
The parameters for this update contain the following logic
1.Where the record has a id equal to the ID sent by SQS
2.And where the SortKey has a value equal to “null” (Note that null is specifically a String with a value of “null”)
3.Update the address field with the new address
I'm seeing the following issues with this function
The function is not updating the DynamoDB instance
I am not receiving any kind of feedback from the update call. Looking over the code there are several console.logs that should execute but are not. See the Try,Catch,Finally block after the update. Looking at the logs you can see that these do not output to the console. Something is very wrong here. The finally not executing looks like undefined behavior, my only guess is that the call to dynamodb is not being awaited
I also need to implement the following functionality. This is bonus points, if you have an idea of how to do it please feel free to comment!
Right now the update will only change the fieldname of address from one value to another. Instead, I need the record to contain a set of addresses associated with that record. To do this we need to implement the following logic
If a set of addresses does not exist on the record, create a set with the address as the only element
If a set does exist on the student record, update that set with the address. Duplicate addresses should not be added
The code for this function is below. I’ve also attached the most recent CloudWatch log for this function, and the record I am trying to update (the address field on the record was added manually). You’ll notice that we aren’t getting any console.logs after console.log("starting upload"), and the promise has a state “PENDING” when it is examined. We also don’t get any feedback from the dynamodb update. Right now the function is not updating the record, and not giving me any feedback for why it is failing to do so.
const util = require('util')
const aws = require('aws-sdk');
const docClient = new aws.DynamoDB.DocumentClient();
exports.handler = async(event) => {
event.Records.forEach(async record => {
const { body } = record;
const test = JSON.parse(body);
console.log(test);
const message = JSON.parse(test["Message"]);
console.log(message);
const id = message.id;
const name = message.Name;
const address = message.address;
console.log("parameters parsed");
console.log("record being processed is " + id);
const params = {
TableName: "My_Records",
Key: {
"ID": ":id",
"SortKey": ":sortKey"
},
//KeyConditionExpression: 'SortKey = :sortKey',
UpdateExpression: "set info.address = :address",
ExpressionAttributeValues: {
':id': id,
':address': address,
':sortKey': "null"
},
ReturnValues: "UPDATED_NEW"
};
console.log(params)
console.log("starting upload")
try {
let putObjectPromise = docClient.update(params).promise();
console.log(util.inspect(putObjectPromise, {showHidden: false, depth: null}))
putObjectPromise.then(function(data) {
console.log("UpdateItem succeeded:");
}).catch(function(err) {
console.log("Unable to update item. Error JSON:" + err);
}).finally(() =>
console.log("done with upload")
);
return putObjectPromise
}
catch (err) {
console.err(err)
}
});
};
CloudWatch log of the most recent execution of this function
INFO {
Type: 'Notification',
MessageId: 'ID',
TopicArn: 'ARN',
Subject: 'DB updated',
SignatureVersion: '1',
INFO { id: '11111111', Name: 'Jerms Macgee', address: '102 homeslice lane' }
INFO parameters parsed
INFO record being processed is 11111111
INFO {
TableName: 'my_table',
Key: { ID: ':id', SortKey: ':sortKey' },
UpdateExpression: 'set info.address = :address',
ExpressionAttributeValues: {
':id': '11111111',
':address': '102 homeslice lane',
':sortKey': 'null'
},
ReturnValues: 'UPDATED_NEW'
}
INFO starting upload
INFO Promise { <pending> }
END RequestId
And here's an example of the record I'd expect to be updated
{
"address": "test",
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
The updated record should be
{
"address": "102 homeslice lane",
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
And for bonus points I'd really like to do something like
{
"address": {"102 homeslice lane"},
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
where addresses is a set that can hold other records
First, forEach won't work with async/await callback. The example from https://codeburst.io/javascript-async-await-with-foreach-b6ba62bbf404
const waitFor = (ms) => new Promise(r => setTimeout(r, ms));
[1, 2, 3].forEach(async (num) => {
await waitFor(50);
console.log(num);
});
console.log('Done');
Second, if you wrap a rejected promise in a try/catch block and that promise already has a .catch() set, catch block will never be executed.
const waitFor = (ms) => new Promise((resolve, reject) => {
setTimeout(() => {
reject(123);
}, ms)
});
try {
waitFor(2000).catch(e => { console.log(e) })
} catch (error) {
console.error('error');
}
You're mixing up your async/promise stuff.
First, event.Records.forEach isn't going to wait for the async function you're passing to it to resolve, you can change it to:
await Promise.all(event.Records.map(async record => {
///... the rest of your function body
});
This way your main handler function will actually wait for them all to resolve.
Next, all this stuff:
try {
let putObjectPromise = docClient.update(params).promise();
console.log(util.inspect(putObjectPromise, {showHidden: false, depth: null}))
putObjectPromise.then(function(data) {
console.log("UpdateItem succeeded:");
}).catch(function(err) {
console.log("Unable to update item. Error JSON:" + err);
}).finally(() =>
console.log("done with upload")
);
return putObjectPromise
}
catch (err) {
console.err(err)
}
is weird, you're using .then and callback functions, but you're in an async function so you can just await them. Eg:
try {
const putObjectResponse = await docClient.update(params).promise();
console.log("UpdateItem succeeded:");
console.log(JSON.stringify(putObjectResponse));
}
catch (err) {
console.log("Unable to update item. Error JSON:" + err);
console.err(err)
}
console.log("done with upload")
By awaiting update(params).promise() the return value becomes what the promise resolves to, not the promise. If the promise rejects, it is thrown and caught in your catch block.
This also fixes your weird logging messages because you're now logging the resolved value from the promise rather than the promise itself.
I have another app which uses express and routes but this new app i was slimming it down. I know the connection string stuff is correct
script.getQuestions(connection);
script.getQuestions = function(connection,req, res){
console.log(connection);
}
I have read that some people said online to change to use a promise for async fixes this... problem is that with my function having req and res i don't know how to pass those in when i even try to refactor with a promise
"ConnectionError: Connection is closed"
"(module.js:487:32) code: 'ECONNCLOSED', name: 'ConnectionError' }"
What I call up (script) is
var sql = require('mssql');
exports.getQuestions = function(connection, req,res){
console.log(connection);
var request = new sql.Request(connection);
var query = 'select * from Question'
request.query(query).then(function(resultset){
res.json(resultset.recordset);
}).catch(function(err){
console.log(err);
//res.json(err)
})
}
it's a bit hard to understand what you're doing there. But here is an promise example to use mssql
const sql = require('mssql')
sql.connect(config).then(pool => {
// Query
return pool.request()
.input('input_parameter', sql.Int, value)
.query('select * from mytable where id = #input_parameter')
}).then(result => {
console.dir(result)
// Stored procedure
return pool.request()
.input('input_parameter', sql.Int, value)
.output('output_parameter', sql.VarChar(50))
.execute('procedure_name')
}).then(result => {
console.dir(result)
}).catch(err => {
// ... error checks
})
sql.on('error', err => {
// ... error handler
})
source: https://www.npmjs.com/package/mssql#promises