I am developping an app to order food online. As backend service I am using firestore to store the data and files. The user can order dishes and there are limited stocks. So every time a user order a dish and create a basket I update the stock of the corresponding ordered dishes. I am using a firebase function in order to perform this action. To be honest it is the first I am creating firebase function.
Into the Basket object, there is a list of ordered Dishes with the corresponding database DishID. When the basket is created, I go through the DishID list and I update the Quantity in the firestore database. On my local emulator it works perfectly and very fast. But online it takes minutes to perform the first update. I can deal with some seconds. Even if it takes a few seconds (like for cold restart) it's okay. But sometimes it can take 3 minutes and someone else can order a dish during this time.
Here is my code:
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try{
//Get the created basket
const originalBasket = snap.data();
originalBasket.OrderedDishes.forEach(async dish => {
const doc = await db.collection('Dishes').doc(dish.DishID);
console.log('Doc created');
return docRef = doc.get()
.then((result) =>{
console.log('DocRef created');
if(result.exists){
console.log('Result exists');
const dishAvailableOnDataBase = result.data().Available;
console.log('Data created');
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - dish.Quantity, 0)};
console.log('Online doc updated');
return result.ref.set(newQuantity, { merge: true });
}else{
console.log("doc doesnt exist");
}
})
.catch(error =>{
console.log(error);
return null;
});
});
}catch(error){
console.log(error);
}
});
I have a couple of logs output to debug the outputs on the server. It's the doc.get() function that takes 2 minutes to execute as you can see on the logger below:
Firebase logger
Thanks for your help,
Thansk for your help. I just edited a little bit your code to make it work. I post my edited code. Thanks a lot, now it takes just 4 seconds to update the quantities.
Kid regards
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try {
//Get the created basket
const originalBasket = snap.data();
const promises = [];
const quantities = [];
originalBasket.OrderedDishes.forEach(dish => {
promises.push(db.collection('Dishes').doc(dish.DishID).get());
quantities.push(dish.Quantity);
});
const docSnapshotsArray = await Promise.all(promises);
console.log("Promises", promises);
const promises1 = [];
var i = 0;
docSnapshotsArray.forEach(result => {
if (result.exists) {
const dishAvailableOnDataBase = result.data().Available;
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - quantities[i], 0) };
promises1.push(result.ref.set(newQuantity, { merge: true }));
}
i++;
})
return Promise.all(promises1)
} catch (error) {
console.log(error);
return null;
}
});
You should not use async/await within a forEach() loop, see "JavaScript: async/await with forEach()" and "Using async/await with a forEach loop".
And since your code executes, in parallel, a variable number of calls to the asynchronous Firebase get() and set() methods, you should use Promise.all().
You should refactor your Cloud Function along the following lines:
//Update the dishes quantities when a basket is created
exports.updateDishesQuantity = functions.firestore.document('/Baskets/{documentId}').onCreate(async (snap, context) => {
try {
//Get the created basket
const originalBasket = snap.data();
const promises = [];
originalBasket.OrderedDishes.forEach(dish => {
promises.push(db.collection('Dishes').doc(dish.DishID).get());
});
const docSnapshotsArray = await Promise.all(promises);
const promises1 = [];
docSnapshotsArray.forEach(snap => {
if (result.exists) {
const dishAvailableOnDataBase = result.data().Available;
const newQuantity = { Available: Math.max(dishAvailableOnDataBase - dish.Quantity, 0) };
promises1.push(result.ref.set(newQuantity, { merge: true }));
}
})
return Promise.all(promises1)
} catch (error) {
console.log(error);
return null;
}
});
Note that instead of looping and calling push() you could use the map() method for a much concise code. However, for SO answers, I like the clarity brought by creating an empty array, populating it with a forEach() loop and passing it to Promise.all()...
Also note that since you are updating quantities in a basket you may need to use a Transaction.
Related
I'm trying to add a delete button to my page. the event listener callback is working properly except for the updateDoc function.
const deleteBook = document.getElementsByClassName('deleteBook');
for (let i = 0; i < deleteBook.length; i++) {
deleteBook[i].addEventListener('click', async () => {
//book to delete
const bookToDelete = deleteBook[i].parentElement.firstElementChild.textContent
// collection title to delete the book from
const bookCol = deleteBook[i].parentElement.parentElement.parentElement.firstElementChild.textContent
// get a snap of the database
const docRef = doc(dataBase, 'users', `${auth.currentUser.uid}`)
const docSnap = (await getDoc(docRef)).data();
// loop over the collections and get a match with the bookCol
for (const col in docSnap) {
if (docSnap[col].title === bookCol) {
console.log('col to delete from found')
console.log(`book to delete ${bookToDelete}`)
await updateDoc(doc(dataBase, 'users', `${auth.currentUser.uid}`), {
[`${col}.books`]: arrayRemove(`${bookToDelete}`)
}).then(()=>{
// fullfiled
console.log('book deleted')
}, ()=>{
// rejected
console.log('promis rejected')
})
}
}
})
}
Col is the object that contains the books array. In the console it always prints book deleted, but in the firestore console, nothing changes. this is a screenshot of the database.
I would really appreciate any help and thank you.
I have replicated the behavior that you're experiencing. I tried changing the content of ${bookToDelete} to any word or even ID. It always returns book deleted even if its deleted or not. The line of code below should be changed in order to get the correct output.
.then(()=>{
// fullfiled
console.log('book deleted')
}, ()=>{
// rejected
console.log('promis rejected')
})
I have created a workaround for your use-case with this kind of issue. See snippet below:
const db = getFirestore();
const colName = "users";
const arrayName = "books";
const usersCol = collection(db, colName);
const userRef = doc(db, colName, `${auth.currentUser.uid}`);
const arrayRef = `${col}.${arrayName}`;
const q = query(usersCol, where(arrayRef, "array-contains", `${bookToDelete}`));
const querySnapshot = await getDocs(q)
.then((querySnapshot) => {
// Removal of object will not proceed if the querySnapshot is empty.
if ((querySnapshot.empty)) {
console.log("No object found!");
}
else {
// Proceeds to removal of object.
updateDoc(userRef, {
[arrayRef]: arrayRemove(`${bookToDelete}`)
})
.then(() => {
// Check again if the object was deleted successfully.
const querySnapshot = getDocs(q)
.then((querySnapshot) => {
if ((querySnapshot.empty)) {
console.log("Book Deleted!");
}
else {
console.log("Failed!");
}
})
});
}
})
// Catch if there are any Firebase errors.
.catch(error => console.log('Failed!', error));
The workaround that I created will query the object in the array then remove the object in the array if it exist. After removing, it will query again to check if the object has been deleted and logs Book Deleted!. Vise versa for checking if the object doesn't exist on the 1st query, it will not proceed on removing them and logs No object found!.
The workaround itself can still be improved. You can add any logic you want for your use-case.
I'd also recommend to create a Feature Request if you want to have this kind of feature together with the arrayRemove Method.
Im guessing this problem is because I don't know how to use async await effectively. I still dont get it and I've been trying to understand for ages. sigh.
Anyway, heres my function:
app.post("/declineTrades", async (request, response) => {
//---------------------------------------------
const batch = db.batch();
const listingID = request.body.listingID;
const tradeOfferQuery = db
//---------------------------------------------
//Get trade offers that contain the item that just sold
//(therefore it cannot be traded anymore, I need to cancel all existing trade offers that contain the item because this item isn't available anymore)
//---------------------------------------------
.collection("tradeOffers")
.where("status", "==", "pending")
.where("itemIds", "array-contains", listingID);
//---------------------------------------------
//Function that gets all trade offers that contain the ID of the item.
async function getIdsToDecline() {
let tempArray = [];
tradeOfferQuery.get().then((querySnapshot) => {
querySnapshot.forEach((doc) => {
//For each trade offer found
let offerRef = db.collection("tradeOffers").doc(doc.id);
//Change the status to declined
batch.update(offerRef, { status: "declined" });
//Get the data from the trade offer because I want to send an email
//to the who just got their trade offer declined.
const offerGet = offerRef.get().then((offer) => {
const offerData = offer.data();
//Check the items that the receiving person had in this trade offer
const receiverItemIds = Array.from(
offerData.receiversItems
.reduce((set, { itemID }) => set.add(itemID), new Set())
.values()
);
//if the receiver item id's array includes this item that just sold, I know that
//I can get the sender ID (users can be sender or receiver, so i need to check which person is which)
if (receiverItemIds.includes(listingID)) {
tempArray.push(offerData.senderID);
}
});
});
});
//With the ID's now pushed, return the tempArray
return tempArray;
}
//---------------------------------------------
//Call the above function to get the ID's of people that got declined
//due to the item no longer being available
const peopleToDeclineArray = await getIdsToDecline();
//Update the trade offer objects to declined
const result = await batch.commit();
//END
response.status(201).send({
success: true,
result: result,
idArray: peopleToDeclineArray,
});
});
Im guessing that my return tempArray is in the wrong place? But I have tried putting it in other places and it still returns an empty array. Is my logic correct here? I need to run the forEach loop and add to the array before the batch.commit happens and before the response is sent.
TIA Guys!
As #jabaa pointed out in their comment, there are problems with an incorrectly chained Promise in your getIdsToDecline function.
Currently the function initializes an array called tempArray, starts executing the trade offer query and then returns the array (which is currently still empty) because the query hasn't finished yet.
While you could throw in await before tradeOfferQuery.get(), this won't solve your problem as it will only wait for the tradeOfferQuery to execute and the batch to be filled with entries, while still not waiting for any of the offerRef.get() calls to be completed to fill the tempArray.
To fix this, we need to make sure that all of the offerRef.get() calls finish first. To get all of these documents, you would use the following code to fetch each document, wait for all of them to complete and then pull out the snapshots:
const itemsToFetch = [ /* ... */ ];
const getAllItemsPromise = Promise.all(
itemsToFetch.map(item => item.get())
);
const fetchedItemSnapshots = await getAllItemsPromise;
For documents based on a query, you'd tweak this to be:
const querySnapshot = /* ... */;
const getSenderDocPromises = [];
querySnapshot.forEach((doc) => {
const senderID = doc.get("senderID");
const senderRef = db.collection("users").doc(senderID);
getSenderDocPromises.push(senderRef.get());
}
const getAllSenderDocPromise = Promise.all(getSenderDocPromises);
const fetchedSenderDataSnapshots = await getAllSenderDocPromise;
However neither of these approaches are necessary, as the document you are requesting using these offerRef.get() calls are already returned in your query so we don't even need to use get() here!
(doc) => {
let offerRef = db.collection("tradeOffers").doc(doc.id);
//Change the status to declined
batch.update(offerRef, { status: "declined" });
//Get the data from the trade offer because I want to send an email
//to the who just got their trade offer declined.
const offerGet = offerRef.get().then((offer) => {
const offerData = offer.data();
//Check the items that the receiving person had in this trade offer
const receiverItemIds = Array.from(
offerData.receiversItems
.reduce((set, { itemID }) => set.add(itemID), new Set())
.values()
);
//if the receiver item id's array includes this item that just sold, I know that
//I can get the sender ID (users can be sender or receiver, so i need to check which person is which)
if (receiverItemIds.includes(listingID)) {
tempArray.push(offerData.senderID);
}
});
}
could be replaced with just
(doc) => {
// Change the status to declined
batch.update(doc.ref, { status: "declined" });
// Fetch the IDs of items that the receiving person had in this trade offer
const receiverItemIds = Array.from(
doc.get("receiversItems") // <-- this is the efficient form of doc.data().receiversItems
.reduce((set, { itemID }) => set.add(itemID), new Set())
.values()
);
// If the received item IDs includes the listed item, add the
// sender's ID to the array
if (receiverItemIds.includes(listingID)) {
tempArray.push(doc.get("senderID"));
}
}
which could be simplified to just
(doc) => {
//Change the status to declined
batch.update(doc.ref, { status: "declined" });
// Check if any items that the receiving person had in this trade offer
// include the listing ID.
const receiversItemsHasListingID = doc.get("receiversItems")
.some(item => item.itemID === listingID);
// If the listing ID was found, add the sender's ID to the array
if (receiversItemsHasListingID) {
tempArray.push(doc.get("senderID"));
}
}
Based on this, getIdsToDecline actually queues declining the invalid trades and returns the IDs of those senders affected. Instead of using the batch and tradeOfferQuery objects that are outside of the function that make this even more unclear, you should roll them into the function and pull it out of the express handler. I'll also rename it to declineInvalidTradesAndReturnAffectedSenders.
async function declineInvalidTradesAndReturnAffectedSenders(listingID) {
const tradeOfferQuery = db
.collection("tradeOffers")
.where("status", "==", "pending")
.where("itemIds", "array-contains", listingID);
const batch = db.batch();
const affectedSenderIDs = [];
const querySnapshot = await tradeOfferQuery.get();
querySnapshot.forEach((offerDoc) => {
batch.update(offerDoc.ref, { status: "declined" });
const receiversItemsHasListingID = offerDoc.get("receiversItems")
.some(item => item.itemID === listingID);
if (receiversItemsHasListingID) {
affectedSenderIDs.push(offerDoc.get("senderID"));
}
}
await batch.commit(); // generally, the return value of this isn't useful
return affectedSenderIDs;
}
This then would change your route handler to:
app.post("/declineTrades", async (request, response) => {
const listingID = request.body.listingID;
const peopleToDeclineArray = await declineInvalidTradesAndReturnAffectedSenders(listingID);
response.status(201).send({
success: true,
result: result,
idArray: peopleToDeclineArray,
});
});
Then adding the appropriate error handling, swapping out the incorrect use of HTTP 201 Created for HTTP 200 OK, and using json() instead of send(); you now get:
app.post("/declineTrades", async (request, response) => {
try {
const listingID = request.body.listingID;
const affectedSenderIDs = await declineInvalidTradesAndReturnAffectedSenders(listingID);
response.status(200).json({
success: true,
idArray: affectedSenderIDs, // consider renaming to affectedSenderIDs
});
} catch (error) {
console.error(`Failed to decline invalid trades for listing ${listingID}`, error);
if (!response.headersSent) {
response.status(500).json({
success: false,
errorCode: error.code || "unknown"
});
} else {
response.end(); // forcefully end corrupt response
}
}
});
Note: Even after all these changes, you are still missing any form of authentication. Consider swapping the HTTPS Event Function out for a Callable Function where this is handled for you but requires using a Firebase Client SDK.
I am using Firebase authentication to store users. I have two types of users: Manager and Employee. I am storing the manager's UID in Firestore employee along with the employee's UID. The structure is shown below.
Firestore structure
Company
|
> Document's ID
|
> mng_uid: Manager's UID
> emp_uid: Employee's UID
Now I want to perform a query like "Retrieve employees' info which is under the specific manager." To do that I tried to run the below code.
module.exports = {
get_users: async (mng_uid, emp_uid) => {
return await db.collection("Company").where("manager_uid", "==", mng_uid).get().then(snaps => {
if (!snaps.empty) {
let resp = {};
let i = 0;
snaps.forEach(async (snap) => {
resp[i] = await admin.auth().getUser(emp_uid).then(userRecord => {
return userRecord;
}).catch(err => {
return err;
});
i++;
});
return resp;
}
else return "Oops! Not found.";
}).catch(() => {
return "Error in retrieving employees.";
});
}
}
Above code returns {}. I tried to debug by returning data from specific lines. I got to know that the issue is in retrieving the user's info using firebase auth function which I used in forEach loop. But it is not returning any error.
Thank you.
There are several points to be corrected in your code:
You use async/await with then() which is not recommended. Only use one of these approaches.
If I understand correctly your goal ("Retrieve employees' info which is under the specific manager"), you do not need to pass a emp_uid parameter to your function, but for each snap you need to read the value of the emp_uid field with snap.data().emp_uid
Finally, you need to use Promise.all() to execute all the asynchronous getUser() method calls in parallel.
So the following should do the trick:
module.exports = {
get_users: async (mng_uid) => {
try {
const snaps = await db
.collection('Company')
.where('manager_uid', '==', mng_uid)
.get();
if (!snaps.empty) {
const promises = [];
snaps.forEach(snap => {
promises.push(admin.auth().getUser(snap.data().emp_uid));
});
return Promise.all(promises); //This will return an Array of UserRecords
} else return 'Oops! Not found.';
} catch (error) {
//...
}
},
};
I'm looking a good and right way to set/ initalize values with http trigger.
what I did is ref to the node in firebase and get data then update it.
module.exports.initializeAnswers = functions.https.onRequest(async(req,res)=>{
try{
// i want to initalize each key
await firebase.ref('/CurrentGame').once('value',snapshot=>{
snapshot.forEach((childSnapshot)=>{
if(childSnapshot !=null){
childSnapshot.update(0)
}
return false
});
})
}catch(e){
console.info(e)
return res.status(400).send({error:0})
}
})
I'm looking for a right way and not by the 'update' function
I want to initalize each value to zero with http trigger
I understand that you will have several children (variable number) under the "answers" node and only one "rightAnswer" node. If my understanding is correct, the following code will do the trick:
exports.initializeAnswers = functions.https.onRequest((req, res) => {
admin.database().ref('/CurrentGame/answers').once('value', snapshot => {
const updates = {};
snapshot.forEach((child) => {
updates['/answers/' + child.key] = 0;
});
updates['/rightAnswer'] = 0;
return admin.database().ref('/CurrentGame').update(updates);
}).then(() => {
res.status(200).end();
}).catch((err) => {
console.log(err);
res.status(500).send(err);
});
});
You can use Firebase Cloud Functions to initialize the values on HTTP trigger. check this link
I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form