I created a function to communicate with the firestore database.
First, check if there is something in the relation. If not then add something.
If something already exists then use the data and then delete the entry in the queried relation. But you have to add that in the function(else section). And now the question arises what happens when two users simultaneously perform the function.
Is there a way to put the second user in a queue while the first user is done with the request?
let ref = db.collection('relation1').doc('test').collection('user');
var checkForAdd = ref.get().then(snapshot => {
if(snapshot.size < 1){
db.collection('relation1').doc('test').collection('user').add({
user: 'Test',
createdAt: Date.now()
}).catch(err =>{
console.log(err)
})
}
Cloud Firestore supports atomic operations for reading and writing data. In a set of atomic operations, either all of the operations succeed, or none of them are applied.
https://firebase.google.com/docs/firestore/manage-data/transactions
// Create a reference to the user doc you want to create if it doesn't exist.
const userCollectionRef = db.collection('relation1').doc('test').collection('user');
const userDocRef = userCollectionRef.doc('documentID');
return db.runTransaction(transaction => {
// This code may get re-run multiple times if there are conflicts.
return transaction.get(userDocRef).then(userDoc => {
if (userDoc.exists) {
// If something already exists then use the data and
// then delete the entry in the queried relation.
} else {
transaction.update(userDocRef, {
user: 'Test',
createdAt: Date.now()
});
}
});
}).then(() => {
console.log("Transaction successfully committed!");
}).catch(error => {
console.log("Transaction failed: ", error);
});
Related
I'm creating an API that check counter value, increment by one and after that creates new documents in document's collection. For that purpose I'm using runTransaction().
And I'm running into problem. Transactions works as expected, checks counter value and after that increment it by one. But after that, when I try to set new document, I get error (empty object) and I can't do that. I think there can be bad my used logic, so I need your advice where I am wrong.
const db = admin.firestore()
const couterRef = db.collection('invoices').doc('invoices_doc')
let counterValue = 0
// check last invoice counter number and increment by 1
return db.runTransaction((transaction) => {
return transaction
.get(couterRef)
.then((counter) => {
counterValue = counter.data().counter + 1
// Update counter
return transaction.update(couterRef, { counter: counterValue })
})
.catch((error) => response.status(200).send({ status: 'TRANSACTION ERROR', error }))
})
.then(() => {
// Create new invoice document
couterRef.collection('invoices').doc(counterValue).set({
series: 'SS',
series_nr: counterValue
})
.then(() => response.status(200).send({ status: 'OK' }))
.catch((error) => response.status(200).send({ status: 'DOCUMENT SET ERROR', error }))
})
.catch((error) => {
response.status(200).send({ status: 'RUN TRANSACTION ERROR', error })
})
I've not tested your code but the problem most probably comes from the fact that you modify the application state inside of your transaction functions, which is something you must avoid. There is a specific section about this problem in the documentation.
You need to pass the new value of counterValue out of your transaction function, as follows:
const db = admin.firestore();
const couterRef = db.collection('invoices').doc('invoices_doc');
// let counterValue = 0 Remove this line
return db
.runTransaction((transaction) => {
return transaction.get(couterRef).then((counter) => {
const counterValue = counter.data().counter + 1;
// See the changes below !!
transaction.update(couterRef, { counter: counterValue }); // Actually, this in not an asynchronous operation
return counterValue; // We pass the new value of counterValue out of the transaction function
});
})
.then((counterValue) => {
// Create new invoice document
couterRef.collection('invoices').doc(counterValue.toString(10)).set({
series: 'SS', // you missed a ,
series_nr: counterValue,
});
})
.then(() => response.status(200).send({ status: 'OK' }))
.catch((error) => {
response.status(200).send({ status: 'RUN TRANSACTION ERROR', error });
});
Also, in your HTTPS Cloud Function, don't send the response back to the client from WITHIN the Transaction: this is also an application state modification and shall not be done from within the transaction.
Similarly, don't include catch blocks in the then blocks: add a catch block only once at the end of the promise chain. If you want to deal with different error types in this unique catch block, just throw errors with different error messages and decide, in the catch block, what to do depending on the message. Alternatively you can create some subclasses of the Error class.
Having said all of that, since your transaction only impacts one document and only increments a counter, you could very well use the FieldValue.increment() method, which is atomic. See this Firebase blog post for more details.
I am using Firebase authentication to store users. I have two types of users: Manager and Employee. I am storing the manager's UID in Firestore employee along with the employee's UID. The structure is shown below.
Firestore structure
Company
|
> Document's ID
|
> mng_uid: Manager's UID
> emp_uid: Employee's UID
Now I want to perform a query like "Retrieve employees' info which is under the specific manager." To do that I tried to run the below code.
module.exports = {
get_users: async (mng_uid, emp_uid) => {
return await db.collection("Company").where("manager_uid", "==", mng_uid).get().then(snaps => {
if (!snaps.empty) {
let resp = {};
let i = 0;
snaps.forEach(async (snap) => {
resp[i] = await admin.auth().getUser(emp_uid).then(userRecord => {
return userRecord;
}).catch(err => {
return err;
});
i++;
});
return resp;
}
else return "Oops! Not found.";
}).catch(() => {
return "Error in retrieving employees.";
});
}
}
Above code returns {}. I tried to debug by returning data from specific lines. I got to know that the issue is in retrieving the user's info using firebase auth function which I used in forEach loop. But it is not returning any error.
Thank you.
There are several points to be corrected in your code:
You use async/await with then() which is not recommended. Only use one of these approaches.
If I understand correctly your goal ("Retrieve employees' info which is under the specific manager"), you do not need to pass a emp_uid parameter to your function, but for each snap you need to read the value of the emp_uid field with snap.data().emp_uid
Finally, you need to use Promise.all() to execute all the asynchronous getUser() method calls in parallel.
So the following should do the trick:
module.exports = {
get_users: async (mng_uid) => {
try {
const snaps = await db
.collection('Company')
.where('manager_uid', '==', mng_uid)
.get();
if (!snaps.empty) {
const promises = [];
snaps.forEach(snap => {
promises.push(admin.auth().getUser(snap.data().emp_uid));
});
return Promise.all(promises); //This will return an Array of UserRecords
} else return 'Oops! Not found.';
} catch (error) {
//...
}
},
};
I have a database triggered function, that triggers when a team administrator adds new members to his team. The function is supposed to create authentication in Firebase and an object, where the new user can store his personal settings.
My problem is, that when a lot of members are added simultaneously via an import feature, my function doesn't always complete. Since they seem to be triggered alright when I look at the log, I suspect my implementation of chained promises to be the error cause. Here is a copy of the code. Please help me correct the errors.
// When a team adds a new member, we should also create authentication and a record for his user data...
exports.createNewUserAndAuthOnNewMember = functions
.database
.ref('/Members/{team}/{memberId}/createNewUser')
.onCreate(event => {
const memberRef = admin.database().ref('/Members/'+event.params.team+'/'+event.params.memberId);
memberRef.once('value')
.then((memberSnap) => {
const memberEmail = memberSnap.child('email').val();
const preferredLanguage = memberSnap.child('preferredLanguage').val();
// Creating authentication for new system user...
//since we want to update the /user object later on even if the authentication creation fails (because user already exists), this promise is inside the top promise chain
admin.auth().createUser({
uid: event.params.memberId,
email: memberEmail,
emailVerified: false,
password: '[random password generated]',
disabled: false
})
.then(function(userRecord) {
console.log("Successfully created new user:", userRecord.uid);
return preferredLanguage;
})
.catch(function(error) {
console.log("Error creating new user:", error);
return preferredLanguage;
});
})
.then(preferredLanguage => {
// Creating the personal user object in the database
admin.database().ref('/users/'+event.params.memberId).update({'team': event.params.team, 'preferredLanguage': preferredLanguage});
})
.then(() => {
//we did the job and should remove the trigger from the member object in the database
memberRef.child('createNewUser').remove();
})
.then(() => {
console.log('Created /users/'+event.params.memberId);
return true;
});
});
This should work:
exports.createNewUserAndAuthOnNewMember = functions
.database
.ref('/Members/{team}/{memberId}/createNewUser')
.onCreate(event => {
let preferredLanguage;
const memberRef = admin.database().ref('/Members/' + event.params.team + '/' + event.params.memberId);
return memberRef.once('value')
.then((memberSnap) => {
const memberEmail = memberSnap.child('email').val();
preferredLanguage = memberSnap.child('preferredLanguage').val();
// Creating authentication for new system user...
//since we want to update the /user object later on even if the authentication creation fails (because user already exists), this promise is inside the top promise chain
return admin.auth().createUser({
uid: event.params.memberId,
email: memberEmail,
emailVerified: false,
password: '[random password generated]',
disabled: false
})
})
.then(() => {
// Creating the personal user object in the database
return admin.database().ref('/users/' + event.params.memberId).update({'team': event.params.team, 'preferredLanguage': preferredLanguage});
})
.then(() => {
//we did the job and should remove the trigger from the member object in the database
return memberRef.child('createNewUser').remove();
})
.catch(error => {
console.log(error);
//...
});
});
You have to return the promise in each then() when chaining them, and you only need one catch at the end of the chain.
In addition, note that you are using the "old" syntax for Cloud Functions. Since version 1.0.+ there is a new syntax, see https://firebase.google.com/docs/functions/beta-v1-diff
I'm trying to make a cloud function that will trigger on HTTP request (which is sent on a timer), that will remove all childs with a specific value.
The database node looks like this:
activities
4GI1QXUJG0MeQ8Bq19WOdCQFo9r1 //uid
activity: "hammer"
id: some ID
note: "some note"
timestamp: some timeintervalsince1970
7IDUiufhiws8939hdfIUHiuhwdi5
etc....
I want to look through all the activities, and if the activity value is "hammer", I want to remove the child.
This is what I have so far
exports.deleteHammerNotifications = functions.https.onRequest((req, res) => {
admin.database().ref('activities').once('value', (snapshot) => {
console.log(snapshot.val())
});
});
which prints:
{
'4GI1QXUJG0MeQ8Bq19WOdCQFo9r1':
{ activity: 'nn',
id: '4GI1QXUJG0MeQ8Bq19WOdCQFo9r1',
note: 'Blank note...',
timestamp: 1498032472 },
M6xQU5XWTEVbSqBnR3HBAEhA9hI3:
{ activity: 'hammer',
id: 'M6xQU5XWTEVbSqBnR3HBAEhA9hI3',
note: 'some note here...',
timestamp: 1497973839 },
}
My problem is I don't know how to cycle through the DataSnapshot and look for all the childs that has the activity: "hammer" value.
I have done similar function in my xcode project with arrays, but I don't know how to do it with JavaScript.
Any help is appreciated!
To cycle through the matching child nodes, use snapshot.forEach():
exports.deleteHammerNotifications = functions.https.onRequest((req, res) => {
admin.database().ref('activities').once('value', (snapshot) => {
snapshot.forEach((childSnapshot) => {
console.log(childSnapshot.val())
});
});
});
But you're still missing a query here to select the correct nodes. Without such a query you might as well call admin.database().ref('activities').remove().
To most efficiently delete a number of items from the database and write a single response back to the user, use this function (which I modified from something I needed recently):
exports.cleanup = functions.https.onRequest((req, res) => {
var query = admin.database().ref("activities").orderByChild("activity").equalTo("hammer");
query.once("value").then((snapshot) => {
console.log("cleanup: "+snapshot.numChildren()+" activities");
var updates = {};
snapshot.forEach((child) => {
updates["activities/"+child.key] = null;
});
admin.database().ref().update(updates).then(() => {
res.status(200).send(snapshot.numChildren()+" activities deleted");
}).catch((error) => {
res.status(500).send(error);
})
});
});
Learn more:
Firebase documentation on querying
Firebase blog post on multi-location updates
I am not sure if its possible, but IF you can start a "child_added" listener, once the HTTPS trigger has run, you can do like this.
ref.on('child_added', function(snapshot) {
if (snapshot.child("activity").val() === 'hammer') {
var value = snapshot.child("activity").val();
})
I am doing the exact same thing to see if people are still subscribed to my mailing list or not, and IF they are, they will receive a mail.
Hope that helps :-)
I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form