How to update hasMany associations when updating model? - javascript

Is there a way to update an object (organization) and it’s associations (tasg) in a single call? In my case I have Orgs -> tags. One org can have many tags.
I can’t figure out how to update the tags as well as the organization in one simple call
function updateOrganization(db, stats) {
return function (req, res) {
let myOrg
db.organization.findOne({
where: {
id: req.params.id
},
include: ['tags']
})
.then(org => {
myOrg = org
let promises = []
if (req.body.tags) {
req.body.tags.forEach(tag => {
promises.push(org.createTag({ name: tag }))
})
}
return Promise.all(promises)
})
.then(tags => {
console.log('tags = ', tags)
return myOrg.setTags(tags) <-- DOES NOT SEEM TO BE WORKING
})
.then(updatedOrg => {
console.log('updatedOrg.get() = ', updatedOrg.get()) <-- DOES NOT CONTAIN NEW TAGS
console.log('myOrg final = ', myOrg.get()) <-- DOES NOT CONTAIN NEW TAGS
return res.status(HttpStatus.OK).send(myOrg)
})
.catch(err => {
req.log.error(err)
return handleErr(res, HttpStatus.INTERNAL_SERVER_ERROR, err.message)
})
}
}
NOTE: It looks like the line promises.push(org.createTag({ name: tag })) is actually creating the tags and the line return myOrg.setTags(tags) is not necessary. When i fetch this record with a findOne query, all the tags do actually exist. So why don't they appear when in my log statements which is the output of updatedOrg?

You can simply use something like this:
function updateOrganization(db, stats) {
return async (req, res) => {
try {
// Get the organization + associations
const org = await Organization.find({
where: { id: req.params.id },
include: ['tags'],
limit: 1, // added this because in the 4.42.0 version '.findOne()' is deprecated
});
// Check if we have any tags specified in the body
if(req.body.tags) {
await Promise.all(req.body.tags.map(tag => org.createTag({ name: tag })));
}
return res.status(HttpStatus.OK).send(org.reload()); // use '.reload()' to refresh associated data
} catch(err) {
req.log.error(err);
return handleErr(res, HttpStatus.INTERNAL_SERVER_ERROR, err.message);
}
}
}
You can read more about .reload() here.
Also I recommend you to use Sequelize Transactions in the future, it will be very easy to control your app flow.

Are you sure you want a "hasMany" relationship. Usually tags are shared, and the Org would have a "belongsToMany" relationship. Please post your models and schema, and if possible, a working example (with connection to a database with your schema) or a link to one.
In any case, I believe createTag is only going to work if the tag does not already exist. If the tag exists, you need to setTags or addTags, passing in the tag objects or their primary key IDs.

Related

Working SQL yields Syntax Error in pg-promise

I have the following code in one of my endpoints:
let setText = ''
for (const [ key, value ] of Object.entries(req.body.info)) {
setText = setText.concat(`${key} = ${value}, `)
}
// Last character always an extra comma and whitespace
setText = setText.substring(0, setText.length - 2)
db.one('UPDATE students SET ${setText} WHERE id = ${id} RETURNING *', { setText, id: req.body.id })
.then(data => {
res.json({ data })
})
.catch(err => {
res.status(400).json({'error': err.message})
})
It is supposed to dynamically generate the SQL from the request body. When I log the created SQL, it generates correctly. It even works when I directly query the database. However, whenever I run the endpoint, I get a syntax error that's "at or near" whatever setText is. I've tried using slice instead of substring with no change.
You should never concatenate values manually, as they are not escaped properly then, and open your code to possible SQL injections.
Use the tools that the library offers. For an UPDATE from a dynamic object, see below:
const cs = new pgp.helpers.ColumnSet(req.body.info, {table: 'students'});
const query = pgp.helpers.update(req.body.info, cs) +
pgp.as.format(' WHERE id = ${id} RETURNING *', req.body);
db.one(query)
.then(data => {
res.json({ data });
})
.catch(err => {
res.status(400).json({error: err.message})
});

How to use dataloader?

Im trying to figure this out.
I want to get all my users from my database, cache them
and then when making a new request I want to get those that Ive cached + new ones that have been created.
So far:
const batchUsers = async ({ user }) => {
const users = await user.findAll({});
return users;
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: {
userLoader: new DataLoader(() => batchUsers(db)),// not sending keys since Im after all users
},
});
my resolver:
users: async (obj, args, context, info) => {
return context.userLoader.load();
}
load method requiers a parameter but in this case I dont want to have a specific user I want all of them.
I dont understand how to implement this can someone please explain.
If you're trying to just load all records, then there's not much of a point in utilizing DataLoader to begin in. The purpose behind DataLoader is to batch multiple calls like load(7) and load(22) into a single call that's then executed against your data source. If you need to get all users, then you should just call user.findAll directly.
Also, if you do end up using DataLoader, make sure you pass in a function, not an object as your context. The function will be ran on each request, which will ensure you're using a fresh instance of DataLoader instead of one with a stale cache.
context: () => ({
userLoader: new DataLoader(async (ids) => {
const users = await User.findAll({
where: { id: ids }
})
// Note that we need to map over the original ids instead of
// just returning the results of User.findAll because the
// length of the returned array needs to match the length of the ids
return ids.map(id => users.find(user => user.id === id) || null)
}),
}),
Note that you could also return an instance of an error instead of null inside the array if you want load to reject.
Took me a while but I got this working:
const batchUsers = async (keys, { user }) => {
const users = await user.findAll({
raw: true,
where: {
Id: {
// #ts-ignore
// eslint-disable-next-line no-undef
[op.in]: keys,
},
},
});
const gs = _.groupBy(users, 'Id');
return keys.map(k => gs[k] || []);
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: () => ({
userLoader: new DataLoader(keys => batchUsers(keys, db)),
}),
});
resolver:
user: {
myUsers: ({ Id }, args, { userLoader }) => {
return userLoader.load(Id);
},
},
playground:
{users
{Id
myUsers
{Id}}
}
playground explained:
users basically fetches all users and then myusers does the same thing by inhereting the id from the first call.
I think I choose a horrible example here since I did not see any gains in performence by this. I did see however that the query turned into:
SELECT ... FROM User WhERE ID IN(...)

Refresh template in Keystone js

I create 2 models in keystone. One is CD_Book, second is Musicians. I try connect this two collections when I open CD_Book view. I want display musicians from cd so I create this query:
keystone.list('CD_Book').model.findOne({
slug: locals.filters.cdbook
}).exec(function(err, result) {
locals.data.cdbook = result;
let musiciansString = result.musicians
musiciansTab = musiciansString.split(',');
for (let i = 0; i < musiciansTab.length; i++) {
keystone.list("Musician").model.findOne({
"title": musiciansTab[i].trim()
}).exec(function(err, result) {
locals.data.musicians.push(result);
console.log(locals.data.musicians);
});
}
next(err);
});
And it's of course work and in console.log I get all musician data what i want, but it doesn't display in .hbs template. How should I refresh/update template after find all musicians? Maybe it's not the best way to achieve this (by using for loop)
Your findOne calls are asynchronous so next() gets called before they finish.
Therefore, your template will be rendered before the data is available in locals.data.musicians.
You could try using find instead of findOne to get all musicians in one go and then set that to locals once retrieved.
You can then call next() when done to continue on to render the template.
Try something like this:
keystone.list('CD_Book').model
.findOne({ slug: locals.filters.cdbook })
.exec()
.then(result => {
locals.data.cdbook = result
let musiciansString = result.musicians
let musiciansTab = musiciansString
.split(',')
.map(musician => musician.trim())
return keystone.list("Musician").model
.find({ "title": { $in: musiciansTab } })
.exec()
})
.then(result => {
locals.data.musicians = result
console.log(locals.data.musicians)
next()
})
.catch(err => {
next(err)
})
I hope this helps.

How to look for a specific value in a DataSnapshot with Firebase Cloud Functions

I'm trying to make a cloud function that will trigger on HTTP request (which is sent on a timer), that will remove all childs with a specific value.
The database node looks like this:
activities
4GI1QXUJG0MeQ8Bq19WOdCQFo9r1 //uid
activity: "hammer"
id: some ID
note: "some note"
timestamp: some timeintervalsince1970
7IDUiufhiws8939hdfIUHiuhwdi5
etc....
I want to look through all the activities, and if the activity value is "hammer", I want to remove the child.
This is what I have so far
exports.deleteHammerNotifications = functions.https.onRequest((req, res) => {
admin.database().ref('activities').once('value', (snapshot) => {
console.log(snapshot.val())
});
});
which prints:
{
'4GI1QXUJG0MeQ8Bq19WOdCQFo9r1':
{ activity: 'nn',
id: '4GI1QXUJG0MeQ8Bq19WOdCQFo9r1',
note: 'Blank note...',
timestamp: 1498032472 },
M6xQU5XWTEVbSqBnR3HBAEhA9hI3:
{ activity: 'hammer',
id: 'M6xQU5XWTEVbSqBnR3HBAEhA9hI3',
note: 'some note here...',
timestamp: 1497973839 },
}
My problem is I don't know how to cycle through the DataSnapshot and look for all the childs that has the activity: "hammer" value.
I have done similar function in my xcode project with arrays, but I don't know how to do it with JavaScript.
Any help is appreciated!
To cycle through the matching child nodes, use snapshot.forEach():
exports.deleteHammerNotifications = functions.https.onRequest((req, res) => {
admin.database().ref('activities').once('value', (snapshot) => {
snapshot.forEach((childSnapshot) => {
console.log(childSnapshot.val())
});
});
});
But you're still missing a query here to select the correct nodes. Without such a query you might as well call admin.database().ref('activities').remove().
To most efficiently delete a number of items from the database and write a single response back to the user, use this function (which I modified from something I needed recently):
exports.cleanup = functions.https.onRequest((req, res) => {
var query = admin.database().ref("activities").orderByChild("activity").equalTo("hammer");
query.once("value").then((snapshot) => {
console.log("cleanup: "+snapshot.numChildren()+" activities");
var updates = {};
snapshot.forEach((child) => {
updates["activities/"+child.key] = null;
});
admin.database().ref().update(updates).then(() => {
res.status(200).send(snapshot.numChildren()+" activities deleted");
}).catch((error) => {
res.status(500).send(error);
})
});
});
Learn more:
Firebase documentation on querying
Firebase blog post on multi-location updates
I am not sure if its possible, but IF you can start a "child_added" listener, once the HTTPS trigger has run, you can do like this.
ref.on('child_added', function(snapshot) {
if (snapshot.child("activity").val() === 'hammer') {
var value = snapshot.child("activity").val();
})
I am doing the exact same thing to see if people are still subscribed to my mailing list or not, and IF they are, they will receive a mail.
Hope that helps :-)

Batch update in knex

I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form

Categories

Resources