Working SQL yields Syntax Error in pg-promise - javascript

I have the following code in one of my endpoints:
let setText = ''
for (const [ key, value ] of Object.entries(req.body.info)) {
setText = setText.concat(`${key} = ${value}, `)
}
// Last character always an extra comma and whitespace
setText = setText.substring(0, setText.length - 2)
db.one('UPDATE students SET ${setText} WHERE id = ${id} RETURNING *', { setText, id: req.body.id })
.then(data => {
res.json({ data })
})
.catch(err => {
res.status(400).json({'error': err.message})
})
It is supposed to dynamically generate the SQL from the request body. When I log the created SQL, it generates correctly. It even works when I directly query the database. However, whenever I run the endpoint, I get a syntax error that's "at or near" whatever setText is. I've tried using slice instead of substring with no change.

You should never concatenate values manually, as they are not escaped properly then, and open your code to possible SQL injections.
Use the tools that the library offers. For an UPDATE from a dynamic object, see below:
const cs = new pgp.helpers.ColumnSet(req.body.info, {table: 'students'});
const query = pgp.helpers.update(req.body.info, cs) +
pgp.as.format(' WHERE id = ${id} RETURNING *', req.body);
db.one(query)
.then(data => {
res.json({ data });
})
.catch(err => {
res.status(400).json({error: err.message})
});

Related

String literals in JS API query

I'm currently utilizing the MTG API SDK (https://api.magicthegathering.io) that has a query structured as such:
// partial name match
mtg.card.where({name: 'Archangel Avacyn'})
.then(results => {
console.log(results)
})
// exact name match
mtg.card.where({name: '"Archangel Avacyn"'})
.then(results => {
console.log(results)
})
I have a REST API trying to access the SDK on the server by passing in a string value for the exact name of the card you want to find, like this:
exports.getCard = (req, res) => {
const card = req.params.card;
let query = "'\"" + card + "\"'";
console.log(query);
mtg.card.where({name : query})
.then(result=>{
res.send(result);
});
}
My problem is that is I use the variable req.params.card in the query I get the partial match return and if I use the query variable I get nothing. I need to exact match.
How can I get the exact match from passing in a variable instead of static text?

How to update hasMany associations when updating model?

Is there a way to update an object (organization) and it’s associations (tasg) in a single call? In my case I have Orgs -> tags. One org can have many tags.
I can’t figure out how to update the tags as well as the organization in one simple call
function updateOrganization(db, stats) {
return function (req, res) {
let myOrg
db.organization.findOne({
where: {
id: req.params.id
},
include: ['tags']
})
.then(org => {
myOrg = org
let promises = []
if (req.body.tags) {
req.body.tags.forEach(tag => {
promises.push(org.createTag({ name: tag }))
})
}
return Promise.all(promises)
})
.then(tags => {
console.log('tags = ', tags)
return myOrg.setTags(tags) <-- DOES NOT SEEM TO BE WORKING
})
.then(updatedOrg => {
console.log('updatedOrg.get() = ', updatedOrg.get()) <-- DOES NOT CONTAIN NEW TAGS
console.log('myOrg final = ', myOrg.get()) <-- DOES NOT CONTAIN NEW TAGS
return res.status(HttpStatus.OK).send(myOrg)
})
.catch(err => {
req.log.error(err)
return handleErr(res, HttpStatus.INTERNAL_SERVER_ERROR, err.message)
})
}
}
NOTE: It looks like the line promises.push(org.createTag({ name: tag })) is actually creating the tags and the line return myOrg.setTags(tags) is not necessary. When i fetch this record with a findOne query, all the tags do actually exist. So why don't they appear when in my log statements which is the output of updatedOrg?
You can simply use something like this:
function updateOrganization(db, stats) {
return async (req, res) => {
try {
// Get the organization + associations
const org = await Organization.find({
where: { id: req.params.id },
include: ['tags'],
limit: 1, // added this because in the 4.42.0 version '.findOne()' is deprecated
});
// Check if we have any tags specified in the body
if(req.body.tags) {
await Promise.all(req.body.tags.map(tag => org.createTag({ name: tag })));
}
return res.status(HttpStatus.OK).send(org.reload()); // use '.reload()' to refresh associated data
} catch(err) {
req.log.error(err);
return handleErr(res, HttpStatus.INTERNAL_SERVER_ERROR, err.message);
}
}
}
You can read more about .reload() here.
Also I recommend you to use Sequelize Transactions in the future, it will be very easy to control your app flow.
Are you sure you want a "hasMany" relationship. Usually tags are shared, and the Org would have a "belongsToMany" relationship. Please post your models and schema, and if possible, a working example (with connection to a database with your schema) or a link to one.
In any case, I believe createTag is only going to work if the tag does not already exist. If the tag exists, you need to setTags or addTags, passing in the tag objects or their primary key IDs.

How to get multiple DOM elements with chrome-remote-interface node js?

I am just trying to build a crawler with chrome-remote-interface but i don't know how to get multiple dom elements like specific targets id,classes.
for Ex:
price = document.getelementbyid('price')
name= document.getelementbyid('name')
Code
const CDP = require('chrome-remote-interface');
CDP((client) => {
// Extract used DevTools domains.
const {Page, Runtime} = client;
// Enable events on domains we are interested in.
Promise.all([
Page.enable()
]).then(() => {
return Page.navigate({url: 'http://example.com'})
});
// Evaluate outerHTML after page has loaded.
Page.loadEventFired(() => {
Runtime.evaluate({expression: 'document.body.outerHTML'}).then((result) => {
//How to get Multiple Dom elements
console.log(result.result.value);
client.close();
});
});
}).on('error', (err) => {
console.error('Cannot connect to browser:', err);
});
Update
const CDP = require('chrome-remote-interface');
CDP((client) => {
// Extract used DevTools domains.
const {DOM,Page, Runtime} = client;
// Enable events on domains we are interested in.
Promise.all([
Page.enable()
]).then(() => {
return Page.navigate({url: 'https://someDomain.com'});
})
Page.loadEventFired(() => {
const expression = `({
test: document.getElementsByClassName('rows')),
})`
Runtime.evaluate({expression,returnByValue: true}).then((result) => {
console.log(result.result) // Error
client.close()
})
})
}).on('error', (err) => {
console.error('Cannot connect to browser:', err);
});
Error
{ type: 'object',
subtype: 'error',
className: 'SyntaxError',
description: 'SyntaxError: Unexpected token )',
objectId: '{"injectedScriptId":14,"id":1}' }
Actually I want to iterate over the list of elements But I don't know where it goes wrong
You cannot move DOM object from the browser context to the Node.js context, all you can do is pass a property or whatever can be considered a JSON object. Here I'm assuming you're interested in the computed HTML.
A possible solution is:
const CDP = require('chrome-remote-interface');
CDP((client) => {
// Extract used DevTools domains.
const {Page, Runtime} = client;
// Enable events on domains we are interested in.
Promise.all([
Page.enable()
]).then(() => {
return Page.navigate({url: 'http://example.com'});
});
// Evaluate outerHTML after page has loaded.
Page.loadEventFired(() => {
const expression = `({
name: document.getElementById('name').outerHTML,
price: document.getElementById('price').outerHTML
})`;
Runtime.evaluate({
expression,
returnByValue: true
}).then(({result}) => {
const {name, price} = result.value;
console.log(`name: ${name}`);
console.log(`price: ${price}`);
client.close();
});
});
}).on('error', (err) => {
console.error('Cannot connect to browser:', err);
});
The key point is returning a JSON object using returnByValue: true.
Update: You have an error in your expression, a trailing ) in ...('rows')),. But even if you fix it you'd still end up in a wrong situation because you're attempting to pass an array of DOM objects (see the first paragraph of this answer). Again, if you want just the outer HTML you can do something like:
// Evaluate outerHTML after page has loaded.
Page.loadEventFired(() => {
const expression = `
// fetch an array-like of DOM elements
var elements = document.getElementsByTagName('p');
// create and return an array containing
// just a property (in this case `outerHTML`)
Array.prototype.map.call(elements, x => x.outerHTML);
`;
Runtime.evaluate({
expression,
returnByValue: true
}).then(({result}) => {
// this is the returned array
const elements = result.value;
elements.forEach((html) => {
console.log(`- ${html}`);
});
client.close();
});
});

Parameterized/Prepared Statements usage pg-promise

I'm using koa v2 with pg-promise. I try to do a simple SELECT 2 + 2; within a Parameterized/Prepared Statement to test my setup:
// http://127.0.0.1:3000/sql/2
router.get('/sql/:id', async (ctx) => {
await db.any({
name: 'addition',
text: 'SELECT 2 + 2;',
})
.then((data) => {
console.log('DATA:', data);
ctx.state = { title: data }; // => I want to return num 4 instead of [Object Object]
})
.catch((error) => {
console.log('ERROR:', error);
ctx.body = '::DATABASE CONNECTION ERROR::';
})
.finally(pgp.end);
await ctx.render('index');
});
Which is rendering [Object Object] in the templates and returning this to the console from pg-monitor:
17:30:54 connect(postgres#postgres)
17:30:54 name="addition", text="SELECT 2 + 2;"
17:30:54 disconnect(postgres#postgres)
DATA: [ anonymous { '?column?': 4 } ]
My problem:
I want to store result 4 in ctx.state. I don't know how can I access it within [ anonymous { '?column?': 4 } ]?
Thank You for your help!
Edit:
I found another recommended(1) ways(2) to dealing with named parameters in the official wiki.
// http://127.0.0.1:3000/sql/2
router.get('/sql/:id', async (ctx) => {
const obj = {
id: parseInt(ctx.params.id, 10),
};
await db.result('SELECT ${id} + ${id}', obj)
.then((data) => {
console.log('DATA:', data.rows[0]['?column?']);
ctx.state = { title: data.rows[0]['?column?'] }; // => 4
})
.catch((error) => {
console.log('ERROR:', error);
ctx.body = '::DATABASE CONNECTION ERROR::';
})
.finally(pgp.end);
await ctx.render('index');
});
I changed the any object to result, which returning the raw text. Than I access number 4 like a javascript object. Am I doing something wrong? Is there another way to access this value?
What is the recommended, more faster, safer way of usage?
Since you are requesting just one value, you should use method one:
const {value} = await db.one({
name: 'addition',
text: 'SELECT 2 + 2 as value',
}); // value = 4
And for such example you cannot use types PreparedStatement or ParameterizedQuery, because they format query on the server side, and PostgreSQL does not support syntax like $1 + $1.
The real question is - do you really need those types?

Batch update in knex

I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form

Categories

Resources