How to pass an SQL function on bulk insert query in node - javascript

I have to execute an INSERT INTO query using mysql or mysql2 in node.js
The values passed into the query utilizes a spatial geometry function ST_GeomFromGeoJSON() on geoMap column which is of type GEOMETRY.
Here is the simplified code:
const insertQuery = `INSERT INTO maps (name, geoMap) VALUES ?`;
const insertData = [];
geoMapList.forEach((geoMap) => {
insertData.push([
geoMap.name,
`ST_GeomFromGeoJSON(${JSON.stringify(geoMap.map)}, 2)`
]);
});
this.connection.query(insertQuery, [insertData]);
The above code does not work and throws the error
Cannot get geometry object from data you send to the GEOMETRY field
I believe this is because the ST_GeomFromGeoJSON() is not parsed as a function but as a string by MySQL.
How can this be resolved?

You can't put MySQL function calls in the parameter array, that data is all treated as literals.
Call the function in the SQL, with the JSON string as its parameter.
I don't think you can use node-mysql's bulk-insert for this, though, so you'll need to insert each row separately.
const insertQuery = `INSERT INTO maps(name, geoMap) VALUES (?, ST_GeomFromGeoJSON(?))`
geoMapList.forEach((geomap) =>
this.connection.query(insertQuery, [geomap.name, JSON.stringify(geomap.map)])
);
To avoid calling query() thousands of times, you can put multiple values lists in the query.
const values = Array(geoMapList.length).fill('(?, ST_GeomFromGeoJSON(?))).join(',');
const insertQuery = `INSERT INTO maps(name, geoMap) VALUES ${values}`;
const params = geoMapList.flatMap(({name, map}) => [name, JSON.stringify(map)]);
this.connection.query(insertquery, params);
Since this will create a very long INSERT query, make sure you increase the MySQL max_allowed_packet setting. You can also split geoMapList into smaller batches, rather than doing everything in a single query.

Related

TypeORM createQueryBuilder: Is possible to use table name as string instead of entity

In my current scenario, a third party will be creating tables dynamically on my DB, and storing the table name as a varchar and column names as jsonb in other table which is defined as an Entity in my NestJS backend.
This is so I can keep track (and query) these tables, since I have no control over its creation.
For this purpose I'd like to use TypeORM's createQueryBuilder instead of using raw queries as its easier for me to play with abstraction.
As far as I know TypeORMs createQueryBuilder needs a defined Entity in the from clause.
Something like this:
return await getManager()
.createQueryBuilder()
.select('*')
.from(MyDefinedModel, 'modelAlias')
.getRawMany();
}
So I'd like to do something like:
const tableName = await getDynamicallyGenetaredTableNames().getFirstOne()
// now tableName points to a string that is a table name, i.e 'table-data-192239'
return await getManager()
.createQueryBuilder()
.select('*')
.from(tableName, 'tableAlias')
.getRawMany();
So passing the table name I point to the right table, but TypeORM (and TS) complains because that tableName is a string, and not an Entity (Class) type
I really don't want to type-cast and start doing nasty things if there is something cleaner to achieve this.
I didn't find any solution in the official docs
Any brilliant ideas out there?
Thanks, y'all
We can pass table name instead of Entity for getRepository.
let tableName = 'user';
let query = getRepository(tableName).createQueryBuilder(tableName);
You can select from a table by its table name without defining an entity before like this:
const res = await manager
.createQueryBuilder()
.select()
.from("tableName", null)
.where("tableName.id = :id", { id: 1 })
.getRawMany();
Be sure that you set the second parameter of the from() explicitly to null.
You could try using a raw query:
import { getManager } from 'typeorm'
const tableName = await getDynamicallyGenetaredTableNames().getFirstOne()
// Use the table name in a raw SQL query
const result = await getManager().query(`SELECT * FROM ${tableName}`)

update column values with call to javascript function using knex update

I'm anonymising data using knex migrations. I'm using the faker library to generate random street names that will replace the real street names. I need the knex update statement to call the faker.address.streetAddress() function as it would call any other mysql function ( RAND() for example ).
The workaround I have found is significantly less performant.
const faker = require("faker/locale/de");
exports.up = function (knex) {
// generate a random character in the set ('A', 'B', 'C')
return knex("Address").update(
"street",
knex.raw("CHAR(FLOOR( 65 + RAND( ) *3 ))")
);
// alternative is not very performant
return knex("Address")
.pluck("id")
.then(function (ids) {
return Promise.all(
ids.map((id) =>
knex("Address")
.where({ id: id })
.update({ street: faker.address.streetAddress() })
)
);
});
};
exports.down = function (knex) {};
One just cannot call backend's javascript functions from SQL server side to generate data.
If you want to randomize every street name in a single query with a different value, you cannot use faker (or any other javascript side library), but you need to generate the fake address on DB side (like you do with first rand() example).
That can be done for example by creating an SQL procedure for address generation and use it instead of faker. Probably the address generator is pretty easy to write by using some predefined word lists and randomizing names from those + adding some street number etc.

NodeJS, sqlite3, and WHERE IN paramaters

Does the sqlite3 library for NodeJS support parameters for WHERE IN queries?
I have the following small program.
const sqlite3 = require('sqlite3');
const db = new sqlite3.Database('./data/data.db');
const sql = `
SELECT * FROM accounts
WHERE
name IN ('Business Expense - Internet Service','RCSB Checking')`;
db.all( sql,
function getAccout(error, rows) {
console.log(error);
console.log(rows);
});
The program "works". It queries the database, and the rows variable is successfully populated with data from two separate rows.
However, I don't want hard coded values. I want to parameterize my values. The following code works for that.
const sql = `
SELECT * FROM accounts
WHERE
name = ? OR name = ?`;
db.all( sql,
['Business Expense - Internet Service','RCSB Checking'],
function getAccout(error, rows) {
console.log(error);
console.log(rows);
});
However, I want this to work with any number of names. i.e. I want to parameterize the original WHERE IN query. However, the following program
const sql = `
SELECT * FROM accounts
WHERE
name IN (?)`;
db.all( sql,
[['Business Expense - Internet Service','RCSB Checking']],
function getAccout(error, rows) {
console.log(error);
console.log(rows);
});
does not work. It runs without error, but returns zero rows. I also tried using a concatenated string as the param
const sql = `
SELECT * FROM accounts
WHERE
name IN (?)`;
db.all( sql,
["'Business Expense - Internet Service','RCSB Checking'"],
function getAccout(error, rows) {
console.log(error);
console.log(rows);
});
and this also didn't work.
Does the sqlite3 library support parameterizing WHERE IN queries? If so, what's the syntax for doing so? If not, are there common work arounds to this in the NodeJS community?
I don't know if there is support for it, but in case there is not
const params = [...]
const sql = `
SELECT * FROM accounts
WHERE
name IN (${new Array(params.length).fill('?').join(',')})`;
I was going to suggest using JS string literals as well. Seems more like the 'node way' of doing things.

get records after creation using sequelize raw query

I am using sequelize for migration. here I execute an INSERT query with following options but its didnot return created records:
const res = oldUsers.map(u => sequelize.query(
`INSERT INTO ${LP_LOCATIONS_TABLE} (name, address, city)
VALUES (
'${u.email}', '${u.address}', '${u.city}');`,
{ type: DataTypes.QueryTypes.INSERT, raw: true },
))
the output is an array of array like below:
[ [[0],[1]] ]
i expect to get created recorders. specially PK. how can I fix it?
I forgot to put RETURNING * at the end of the raw SQL query.
From the doc, you may have to specify the option returning:true to make this happen. I use mySql, so can't test (the returning option is only for postgres).

Scan multiple email id from table - DynamoDB

I'm using #awspilot/dynamodb to fetch data from a customer table where customer_id is the primary key.
I need to get customer id based on multiple customer_email.
dynamodb
.table('bc_customer')
.select('customer_id')
.having('email').eq('test#gmail.com')
.scan(function( err, data ) {
console.log(data);
});
The above code allows me to pass single email id, is there any way to search on multiple email id?
DynamoDB supports query, which allows you to get data from one and only one partition (i.e. one partition key), or scan which returns every item in the table (i.e. a full table scan).
If you know the partition keys to query, it will be faster to do multiple queries and combine your result set. If you don't care about performance or you are happy with the speed (bear in mind a scan will scale poorly as your table grows) you can use a scan.
Note that above you are actually doing a scan, and so you are not using your partition key as an index.
A query would be like this:
DynamoDB
.table('bc_customer')
.where('email').eq('test#gmail.com')
.query(function(err, data ) {
console.log(err,data)
})
And a scan (I think - the awspilot documents are not too clear) should be something like:
DynamoDB
.table('bc_customer')
.having('email').eq('test#gmail.com')
.having('someattribute').eq('something')
.scan(function( err, data ) {
console.log( err, data )
})

Categories

Resources