update column values with call to javascript function using knex update - javascript

I'm anonymising data using knex migrations. I'm using the faker library to generate random street names that will replace the real street names. I need the knex update statement to call the faker.address.streetAddress() function as it would call any other mysql function ( RAND() for example ).
The workaround I have found is significantly less performant.
const faker = require("faker/locale/de");
exports.up = function (knex) {
// generate a random character in the set ('A', 'B', 'C')
return knex("Address").update(
"street",
knex.raw("CHAR(FLOOR( 65 + RAND( ) *3 ))")
);
// alternative is not very performant
return knex("Address")
.pluck("id")
.then(function (ids) {
return Promise.all(
ids.map((id) =>
knex("Address")
.where({ id: id })
.update({ street: faker.address.streetAddress() })
)
);
});
};
exports.down = function (knex) {};

One just cannot call backend's javascript functions from SQL server side to generate data.
If you want to randomize every street name in a single query with a different value, you cannot use faker (or any other javascript side library), but you need to generate the fake address on DB side (like you do with first rand() example).
That can be done for example by creating an SQL procedure for address generation and use it instead of faker. Probably the address generator is pretty easy to write by using some predefined word lists and randomizing names from those + adding some street number etc.

Related

How to pass an SQL function on bulk insert query in node

I have to execute an INSERT INTO query using mysql or mysql2 in node.js
The values passed into the query utilizes a spatial geometry function ST_GeomFromGeoJSON() on geoMap column which is of type GEOMETRY.
Here is the simplified code:
const insertQuery = `INSERT INTO maps (name, geoMap) VALUES ?`;
const insertData = [];
geoMapList.forEach((geoMap) => {
insertData.push([
geoMap.name,
`ST_GeomFromGeoJSON(${JSON.stringify(geoMap.map)}, 2)`
]);
});
this.connection.query(insertQuery, [insertData]);
The above code does not work and throws the error
Cannot get geometry object from data you send to the GEOMETRY field
I believe this is because the ST_GeomFromGeoJSON() is not parsed as a function but as a string by MySQL.
How can this be resolved?
You can't put MySQL function calls in the parameter array, that data is all treated as literals.
Call the function in the SQL, with the JSON string as its parameter.
I don't think you can use node-mysql's bulk-insert for this, though, so you'll need to insert each row separately.
const insertQuery = `INSERT INTO maps(name, geoMap) VALUES (?, ST_GeomFromGeoJSON(?))`
geoMapList.forEach((geomap) =>
this.connection.query(insertQuery, [geomap.name, JSON.stringify(geomap.map)])
);
To avoid calling query() thousands of times, you can put multiple values lists in the query.
const values = Array(geoMapList.length).fill('(?, ST_GeomFromGeoJSON(?))).join(',');
const insertQuery = `INSERT INTO maps(name, geoMap) VALUES ${values}`;
const params = geoMapList.flatMap(({name, map}) => [name, JSON.stringify(map)]);
this.connection.query(insertquery, params);
Since this will create a very long INSERT query, make sure you increase the MySQL max_allowed_packet setting. You can also split geoMapList into smaller batches, rather than doing everything in a single query.

How to read all nested collections of all users on firestore? [duplicate]

I thought I read that you can query subcollections with the new Firebase Firestore, but I don't see any examples. For example I have my Firestore setup in the following way:
Dances [collection]
danceName
Songs [collection]
songName
How would I be able to query "Find all dances where songName == 'X'"
Update 2019-05-07
Today we released collection group queries, and these allow you to query across subcollections.
So, for example in the web SDK:
db.collectionGroup('Songs')
.where('songName', '==', 'X')
.get()
This would match documents in any collection where the last part of the collection path is 'Songs'.
Your original question was about finding dances where songName == 'X', and this still isn't possible directly, however, for each Song that matched you can load its parent.
Original answer
This is a feature which does not yet exist. It's called a "collection group query" and would allow you query all songs regardless of which dance contained them. This is something we intend to support but don't have a concrete timeline on when it's coming.
The alternative structure at this point is to make songs a top-level collection and make which dance the song is a part of a property of the song.
UPDATE
Now Firestore supports array-contains
Having these documents
{danceName: 'Danca name 1', songName: ['Title1','Title2']}
{danceName: 'Danca name 2', songName: ['Title3']}
do it this way
collection("Dances")
.where("songName", "array-contains", "Title1")
.get()...
#Nelson.b.austin Since firestore does not have that yet, I suggest you to have a flat structure, meaning:
Dances = {
danceName: 'Dance name 1',
songName_Title1: true,
songName_Title2: true,
songName_Title3: false
}
Having it in that way, you can get it done:
var songTitle = 'Title1';
var dances = db.collection("Dances");
var query = dances.where("songName_"+songTitle, "==", true);
I hope this helps.
UPDATE 2019
Firestore have released Collection Group Queries. See Gil's answer above or the official Collection Group Query Documentation
Previous Answer
As stated by Gil Gilbert, it seems as if collection group queries is currently in the works. In the mean time it is probably better to use root level collections and just link between these collection using the document UID's.
For those who don't already know, Jeff Delaney has some incredible guides and resources for anyone working with Firebase (and Angular) on AngularFirebase.
Firestore NoSQL Relational Data Modeling - Here he breaks down the basics of NoSQL and Firestore DB structuring
Advanced Data Modeling With Firestore by Example - These are more advanced techniques to keep in the back of your mind. A great read for those wanting to take their Firestore skills to the next level
What if you store songs as an object instead of as a collection? Each dance as, with songs as a field: type Object (not a collection)
{
danceName: "My Dance",
songs: {
"aNameOfASong": true,
"aNameOfAnotherSong": true,
}
}
then you could query for all dances with aNameOfASong:
db.collection('Dances')
.where('songs.aNameOfASong', '==', true)
.get()
.then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
console.log(doc.id, " => ", doc.data());
});
})
.catch(function(error) {
console.log("Error getting documents: ", error);
});
NEW UPDATE July 8, 2019:
db.collectionGroup('Songs')
.where('songName', isEqualTo:'X')
.get()
I have found a solution.
Please check this.
var museums = Firestore.instance.collectionGroup('Songs').where('songName', isEqualTo: "X");
museums.getDocuments().then((querySnapshot) {
setState(() {
songCounts= querySnapshot.documents.length.toString();
});
});
And then you can see Data, Rules, Indexes, Usage tabs in your cloud firestore from console.firebase.google.com.
Finally, you should set indexes in the indexes tab.
Fill in collection ID and some field value here.
Then Select the collection group option.
Enjoy it. Thanks
You can always search like this:-
this.key$ = new BehaviorSubject(null);
return this.key$.switchMap(key =>
this.angFirestore
.collection("dances").doc("danceName").collections("songs", ref =>
ref
.where("songName", "==", X)
)
.snapshotChanges()
.map(actions => {
if (actions.toString()) {
return actions.map(a => {
const data = a.payload.doc.data() as Dance;
const id = a.payload.doc.id;
return { id, ...data };
});
} else {
return false;
}
})
);
Query limitations
Cloud Firestore does not support the following types of queries:
Queries with range filters on different fields.
Single queries across multiple collections or subcollections. Each query runs against a single collection of documents. For more
information about how your data structure affects your queries, see
Choose a Data Structure.
Logical OR queries. In this case, you should create a separate query for each OR condition and merge the query results in your app.
Queries with a != clause. In this case, you should split the query into a greater-than query and a less-than query. For example, although
the query clause where("age", "!=", "30") is not supported, you can
get the same result set by combining two queries, one with the clause
where("age", "<", "30") and one with the clause where("age", ">", 30).
I'm working with Observables here and the AngularFire wrapper but here's how I managed to do that.
It's kind of crazy, I'm still learning about observables and I possibly overdid it. But it was a nice exercise.
Some explanation (not an RxJS expert):
songId$ is an observable that will emit ids
dance$ is an observable that reads that id and then gets only the first value.
it then queries the collectionGroup of all songs to find all instances of it.
Based on the instances it traverses to the parent Dances and get their ids.
Now that we have all the Dance ids we need to query them to get their data. But I wanted it to perform well so instead of querying one by one I batch them in buckets of 10 (the maximum angular will take for an in query.
We end up with N buckets and need to do N queries on firestore to get their values.
once we do the queries on firestore we still need to actually parse the data from that.
and finally we can merge all the query results to get a single array with all the Dances in it.
type Song = {id: string, name: string};
type Dance = {id: string, name: string, songs: Song[]};
const songId$: Observable<Song> = new Observable();
const dance$ = songId$.pipe(
take(1), // Only take 1 song name
switchMap( v =>
// Query across collectionGroup to get all instances.
this.db.collectionGroup('songs', ref =>
ref.where('id', '==', v.id)).get()
),
switchMap( v => {
// map the Song to the parent Dance, return the Dance ids
const obs: string[] = [];
v.docs.forEach(docRef => {
// We invoke parent twice to go from doc->collection->doc
obs.push(docRef.ref.parent.parent.id);
});
// Because we return an array here this one emit becomes N
return obs;
}),
// Firebase IN support up to 10 values so we partition the data to query the Dances
bufferCount(10),
mergeMap( v => { // query every partition in parallel
return this.db.collection('dances', ref => {
return ref.where( firebase.firestore.FieldPath.documentId(), 'in', v);
}).get();
}),
switchMap( v => {
// Almost there now just need to extract the data from the QuerySnapshots
const obs: Dance[] = [];
v.docs.forEach(docRef => {
obs.push({
...docRef.data(),
id: docRef.id
} as Dance);
});
return of(obs);
}),
// And finally we reduce the docs fetched into a single array.
reduce((acc, value) => acc.concat(value), []),
);
const parentDances = await dance$.toPromise();
I copy pasted my code and changed the variable names to yours, not sure if there are any errors, but it worked fine for me. Let me know if you find any errors or can suggest a better way to test it with maybe some mock firestore.
var songs = []
db.collection('Dances')
.where('songs.aNameOfASong', '==', true)
.get()
.then(function(querySnapshot) {
var songLength = querySnapshot.size
var i=0;
querySnapshot.forEach(function(doc) {
songs.push(doc.data())
i ++;
if(songLength===i){
console.log(songs
}
console.log(doc.id, " => ", doc.data());
});
})
.catch(function(error) {
console.log("Error getting documents: ", error);
});
It could be better to use a flat data structure.
The docs specify the pros and cons of different data structures on this page.
Specifically about the limitations of structures with sub-collections:
You can't easily delete subcollections, or perform compound queries across subcollections.
Contrasted with the purported advantages of a flat data structure:
Root-level collections offer the most flexibility and scalability, along with powerful querying within each collection.

Programmatically add products to a cart – Odoo 13

I have a custom module that creates a form. Based on the answers inside this form I’m generating order line. After user sends this form I’m creating sale order with all products from the generated order line.
So from JavaScript I’m sending an JSON with products to buy:
order_data = [{product_id: 1, amount: 10, …},{product_id: 2, …}, …];
note = '';
this._rpc({
route: '/api/create_order',
params: { order_products: order_data, note: note }
}).then((data) => {
window.location = '/contactus-thank-you';
}).catch((error) => {
console.error(error);
});
And then inside Python I’m creating sale order based on the JSON:
#http.route('/api/create_order', type='json', auth='user', website=True)
def create_order(self, **kw):
uid = http.request.env.context.get('uid')
partner_id = http.request.env['res.users'].search([('id','=',uid)]).partner_id.id
order_products = kw.get('order_products', [])
note = kw.get('note', '')
order_line = []
for product in order_products:
amount = 0
if 'custom_amount' in product:
amount = product['custom_amount']
else:
amount = product['amount']
if amount > 0:
order_line.append(
(0, 0, {
'product_id': product['product_id'],
'product_uom_qty': amount,
}))
order_data = {
'name': http.request.env['ir.sequence'].with_user(SUPERUSER_ID).next_by_code('sale.order') or _('New'),
'partner_id': partner_id,
'order_line': order_line,
'note': note,
}
result_insert_record = http.request.env['sale.order'].with_user(SUPERUSER_ID).create(order_data)
return result_insert_record.id
But instead of generating sale order directly I need to use workflow from Odoo’s eCommerce addon. That way user can for example edit delivery address, choose payment etc. So I think I just need to programmatically put all the product inside a cart and then rest will be taken care of by Odoo built-in functionality.
But how? I’ve tried to find something inside Odoo source code but it is quite hard to grasp anything.
Odoo uses a typical Sale Order for handling products inside a cart. But the process isn't as simple as just creating Sale Order with some products. Odoo needs to know which order is linked with which cart etc.
Luckily Odoo has a method for dealing with it. There is a sale_get_order() method that lets you get an order that is currently linked with a cart or create new one if there isn't any.
I'm not sure if it is documented anywhere outside the source code so here is a slice from the code (/addons/website_sale/models/website.py):
def sale_get_order(self, force_create=False, code=None, update_pricelist=False, force_pricelist=False):
""" Return the current sales order after mofications specified by params.
:param bool force_create: Create sales order if not already existing
:param str code: Code to force a pricelist (promo code)
If empty, it's a special case to reset the pricelist with the first available else the default.
:param bool update_pricelist: Force to recompute all the lines from sales order to adapt the price with the current pricelist.
:param int force_pricelist: pricelist_id - if set, we change the pricelist with this one
:returns: browse record for the current sales order
"""
# ...
I'm using it alongside another method _cart_update() that lets me easily update products inside this order. There is also sale_reset() and I'm using it just to be sure that current session will be updated with particular sale order every time.
sale_order = request.website.sale_get_order(force_create=True)
request.website.sale_reset()
sale_order.write({'order_line':[(5, 0, 0)]})
for product in order_products:
sale_order._cart_update(product_id=product['product_id'], line_id=None, add_qty=None, set_qty=product['amount'])

In sails/waterline get maximum value of a column in a database agnostic way

While using sails as ORM (version 1.0), I notice that there is a function called Model.avg (as well as sum). - However there is not a maximum or minimum function to get the maximum or minimum from a column in a model; so it seems this is not necessary because it is covered by other functions already?
Now in my database I need to get the "maximum id" in a list; and I have it working for postgresql by using a native query:
const maxnum = await Order.getDatastore().sendNativeQuery('SELECT MAX(\"orderNr\") FROM \"order\"')
While this isn't the most difficult thing, it is not what I truly want: it is limited to only sql-based datastores (so we wouldn't be able to move easily to mongodb); and the syntax might actually be even different for another sql database type.
So I wonder - can this be transformed in such a way it doesn't rely on sendNativeQuery?
You can try .query() to execute a raw SQL query using the specified model's datastore and if u want u can try pg , an NPM package used for communicating with PostgreSQL databases:
Pet.query('SELECT pet.name FROM pet WHERE pet.name = $1', [ 'dog' ]
,function(err, rawResult) {
if (err) { return res.serverError(err); }
sails.log(rawResult);
// (result format depends on the SQL query that was passed in, and
the adapter you're using)
// Then parse the raw result and do whatever you like with it.
return res.ok();
});
You can use the limit and order options waterline provides to get a single Model with a maximal value (then just extract that value).
const orderModel = await Order.find({
where: {},
select: ['orderNr'],
limit: 1,
sort: 'orderNr DESC'
});
console.log(orderModel.orderNr);
Like most things in Waterline, it's probably not as efficient as an SQL SELECT MAX query (or some equivalent in mongo, etc), but it should allow swapping out the database with no maintenance. Last note, don't forget to handle the case of no models found.

Scan multiple email id from table - DynamoDB

I'm using #awspilot/dynamodb to fetch data from a customer table where customer_id is the primary key.
I need to get customer id based on multiple customer_email.
dynamodb
.table('bc_customer')
.select('customer_id')
.having('email').eq('test#gmail.com')
.scan(function( err, data ) {
console.log(data);
});
The above code allows me to pass single email id, is there any way to search on multiple email id?
DynamoDB supports query, which allows you to get data from one and only one partition (i.e. one partition key), or scan which returns every item in the table (i.e. a full table scan).
If you know the partition keys to query, it will be faster to do multiple queries and combine your result set. If you don't care about performance or you are happy with the speed (bear in mind a scan will scale poorly as your table grows) you can use a scan.
Note that above you are actually doing a scan, and so you are not using your partition key as an index.
A query would be like this:
DynamoDB
.table('bc_customer')
.where('email').eq('test#gmail.com')
.query(function(err, data ) {
console.log(err,data)
})
And a scan (I think - the awspilot documents are not too clear) should be something like:
DynamoDB
.table('bc_customer')
.having('email').eq('test#gmail.com')
.having('someattribute').eq('something')
.scan(function( err, data ) {
console.log( err, data )
})

Categories

Resources