Scan multiple email id from table - DynamoDB - javascript

I'm using #awspilot/dynamodb to fetch data from a customer table where customer_id is the primary key.
I need to get customer id based on multiple customer_email.
dynamodb
.table('bc_customer')
.select('customer_id')
.having('email').eq('test#gmail.com')
.scan(function( err, data ) {
console.log(data);
});
The above code allows me to pass single email id, is there any way to search on multiple email id?

DynamoDB supports query, which allows you to get data from one and only one partition (i.e. one partition key), or scan which returns every item in the table (i.e. a full table scan).
If you know the partition keys to query, it will be faster to do multiple queries and combine your result set. If you don't care about performance or you are happy with the speed (bear in mind a scan will scale poorly as your table grows) you can use a scan.
Note that above you are actually doing a scan, and so you are not using your partition key as an index.
A query would be like this:
DynamoDB
.table('bc_customer')
.where('email').eq('test#gmail.com')
.query(function(err, data ) {
console.log(err,data)
})
And a scan (I think - the awspilot documents are not too clear) should be something like:
DynamoDB
.table('bc_customer')
.having('email').eq('test#gmail.com')
.having('someattribute').eq('something')
.scan(function( err, data ) {
console.log( err, data )
})

Related

Node SQL Server IN query not working with request params

I want to run a query in Node SQL Server which is using IN clause. This is the string used for querying 'a','b','c'. This code works fine, but user is passing data so, I can't use it. May lead to attacks:
const dbResult = await request.query(`
SELECT OrderID, ParentSKURefNum, SKURefNum, OrderCompleteTime
FROM ${tables.ORDERS}
WHERE OrderID IN (${idsWithQuotes})
`);
I want to use request.input('OrderIDs', ids) and then code will be like this:
request.input('OrderIDs', ids);
const dbResult = await request.query(`
SELECT OrderID, ParentSKURefNum, SKURefNum, OrderCompleteTime
FROM ${tables.ORDERS}
WHERE OrderID IN (#OrderIDs)
`);
But the code above always shows: No data found. What am I doing wrong? In second situation I also tried removing first and last quote from the string assuming request automatically adds it.
Thanks for your help!
I'm using SQL Server 2012 which doesn't support STRING_SPLIT function to split CSV into some sort of table which then IN operator operates on.
I found it on stack overflow that we can split the values using XML which I didn't really understand but did the trick.
SELECT OrderID, ParentSKURefNum, SKURefNum, OrderCompleteTime
FROM ${tables.ORDERS}
WHERE OrderID IN (
SELECT Split.a.value('.', 'NVARCHAR(MAX)') DATA
FROM
(
SELECT CAST('<X>'+REPLACE(#OrderIDs, ',', '</X><X>')+'</X>' AS xml) AS STRING
) AS A
CROSS APPLY String.nodes('/X') AS Split(a)
)

How to sort the relations returned using get work items API?

I am writing a TFS extension in javascript where I am using the 'GetWorkItem' function within 'TFS/WorkItemTracking/RestClient' library.
wiRestClient.getWorkItem(<workItemID>, null, null, Contracts.WorkItemExpand.All)
.then(function success(workItem) {
console.log(workItem);
});
the output generated by the code above is as below:
This PBI has about 40 tasks within it and they are fetched in random order by the API.
Is there a way that these relations are fetched in the order of their Id?
I process the relations returned in the result, fetch the Id from a forward relation, get the workItemId
and add it to an array.
Now, this array has information about all the child workitems of the parent PBI.
I tried to sort this array based on System.Id in the fields property.
This is the function I use to sort the data:
childWorkItems.sort(function(a,b) {
return a["System.Id"] > b.["System.Id"]
});
console.log(childWorkItems);
This doesn't seem to work. The array is still in random order.
I solved by changing the sort function to
childWorkItems.sort(function(a,b) {
return a["System.Id"] - b.["System.Id"]
});
console.log(childWorkItems);

In sails/waterline get maximum value of a column in a database agnostic way

While using sails as ORM (version 1.0), I notice that there is a function called Model.avg (as well as sum). - However there is not a maximum or minimum function to get the maximum or minimum from a column in a model; so it seems this is not necessary because it is covered by other functions already?
Now in my database I need to get the "maximum id" in a list; and I have it working for postgresql by using a native query:
const maxnum = await Order.getDatastore().sendNativeQuery('SELECT MAX(\"orderNr\") FROM \"order\"')
While this isn't the most difficult thing, it is not what I truly want: it is limited to only sql-based datastores (so we wouldn't be able to move easily to mongodb); and the syntax might actually be even different for another sql database type.
So I wonder - can this be transformed in such a way it doesn't rely on sendNativeQuery?
You can try .query() to execute a raw SQL query using the specified model's datastore and if u want u can try pg , an NPM package used for communicating with PostgreSQL databases:
Pet.query('SELECT pet.name FROM pet WHERE pet.name = $1', [ 'dog' ]
,function(err, rawResult) {
if (err) { return res.serverError(err); }
sails.log(rawResult);
// (result format depends on the SQL query that was passed in, and
the adapter you're using)
// Then parse the raw result and do whatever you like with it.
return res.ok();
});
You can use the limit and order options waterline provides to get a single Model with a maximal value (then just extract that value).
const orderModel = await Order.find({
where: {},
select: ['orderNr'],
limit: 1,
sort: 'orderNr DESC'
});
console.log(orderModel.orderNr);
Like most things in Waterline, it's probably not as efficient as an SQL SELECT MAX query (or some equivalent in mongo, etc), but it should allow swapping out the database with no maintenance. Last note, don't forget to handle the case of no models found.

MongoDb bulk insert limit issue

Im new with mongo and node. I was trying to upload a csv into the mongodb.
Steps include:
Reading the csv.
Converting it into JSON.
Pushing it to the mongodb.
I used 'csvtojson' module to convert csv to json and pushed it using code :
MongoClient.connect('mongodb://127.0.0.1/test', function (err, db) { //connect to mongodb
var collection = db.collection('qr');
collection.insert(jsonObj.csvRows, function (err, result) {
console.log(JSON.stringify(result));
console.log(JSON.stringify(err));
});
console.log("successfully connected to the database");
//db.close();
});
This code is working fine with csv upto size 4mb; more than that its not working.
I tried to console the error
console.log(JSON.stringify(err));
it returned {}
Note: Mine is 32 bit system.
Is it because there a document limit of 4mb for 32-bit systems?
I'm in a scenario where I can't restrict the size and no.of attributes in the csv file (ie., the code will be handling various kinds of csv files). So how to handle that? I there any modules available?
If you are not having a problem on the parsing the csv into JSON, which presumably you are not, then perhaps just restrict the list size being passed to insert.
As I can see the .csvRows element is an array, so rather than send all of the elements at once, slice it up and batch the elements in the call to insert. It seems likely that the number of elements is the cause of the problem rather than the size. Splitting the array up into a few inserts rather than 1 should help.
Experiment with 500, then 1000 and so on until you find a happy medium.
Sort of coding it:
var batchSize = 500;
for (var i=0; i<jsonObj.csvRows.length; i += batchSize) {
var docs = jsonObj.csvRows.slice(i, i+(batchSize -1));
db.collection.insert( docs, function(err, result) {
// Also don't JSON covert a *string*
console.log(err);
// Whatever
}
}
And doing it in chunks like this.
You can make those data as an array of elements , and then simply use the MongoDB insert function, passing this array to the insert function

How do I get a model from a Backbone.js collection by its id?

In my app, everything I do with data is based on the primary key as the data is stored in the database. I would like to grab a model from a collection based on this key.
Using Collection.at() requires the array index, Collection.getByCid() requires the client ID that backbone randomly generates.
What is the best way to grab the model I want from the collection with the given id value? I figure the worst I could do would be to iterate over each item, .get('id'), and return that one.
Take a look at the get method, it may be of some help :)
http://backbonejs.org/#Collection-get
get collection.get(id)
Get a model from a collection, specified by an id, a cid, or by passing in a model.
If your data requires you to use a different kind of key or a set that doesn't mesh well with at(), getByCid() or get(), there is also where(). Something like this might work:
window.lib = new Library;
window.lib.fetch([
success: function(model, response) {
console.log(window.lib.where({'BookID':488, 'Rev':2, 'Status':'Active'});
}
});

Categories

Resources