I have tried many various ways to insert data (see the // comments). But still it doesn't seem to insert the data in either chromium or firefox (with ubuntu).
Full example:
`
<!doctype html>
<html><head>
<meta charset="UTF-8">
<script type = "text/javascript" src="jquery-1.11.0.js"></script>
<script type = "text/javascript" src="jquery.indexeddb.js"></script>
</head><body><script type = "text/javascript">
var key = null;
// Simply open the database once so that it is created with the required tables
$.indexedDB("BibleWay", {
"schema": {
"1": function(versionTransaction){
var catalog = versionTransaction.createObjectStore("context", {
keyPath: 'keyPath', autoIncrement: true
});
catalog.createIndex("bid");
catalog.createIndex("bk");
catalog.createIndex("c");
catalog.createIndex("v");
catalog.createIndex("t");
},
}
}).done(function(){
// Once the DB is opened with the object stores set up, show data from all tables
window.setTimeout(function(){
downloadCatalog();
}, 200);
});
function downloadCatalog(){
$.getJSON("3.json", function(data){
$.indexedDB("BibleWay").transaction("context").then(function(){
console.log("Transaction completed, all data inserted");
// loadFromDB("catalog");
}, function(err, e){
console.log("Transaction NOT completed", err, e);
}, function(transaction){
var catalog = transaction.objectStore("context"),$doadd,i2=0;;
catalog.clear();
/*$.each(data, function(i){
_(catalog.add(this));
})*/
$.each(data, function( index, value ) {
// bible id [bid]
var split_bid=index;
$.each(value, function( index, value ) {
// bible book name
var split_bk=index;
$.each(value, function( index, value ) {
// bible book chapter
var split_c=index;
$.each(value, function( index, value ) {
//var $doadd={"bid":split_bid,"bk":split_bk,"c"=split_c ,"v"=index,"t"=value};
//$doadd="{bid:\""+split_bid + "\",bk:\"" + split_bk + "\",c=" + split_c + ",v=" + index + ",t=\"" + value+"\"}";
$doadd=new Object();
$doadd.bid=split_bid;
$doadd.bk=split_bk;
$doadd.c=split_c;
$doadd.v=index;
$doadd.t=value;
catalog.add($doadd);
if (i2<10) {
console.log($doadd);
++i2;
}
//catalog.add(JSON.stringify($doadd));
//catalog.add({bid:split_bid,bk:split_bk,c:split_c,v:index,t:value});
//console.log(split_bid + " " + split_bk + " " + split_c + " " + index + ": " + value );
})
})
})
});
})
}
);
}
//$.indexedDB("BibleWay").deleteDatabase();
</script></body></html>
`
The JSON file "3.json":
{"3":{"GEN":{"1":{"1":"In the begynnynge God created heaven and erth."}}}}
Console Messages:
XHR finished loading: "http://host.host/3.json". jquery-1.11.0.js:9666
Object {bid: "3", bk: "GEN", c: "1", v: "1", t: "In the begynnynge God created heaven and erth."}
Transaction completed, all data inserted.
I found a bug in the jquery indexeddb api first that i fixed by commenting the line 90:
//e.name = "exception";
dfd.rejectWith(idbRequest, ["exception", e]);
This test is based on the code found in this example: http://nparashuram.com/jquery-indexeddb/example/
Thank you for your help
It's important and common for a library to surface any underlying errors, so if you've got some error messages or other warnings please update your answer to include them. I am wondering in particular whether your transaction has autocommited before the write was attempted.
I'm not too familiar with this API, but I see a common antipattern in that you are chaining your store creation and data insertion. Unless your jQuery library has designed around this common pattern, you could be running into that trouble here.
In IDB all action happens in a transaction. When you create modify anything related to a store or index you need what in IDB parlance is called a versionchange event. While such a transaction is capable of writing, reading and modifying schema, a phenomena us developers notice is that it ends up "autocommiting" despite your best attempts to keep it alive using closures, which maintain a reference to the commit and thereby should (I believe) keep it from commiting.
In case that's the problem, I suggest you decompose your callback hell into smaller helldoms: one for the object store, one for the object insertion. What this tends to do is create two closures around your versionchange and readwrite transactions and makes sure each operation completes successfully.
Related
I am trying to develop an Excel JavaScript add-in that processes all selected ranges.
When selecting a large range with data, such as A1:O1000000, desktop Excel hangs when I call the ctx.sync method after getting the ctx.workbook.getSelectedRanges() object.
Also the problem is reproduced in Excel 2016 and Excel 2019 desktop versions.
In Excel Online (web), this code works normally.
My Excel version is 16.0.12430.20172 64-bit
onTestClick(e)
{
console.log("onTestClick");
Excel.run(function (ctx) {
let selectedRanges = ctx.workbook.getSelectedRanges();
selectedRanges.load(["areas", "areaCount", "address", "addressLocal"]);
selectedRanges.areas.load({ $all: false, address: true, addressLocal: true});
return ctx.sync().then(function () {
for (let index = 0; index < selectedRanges.areaCount; index++) {
let area = selectedRanges.areas.items[index];
console.log("area[" + index + "] = " + area.address);
}
}).catch(function (error) {
console.log("onTestClick sync error: " + error);
}).then(()=>{
console.log("onTestClick end");
});
}).catch(function (error) {
console.log("onTestClick Excel.run error: " + error);
});
}
In your script above, load "areas" in selectedRanges is not necessary. You can make some changes on "selectedRanges.load(["areas", "areaCount", "address", "addressLocal"]);" to "selectedRanges.load(["areaCount", "address", "addressLocal"]);" instead. With the change, you can see the result is returned quickly.
The reason is that "areas" is Excel.RangeCollection(https://learn.microsoft.com/en-us/javascript/api/excel/excel.rangecollection?view=excel-js-preview), which returns a collection of rectangular ranges that comprise this RangeAreas object. The return value for Range(A1:O1000000) is too big to be processed.
When calling selectedRanges.load("areas"), it equals to call selectedRanges.areas.load() which is a empty load. An empty-load loads all scalar properties which can have significant performance overhead.
references:
https://learn.microsoft.com/en-us/office/dev/add-ins/excel/performance#load-necessary-properties-only
https://learn.microsoft.com/en-us/office/dev/add-ins/excel/excel-add-ins-advanced-concepts#scalar-and-navigation-properties
I'm trying to eventually create a php script to update mysql. As I go step by step and try to set the variables and have js behave correctly, I'm having difficulty with changing the 'resource' rooms in the scheduler.
If I change the location of the event and change the room, I get a JS alert saying "Title was dropped on to [new time] from [old time]. Old room [oldresource]. New Room [newresource]". That's working well.
However, if I move the event to a location on the same day, I get errors - because info.oldResource and info.newResource are only available IF the event has moved to a NEW resource. If they're moving within the same resource, the object doesn't exist.
I tried working in an IF statement. Console Log shows null, but my if statement is not stopping the processing of the rest of the code. This will eventually (I think) result in the commands not being run correctly - once I pass them to PHP.
I plan on having an 'else' statement that does not include oldResource or newResource dialogue to process changes that stay within the same resource.
My code is as such:
eventDrop: function(info) {
if (typeof info.oldResource !=="null") {
var oll = console.log(info.oldResource);
var nww = console.log(info.newResource);
var oldroom = (info.oldResource.id);
var newroom = (info.newResource.id);
var newtitle = info.event.title;
/*if (oldroom = newroom) {alert ("Same Room");}*/
alert(newtitle + " was dropped on " + info.event.start.toString() + ".Old Time: " + info.oldEvent.start.toString() + ". From: " + oldroom + ". To: " + newroom );
if (!confirm("Are you sure about this change?")) {
info.revert();
}}
},
You don't need typeof here. Just write
if (info.oldResource != null)
instead.
P.s. if you call typeof on a property that's set to null I would expect the typeof call to return "object", not "null". But like I said, it's irrelevant because you don't need it. Just check for the value null directly.
I am experiencing a "callback hell" situation in Node.js.
Basically what I want is:
Read data from a static json file (local) --> query MongoDB to get two records from two separate collections --> compare the returned data -> add the result after compare into result object --> go to next step in the loop --> repeat.
Please review the code and let me know where is the problem.
jsonfile.readFile(file, function(err, staticData) {
if(err){
console.log("Error while loading Tower Details from Static Data " + err);
}
else{
var staticD = staticData.Teams;
var l = staticData.Teams.length;
// console.log('*******************Getting Tower Level Data from Static File*******************');
//console.log('*******************Tower Name received is ******************* ' + staticData.Tower);
if(counter == l){
console.log('Inside the couneter loop');
res.json(testObject);
}
for ( var i = 0 ; i<l; i++){
var trackName = staticD[i].name
console.log('Counter--------->>>>' + counter);
//console.log("Team name " + staticD[i].name);
++counter;
for (var j = 0 ; j<staticD[i].applications.length;j++){
//var RObj;
//var AObj;
//console.log("Application Name " + staticD[i].applications[j]);
var applicationName = staticD[i].applications[j];
var test = new Object();
test.data = [];
var resultSet;
var response = reference.find({'appname' : applicationName , 'track' : trackName }).sort({'_id': -1});
var promise = response.exec();
var alertT = alert.find({'appname' : applicationName , 'track' : trackName }).sort({'_id': -1}).limit(1);
var promise1 = alertT.exec();
promise.then(function allRefRecords (recordAlerts){
if(recordAlerts.length >0){
//console.log('Ref Length' + recordAlerts.length);
recordAlerts.forEach(function refRecord(R){
testObject.data.testInfra.push(R);
//console.log('testObject' + testObject.data.testInfra);
});
}
});
promise1.then(function allAlertsRecords (alerts){
if(alerts.length > 0){
alerts.forEach(function refRecord(a){
// console.log('a' + a)
testObject.data.testCustom.push(a);
//console.log('testObject' + testObject.data.testCustom);
// res.json(testObject);
});
}
})
.then(function(){
resultSet = compareData(testObject.data.testCustom,testObject.data.testInfra);
test.data.push(resultSet);
})
.then(function(){
res.json(test);
});
}
}
}
});
});
Don't nest functions, give them names and place them at the top level
of your program. Use function hoisting to your advantage to move
functions 'below the fold'. Handle every single error in every one
of your callbacks and use a linter like standard to help you with
this. Create reusable functions and place them in a module to reduce
the cognitive load required to understand your code. Splitting your
code into small pieces like this also helps you handle errors, write
tests, forces you to create a stable and documented public API for
your code, and helps with refactoring.
Source : http://callbackhell.com/
It is possible to avoid callback hell with ASYNC, with PROMISES, with DESIGNS, and many other ways...
But 99% of time, design is the finest (imho) and you don't need other things.
Some links :
How to Elegantly Solve the Callback Hell
Avoiding Callback Hell in Node.js
Remember that callback hell is not a fatality ;)
Some tips to prevent a further appearance of Callback Hell you can browse next libraries:
Async.js: you can execute functions in series without nesting them.
Bluebird: asynchronous logic will be more manageable with mapping and enqueueing.
Q: reveals the concept of promise to manage the nested calls effortlessly.
I'm trying to compare a new object with the original using CloudCode beforeSave function. I need to compare a field sent in the update with the existing value. The problem is that I can't fetch the object correctly. When I run the query I always get the value from the sent object.
UPDATE: I tried a different approach and could get the old register ( the one already saved in parse). But the new one, sent in the request, was overridden by the old one. WHAT?! Another issue is that, even thought the code sent a response.success(), the update wasn't saved.
I believe that I'm missing something pretty obvious here. Or I'm facing a bug or something...
NEW APPROACH
Parse.Cloud.beforeSave('Tasks', function(request, response) {
if ( !request.object.isNew() )
{
var Task = Parse.Object.extend("Tasks");
var newTask = request.object;
var oldTask = new Task();
oldTask.set("objectId", request.object.id);
oldTask.fetch()
.then( function( oldTask )
{
console.log(">>>>>> Old Task: " + oldTask.get("name") + " version: " + oldTask.get("version"));
console.log("<<<<<< New Task: " + newTask.get("name") + " version: " + newTask.get("version"));
response.success();
}, function( error ) {
response.error( error.message );
}
);
}
});
OBJ SENT {"name":"LLL", "version":333}
LOG
I2015-10-02T22:04:07.778Z]v175 before_save triggered for Tasks for user tAQf1nCWuz:
Input: {"original":{"createdAt":"2015-10-02T17:47:34.143Z","name":"GGG","objectId":"VlJdk34b2A","updatedAt":"2015-10-02T21:57:37.765Z","version":111},"update":{"name":"LLL","version":333}}
Result: Update changed to {}
I2015-10-02T22:04:07.969Z]>>>>>> Old Task: GGG version: 111
I2015-10-02T22:04:07.970Z]<<<<<< New Task: GGG version: 111
NOTE: I'm testing the login via cURL and in the parse console.
CloudCode beforeSave
Parse.Cloud.beforeSave("Tasks", function( request, response) {
var query = new Parse.Query("Tasks");
query.get(request.object.id)
.then(function (oldObj) {
console.log("-------- OLD Task: " + oldObj.get("name") + " v: " + oldObj.get("version"));
console.log("-------- NEW Task: " + request.object.get("name") + " v: " + request.object.get("version"));
}).then(function () {
response.success();
}, function ( error) {
response.error(error.message);
}
);
});
cURL request
curl -X PUT \
-H "Content-Type: application/json" \
-H "X-Parse-Application-Id: xxxxx" \
-H "X-Parse-REST-API-Key: xxxxx" \
-H "X-Parse-Session-Token: xxxx" \
-d "{\"name\":\"NEW_VALUE\", \"version\":9999}" \
https://api.parse.com/1/classes/Tasks/VlJdk34b2A
JSON Response
"updatedAt": "2015-10-02T19:45:47.104Z"
LOG
The log prints the original and the new value, but I don't know how to access it either.
I2015-10-02T19:57:08.603Z]v160 before_save triggered for Tasks for user tAQf1nCWuz:
Input: {"original":{"createdAt":"2015-10-02T17:47:34.143Z","name":"OLD_VALUE","objectId":"VlJdk34b2A","updatedAt":"2015-10-02T19:45:47.104Z","version":0},"update":{"name":"NEW_VALUE","version":9999}}
Result: Update changed to {"name":"NEW_VALUE","version":9999}
I2015-10-02T19:57:08.901Z]-------- OLD Task: NEW_VALUE v: 9999
I2015-10-02T19:57:08.902Z]-------- NEW Task: NEW_VALUE v: 9999
After a lot test and error I could figure out what was going on.
Turn out that Parse is merging any objects with the same class and id into one instance. That was the reason why I always had either the object registered in DB or the one sent by the user. I honestly can't make sense of such behavior, but anyway...
The Parse javascript sdk offers an method called Parse.Object.disableSingeInstance link that disables this "feature". But, once the method is called, all object already defined are undefined. That includes the sent object. Witch means that you can't neither save the sent object for a later reference.
The only option was to save the key and values of the sent obj and recreate it later. So, I needed to capture the request before calling disableSingleInstance, transform it in a JSON, then disable single instance, fetch the object saved in DB and recreate the sent object using the JSON saved.
Its not pretty and definitely isn't the most efficient code, but I couldn't find any other way. If someone out there have another approach, by all means tell me.
Parse.Cloud.beforeSave('Tasks', function(request, response) {
if ( !request.object.isNew() ) {
var id = request.object.id;
var jsonReq;
var Task = Parse.Object.extend("Tasks");
var newTask = new Task;
var oldTask = new Task;
// getting new Obj
var queryNewTask = new Parse.Query(Task);
queryNewTask.get(id)
.then(function (result) {
newTask = result;
// Saving values as a JSON to later reference
jsonReq = result.toJSON();
// Disable the merge of obj w/same class and id
// It will also undefine all Parse objects,
// including the one sent in the request
Parse.Object.disableSingleInstance();
// getting object saved in DB
oldTask.set("objectId", id);
return oldTask.fetch();
}).then(function (result) {
oldTask = result;
// Recreating new Task sent
for ( key in jsonReq ) {
newTask.set( key, jsonReq[key]);
}
// Do your job here
}, function (error) {
response.error( error.message );
}
);
}
});
If I were you, I would pass in the old value as a parameter to the cloud function so that you can access it under request.params.(name of parameter). I don't believe that there is another way to get the old value. An old SO question said that you can use .get(), but you're claiming that that is not working. Unless you actually already had 9999 in the version...
edit - I guess beforeSave isn't called like a normal function... so create an "update version" function that passes in the current Task and the version you're trying to update to, perhaps?
Rather than performing a query, you can see the modified attributes by checking which keys are dirty, meaning they have been changed but not saved yet.
The JS SDK includes dirtyKeys(), which returns the keys that have been changed. Try this out.
var attributes = request.object.attributes;
var changedAttributes = new Array();
for(var attribute in attributes) {
if(object.dirty(attribute)) {
changedAttributes.push(attribute);
// object.get(attribute) is changed and the key is pushed to the array
}
}
For clarification, to get the original attribute's value, you will have to call get() to load those pre-save values. It should be noted that this will count as another API request.
Hey this worked perfectly for me :
var dirtyKeys = request.object.dirtyKeys();
var query = new Parse.Query("Question");
var clonedData = null;
query.equalTo("objectId", request.object.id);
query.find().then(function(data){
var clonedPatch = request.object.toJSON();
clonedData = data[0];
clonedData = clonedData.toJSON();
console.log("this is the data : ", clonedData, clonedPatch, dirtyKeys);
response.success();
}).then(null, function(err){
console.log("the error is : ", err);
});
For those coming to this thread in 2021-ish, if you have the server data loaded in the client SDK before you save, you can resolve this issue by passing that server data from the client SDK in the context option of the save() function and then use it in the beforeSave afterSave cloud functions.
// eg JS client sdk
const options = {
context: {
before: doc._getServerData() // object data, as loaded
}
}
doc.save(null, options)
// #beforeSave cloud fn
Parse.Cloud.beforeSave(className, async (request) => {
const { before } = request.context
// ... do something with before ...
})
Caveat: this wouldn't help you if you didn't have the attributes loaded in the _getServerData() function in the client
Second Caveat: parse will not handle (un)serialization for you in your cloud function, eg:
{
before: { // < posted as context
status: {
is: 'atRisk',
comment: 'Its all good now!',
at: '2021-04-09T15:39:04.907Z', // string
by: [Object] // pojo
}
},
after: {
status: { // < posted as doc's save data
is: 'atRisk',
comment: 'Its all good now!',
at: 2021-04-09T15:39:04.907Z, // instanceOf Date
by: [ParseUser] // instanceOf ParseUser
}
}
}
first off : I'm new to node, and a relative programming beginner.
I'm trying to create a small web app with Express, whose only goal is to fetch and reformat data from a website that doesn't have an open API.
To do so, I've decided to learn about scraping, and that brought me to Cheerio and Request.
I'm using reddit as an example, to learn on. The end goal in this example is to gather the name and href of the posts on the front page as well as the url leading to the comments, then to go on that page to scrape the number of comments.
What follows is the route that is called on a GET request to / (please excuse the variable names, and the comments/console.logs, I got frustrated) :
/*
* GET home page.
*/
exports.index = function(req, res){
var request = require('request')
, cheerio =require('cheerio')
, mainArr = []
, test = "test"
, uI
, commentURL;
function first() {
request("http://www.reddit.com", function(err, resp, body) {
if (!err && resp.statusCode == 200) {
var $ = cheerio.load(body);
$('.thing', '#siteTable').each(function(){
var url = $('a.title', this).attr('href')
, title = $('a.title', this).html()
, commentsLink = $('a.comments', this).attr('href')
, arr = [];
arr.push(title);
arr.push(url);
arr.push(commentsLink);
mainArr.push(arr);
});
second();
};
});
}
function second() {
for (i = mainArr.length - 1; i >= 0; i--) {
uI = mainArr[i].length - 1;
commentURL = mainArr[i][uI];
console.log(commentURL + ", " + uI + ", " + i);
var foo = commentURL;
request(foo, function(err, resp, body) {
console.log("what the shit");
// var $ = cheerio.load(body);
// console.log(mainArr.length + ", " + commentURL + ", " + i + ", " + uI);
// var test = $('span.title', 'div.content').html();
console.log(test + ", "+ foo + ", " + commentURL + ", " + i + ", " + uI);
// mainArr[1][2] = test;
});
};
if (i<=0) {
res.render('index', {title: test});
};
}
first();
};
The function first(); works as intended. It puts the title, the href and url to the comments in an array, then pushes that array in a master array containing those data points for all of the posts on the front page. It then calls the function second();
Said function's goal is to loop through the master array (mainArr[]), then select all of the urls leading to comments (mainArr[i][uI]) and launch a request() with that url as first parameter.
The loop works, but during the second call of request() inside the second() function, everything breaks down. The variable i gets set permanently at -1, and commentURL (the variable that is set to the URL of the comments of the current post), is defined permanently as the first url in arrMain[]. There are also weird behaviors with arrMain.length. Depending on where I place it, it tells me that arrMain is undefined.
I have a feeling that I'm missing something obvious (probably to do with asynchronicity), but for the life of me, I can't find it.
I would be really greatful for any suggestions!
You are correct about your guess, it's the infamous "Javascript loop Gotcha". See here, for example, for an explanation:
Javascript infamous Loop issue?
Besides that, it seems that only your debug prints are affected. The commented code regarding var test ought to work.
Finally, the kind of language is frowned upon in SO, you would do well to take 2 minutes and change your variable names in this post.