Asynchronous AJAX Calls in orderd manner - javascript

Hi firstly sorry for my bad English. I Already searched in SO. but i didn't get the exact answer i needed.
My issue is i need to synch the Ajax request. i know we can use the "asynch : false ".
but this will make browser locked. I have a folder tree(i am using "tafel tree" js) in my web. the tree nodes are generated at run-time. each time
user click a node it will send request to server and add the node to the tree.
but issue is if the page is refreshed by clicking f5 then i need to load the tree structure that i already selected previously.
i implemented it using "asynch : false ". but this will makes browser too slow.
and here what i have
function loadPage() {
/* some other codes are here*/
/* here i call an ajax for get the inside folder list in correct order.(i am usig protoype)*/
new Ajax.Request(ajaxGetInsideFolderIdsUrl,
{
parameters: {branchId: CurrentSelectfolderId},
onSuccess: function(res) {
/* onSuccess i call another function*/
var brs = res.responseText.split(","); // branch ids in correct order.
syncFolder(brs)
}
}
function syncFolder(brs){
for(var i= 0 ; i < brs.length; i ++){
var tempbranch = tree.getBranchById(brs[i].getId());
selectAfterChange( tempbranch)
/*
selectAfterChange function is used for selecting current branch. calling "tafle tree" select() function in side it.
i just created an copy of "select()","_openPopulate()" functions used in "tafle tree" and modified it with "asynch : false ".and now its working fine.
*/
}
}
function selectAfterChange(brs){
brs.chk_select();
/* for selecting the branch (created a copy of current "select()" function used in "tafle tree" js
and modified it with "asynch : false "and now its working fine.) */
showList();// for listing items in side that folder (in another Ajax page).
}
My problem is if a user opened a long branch.
And then refresh the page it will take too much time to load because of synch Ajax call.
Taking too much time is not an big issue to me. but the browser is get locked until all the request executed.
is there any other way to do this.

I'm not familiar with the tree library you're using, but in general, the way you'd solve this is to store the currently expanded path(s) in the browser (local storage, or cookies, or wherever), and when you refresh the page, load just the nodes that are visible.
Let's say that the user is currently looking at path one/two/three/four. You save that path somewhere, and then when the page loads again, you create a queue of paths to request from the back end, by splitting the path, and appending the components of the path one by one:
["one", "one/two", "one/two/three"]
You then send the AJAX request for "one", and when you get the result back, update that node in the tree, and send a request for "one/two". When that request returns, update the tree, and send the request for "one/two/three".
Depending on your design, you can then start filling in the rest of the tree with more async requests...

Related

Attempting to use a global array inside of a JS file shared between 2 HTML files and failing

So I have one HTML page which consists of a bunch of form elements for the user to fill out. I push all the selections that the user makes into one global variable, allTheData[] inside my only Javascript file.
Then I have a 2nd HTML page which loads in after a user clicks a button. This HTML page is supposed to take some of the data inside the allTheData array and display it. I am calling the function to display allTheData by using:
window.onload = function () {
if (window.location.href.indexOf('Two') > -1) {
carousel();
}
}
function carousel() {
console.log("oh");
alert(allTheData.toString());
}
However, I am finding that nothing gets displayed in my 2nd HTML page and the allTheData array appears to be empty despite it getting it filled out previously in the 1st HTML page. I am pretty confident that I am correctly pushing data into the allTheData array because when I use alert(allTheData.toString()) while i'm still inside my 1st HTML page, all the data gets displayed.
I think there's something happening during my transition from the 1st to 2nd HTML page that causes the allTheData array to empty or something but I am not sure what it is. Please help a newbie out!
Web Storage: This sounds like a job for the window.sessionStorage object, which along with its cousin window.localStorage allows data-as-strings to be saved in the users browser for use across pages on the same domain.
However, keep in mind that they are both Cookie-like features and therefore their effectiveness depends on the user's Cookie preference for each domain.
A simple condition will determine if the web storage option is available, like so...
if (window.sessionStorage) {
// continue with app ...
} else {
// inform user about web storage
// and ask them to accept Cookies
// before reloading the page (or whatever)
}
Saving to and retrieving from web storage requires conversion to-and-from String data types, usually via JSON methods like so...
// save to...
var array = ['item0', 'item1', 2, 3, 'IV'];
sessionStorage.myApp = JSON.stringify(array);
// retrieve from...
var array = JSON.parse(sessionStorage.myApp);
There are more specific methods available than these. Further details and compatibility tables etc in Using the Web Storage API # MDN.
Hope that helps. :)

Converting text to hyperlinks on the fly or store them in a DB?

I have a comments section on a webpage (similar to like on SO) where people can leave text but also links. It works using a standard textfield which does not allow HTML to be included. There is no formatting options etc.
Because no HTML is allowed, I need to parse links (basically text that begins with http or www) and then wrap it in an <a> tag.
To do this on the server would mean using something like JSoup to parse the text and then do the wrapping before inserting it with the rest of the comment text into my DB.
Alternatively, I was thikning that JQuery could scan ALL the comments on the page and wrap anything beginning with a http or www with the <a> tag.
Which one is the correct/better/more efficient method? I have a hunch that the JQuery way will cause some page slowdowns because its updating the DOM constantly on the fly but would like some confirmation from an expert!
When you consider you might at some point want to parse markdown or replace emoticons with images etc., thinking about the performance impact is a good decision.
Usually, you have 3 options:
1. Transform hyperlinks in the backend
I'll assume PHP here, but the principle stays the same:
function renderComment($comment) {
$commentHtml = transformLinks($comment->comment_text);
// render $commentHtml
}
This has the disadvantage of having to replace texts for every comment, on every page request, and is therefore not recommended. On the plus side, compared to option 2, you only have to store the comment text once.
2. Transform hyperlinks in the backend and store them in the database
Assuming PHP and MySQL:
CREATE TABLE comments (id ..., comment_text TEXT, comment_html TEXT, ...)
^^^^^^^^^^^^
function saveComment($createdOrChangedComment) {
$comment->comment_html = transformLinks($comment->comment_text);
saveToDatabase($comment);
}
function renderComment($comment) {
$commentHtml = $comment->comment_html;
// render $commentHtml
}
This means you have to store the comment text twice in the database, once as text, once as html - assuming you want some sort of "edit" button; if not, only saving the html is fine.
While taking up a bit more database space, this solution only fetches the pre-rendered HTML from the database and is therefore better performance-wise.
3. Transforming hyperlinks in the front-end
jQuery(document).ready(function(){
jQuery('.comment').each(function(commentElement) {
// Do magic
transformCommentToHTML(commentElement);
});
});
This should be fine performance-wise when you only have a few hundred comments.
If there are thousands of comments loaded to the DOM at once (which might be a performance hit in itself, but let's ignore that for now), the parsing might be noticable since it happens synchronously and blocks the browser.
In such a case you can parse/replace one pack of comments at a time (e.g. 300) and return the control back to the browser between parsing.
3.b Transform hyperlinks in the frontend, batch-by-batch
jQuery(document).ready(function(){
var $comments = jQuery('.comment');
var currentComment = 0;
(function transformCommentBatch() {
for (var batchLimit = currentComment + 300; currentComment < $comments.length && currentComment < batchLimit; currentComment++) {
// Do magic
transformCommentToHTML($comments[currentComment]);
}
if (currentComment < $comments.length) {
// Don't freeze the browser, continue in the next frame
setTimeout(transformCommentBatch, 1);
}
}());
});
That way, the browser can handle events and does not appear "frozen" to the user, while your first 300 comments in the DOM are transformed first - these are most likely on top of the page and the only ones visible on page load.

Parsing a large JSON array in Javascript

I'm supposed to parse a very large JSON array in Javascipt. It looks like:
mydata = [
{'a':5, 'b':7, ... },
{'a':2, 'b':3, ... },
.
.
.
]
Now the thing is, if I pass this entire object to my parsing function parseJSON(), then of course it works, but it blocks the tab's process for 30-40 seconds (in case of an array with 160000 objects).
During this entire process of requesting this JSON from a server and parsing it, I'm displaying a 'loading' gif to the user. Of course, after I call the parse function, the gif freezes too, leading to bad user experience. I guess there's no way to get around this time, is there a way to somehow (at least) keep the loading gif from freezing?
Something like calling parseJSON() on chunks of my JSON every few milliseconds? I'm unable to implement that though being a noob in javascript.
Thanks a lot, I'd really appreciate if you could help me out here.
You might want to check this link. It's about multithreading.
Basically :
var url = 'http://bigcontentprovider.com/hugejsonfile';
var f = '(function() {
send = function(e) {
postMessage(e);
self.close();
};
importScripts("' + url + '?format=json&callback=send");
})();';
var _blob = new Blob([f], { type: 'text/javascript' });
_worker = new Worker(window.URL.createObjectURL(_blob));
_worker.onmessage = function(e) {
//Do what you want with your JSON
}
_worker.postMessage();
Haven't tried it myself to be honest...
EDIT about portability: Sebastien D. posted a comment with a link to mdn. I just added a ref to the compatibility section id.
I have never encountered a complete page lock down of 30-40 seconds, I'm almost impressed! Restructuring your data to be much smaller or splitting it into many files on the server side is the real answer. Do you actually need every little byte of the data?
Alternatively if you can't change the file #Cyrill_DD's answer of a worker thread will be able to able parse data for you and send it to your primary JS. This is not a perfect fix as you would guess though. Passing data between the 2 threads requires the information to be serialised and reinterpreted, so you could find a significant slow down when the data is passed between the threads and be back to square one again if you try to pass all the data across at once. Building a query system into your worker thread for requesting chunks of the data when you need them and using the message callback will prevent slow down from parsing on the main thread and allow you complete access to the data without loading it all into your main context.
I should add that worker threads are relatively new, main browser support is good but mobile is terrible... just a heads up!

Losing context reference

I am encountering a 'small' problem when making a new object in the options page.
In the options page I create a few objects and save them as general settings. These objects have method to communicate with different API's. But as soon as I get one of those objects to work with, I lose the context on the page I am.
For example:
The options page I create an object that has a method 'request' and I send an ajax request to some api with this method. When I call this on an other page the ajax request is logged within the options page. When I close the options page I lose all context of the logs it makes.
Is there a way to force the context reference to the current page? or did I make a mistake with creating objects on the wrong pages/saving them and retrieving them on a page that needs them? (IE Should I only save the data I need to create objects on the page itself? (which seems like alot of overhead for the the same thing(?)).
Thanks in advance.
EDIT
I have an option page which creates an Object lets call it MyApi. I create a new MyApi and store it in chrome.storage.local . When a user has some text selected and clicks on the context menu I open a new page(selectedText.html) which shows the selected text and some API calls are made, which are mostly ajax requests. The moment I get the object from storage in selectedText.html and make any request with MyApi I see no logs in the network tab of the ajax requests, neither any console logs. But with my options page open I see everything in there.
EDIT2
save : function()
{
var obj = {'api':this.data};
chrome.storage.local.set(obj,function() { if(chrome.runtime.lastError) console.warn(chrome.runtime.lastError); });
}
This is in the background script.
You could achieve what you want this way:
Define your object/function/whatever in the background page.
Use chrome.runtime.getBackgroundPage() to get a hold on the background pages window object.
Execute the desired method, passing as argument an object of the local context. (Of course you have to modify the function to accept and make use of such an argument.)
Example:
In background.js:
function myFunc(winObj, msg) {
winObj.console.log(msg);
}
In otherPage.js:
chrome.runtime.getBackgroundPage(function(bgPage) {
bgPage.myFunc(window, "This will be logged in 'otherPage.html' !");
});
It is not the cleanest solution, but it might work...

Javascript Auto Update

In my stream page, I have one current song script, but it doesn't update... The user needs to refresh the page.
the script is:
<script name="whasong" id="whasongid" src="http://xxxx.xxxx.net/js/song/u4:2134" type="text/javascript">
You appear to have javascript turned off.
</script>
src="http://xxxx.xxxx.net/js/song/u4:2134" code:
document.write('SONG NAME');
Is it possible to autoupdate just this script without refreshing the whole page ?
You might be in need of Ajax functionality. Do you mean that, at the end of a song playing, you want to modify the page content to reflect the song change? If it is so, you might gain a lot by looking into a JavaScript framework like jQuery or Backbone.js and bind some behavior to the state change in question. So you will want the song change (for example) to trigger a certain function you will have prepared, which will use jQuery to query the server via Ajax to update the page, and change the DOM accordingly.
EDIT: If instead of waiting for an explicit state change in your web page, you wanted to update it periodically, you might be looking for such a strategy :
function updatePage(/* arguments */) {
// Call the server to get some data ...
// Update the page accordingly ...
if (/* continue updating the page? */) setTimeout(60000 /* 60 seconds */, updatePage);
}
If you were using jQuery for your Ajax needs for instance, that function could look like so :
function updatePage(/* arguments */) {
jQuery.get("/your/url", {foo: "bar"}, function(data) {
// Update the page accordingly ...
if (/* continue updating the page? */) setTimeout(60000 /* 60 seconds */, updatePage);
});
}
If you want some more specific help, don't hesitate to be more precise as to what your issue is.

Categories

Resources