My scenario:
I have a page—let's call it items.aspx—that has an HTML table that is generated with a $.get() call to my ASP Web API controller. When a user clicks on one of the items in the HTML table, the user is sent to a page representing that item's details—let's call that one itemDetails.aspx. I predict it will be common to view the itemDetails.aspx page and then decide to cancel and redirect back to the items.aspx page.
The problem:
The problem is that each time the items.aspx page is loaded, the ajax call runs again and it takes a couple seconds to populate the table.
The question:
Is it possible to, when the user clicks the item to go to the itemDetails.aspx page, store/cache the HTML output of the table for a limited time? Since I only want this to happen if the user clicks the item link, I figured javascript would be the best tool.
You could use a function that returns a jQuery Promise for the HTML, whether it is loaded from a local cache or an AJAX request.
A simple example:
// Refers to cached HTML when available
var cached;
// Returns a promise for the HTML
function getHtml() {
var defer;
if(cached) {
defer = new Deferred();
defer.resolve(cached);
} else {
defer = $.get(/* ... */);
defer.then(function(response) {
cached = response;
})
}
return defer.promise();
}
Usage:
getHtml().then(function(html) {
$container.html(html);
})
A real implementation would probably do something to prevent concurrent $.get calls, and expire the cache when required.
Related
The Javascript of my webpage is executing the code shown below when a webpage is loading. The call to get can take a couple of seconds, thus I would like to wait with loading the page until the call is finished.
Is it possible to postpone the loading of the page until the call to get finished? Or even a better way would be to show some spinning wheel (instead of a white page), so that the user is aware that some process is going on. Is this possible?
document.addEventListener("DOMContentLoaded", function(){
if (!sessionStorage.getItem("userID")) {
// Get new userID
$.get("/new_user").done(function (data) {
sessionStorage.setItem("userID", data);
});
}
});
I'm not sure if your implementation with the event listener "DOMContentLoaded" is right, I think we are missing some context here and you may be able to check session storage before this, but I will assume that part is correct and address your question about a loading spinner.
Also I wont go into detail about how to make a load spinner as there are a lot of examples out there.. but as far as having your page be a load spinner while your ajax call is running I would make the function async and set the html of the page to the loading spinner right before the call, and then after you await the call, set the data and then set the html to what you want it to be after it's done loading
document.addEventListener('DOMContentLoaded', async (event) => {
if (!sessionStorage.getItem("userID")) {
document.getElementById('container').innerHTML = '<div>loadspinnerhtml</div>';
var data = await $.get("/new_user")
sessionStorage.setItem("userID", data);
document.getElementById('container').innerHTML = '<div>theloadedhtml</div>'
}
});
I am visiting a site that emits a (large) JSON response. A click triggers the request:
casper.then(function li10() {
casper.click(SEARCH_BUTTON_CSS);
});
But according to my web proxy, the client closes the connection before receiving the entire response. I've tried waiting for the URL to appear. It waits for the URL as expected, but that doesn't appear to be sufficient:
casper.then(function li11() {
casper.waitForUrl(/\/search-results\/p\?/,
function() {
var search_url = casper.getCurrentUrl();
console.log('found search results, url = ' + search_url);
},
function() {
console.log('failed to find search results');
casper.exit();
},
10000);
});
So: what is something dependable that I can wait for that will guarantee that the JSON code has completely loaded before proceeding to the next step?
I'm assuming that you would fill a search field, click a button and the JavaScript makes an Ajax request to receive a JSON response to then parse it and display the results.
casper.waitForUrl() is used to wait for the page URL to change to the specified one. It has nothing to do with resources that are loaded separately such as AJAX responses.
You either need to
find out the specific URL that is requested for the search action and use casper.waitForResource() to wait for that specific resource or
devise a specific selector that appears when the search data is parsed and injected into the page with casper.waitForSelector().
I'm using javascript but not jQuery.
Let's say I have 3 users in my database [Kim,Ted,Box] and 3 buttons as below:
<button class="user">Kim</button>
<button class="user">Ted</button>
<button class="user">Box</button>
<div id="displayArea"></div>
If a person clicks on any of the buttons it will display the user information in the div.
Assume I click on the Kim button and it uses ajax and displays the information of Kim. And now I click Ted it also calls a new ajax function to get the data. But when I click Kim again I call the new ajax function to get the data rather than get the data from cache or some place. How can I achieve it without getting the data from ajax function if the data is loaded before?
The reason why I need this is because I don't want the user to wait to get the data again that they loaded before.
Add one level of abstraction by creating a function that takes care of the caching and either returns the data from the cache or makes an Ajax request. For example:
var getDataForUser = (function() {
/**
* We use an object as cache. The user names will be keys.
* This variable can't be accessed outside of this function
*/
var cache = {};
/**
* The function that actually fetches the data
*/
return function getDataForUser(user, callback) {
if (cache.hasOwnProperty(user)) { // cache hit
callback(cache[user]);
} else {
// somehow build the URL including the user name
var url = ...;
makeAjaxRequest(url, function(data) { // cache miss
cache[user] = data; // store in cache
callback(data);
});
}
};
}());
Then you make the call
getDataForUser('John', function(data) { /*...*/ });
twice and the second time it will hit the cache.
Hi i have to perform perform like, when the ajax is in progress, then do not allow the user to do page refresh.
here is the code i have
$('#submit').click(function() {
$(function() {
$(".col1").mask("Generating csv...."); //This will generate a mark, Here i would like to prevent the user from doing any sort of operation.
var to = $('#filters_date_to').val();
var from = $('#filters_date_from').val();
$.ajax({
url:"../dailyTrade/createCsv?filters[date][to]="+to+"&filters[date][from]="+from,success:function(result){
if(result) {
$(".col1").unmask(); //Here we can unlock the user from using the refresh button.
window.location = '../dailyTrade/forceDownload?file='+result;
setTimeout('location.reload(true);',5000);
}
}
});
});
});
Any suggestions.
Best you can do is use onbeforeunload to present the user with a message saying that a request is in progress and asking them if they are sure they want to proceed.
e.g.
var req;
window.onbeforeunload = function() {
if(req) {
return 'Request in progress....are you sure you want to continue?';
}
};
//at some point in your code
req = //your request...
You cannot, in any way, prevent the user from leaving your page using JS or anything else.
I doubt if you should do that.
$(window).bind('beforeunload',function(){
return 'are you sure you want to leave?';
});
If you are talking about a refresh "html button" on your web page, that can easily be done. Just before you make your ajax call, disable your refresh button and on success/error function of the ajax call enable it.
Disable button
$("#refreshBtn").attr("disabled", "disabled");
Enable button
$("#refreshBtn").removeAttr("disabled");
You cannot do it just by inserting JavaScript code.
Only ways I can think of are:
Use synchronous ajax call, on that way browser should freeze (however it will notify user that script is taking too long to process and user will be able to stop execution)
Write browser plugin that will modify browser behavior (eg. prevent refreshing page for url that you preset)
Both ways are ugly and I wouldn't recommend doing it.
You should modify your script so it can resume execution if page has been refreshed (use HTML5 localStorage).
http://www.w3schools.com/html/html5_webstorage.asp
In your case, I would put in localStorage simple state (boolean) to check did ajax call happened or not. If it did happened, then just try calling again same url and you will get file name. But on server side (if you haven't done already) you should implement caching, so when same url is called twice, you don't need to make two separate files, it could be same file (server will be much lighter on hardware resources).
I am running into a weird issue where I have 4 loading async calls that populate certain parts of a page. I am using the hash to perform javascript navigation without loading the page. Once I enter the page that is doing the async calls and immediately try to navigate out of it, the page waits until all calls are done before the page navigates to the other page. This should go to the other regardless if the async calls were done yet or not. I am using knockout.js to populate HTML to contains the redirect javascript and I know it's getting called because I put logging statements to make sure they were being executed. Thoughts?
app.viewModel.members.container().html("<div>" + html + "</div>");
I fixed this by keeping a track of the async calls and doing a loop through them and aborting them on page navigation.
Ajax Calls
config.request = $
.ajax({..............
});
if (!def.cancelRequest)
config.requests.push(config.request);
Abort Function
app.abortRequests = function() {
$(config.requests).each(function(index, item) {
item.abort();
});
config.requests = [];
};