There are plenty of reasons to want to avoid <iframe>s (and indeed frames in general) but what are the best alternatives? (The intent here being to avoid full page reloads).
Facebook, for instance, seems to keep its top bar and side menu in tact (for the most part) and a full page reload incredibly rare.
Searching for explanations with little idea of what to use as search terms has rendered me little insight, so I thought it best to raise the question here. Is this all Ajax, or is there more to it than that?
AJAX
The more traditional approach is "AJAX". In a nutshell, your javascript code can request specific content from the server on a time (every x seconds) or when a user event happens (e.g. a button click).
A very basic implementation in jQuery would look something like:
function updateShouts(){
// Assuming we have #shoutbox
$('#shoutbox').load('latestShouts.php');
}
setInterval( "updateShouts()", 10000 );
This will update a div with id "shoutbox" every 10 seconds with whatever content is retrieved from latestShouts.php.
More advanced implementation would involve retrieving only data (not presentation) in a format like JSON or XML, and then updating the existing HTML values with the data that was received.
WebSockets
More recently, browsers have started supporting something called WebSockets. WebSockets allow you to keep a bidirectional connection open between the browser and the server, and it allows the server to push information to the browser without the browser requesting it.
This is more efficient in many ways; with the main reason being the fact that you don't have to waste server calls every x seconds to check if data is there. WebSockets allow you to display information from the server almost as soon as it becomes available.
I hope that helps..
Cheers!
Injecting partial content using ajax is your best and easiest bet - I recommend jquery too.
Related
I'm new to Asp.net and web development, I have an application that makes some calculation based on the selected dates on the form and gets some data from sql db, every time the user change the date the from should be updated with the latest calculations, means a post back, which a bit annoying, what I want is what is the best way / method / language to retrieve the data into my from without post back the form?, is it possible with JavaScript?, I tried the update panel, but still annoying and slow with big calculations, could you please give me an example to start with as I have the below stored procedure on my database, and let's say that I have a TextBox will get the returned Value of #category:
alter PROCEDURE Test2sp
#id_ref nvarchar(50),
#category nvarchar(50) output
as
BEGIN
select #category= category from employees where id_ref=#id_ref
END
GO
Well, software is part science, but part art so to speak.
So, in software, RARE that "one solution" is the best.
So, in your question the goal is to avoid a post-back.
But, you should supply more information as to why that post-back is to be avoided.
means a post back, which a bit annoying,
Hum, ok, not really a solid narrative here.
The issue is we have quite a few choices. And thus which road and choice will depend.
For example, OFTEN we can as a quick and easy soluion?
We can introduce what is called a update-panel.
They are easy to setup, and they reduce efforts (developer cost's) on your part.
However, if you abuse use of up-date panels, then you wind up not saving performance, but might save developer cost and time!!
We all don't have un-limited time or unlimited government budgets for our software choices.
So, update panels are OFTEN lamented by many in asp.net land, but I happen to be a great fan of them. In fact, I find them downright amazing!!!
The reason for using a update panel (UP in this disscuion) are you eliminate the need to write client side browser code, get to use nice code behind, and in MOST cases are a viable choice and soluion.
So, what a UP does is allow you to update only parts of the browser (just in case everyone is wondering - razor pages attempt the same goal, but do this in a different way).
You don't mention/note when you want this calualation to occur. For example, are you wanting to use the control/text changed event?
Or, does user change the value, and have to hit some button?
Anway, we assume now that you have the code behind that calls and gets the information you want working.
Since you spent that time and money and effort to build that code?
Then give a UP a try!!
Just enclose that date and button inside of a UP, and you get what is called a "partial page" post back. That means that ONLY that part of the page will be sent to the server. And this means no flicker on the page, no scroling postion changes. In fact, quite much NEAR ALL of what you acheive with a ajax call will be the result.
But, keep in mind that while this looks and talks and quacks like a ajax call?
The page life cycle DOES run. That certanly means that page load will run. But, if the page been designed that you have the
if (!IsPostBack)
{
// the REAL first page load code goes here
// code to load drop downs, combo's, pick lists etc.
// code to load data and grids etc.
}
So, we design all pages with above in mind, since no need to re-load data on each post-back and round trip, right????
So, using a UP
Drag in a script manager (put it right below the page form tag.
Then take your date field/control (whatever) and the button you NOW have to run code behind to get that data.
The controls you need to operate on (update) need to be included inside of this update panel. (don't put more of the page into the UP then you need).
the pattern lookes like this:
<asp:UpdatePanel ID="UpdatePanel1" runat="server">
<ContentTemplate>
// put markup of button and controls here
</ContentTemplate>
</asp:UpdatePanel>
So, in above, put your button + date control in above. Try running it.
You not see a page post back, you are free to use code behind. And you not see any page flicker. it really is like magic.
Behind the scenes, you get a ajax setup, but without all the hassle.
Do give the above a try - you find the results amazing and beautifal - and effort less, since you get to use + write 100% server side code.
The next choice?
Create a static method in the current web page, and then use a ajax call.
This approach is about the best choice, but it more efforts, and you don't in code behind have any use of controls in the current web page (so the ajax post call has to pass any values required from the web page. and on return of that ajax call, then the client side js code has to update the controls on the page.
So best choice? Well, a ajax call in "somewhat" better, but then again, it reuqires far more efforts (and thus more cost).
Try the update panel. You be rather amazed, since it oh so easy to remove a post-back on given webform pages.
However, do keep in mind, that a page life cycle STILL occurs, but it only now the part you placed in that UP.
Currently, I'm using setTimeout() to pause a for loop on a huge list so that I can add some styling to the page. For instance,
Eg: http://imdbnator.com/process?id=wtf&redirect=false
What I use setTimeOut for:
I use setTimeout() to add images,text and css progress bar (Why doesn't Progress Bar dynamically change unlike Text?2).
Clearly, as you can see it is quite painful for a user to just browse through the page and hover over a few images. It gets extremely laggy. Is there any any workaround to this?
My FOR Loop:
Each for loop makes an ajax request on the background to a PHP API. It definitely costs me some efficiency there but how do all other websites pull it off with such elegance? I mean, I've seen websites show a nice loading image with no user interference while it makes an API request. While I try to do something like that, I have set a time-out everytime.
Is that they use better Server-Client side interaction languages like the node.js that I've heard?
Also, I'e thought of a few alternatives but run into other complications. I would greatly appreciate if you can help me on each of these possible alternatives.
Method 1:
Instead of making an AJAX call to my PHP API through jQuery, I could do a complete server side script altogether. But then, the problem I run into is that I cannot make a good Client Side Page (as in my current page) which updates the progress bar and adds dynamic images after each of the item of the list is processed. Or is this possible?
Method 2: (Edited)
Like one the useful answers below, I think the biggest problem is the server API and client interaction. Websockets as suggested by him look promising to me. Will they necessarily be a better fix over a setTimeout? Is there any significant time difference in lets say I replace my current 1000 AJAX requests into a websocket?
Also, I would appreciate if there is anything other than websocket that is better off than an AJAX call.
How do professional websites get around with a fluidic server and client side interactions?
Edit 1: Please explain how professional websites (such as http://www.cleartrip.com when you are requesting for flight details) provide a smooth client side while processing the server side.
Edit 2: As #Syd suggested. That is something that I'm looking for.I think there is a lot of delay in my current client and server interaction. Websockets seem to be a fix for that. What are the other/ best ways for improving server cleint interaction apart from the standard AJAX?
Your first link doesn't work for me but I'll try to explain a couple of things that might help you if I understand your overall problem.
First of all it is bad to have synchronous calls with large amount of data that require processing in your main ui thread because the user experience might suffer a lot. For reference you might want to take a look into "Is it feasible to do an AJAX request from a Web Worker?"
If I understand correctly you want to load some data on demand based on an event.
Here you might want to sit back and think what is the best event for your need, it's quite different to make an ajax request every once in a while especially when you have a lot of traffic. Also you might want to check if your previous request has completed before you initialize the next one (this might not be needed in some cases though). Have a look at async.js if you want to create chained asynchronous code execution without facing the javascript "pyramid of doom" effect and messy code.
Moreover you might want to "validate - halt" the event before making the actual request. For example let's assume a user triggers a "mouseenter" you should not just fire an ajax call. Hold your breath use setTimeout and check if the user didn't fire any other "mouseenter" event for the next 250 ms this will allow your server to breath. Or in implementations that load content based on scroll. You should not fire an event if the user scrolls like a maniac. So validate the events.
Also loops and iterations, we all know that if the damn loop is too long and does heavy lifting you might experience unwanted results. So in order to overcome this you might want to look into timed loops (take a look at the snippet bellow). basically loops that break after x amount of time and continue after a while. Here are some references that helped me with a three.js project. "optimizing-three-dot-js-performance-simulating-tens-of-thousands-of-independent-moving-objects" and "Timed array processing in JavaScript"
//Copyright 2009 Nicholas C. Zakas. All rights reserved.
//MIT Licensed
function timedChunk(items, process, context, callback){
var todo = items.concat(); //create a clone of the original
setTimeout(function(){
var start = +new Date();
do {
process.call(context, todo.shift());
} while (todo.length > 0 && (+new Date() - start < 50));
if (todo.length > 0){
setTimeout(arguments.callee, 25);
} else {
callback(items);
}
}, 25);
}
cleartip.com will probably might use some of these techniques and from what I've seen what it does is get a chunk of data when you visit the page and then upon scroll it fetches other chunks as well. The trick here is to fire the request a little sooner before the user reaches the bottom of the page in order to provide a smooth experience. Regarding the left side filters they only filter out data that are already in the browser, no more requests are being made. So you fetch and you keep something like cache (in other scenarios though caching might be unwanted for live data feeds etc).
Finally If you are interested for further reading and smaller overhead in data transactions you might want to take a look into "WebSockets".
You must use async AJAX calls. Right now, the user interaction is blocked while the HTTP ajax request is being done.
Q: "how professional websites (such as cleartrip.com) provide a smooth client side while processing the server side."
A: By using async AJAX calls
I'm using the setTimeout() function in javascript to allow a popup that says "loading" to be shown while I'm parsing some xml data. I found that at small enough delay values (below 10ms) it doesn't have time to show it before the browser freezes for a moment to do the actual work.
At 50ms, it has plenty of time, but I don't know how well this will translate to other systems. Is there some sort of "rule of thumb" that would dictate the amount of delay necessary to ensure a visual update without causing unnecessary delay?
Obviously, it'll depend on the machine on which the code is running etc., but I just wanted to know if there was anything out there that would give a little more insight than my guesswork.
The basic code structure is:
showLoadPopup();
var t = setTimeout(function()
{
parseXML(); // real work
hideLoadPopup();
}, delayTime);
Thanks!
UPDATE:
Turns out that parsing XML is not something that Web Workers can usually do since they don't have access to the DOM or the document etc. So, in order to accomplish this, I actually found a different article here on Stack Overflow about parsing XML inside a Web Worker. Check out the page here.
By serializing my XML object into a string, I can then pass it into the Web Worker through a message post, and then, using the JavaScript-only XML parser that I found in the aforementioned link, turn it back into an XML object within the Web Worker, do the parsing needed, and then pass back the desired text as a string without making the browser hang at all.
Ideally you would not ever have to parse something on the client side that actually causes the browser to hang. I would look into moving this to an ajax request that pulls part of the parsed xml (child nodes as JSON), or look at using Web Workers or a client side asynchronous option.
There appears to be no "rule-of-thumb" for this question simply because it was not the best solution for the problem. Using alternative methods to do the real meat of the work was the real solution, not using a setTimeout() call to allow for visual update to the page.
Given options were:
HTML 5's new Web Worker option (alternative information)
Using an AJAX request
Thanks for the advice, all.
I am in the middle of the design/development of a web store and am thinking my way through the best way of handling a transparent load of a couple of megabytes of product items. It seems the Asynchronous bit of AJAX doesn't mean parallel so I have to be a little bit creative here.
Rather than just pull a large lump of data down I was thinking of breaking it into pages of say 50->100 items and allowing the browser some time to process any internal messages.
The loader would pull down a page of data - fire a custom event to itself to get the next page. Theory is that if the browser has other messages to process this event would queue up behind them allowing the browser do anything else it has to do. A loss of a bit of speed - but a smoother user experience.
Rinse and repeat.
Add in some smoke and mirrors engineering - a loading icon or some such - to keep the user from noticing any delays and I should be right.
Before I dive into what is starting to sound like a fun bit of code can anyone think of a better way to pull down a large lump of data in as smooth and friendly a way as possible? I am an ancient old programmer - but JavaScript is a bit new to me.
Am I reinventing the wheel - AJAX already does all this - and I just don't know about it?
There are two ways to improve the situation:
a) reduce the data coming from the database - i.e. if there is some information, which is not used you don't need to load it. Also if there is non-changeable data you may cache it and request it only in the beginning once
b) load only the information which you need to show - that's the way which you thinking about, except the fact that you want to trigger new data loading automatically. Or at least that's what I understood. I'll suggest to keep the ajax requests as less as possible and make a new one only if the user needs more data. For example if the user stays on page 1 of 20, you don't need to fire loading of page 3 and 4. It's maybe good idea to load page 2, so the user could switch fast.
I've been getting more and more into high-level application development with JavaScript/jQuery. I've been trying to learn more about the JavaScript language and dive into some of the more advanced features. I was just reading an article on memory leaks when i read this section of the article.
JavaScript is a garbage collected language, meaning that memory is allocated to objects upon their creation and reclaimed by the browser when there are no more references to them. While there is nothing wrong with JavaScript's garbage collection mechanism, it is at odds with the way some browsers handle the allocation and recovery of memory for DOM objects.
This got me thinking about some of my coding habits. For some time now I have been very focused on minimizing the number of requests I send to the server, which I feel is just a good practice. But I'm wondering if sometimes I don't go too far. I am very unaware of any kind of efficiency issues/bottlenecks that come with the JavaScript language.
Example
I recently built an impound management application for a towing company. I used the jQuery UI dialog widget and populated a datagrid with specific ticket data. Now, this sounds very simple at the surface... but their is a LOT of data being passed around here.
(and now for the question... drumroll please...)
I'm wondering what the pros/cons are for each of the following options.
1) Make only one request for a given ticket and store it permanently in the DOM. Simply showing/hiding the modal window, this means only one request is sent out per ticket.
2) Make a request every time a ticket is open and destroy it when it's closed.
My natural inclination was to store the tickets in the DOM - but i'm concerned that this will eventually start to hog a ton of memory if the application goes a long time without being reset (which it will be).
I'm really just looking for pros/cons for both of those two options (or something neat I haven't even heard of =P).
The solution here depends on the specifics of your problem, as the 'right' answer will vary based on length of time the page is left open, size of DOM elements, and request latency. Here are a few more things to consider:
Keep only the newest n items in the cache. This works well if you are only likely to redisplay items in a short period of time.
Store the data for each element instead of the DOM element, and reconstruct the DOM on each display.
Use HTML5 Storage to store the data instead of DOM or variable storage. This has the added advantage that data can be stored across page requests.
Any caching strategy will need to consider when to invalidate the cache and re-request updated data. Depending on your strategy, you will need to handle conflicts that result from multiple editors.
The best way is to get started using the simplest method, and add complexity to improve speed only where necessary.
The third path would be to store the data associated with a ticket in JS, and create and destroy DOM nodes as the modal window is summoned/dismissed (jQuery templates might be a natural solution here.)
That said, the primary reason you avoid network traffic seems to be user experience (the network is slower than RAM, always). But that experience might not actually be degraded by making a request every time, if it's something the user intuits involves loading data.
I would say number 2 would be best. Because that way if the ticket changes after you open it, that change will appear the second time the ticket is opened.
One important factor in the number of redraws/reflows that are triggered for DOM manipulation. It's much more efficient to build up your content changes and insert them in one go than do do it incrementally, since each increment causes a redraw/reflow.
See: http://www.youtube.com/watch?v=AKZ2fj8155I to better understand this.