I am writing a Javascript based upload progress meter. I want to use the standard multipart submit method (rather than submitting the file in an iframe). During the submit, I send ajax requests that return the % complete of the upload and then update the progress meter accordingly.
This all works smoothly in FireFox & IE. However, Safari seems prevent the completion of ajax requests after the main form has been submitted. In the debugger, I can see the request headers, but it appears as though the response is never received.
Anyone aware of this, or how to get around it?
Yes, this is how Safari and any browser based on WebKit (i.e. Google Chrome) behave. I recently ran into this on a file upload progress meter also. I ended up using the same technique seen at http://drogomir.com/blog/2008/6/30/upload-progress-script-with-safari-support to get the ajax to work. In my case, I didn't want to change the look of my application to the one Drogomir uses, but the technique itself worked. Essentially, the solution is to create a hidden iframe only in Safari that loads jQuery and your AJAX script. Then, the top frame calls a function in that frame on form submit. All other browsers still work the same as before.
This is a WebKit bug. See https://bugs.webkit.org/show_bug.cgi?id=23933
Are you using an iframe to submit your form to? I'm guessing that once the form is submitted, the page enters a state where no more modifications to the DOM can be made.
Check a tutorial such as this one for more information.
This actually sounds like correct behaviour to me - and im surprised that firefox and IE behave otherwise.
It is akin to you attempting to leave a page and the page still interacting with you - sounds naughty!
i can see why this would be of benefit - but I would hope it only the case if you are performing a POST to the uri you are currently accessing, or at worst same-domain.
Related
I know that every browser reacts in a different way and that the behaviour can change with new versions.
I am trying to find a way to disable the basic-authentication dialog of chrome and IE, because I want to handle it on my own in an JavaScript (AngularJS) client. I found this nice module, with this the popup is not shown in Firefox, but in chrome and IE it still is shown.
I would like to handle it in JavaScript. There seems to be a common way to let the server send not a HTTP 401 status code, but instead another (e.g. HTTP 418), but I don't like this approach, because then the services can not easily be tested with a browser. Also, I think when the server does not send the 'WWW-Authenticate'='Basic realm=test' header, the Basic-Authentication popup is not shown.
As I stated before, it would be nice to have it handled by JavaScript.
I've made a webpage that has the URL-form http://www.example.com/module/content
It's a very dynamic webpage, actually it is a web app.
To make it as responsive as possible, I want to use AJAX instead of normal page requests. This is also enabling me to use JavaScript to add a layer to provide offline capabilities.
My question is only: How should I make the URLs? Should they be http://www.example.com/module/content or http://www.example.com/#!/module/content?
Following is only my thoughts in both directions. You don't need to read it if you already have a clear thought about this.
I want to use the first version because I want to support the new HTML5 standard. It is easy to use, and the URLs look pretty. But more importantly is that it allows me to do this:
If the user requests a page, it will get a full HTML page back.
If the user then clicks a link, it will insert only the contents into the container div via AJAX.
This will enable users without JavaScript to use my website, since it does not REQUIRE the use to have JavaScript, it will simply use the plain old "click a link-request-get full html page back"-approach.
While this is great, the problem is of course Internet Explorer. Only the last version of IE supports History API, and to get this working in older versions, one needs to use a hashtag. (50 % of my users will be using IE, so I need to support it...) So then I have to use /#!/ to get it working.
If one uses both these URL-versions, the problem arises that if a IE user posts this link to a website, Google will send the server a _unescaped_string (or similar..) And it will index the page with the IE-version with the hashtag. And some pages will be without the hashtag.
And as we remember, a non-hashtag is better on many different things. So, can this search engine problem be circumvented? Is it possible to tell the GoogleBot that if it's reaching for the hashtag-version of the website, it should be redirected to the non-hashtag-version of the webpage? This way, one could get all the benefits of a non-hashtag URL and still support IE6-IE9.
What do you think is the best approach? Maybe you have tried it in practice yourself?
Thanks for your answer!
If you want Google to index your Ajax content then you should use the #!key=value like this. That is what Google prefers for Ajax navigation.
If you really prefer the pretty HTML5 url without #! then, yes, you can support both without indexing problems! Just add:
<link rel="canonical" href="preferredurl" />
to the <head> section of each page (for the initial load), so to help Google know which version of the url you would prefer them index. Read more about canonical urls here.
In that case the solution is very easy. You use the first URL scheme, and you don't use AJAX enhancements for older IE browsers.
If your precious users don't want to upgrade, it's their problem, but they can't complain about not having these kewl effects and dynamics.
You can throw a "Your browser is severely outdated!" notice for legacy browsers as well.
I would not use /#!/ in the url. First make sure the page works normally, with full page requests (that should be easy). Once that works, you can check for the window.history object and if that is present add AJAX. The AJAX calls can go to the same url and the main difference is the server side handling. The check is simple, if the HTTP_X_REQUESTED_WITH is set then the request is an AJAX request and if it is not set then you're dealing with a standard request.
You don't need to worry about duplicate content, because GoogleBot does not set the HTTP_X_REQUESTED_WITH request header.
I realized at a moment with the help of firebug in stackoverflow.com, when someone accepted your answer, suddenly your points got increased without any Ajax hit received to any method. Amazing, How is it possible?
Please advice so that i can try to implement this technique into my upcoming projects. Thanks in advance.
Make sure you have looked into the net option. There are two ways I can tell.
Web Sockets
iFrame
Please have a look at http://www.html5rocks.com/en/features/connectivity & http://html5demos.com/web-socket
But will work on limited browsers
Using iFrame with simple get request no ajax call will made but you will be able to see it firebug net. This is what Facebook uses and all browser compatible.
It's using WebSockets instead of AJAX XMLHttpRequest in modern browsers. You can find more details about Stack Overflow's implementation on meta.stackoverflow.com.
The main advantage of WebSockets is the server can send an update to the browser the moment you receive an upvote. Other methods, such as XHR and hidden iframes, require the browser to poll the server at regular intervals to get an updated vote count.
you could use an image submit button and submit to an small iframe that displays the number.
otherwise you'd still be messing around with an hidden iframe and submits or gets posts in a hidden iframe.
If you really want a javascript less solution the form submits hidden/small iframe are the way to go.
I dont want to page reload when going to history back on my web pages.
When visitors are click back button on browser or press backspace key, my pages are reload. How can i disable reloading on history back or how can i activate real caching?
Thanks...
This behaviour stems from the browser's MO, not from your end.
you cannot prevent page reload. If your problem is POST pages relaoding, with messages alerting the user that POSTED data should be resend then you should look at "Redirect after Post" principle with 303 redirect on POST. It can fix some of theses behaviors.
The second thing you should look at is the cache headers you are sending with your pages responses, use PageSpeed extension of firebug or other tools, you'll have good hints on what headers you are actually sending and what setting you could adjust. When your cache headers are fine you'll see that some pages won't be recall and that some queries from the browser are not generating real GET+response 200 but 304-unchanged responses and headers queries. And if you go deeper on the analysis you'll find that the way the browser cache is working depends a lot of the browser.
The page is not reloaded when following a HTML bookmark within the same document. That is, all the browsing must happen using Javascript only and URL must stay the same until the # character. To handle the Back button correctly, you may need to use the onpopstate event. If you don't want any changes to the URL, you can use history.pushState().
If Javascript is not supported by the browser, you can do some tricks using CSS :target selector -- or just navigate the user to another page with reloading.
Note: I did not code a page like this, it is just my guess after reading an API reference page.
I have experienced a weird problem with javascript.
The problem page uses some jQuery code to collect data and it does input checking validation. If the validation is true, it posts to the server. Some of our users reported (10% or maybe a lot less), they could not submit on the website.
We talked with one of the users who had the problem, and were even more confused afterwards.
Tester's PC: XP, IE8, FireFox
The first time he used IE and the JavaScript validation did not fire, he was NOT able to submit data to server neither, because the validation was set to be false by default.
(it is supposed to have a error message showing up if the validation is false)
Afterwards he tested with F.F. (F.F. worked straight away).
Coming back to IE again, the validation script started working and the submit was again successful.
So, after all the tester don't have problem any more, and couldn't replicate neither.
I am wondering if there is any software or program may stop the js file from downloading properly?
Because the page is also hosted in a i-frame in another website, that is why i thinking some antivirus may think this is a across-domain threat and stopped the posting working.
If so how can i do a check to ensure all the required js files are downloaded before user doing a submit?
What else should i look into, since the problem happens on client end only, with no server-end validation yet.
#drachenstern: thanks for the edit
You could disable the submit button, enable it only after jQuery is fully loaded and executed.
For example:
<input type="submit" disabled />
then, in your Javascript,
$(function () {
$('input:submit').attr('disabled', false);
});
However, be advised that
User will not be able to submit
anything on a browser that doesn't
support Javascript
You should not
depend on Javascript to verify the
user content; always validate the
data again on the server-side.
It is possible that there is some delay in loading the javascript on the client sde. anti-Virus "Internet secutiry" products (may) do a lot of checks.
It is highly possible that the internet security product scans a call, and then decides "ok, this is safe" and then the javascript file is downloaded. There might be a delay in this.
How to avoid the situation?
Don't tie your form submit to javascript. Let it happen always, with or without javascript. If javascript is ready the user will have a good experience (immediate validation). If it is not yet ready, the user will still be able to do the submit, do the validation and throw error messages the "traditional" way - by refreshing the page
Make the user wait till the javascript is loaded. You can have a small "loading" icon somewhere in the page to tell the user he has to wait. The user can enter the data, but can't submit yet. In the background, keep checking whether the javascript is loaded (setTimeout and checking for a specific variable). Once it is loaded, you can use javascript validations
A combination of the two: Allow non-javascript submit till you know that javascript is loaded. Once done, use javascript validations.
I would suggest at first that you should always validate everything on the server. The only reason to validate on the client is to make the response to the user faster on bad inputs.
Additionally, to ensure that each file is downloaded and processed, you could always put a global var in each file, then check them in the document proper to see if each variable has been found. It's a crude back but it would work.
You didn't specify what version of IE the user was using, but the problem of the file not being loaded right away in IE sounds normal behavior to me, however quirky. I've run into that many times, and the only solution is a ctrl-F5 for me. I don't know what else to say there. It would be WONDERFUL if we could always have every browser respond the same, but we can't, so we go on. Also, what OS were they doing all this testing on? And What browser do you test on?
What behavior do you see in IE? If you're using IE8 or later, you'll have debug tools for sure, and you could always use FirebugLite to debug your pages in IE without using the IE tools. Then you could see what the page is doing in IE. Perhaps it's throwing a javascript parsing error? Are there any icons on the window chrome in IE that would give a tipoff?
But I think that if you're trying to fix the second paragraph, you're doing it wrong if you're relying on the javascript to process the validations. But I'm just one guy.