how to check if js file is downloaded on client end - javascript

I have experienced a weird problem with javascript.
The problem page uses some jQuery code to collect data and it does input checking validation. If the validation is true, it posts to the server. Some of our users reported (10% or maybe a lot less), they could not submit on the website.
We talked with one of the users who had the problem, and were even more confused afterwards.
Tester's PC: XP, IE8, FireFox
The first time he used IE and the JavaScript validation did not fire, he was NOT able to submit data to server neither, because the validation was set to be false by default.
(it is supposed to have a error message showing up if the validation is false)
Afterwards he tested with F.F. (F.F. worked straight away).
Coming back to IE again, the validation script started working and the submit was again successful.
So, after all the tester don't have problem any more, and couldn't replicate neither.
I am wondering if there is any software or program may stop the js file from downloading properly?
Because the page is also hosted in a i-frame in another website, that is why i thinking some antivirus may think this is a across-domain threat and stopped the posting working.
If so how can i do a check to ensure all the required js files are downloaded before user doing a submit?
What else should i look into, since the problem happens on client end only, with no server-end validation yet.
#drachenstern: thanks for the edit

You could disable the submit button, enable it only after jQuery is fully loaded and executed.
For example:
<input type="submit" disabled />
then, in your Javascript,
$(function () {
$('input:submit').attr('disabled', false);
});
However, be advised that
User will not be able to submit
anything on a browser that doesn't
support Javascript
You should not
depend on Javascript to verify the
user content; always validate the
data again on the server-side.

It is possible that there is some delay in loading the javascript on the client sde. anti-Virus "Internet secutiry" products (may) do a lot of checks.
It is highly possible that the internet security product scans a call, and then decides "ok, this is safe" and then the javascript file is downloaded. There might be a delay in this.
How to avoid the situation?
Don't tie your form submit to javascript. Let it happen always, with or without javascript. If javascript is ready the user will have a good experience (immediate validation). If it is not yet ready, the user will still be able to do the submit, do the validation and throw error messages the "traditional" way - by refreshing the page
Make the user wait till the javascript is loaded. You can have a small "loading" icon somewhere in the page to tell the user he has to wait. The user can enter the data, but can't submit yet. In the background, keep checking whether the javascript is loaded (setTimeout and checking for a specific variable). Once it is loaded, you can use javascript validations
A combination of the two: Allow non-javascript submit till you know that javascript is loaded. Once done, use javascript validations.

I would suggest at first that you should always validate everything on the server. The only reason to validate on the client is to make the response to the user faster on bad inputs.
Additionally, to ensure that each file is downloaded and processed, you could always put a global var in each file, then check them in the document proper to see if each variable has been found. It's a crude back but it would work.
You didn't specify what version of IE the user was using, but the problem of the file not being loaded right away in IE sounds normal behavior to me, however quirky. I've run into that many times, and the only solution is a ctrl-F5 for me. I don't know what else to say there. It would be WONDERFUL if we could always have every browser respond the same, but we can't, so we go on. Also, what OS were they doing all this testing on? And What browser do you test on?
What behavior do you see in IE? If you're using IE8 or later, you'll have debug tools for sure, and you could always use FirebugLite to debug your pages in IE without using the IE tools. Then you could see what the page is doing in IE. Perhaps it's throwing a javascript parsing error? Are there any icons on the window chrome in IE that would give a tipoff?
But I think that if you're trying to fix the second paragraph, you're doing it wrong if you're relying on the javascript to process the validations. But I'm just one guy.

Related

How to prevent browser Ctrl+U?

I want to disable Ctrl+U from browser to stop users viewing the source (html + JavaScript) for a page.
This unfortunately is not how it works.
When a user visits your website, there's a lot going on behind the scenes:
The user queries a page on your site.
Your server does some fancy things
Your server transforms those fancy things into something for the users browser to use
Your server sends off its final product back to the browser.
The browser then gets a bunch of code, such as HTML or Javascript.
The browser then reads that HTML and Javascript and organizes it to look and work how it's supposed to on the users screen.
Basically, another way of saying all this, is that the HTML and Javascript that you want to hide is executed client-side. This means that your browser gets a bunch of code, it executes it, and then displays its results to the user. If someone really wanted to see the source code of your website, they could easily bypass your prevention of using CTRL+U. All they have to do is to somehow tell the browser not to execute the code!
Ultimately, if a user really wants to see your source code, they will do it. There is no way to stop it. For this reason, it is recommended to keep things you need to remain a secret on the server-side code (such as your PHP).
You potentially can not prevent user from viewing the html source content. The site that prevents user from rightclick. but Fact is you can still do Ctrl+U in firefox and chrome to view source !
It is impossible to effectively hide the HTML, JavaScript, or any other resource sent to the client. Impossible, and isn't all that useful either.
Furthermore, don't try to disable right-click, as there are many other items on that menu (such as print!) that people use regularly.
Please have a look at this
I think this may help you.
Unfortunately CTRL+U is for "View Source", you can't disable browser functionalities, but you can write secure coding whichever you don't want to show.

(Semi-)Automatically logging in to websites

I want to automatically log in to specific websites, e.g. the groupware webinterface at work. My browser (Chrome on Linux, if that matters) saves passwords for me, but I want a complete auto-login, so that I don't even have to click the "login" button anymore.
I have investigated multiple ways to approach this, but none of them has turned out to be satisfying:
1. Use a Tampermonkey JavaScript which clicks the "login" button on the website
I wrote a custom JavaScript which was supposed to just click the "submit" button once I load the login page. Chrome was supposed to fill in the password fields. The idea sounded pretty straight-forward. However, this is bad for 2 reasons: On the one hand, I cannot use Chrome's saved password. Chrome has a policy that the password field already displays the circles, but the password is not actually filled in and is also not accessible from JavaScript until the user has performed a gesture such as clicking (see this Chromium issue), which kind of defeats the purpose of my JavaScript. I could go around this by saving the password in localStorage additionally (security wouldn't be compromised, as the saved passwords are not encrypted either), but this doesn't feel good. On the other hand, this breaks a (imho) significant security feature of Chrome. It is the same feature mentioned above which prevents XSS attacks from stealing login passwords. Because whenever I load the login page, the password would be filled in and it would log me in.
So what I would rather want is a special (if possible local) page which I can bookmark, but which will (probably) never be known to anyone performing an XSS attack on me.
2. Use a local HTML page which loads the login page, fills out the form and logs me in
This is a simple idea and would accomplish my goal, but of course it doesn't work because of the same-origin policy.
3. Use a script/program
This would theoretically work. I could write a program which downloads the login page, reads the form, submits it and then transfers the cookies (or the login URL, if the form uses GET to submit to the browser. However, this would be a major piece of work, especially for the case where the forms use the POST method (I'd have to transfer cookies to a possibly running instance of Chrome).
Plus, I'd have to somehow tie this program to a local webserver or turn it into an extension so I could access it from within my browser. After all, opening a shell and typing a command is not really easier than clicking a login button.
4. Use cookies
This is not really an approach, but I mention it here for completeness' sake. By default, Chrome removes all cookies when I exit the browser. I can configure it to keep the cookies of specific websites so I don't have to log in again when I restart it. Some websites use only session cookies, though, so closing the last tab already (correctly) removes the cookies and I have to login again. As a result, cookies only solve my problem for a few websites, but not all.
So my question is: Is there an easier way to accomplish automatic log-in without having to circumvent security features or write a large program?
P.S.: I know, this is a lot of effort to get around clicking a single button every now and then :)

Why does reloading page in Firefox or Chrome cause the unload event handler to trigger after the browser calls the server?

I noticed something odd in the way Firefox and Chrome handle reloads, and I was wondering if anybody else has encountered this and perhaps knows why.
I have a window.onunload event where I set a cookie (in this case using YUI, but native JS or jQuery would work the same). This cookie normally gets sent in the HTTP request to the server, where the server-side code looks for it. If the cookie exists, it can take a special action.
window.onunload = function() {
Y.Cookie.set('reset_function', 'true', { path: '/'});
}
This works fine when the user is going from one page to another page via a link on the page. However, when the user reloads the page, the cookie is being set in Firefox and Chrome (i.e. I verified the code is being executed via Firebug/Chrome DevTools), but the cookie is NOT sent to the server, so the server can't take the special action.
Has anybody encountered this and know why? Is this behavior baked into these two browsers?
EDIT: When I debugged this further, on reload, Chrome and Firefox go to the server first and then go through the onunload event handler. I'm still unsure why the browser behaves like this.
I could be wrong, but my guess would be that it's just a performance optimization.
First, you should be aware that the onunload method is (last I checked) not allowed to prevent the user from navigating off the page. If it were, I think there'd be a lot more malicious inescapable websites!
What it can do (on some browsers, anyway) is prompt the user with a message, and give the user the opportunity to cancel navigation.
Since this prompt takes some non-zero amount of time, the browser developers (for Chrome and Firefox) may have decided to go ahead and make the request first, so that if the user pauses for a moment and then confirms, the subsequent page will load as fast as possible.
I don't really know if this is accurate or not, but it's one possible explanation. Note that the onunload method is not part of any standard, which means its behavior isn't exactly clearly defined anyway, which means the browser makers are free to put the request before or after the event handler, as far as I can tell.

how to detect event when user has disabled javascript in his browser?

how to detect event when user has disabled javascript in his browser, until the user reloads the page I can send a command to the server not to use javascript
Thanks for the answer.
Consider this scenario, the user uploaded the page, then changed his mind to use javascript and changed the browser settings. The script is still running on the page until the page is reloaded. The question is to catch the moment when the user changes the settings of the browser. Then I would be able to send a command to the server not to use javascript using AJAX
You can't. JavaScript is required to detect events.
Follow the principles of Progressive Enhancement and Unobtrusive JavaScript instead.
The best you can do is a noscript tag. Noscript tag is only displayed if there is no javascript. Beyond that you can't do much if the user has javascript disabled.
<noscript>JAvscript Disabled. Enable it or the world cannot function. </noscript>\
I am not sure if it is deprecated or something
Not sure what you mean, but to detect is not possible since javascript is disabled. I suggest you make the server wait for a javascript "command" (maybe you mean a ajax call/command ?) and if it doesn't arrive, then JS is off.
If you want to know good practice, check Quentin's links.

Safari doesn't allow AJAX Requests after form submit?

I am writing a Javascript based upload progress meter. I want to use the standard multipart submit method (rather than submitting the file in an iframe). During the submit, I send ajax requests that return the % complete of the upload and then update the progress meter accordingly.
This all works smoothly in FireFox & IE. However, Safari seems prevent the completion of ajax requests after the main form has been submitted. In the debugger, I can see the request headers, but it appears as though the response is never received.
Anyone aware of this, or how to get around it?
Yes, this is how Safari and any browser based on WebKit (i.e. Google Chrome) behave. I recently ran into this on a file upload progress meter also. I ended up using the same technique seen at http://drogomir.com/blog/2008/6/30/upload-progress-script-with-safari-support to get the ajax to work. In my case, I didn't want to change the look of my application to the one Drogomir uses, but the technique itself worked. Essentially, the solution is to create a hidden iframe only in Safari that loads jQuery and your AJAX script. Then, the top frame calls a function in that frame on form submit. All other browsers still work the same as before.
This is a WebKit bug. See https://bugs.webkit.org/show_bug.cgi?id=23933
Are you using an iframe to submit your form to? I'm guessing that once the form is submitted, the page enters a state where no more modifications to the DOM can be made.
Check a tutorial such as this one for more information.
This actually sounds like correct behaviour to me - and im surprised that firefox and IE behave otherwise.
It is akin to you attempting to leave a page and the page still interacting with you - sounds naughty!
i can see why this would be of benefit - but I would hope it only the case if you are performing a POST to the uri you are currently accessing, or at worst same-domain.

Categories

Resources