Asynchronous webRequest.onBeforeRequest URL inspection using Firefox native messages - javascript

I'm trying to create URL inspector using Firefox native messages. The problem is, when native application sends a verdict, onBeforeRequest listener already released request, thus redirection doesn't happen.
Can you please help to make my extension wait for reply for up to 2 seconds and redirect the request if answer is "0"?
var port = browser.runtime.connectNative("ping_pong");
function inspectURL(requestDetails) {
console.log("Loading: <" + requestDetails.url + ">");
port.postMessage(requestDetails.url);
console.log("Posting complete <" + requestDetails.url + ">");
}
port.onMessage.addListener((response) = > {
console.log("Received: <" + response + ">");
if (response == "1")
{
console.log("Good url!!!");
}
else
{
console.log("BAD url - redirecting!!!");
return {
redirectUrl: "https://38.media.tumblr.com/tumblr_ldbj01lZiP1qe0eclo1_500.gif"
};
}
});
browser.webRequest.onBeforeRequest.addListener(
inspectURL,
{ urls: ["<all_urls>"] },
["blocking"]
);

Firefox
Firefox supports asynchronous webRequest blocking/modification listeners from Firefox 52 onward. To do so (MDN: webRequest: Modifying requests (and multiple other pages)):
From Firefox 52 onwards, instead of returning BlockingResponse, the listener can return a Promise which is resolved with a BlockingResponse. This enables the listener to process the request asynchronously.
So, in your wrbRequest.onBeforeRequest listener, you return a Promise which you resolve with the BlockingResponse.
You will need to store a list of requests for which you have asked for information from your port. The response from your port needs to uniquely identify the request to which it's responding. Keep in mind that this is all asynchronous, so you can have multiple requests in flight at the same time. You must appropriately track these and only resolve the appropriate Promise. Assuming your responses from your port don't rapidly change, you should store a list of URLs which have been checked (both good and bad), so you can respond immediately to URLs which have already been checked.
Chrome
What you desire is not possible in Chrome. You will need to resolve you issue in some other way.

There's no way at all. It's possible since Firefox 52, see this answer. It's still not possible on Chrome.
Native Messaging is an asynchronous API. After posting a message, you will not receive a reply until you go back to the event loop (e.g. current code terminates).
However, blocking WebRequest API requires a synchronous reply. This is a core limitation, because asynchronous reply may never come or come after an uncertain delay, and the network stack won't wait for that to happen. I mean, it could, but the design of the API deliberately forbids it.
Basically: even if the reply is ready, your code will not receive it until inspectURL terminates, at which point WebRequest already carries on with the request. There is no way in JavaScript to make it synchronous.

Related

Run a bash script through CGI on closing of browser window [duplicate]

I'm trying to find out when a user left a specified page. There is no problem finding out when he used a link inside the page to navigate away but I kind of need to mark up something like when he closed the window or typed another URL and pressed enter. The second one is not so important but the first one is. So here is the question:
How can I see when a user closed my page (capture window.close event), and then... doesn't really matter (I need to send an AJAX request, but if I can get it to run an alert, I can do the rest).
Updated 2021
TL;DR
Beacon API is the solution to this issue (on almost every browser).
A beacon request is supposed to complete even if the user exits the page.
When should you trigger your Beacon request ?
This will depend on your usecase. If you are looking to catch any user exit, visibilitychange (not unload) is the last event reliably observable by developers in modern browsers.
NB: As long as implementation of visibilitychange is not consistent across browsers, you can detect it via the lifecycle.js library.
# lifecycle.js (1K) for cross-browser compatibility
# https://github.com/GoogleChromeLabs/page-lifecycle
<script defer src="/path/to/lifecycle.js"></script>
<script defer>
lifecycle.addEventListener('statechange', function(event) {
if (event.originalEvent == 'visibilitychange' && event.newState == 'hidden') {
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
}
});
</script>
Details
Beacon requests are supposed to run to completion even if the user leaves the page - switches to another app, etc - without blocking user workflow.
Under the hood, it sends a POST request along with the user credentials (cookies), subject to CORS restrictions.
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
The question is when to send your Beacon request. Especially if you want to wait until the last moment to send session info, app state, analytics, etc.
It used to be common practice to send it during the unload event, but changes to page lifecycle management - driven by mobile UX - killed this approach. Today, most mobile workflows (switching to new tab, switching to the homescreen, switching to another app...) do not trigger the unload event.
If you want to do things when a user exits your app/page, it is now recommended to use the visibilitychange event and check for transitioning from passive to hidden state.
document.addEventListener('visibilitychange', function() {
if (document.visibilityState == 'hidden') {
// send beacon request
}
});
The transition to hidden is often the last state change that's reliably observable by developers (this is especially true on mobile, as users can close tabs or the browser app itself, and the beforeunload, pagehide, and unload events are not fired in those cases).
This means you should treat the hidden state as the likely end to the user's session. In other words, persist any unsaved application state and send any unsent analytics data.
Details of the Page lifecyle API are explained in this article.
However, implementation of the visibilitychange event, as well as the Page lifecycle API is not consistent across browsers.
Until browser implementation catches up, using the lifecycle.js library and page lifecycle best practices seems like a good solution.
# lifecycle.js (1K) for cross-browser compatibility
# https://github.com/GoogleChromeLabs/page-lifecycle
<script defer src="/path/to/lifecycle.js"></script>
<script defer>
lifecycle.addEventListener('statechange', function(event) {
if (event.originalEvent == 'visibilitychange' && event.newState == 'hidden') {
var url = "https://example.com/foo";
var data = "bar";
navigator.sendBeacon(url, data);
}
});
</script>
For more numbers about the reliability of vanilla page lifecycle events (without lifecycle.js), there is also this study.
Adblockers
Adblockers seem to have options that block sendBeacon requests.
Cross site requests
Beacon requests are POST requests that include cookies and are subject to CORS spec. More info.
There are unload and beforeunload javascript events, but these are not reliable for an Ajax request (it is not guaranteed that a request initiated in one of these events will reach the server).
Therefore, doing this is highly not recommended, and you should look for an alternative.
If you definitely need this, consider a "ping"-style solution. Send a request every minute basically telling the server "I'm still here". Then, if the server doesn't receive such a request for more than two minutes (you have to take into account latencies etc.), you consider the client offline.
Another solution would be to use unload or beforeunload to do a Sjax request (Synchronous JavaScript And XML), but this is completely not recommended. Doing this will basically freeze the user's browser until the request is complete, which they will not like (even if the request takes little time).
1) If you're looking for a way to work in all browsers, then the safest way is to send a synchronous AJAX to the server. It is is not a good method, but at least make sure that you are not sending too much of data to the server, and the server is fast.
2) You can also use an asynchronous AJAX request, and use ignore_user_abort function on the server (if you're using PHP). However ignore_user_abort depends a lot on server configuration. Make sure you test it well.
3) For modern browsers you should not send an AJAX request. You should use the new navigator.sendBeacon method to send data to the server asynchronously, and without blocking the loading of the next page. Since you're wanting to send data to server before user moves out of the page, you can use this method in a unload event handler.
$(window).on('unload', function() {
var fd = new FormData();
fd.append('ajax_data', 22);
navigator.sendBeacon('ajax.php', fd);
});
There also seems to be a polyfill for sendBeacon. It resorts to sending a synchronous AJAX if method is not natively available.
IMPORTANT FOR MOBILE DEVICES : Please note that unload event handler is not guaranteed to be fired for mobiles. But the visibilitychange event is guaranteed to be fired. So for mobile devices, your data collection code may need a bit of tweaking.
You may refer to my blog article for the code implementation of all the 3 ways.
I also wanted to achieve the same functionality & came across this answer from Felix(it is not guaranteed that a request initiated in one of these events will reach the server).
To make the request reach to the server we tried below code:-
onbeforeunload = function() {
//Your code goes here.
return "";
}
We are using IE browser & now when user closes the browser then he gets the confirmation dialogue because of return ""; & waits for user's confirmation & this waiting time makes the request to reach the server.
Years after posting the question I made a way better implementation including nodejs and socket.io (https://socket.io) (you can use any kind of socket for that matter but that was my personal choice).
Basically I open up a connection with the client, and when it hangs up I just save data / do whatever I need. Obviously this cannot be use to show anything / redirect the client (since you are doing it server side), but is what I actually needed back then.
io.on('connection', function(socket){
socket.on('disconnect', function(){
// Do stuff here
});
});
So... nowadays I think this would be a better (although harder to implement because you need node, socket, etc., but is not that hard; should take like 30 min or so if you do it first time) approach than the unload version.
The selected answer is correct that you can't guarantee that the browser sends the xhr request, but depending on the browser, you can reliably send a request on tab or window close.
Normally, the browser closes before xhr.send() actually executes. Chrome and edge look like they wait for the javascript event loop to empty before closing the window. They also fire the xhr request in a different thread than the javascript event loop. This means that if you can keep the event loop full for long enough, the xhr will successfully fire. For example, I tested sending an xhr request, then counting to 100,000,000. This worked very consistently in both chrome and edge for me. If you're using angularjs, wrapping your call to $http in $apply accomplishes the same thing.
IE seems to be a little different. I don't think IE waits for the event loop to empty, or even for the current stack frame to empty. While it will occasionally correctly send a request, what seems to happen far more often (80%-90% of the time) is that IE will close the window or tab before the xhr request has completely executed, which result in only a partial message being sent. Basically the server receives a post request, but there's no body.
For posterity, here's the code I used attached as the window.onbeforeunload listener function:
var xhr = new XMLHttpRequest();
xhr.open("POST", <your url here>);
xhr.setRequestHeader("Content-Type", "application/json;charset=UTF-8");
var payload = {id: "123456789"};
xhr.send(JSON.stringify(payload));
for(var i = 0; i < 100000000; i++) {}
I tested in:
Chrome 61.0.3163.100
IE 11.608.15063.0CO
Edge 40.15063.0.0
Try this one. I solved this problem in javascript, sending ajax call to server on browse or tab closing. I had a problem with refreshing page because on onbeforeunload function including refreshing of the page. performance.navigation.type == 1 should isolate refresh from closing (on mozzila browser).
$(window).bind('mouseover', (function () { // detecting DOM elements
window.onbeforeunload = null;
}));
$(window).bind('mouseout', (function () { //Detecting event out of DOM
window.onbeforeunload = ConfirmLeave;
}));
function ConfirmLeave() {
if (performance.navigation.type == 1) { //detecting refresh page(doesnt work on every browser)
}
else {
logOutUser();
}
}
$(document).bind('keydown', function (e) { //detecting alt+F4 closing
if (e.altKey && e.keyCode == 115) {
logOutUser();
}
});
function logOutUser() {
$.ajax({
type: "POST",
url: GWA("LogIn/ForcedClosing"), //example controller/method
async: false
});
}
Im agree with Felix idea and I have solved my problem with that solution and now I wanna to clear the Server Side solution:
1.send a request from client side to server
2.save time of the last request recived in a variable
3.check the server time and compare it by the variable of last recived
request
4.if the result is more than the time you expect,start running the
code you want to run when windows closed...
Use:
<body onUnload="javascript:">
It should capture everything except shutting down the browser program.

Wait for an event to trigger inside Node.js process

First of all, I am well aware that Node.js is non-blocking before anything else, but in this very specific case, it must be blocking and waiting here.
I have an authentication process that works that way (using APIs, I didn't design this so I cannot modify the way the auth works):
I send a PUT request to a REST API, I get a HTTPResponse code that determines if the API understood the request.
The server I just requested through its API sends the full response (including error codes, etc) through an XMPP protocol.
This means, when I send the request, I cannot know what happened next, and must wait for the XMPP event to trigger (basically, an on("message", callback) event).
I'd like to know how to work with this with Node.js.
Two more things to know:
1) I'm working on a client/server architecture, and my Node.js server is doing this authentication process, and sending a response through the websocket to the client and waiting for a socket answer is out of the question (not my call, but my boss wants this process to be done in one pass).
2) It must not be done with the client socket and must go through the full Node.js process for various reasons.
Thanks a lot for your help people! \o/
Sorry for not answering previously, we had some severe hardware failure at work.
Anyway, I'm not answering one of your comments directly because I found a solution I prefer, even if I thank you for your help. I've decided to use a promise and to wait for its answer to be sure to get a proper response.
Here is the code:
var answer = await new Promise((accept, reject) => {
// If there are no stanza in 30 seconds, then the process failed or the stanza got missed.
setTimeout(() => {
reject("timed out");
}, (30 * 1000));
// Waiting for the xmpp event to trigger.
xmpp.on("stanza", function(stanza) {
// Processing of the received stanza goes here.
});
});
#gkatzioura solution was interesting, but this looked a little bit heavy on bandwidth and we are working on a large scale applications, or maybe I didn't fully understand it, then it is my mistake.
#pspi solution was also interesting but this would be a problem considering the XMPP event listener is inside the request, and the PUT request needs to send a body on its end() event and here it wouldn't really work for what I want to do. I think that's because the original post I made was somewhat unclear.
Thanks a lot for your help guys! :]
I don't know enough XMPP, but would this just be case of "putting dependent logic inside callback".
request.put(..., function () {
// dependent xmpp logic inside request callback
xmpp.on(..., function () {
// xmpp and put response received, now talk back to socket client
socket.send(...);
});
});
In your case I would proceed with the event emitter (or anything in a publish subscribe fashion).
Fire your http call and inside the handler add an emitter listener with a check if the events is for the corresponding authentication.
Meanwhile your xmpp connection once it receives the authorization it shall emit a message.
The listener will receive the message successfully and will use the callback of the http call.

Javascript / angular: perform asynchronous http from onunload

I'm working with a web app that locks resources on a server. To unlock them, it has to delete a resource on the server with an HTTP DELETE. I know this isn't reliable and there's a periodic cleanup running as well which unlocks them, but the goal is to unlock the resource as soon as possible.
I cannot change the locking architecture (it's not my system), I just have to make the best stab at unlocking.
One point where I need to unlock is when the tab or browser is closed. First, I'm handling the onbeforeunload and if the document is dirty, prompting the user for confirmation that they want to close:
$window.onbeforeunload = function() {
if (documentIsDirty) {
return "Some prompt text";
}
};
I can't unlock from within onbeforeunload, as the user may choose to cancel the close. But there's no event (correct me if I'm wrong) between onbeforeunload and onunload.
If I try to make the call from in onunload, then the tab/session gets destroyed as soon as the onunload function returns. Trouble is, that's before the http request has completed, and it turns out that the resource doesn't actually get unlocked.
$window.onunload = function() {
$http.delete('/a/lock/url');
// I've tried forcing a digest cycle to try and flush the request
// through, but it never gets sent either way
$rootScope.$digest();
};
Now, I know it's anathema to actually block in Javascript, but it appears that once onload returns, that's all she wrote.
Is there any way to block until the http request has actually completed, and prevent onunload from returning until then?
[UPDATE]
Solution was as below - use XMLHttpRequest synchronously. It's noisily deprecated but does (at the time of writing) still work at least in Chrome.
var request = new XMLHttpRequest();
request.open('DELETE', url, false);
request.setRequestHeader('X-XSRF-TOKEN', myXSRFToken);
request.send();
$http will always do requests asynchronously because internally it's just using XMLHttpRequest and always passing true as the third parameter to a request's open function. From the MDN documentation for XMLHttpRequest's open function:
An optional Boolean parameter, defaulting to true, indicating whether or not to perform the operation asynchronously. If this value is false, the send()method does not return until the response is received.
If you want to do a synchronous request you can just use XMLHttpRequest directly and pass false as the third parameter to open. Since this is when the site is closing it's not really necessary to use Angular's $http anyway.
Recently, Have faced same issue while doing this kinda activity and resolved it by using navigator.sendBeacon() .The navigator.sendBeacon() method asynchronously sends a small amount of data over HTTP to a web server. For latest browser you could do like
window.addEventListener('beforeunload', function (event) {
data = new FormData();
// for CSRF Token
token = $('meta[name="csrf-token"]').attr('content');
data.append("key", value);
data.append("authenticity_token", token);
navigator.sendBeacon("URL", data);
// Cancel the event as stated by the standard.
event.preventDefault();
// Chrome requires returnValue to be set.
event.returnValue = 'Are you sure you want to leave this page without saving?';
});
For more details checkout Navigator.sendBeacon() and Window: beforeunload event

Handling HTTP 401 with WWW-Authenticate in Cordova / Ionic iOS application

I'm currently working on a mobile app built on Cordova and Ionic. I am dealing with a third-party API (i.e. it cannot, and will not be changed for this app to work).
When a user of the app is unauthenticated - be that if their session has expired or otherwise - the API responds with an HTTP 401, with a WWW-Authenticate header.
In a browser while developing this is fine, but while on an iPhone, or in a simulator it does not appear, and the app has to reach the timeout period for the request. When that timeout is reached, the request is cancelled. This means that in the JavaScript, we simply get back a HTTP status of 0, with no real data to identify whether or not there was a timeout, or an authentication issue.
Currently, I've put in place some educated guesswork like checking if the phone has connectivity when a timeout occurs etc, but this is not an ideal solution as the user still has to wait for that timeout, and it's not always correct.
How can I check when the HTTP 401 dialog has appeared and is expecting a response? I need to be able to identify when an actual 401 occurs, and when a request simply times out.
If there is a method in JavaScript to accomplish then, then that'd be great. A native solution would also work, be it a plugin or otherwise.
I am dealing with the same issue at the moment.
The issue is that when a 401 is being returned, it has a WWW-Authenticate piece, which tells the browser to pop up that little popup for you to try again. This isn't handled "right" (depends on who you ask I guess) in the iOS web container, which results in Cordova never to see the request.
The cordova bug is logged here: https://issues.apache.org/jira/browse/CB-2415
I don't understand how such a major issue hasn't been resolved, yet. But I am sure there are some technicalities around it. If you check out the updates, you see that Tobias Bocanegra ( https://github.com/tripodsan/cordova-ios/commit/5f0133c026d6e21c93ab1ca0e146e125dfbe8f7e ) added a "quick hack" to solve the problem. Maybe that helps you further. It didn't help me in my situation.
What I did in my case for a temporary fix:
I passed the http requests into the loading modal, which has a cancel button. So, it is up to the user to cancel or just wait. It's horrible, but it worked in my case. (Internal app, etc.)
Well I am not sure why the third-party API woudn't send you normal HTTP error codes. If you are connecting to the API with the use of $http you can add response interceptors to it, for example read next article:
http://codingsmackdown.tv/blog/2013/01/02/using-response-interceptors-to-show-and-hide-a-loading-widget/
Within the next error handler code you can add some code to evaluate the HTTP status code:
function error(response) {
// Added code for specific HTTP error codes
if (response.status === 401) {
console.log('Received 401');
}
// get $http via $injector because of circular dependency problem
$http = $http || $injector.get('$http');
if($http.pendingRequests.length < 1) {
$('#loadingWidget').hide();
}
return $q.reject(response);
}
Also see the AngularJS documentation about interceptors.
A native solution would also work, be it a plugin or otherwise.
I had the same problem and I could fix it with the cordova http plugin. Just install it via ionic plugin add cordova-plugin-advanced-http (check documentation here). Then your xhttp calls will be done natively and not out of the webView. Then responses with a ´WWW-Authenticate´ headers will not timeout anymore and you can properly handle a 401
You can use it in your code like this:
try { // important since it will not work in dev mode in your browser (e.g. chrome)
if (cordova && cordova.plugin && cordova.plugin.http) {
var server_url = 'https://example.com';
var your_endpoind = '/auth-me-example';
cordova.plugin.http.useBasicAuth('some_user', 'and_password');
cordova.plugin.http.setHeader(server_url, 'Content-type', 'application/json');
cordova.plugin.http.get(server_url + your_endpoind, {}, {}, function (response) {
console.log('response: ', JSON.stringify(response));
}, function (response) {
console.error('error response: ', JSON.stringify(response));
});
}
} catch (e) {
console.error('could not make native HTTP request!');
}
PS: if you are using angular2+ npm install --save #ionic-native/http is also quite useful.

IE hang for 5 minutes when calling synchronous xmlhttprequest

I have a web application and use ajax to call back to my webserver to fetch data.
Sometimes(at rather unpredictable moments, but it can be reproduced) IE hangs completely for 5 minutes(the window says Not Responding) and then comes back and the xmlhttprequest object responds with error 12002.
The way I can reproduce it is as follows.
Open window(B) from main window(A) using button
Window A calls synchronous ajax(PROC1) when button is clicked to open window B. PROC1 Runs file.
New window(B) has ajax code(PROC2) and calls server asynchronous. Runs fine
User closes Window B after PROC2 completed but before data is returned.
In Main Window(a) user clicks button again. PROC1 runs again but now the send() call blocks for 5 minutes.
Please help. I've been looking for 3 days.
Please note:
* I can't test it in firefox (the app is not firefox compatible)
* I have to use synchronous calls (that's the way the app is constructed and it would take too much developer effort to rewrite it)
Why does this happen and how to I fix this?
You're right Jaap, this is related to Internet Explorer's connection limit of 2. For some reason, IE doesn't release connections to AJAX requests performed in closed windows.
I have a very similar situation, only slightly simpler:
User clicks in Window A to open Window B
Window B performs an Ajax call that takes awhile
Before the Ajax call returns, user closes Window B. The connection to this call "leaks".
Repeat 1 more time until both available connections are "leaked"
Browser becomes unresponsive
One technique you can try (mentioned in the article you found) that does seem to work is to abort the XmlHttp request in the unload event of the page.
So something like:
var xhr = null;
function unloadPage() {
if( xhr !== null ) {
xhr.abort();
}
}
Another option is to use synchronous AJAX calls, which will block until the call returns, essentially locking the browser. This may or may not be acceptable given your particular situation.
// the 3rd param is whether the call is asynchronous
xhr.open( 'get', 'url', false );
Finally, as mentioned elsewhere, you can adjust the maximum number of connections IE uses in the registry. Expecting visitors to your site to do this however isn't realistic, and it won't actually solve the problem -- just delay it from happening. As a side-note, IE8 is going to allow 6 concurrent connections.
Thanks for answering Martijn.
It didn't solve my issues. I think what I'm seeing is best described on this website:
http://bytes.com/groups/javascript/643080-ajax-crashes-ie-close-window
In my situation I have an unstable connection or a slow webserver and when the connection is too slow and the browser and the webserver still have a connection then freezes.
By default Internet Explorer only allows two concurrent connections to the same website for download purposes. If you try and fire up more than this, I.E. stalls until one of the previous requests finishes at which point the next request will complete. I believe (although I could be wrong) this was put in place to prevent overloading websites with many concurrent downloads at a time. There is a registry hack to circumvent this lock.
I found these instructions kicking around the internet which alleviated my problems - I can't promise it will work for your situation, but the multi-connection limit you're facing appears related:
Click on the Start button and select Run.
On the Run line type Regedt32.exe and hit Enter. This will launch the Registry Editor
Locate the following key in the registry:
HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings
Click on the Internet Settings Key.
Now go to the Edit menu, point to NEW
click DWORD Value
Type MaxConnectionsPer1_0Server for the name of this DWORD Value.
Double-click on the MaxConnectionsPer1_0Server key you just created and enter the following information: Value data: 10. Base: Decimal.
When finished press OK.
Repeat steps 4 through 9. This time naming the key MaxConnectionsPerServer and assigning it the same values as indicated in Steps 8.
When finished press OK
Close the Registry Editor.
Of course, I would use these in conjunction with the abort() call previously mentioned. In tandem, they should fix the issue.
IE5 and IE6, indeed, do hang when attempting to receive data from a PHP script. The reason is that these browsers can not decide when has all of the data been received and the connection can be closed. So they wait until connection expires (thus the 5 or 10 minute hang). A way to solve this is to tell to the browser how much data it will receive. In PHP you can do that using output buffering, for example as follows:
ob_start();
echo $html_content;
header( 'Connection: close' );
header( 'Content-Length: '.ob_get_length() );
flush();
ob_end_flush();
This is a solution when one is just loading a normal web page. When one is using
AJAX GET via Microsoft.XMLHTTP object it is enough to
send the "Connection: close" header with the GET request, like
r.request.open( "GET", url, true );
r.request.setRequestHeader( "Connection", "close" );
r.request.send();
Winsock Error 12002 means the following according to msdn
ERROR_INTERNET_TIMEOUT
12002
The request has timed out.
Winsock is the underlying socket transfer object for XMLHTTP in IE so any error thats not in the HTTP error range (300,400,500 etc) is almost always a winsock error.
What wasnt clear from your question is wheter the same resource is being queried the 2nd time round. You could force a new uncached resource by appending:
'?uid=+'Math.random()
To the URL which might solve the issue.
another solution might be to attach a function to the "onbeforeunload" event on the window object to call abort() an any active XMLHTTP request just before the window B is closed.
Hope these 2 pointers solve your bug.
All these posts - Disable PDF reader.. and that stuff... will not resolve your problem...
But sure shot is - RUN WINDOWS UPDATE .. keep uptodate.. This issue gets resolved by itself..
Experience speaks ;)
HydTechie

Categories

Resources