web workers behave differently in webkit than Firefox - javascript

I have a web application that works just fine in modern webkit-based browsers (http://montecarlo-tester.appspot.com/). Basically it uses a webworker to fetch data from a server, and then sends it back after performing some computations.
It works just fine in Chrome/Safari (no errors in console), but when I try to use it in Firefox, it doesn't. I've deduced that somehow the variable 'iterations' is not set properly in Firefox. Unfortunately, Firefox lacks a debugger (for web workers), and javascript has functional scoping, so it's really hard to pinpoint where the problem is. I've posted the javascript code for my web worker, and I was wondering if anybody could point out where I went wrong:
importScripts('/static/js/mylibs/jquery.hive.pollen-mod.js');
$(function (data) {
main();
//while(main());
close();
});
function main() {
//make an ajax call to get a param
var iterations//value will be set by server response
var key//key of the datastore object
var continueloop = true;
p.ajax.post({
url:'/getdataurl',
dataType: "json",
success: function(responseText){
if (responseText === null) {
var workermessage = {
"log":"responseText is null. Either the server has issues or we have run out of stuff to compute."
};
$.send(workermessage);
continueloop = false;
}
iterations = responseText.iterationsjob;
key = responseText.key;
}
});
if (continueloop === false) {
return false;
}
//here is where I think the problems begin. In chrome/safari, iterations = 1000.
//In Firefox however, iterations = null. As a result, everything after that does not work.
var i,x,y,z;
var count = 0;
var pi;
start = new Date();
for (i=0;i<iterations;i++) {
x = Math.random();
y = Math.random();
z = x*x+y*y;
if(z<=1.0){
count++;
}
}//end for loop
pi = count/(iterations)*4.0;
end = new Date();
result = {
"estimated_pi":pi,
"num_iter":iterations,
"duration_ms":end.valueOf()-start.valueOf(),
"key":key
};
//send results to the server
p.ajax.post({
url: "/resultshandler",
dataType:'json',
data: result,
success: function()
{
//do nothing!
}
});
$.send(result);
return true;//persists the loop
}

You're doing an async XHR, then immediately doing a loop trying to use its results. I have no idea why this possibly works in Chrome, but it's definitely racy. Have you tried passing "sync:true" in your post options?
Edit: Oh, nevermind. I see why your code works. The hive.pollen script has this wonderful bit:
sync: navigator.userAgent.toLowerCase().indexOf('safari/') != -1 ? false : true,
So it's doing a sync XHR in Chrome/Safari and an async one in everything else, by default (because it passes options.sync as the value for the async argument to XMLHttpRequest.open, which is backwards, but whatever; it does mean that you actually need to pass sync: false at the callsite to get sync behavior). And since you don't specify whether you want sync or async, you get sync in Chrome and async in Firefox.
Oh, and the script has this wonderful comment before that line:
// TODO: FIX THIS.

Related

Firefox CORS issue with JSON

I'm trying to use the API from https://www.themoviedb.org/. (The key is free and can be changed easily, so I'll include it because without it, you can’t even test their functions).
Now my JavaScript is working fine in FF when it's hosted local, but not on GitHub pages.
Here is a function that doesn’t work. Error is:
NetworkError: A network error occurred.
…and it appears to happen after bhttp.send();.
function getMovieDetails() {
var reqURL = "https://api.themoviedb.org/3/movie/latest?api_key=afe4e10abbb804e2b4a4f8a3ef067ad5&language=en-US";
var bhttp = new XMLHttpRequest();
bhttp.open("GET", reqURL, false);
bhttp.setRequestHeader("Content-type", "json");
bhttp.send();
var response = JSON.parse(bhttp.responseText);
var str = JSON.stringify(response, null, 2);
return response;
}
console.log(getMovieDetails());
It works fine in Chrome. Googling appears to indicate it’s a CORS problem, but as far as I know GitHub pages supports CORS, so I don't know what I'm doing wrong.
I'm not a firefox user, so you will need to test this. But if the theory of async blocking is true this should work.
I've modified it to use a simple callback, personally I wouldn't use callbacks but would make into promises, but that's another question :)
function getMovieDetails(callback) {
var reqURL = "https://api.themoviedb.org/3/movie/latest?api_key=afe4e10abbb804e2b4a4f8a3ef067ad5&language=en-US";
var bhttp = new XMLHttpRequest();
bhttp.open("GET", reqURL, true);
bhttp.setRequestHeader("Content-type", "json");
bhttp.onload = function() {
if (bhttp.readyState === 4) {
if (bhttp.status === 200) {
callback(JSON.parse(bhttp.responseText));
} else {
console.error(bhttp.statusText);
}
}
};
bhttp.send();
}
getMovieDetails(function (movie) {
console.log(movie);
});
Alright looks like skirtle was correct, the issue was the firefox addon Privacy-Badger blocking the API. I feel pretty stupid now but at least my code is now pretty clean.

Non-blocking function calls with Boost

I have this function that is called from Javascript (Browser):
STDMETHODIMP CActivexObject::WriteToREST(BSTR data, BSTR* retstr)
{
std::string sdata = ConvertToString(data);
RESTClient restclient;
RESTClient::response resp = restclient.post("somewhere.com", "/post", sdata);
CComBSTR bstrResult(resp.body.c_str());
*retstr = bstrResult.Detach();
return S_OK;
}
This method is called from the Javascript like this:
for (var i = 0; i < rowElems.length; i++) {
var resp = ActivexObject.WriteToREST(_rowToData(rowElems[i]));
}
The function works fine, unless the call to REST or the server gets slow, and the Javascript (Browser), i.e. Internet Explorer to show "Not Responding" error box. Then browser shuts down. If I remove the "post" call and just log the data, the there's no such error like this.
Sometimes the WriteToREST is called twice or three times.
What could be a possible solution for this that I can make with the C++ code?

Internet Explorer issues running code if debugger closed

Yesterday I encountered an interesting issue with Internet Explorer. A script runs perfectly on Chrome, Firefox, Safari, but with Internet Explorer 11 it doesn't do anything. If I open the debugger it runs smoothly and everything is as it should, but the moment I close the debugger it stops working and I have no idea why is this. My first thought was the IE extensions, but I disabled them to no veil. I tried running in safe-mode, with admin rights, but nothing seems to work.
To summarize everything: IE - script runs ONLY while the debugger is On. No error is produced, it just doesn't work.
I would be really glad for any ideas what can I do regarding this. Thank you in advance.
--------------EDIT---------------
Here is the script that doesn't run.
for (var i = 0; i < AllStrategyGrids.length; i++) {
try {
isChange = true;
var data = $("#objectives").data("kendoGrid").select().data();
if (AllStrategyGrids[i].ID == data.uid) {
var jsonData = new Object();
jsonData.StrategicID = "1";
jsonData.ObjectiveID = $("#ObjectiveID").val();
jsonData.HeaderID = "00000000-0000-0000-0000-000000000000";
jsonData.PeriodID = "00000000-0000-0000-0000-000000000000";
jsonData.Strategic = "Please enter strategic";
jsonData.TaskStatus = "";
jsonData.TaskStatusID = "1";
jsonData.Position = "";
jsonData.Sorted = "1";
jsonData.SessionID = "00000000-0000-0000-0000-000000000000";
tmpGrid = AllStrategyGrids[i].Grid.data("kendoGrid");
var dataRows = tmpGrid.items();
var rowIndex = dataRows.index(tmpGrid.select());
$.ajax({
url: "CreateStrategy",
type: 'POST',
data:
{
strategics: jsonData,
VersionID: $("#VersionUID").val(),
index: rowIndex
},
success: function () {
tmpGrid.dataSource.read();
}
});
}
} catch (e) { }
}
Just a guess, this is likely because you have a console.log in that script and in IE the console object doesn't exist if your debugger is closed... we've all be there :)
An easy fix is the just add small shim as early on in your site as you can.. it won't do a thing on any browsers except IE and will just stop the execution error that probably blocking your other JS code from running...
<script>
if(!console)console={log:function(){}};
</script>
a more robust solution here :
'console' is undefined error for Internet Explorer
---- EDIT
Okay one thing i can see from your code is that you're going to fail silently because you've used a try catch but do nothing with it. This is going to catch any exception you are having (blocking it from reaching your window thus making it seem like you have no errors). I would perhaps alert the error message at the very least while testing (so you don't need to open Debugger) and see if anything is thrown...
I'd be suspecting your ajax request myself.. that or an undefined in IE8.. so add some logging alerts (brute force i know) to test your assumptions at certain points e.g.
alert("Reach here and data="+data);
Alternatively, i can also see that your ajax request has no callbacks for unsuccessful which might be good idea to add to your call. It might be that the success isn't calling for some reason...
$.ajax({
url: "CreateStrategy",
type: 'POST',
data:
{
strategics: jsonData,
VersionID: $("#VersionUID").val(),
index: rowIndex
},
success: function () {
tmpGrid.dataSource.read();
}
})
.fail(function() {
alert( "error" );
//handle a failed load gracefully here
})
.always(function() {
alert( "complete" );
//useful for any clean up code here
});
Final food for thought.
Since you're checking the DOM for an item, and i have no idea when this code is called but just in case its called directly AFTER a page reload..and lets assume something in IE isn't ready at 'that' point, it might be the DOM isn't ready to be queried yet? Try executing this code when the DOM is ready.. lots of ways to achieve this $.ready , setTimeout(f(){},1) or vanilla... but you get the idea..
Note: Debugging Script with the Developer Tools: MSDN
To enable script debugging for all instances of Internet Explorer, on
the Internet Options menu, click the Advanced tab. Then, under the
Browsing category, uncheck the Disable script debugging (Internet
Explorer) option, and then click OK. For the changes to take effect,
close all instances of Internet Explorer then reopen them again.

How to implement a getter-function (using callbacks)

I have to request data for a JS-script from a MySQL database (based upon a user-id).
I did not find a simple solution for JavaScript and it was not possible to load the data using ajax, because the database is available under a different domain.
I implemented a workaround using PHP and curl.
Now the JS has to "wait" for the request to finish, but the script is of course running asynchronously and does not wait for the response.
I know that it's not really possible to wait in JS, but it must be possible to return value like this.
I also tried using a return as another callback, but that didn't work of course, because the getter-function will run further anyway.
How can I implement a simple getter, which "waits" and returns the response from the HTTP-request?
Thanks for any other clues. I'm really lost at the moment.
This is a excerpt from the source code:
/**
* Simple getter which requests external data
*/
function simple_getter() {
// http request using a php script, because ajax won't work crossdomain
// this request takes some time. function finished before request is done.
/* Example */
var url = "http://example-url.com/get_data.php?uid=1234";
var response_callback = handle_result_response;
var value = send_request( url, response_callback );
value = value.split('*')[0];
if (value === '' || value == const_pref_none) {
return false;
}
/* 1. returns undefinied, because value is not yet set.
2. this as a callback makes no sense, because this function
will run asynchronous anyway. */
return value;
}
Additional information about the used functions:
/**
* Callback for the send_request function.
* basically returns only the responseText (string)
*/
function handle_result_response(req) {
// do something more, but basically:
return req.responseText;
}
/**
* Requests data from a database (different domain) via a PHP script
*/
function send_request( url, response_callback ) {
var req = createXMLHTTPObject();
if (!req)
return;
var method = (postData) ? "POST" : "GET";
req.open(method, url, true);
req.setRequestHeader('User-Agent','XMLHTTP/1.0');
// More not relevant source code
// ...
req.onreadystatechange = function () {
// More not relevant source code
// ...
response_callback(req);
}
if (req.readyState == 4)
return;
req.send(postData);
}
Not really relevant code, but required for the HTTP-request:
var XMLHttpFactories = [
function () {return new XMLHttpRequest()},
function () {return new ActiveXObject("Msxml2.XMLHTTP")},
function () {return new ActiveXObject("Msxml3.XMLHTTP")},
function () {return new ActiveXObject("Microsoft.XMLHTTP")}
];
function createXMLHTTPObject() {
var xmlhttp = false;
for (var i=0; i<XMLHttpFactories.length; i++) {
try {
xmlhttp = XMLHttpFactories[i]();
} catch (e) {
continue;
}
break;
}
return xmlhttp;
}
You really, really shouldn't try to synchronously wait for a network request to complete. The request may never complete, may hang and take a long time, and so on. Since JavaScript is single threaded, and in fact all major browser engines are single threaded, this will cause your entire page to hang while waiting for the request, and in some browsers, may cause the entire browser to hang.
What you should do is replace code like this:
var returned = some_request('http://example.com/query');
do_something_with(returned);
with code like this:
some_request('http://example.com/query', function (returned) {
do_something_with(returned);
});
That way, you will never cause your page or the browser to hang waiting for the request, and can simply do the work once the response comes in.
I don't see whats wrong with your code in general.
When you make a request, provide a Callback. When a response comes back, which you can easily detect, execute the Callback and pass it the result.
This is the way client side apps work.It is not procedural, but works by events.
You present the screen to the user and wait
The user makes an action
You call the server, set a callback and wait
The response come and you execute the callback and wait for another step 2
Rather than trying to change that, you need to fit with that or it will be a painful experience.
Javascript is not multithreaded. It means a single statement is run at a time. The real asynchronism come from the time the server takes to respond and call the callback. You never know which call will come first and need to build your program with that in mind.

Consecutive Ajax requests without jQuery/ JS library

I have an issue, mainly with IE.
I need to be able to handle n queries one after another. But If I simply call my function below in a for loop IE does some strange things (like loading only so many of the calls).
If I use an alert box it proves that the function gets all of the calls, and surprisingly IT WORKS!
My guess is that IE needs more time than other browsers, and the alert box does just that.
Here is my code:
var Ajax = function(all) {
this.xhr = new XMLHTTPREQUEST(); // Function returns xhr object/ activeX
this.uri = function(queries) { // Takes an object and formats query string
var qs = "", i = 0, len = size(queries);
for (value in queries) {
qs += value + "=" + queries[value];
if (++i <= len) { qs += "&"; }
}
return qs;
};
xhr.onreadystatechange = function() { // called when content is ready
if (this.readyState === 4) {
if (this.status === 200) {
all.success(this.responseText, all.params);
}
this.abort();
}
};
this.post = function() { // POST
xhr.open("POST", all.where, true);
xhr.setRequestHeader("Content-type","application/x-www-form-urlencoded");
xhr.send(uri(all.queries));
};
this.get = function() { // GET
xhr.open("GET", all.where + "?" + uri(all.queries), true);
xhr.send();
};
if (this instanceof Ajax) {
return this.Ajax;
} else {
return new Ajax(all);
}
};
This function works perfectly for a single request, but how can I get it to work when called so many times within a loop?
I think the problem might be related to the 2 concurrent connections limit that most web browsers implement.
It looks like the latency of your web service to respond is making your AJAX requests overlap, which in turn is exceeding the 2 concurrent connections limit.
You may want to check out these articles regarding this limitation:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
This limit is also suggested in the HTTP spec: section 8.14 last paragraph, which is probably the main reason why most browsers impose it.
To work around this problem, you may want to consider the option of relaunching your AJAX request ONLY after a successful response from the previous AJAX call. This will prevent the overlap from happening. Consider the following example:
function autoUpdate () {
var ajaxConnection = new Ext.data.Connection();
ajaxConnection.request({
method: 'GET',
url: '/web-service/',
success: function (response) {
// Add your logic here for a successful AJAX response.
// ...
// ...
// Relaunch the autoUpdate() function in 100ms. (Could be less or more)
setTimeout(autoUpdate, 100);
}
}
}
This example uses ExtJS, but you could very easily use just XMLHttpRequest.
Given that the limit to a single domain is 2 concurrent connections in most browsers, it doesn't confer any speed advantage launching more than 2 concurrent requests. Launch 2 requests, and dequeue and launch another each time one completes.
I'd suggest throttling your requests so you only have a few (4?) outstanding at any given time. You're probably seeing the result of multiple requests being queued and timing out before your code can handle them all. Just a gess though. We have an ajax library that has built-in throttling and queues the requests so we only have 4 outstanding at any one time and don't see any problems. We routinely q lots per page.
Your code looks like it's put together using the constructor pattern. Are you invoking it with the new operator like var foo = new Ajax(...) in your calling code? Or are you just calling it directly like var foo = Ajax(...) ?
If the latter, you're likely overwriting state on your later calls. It looks like it's designed to be called to create an object, on which the get/post methods are called. This could be your problem if you're "calling it within a loop" as you say.

Categories

Resources