Client request web page from server. Clent then requests for extra calculations to be done; server performs series of calculations and sends partial results as soon as they are available (text format, each line contains separate full item). Client updates web page (with JavaScript and DOM) using information provided by server.
This seems to fit HTTP Streaming (current version) pattern from Ajaxpatterns site.
The question is how to do it in cross-browser (browser agnostic) way, preferably without using JavaScript frameworks, or using some lightweight framework like jQuery.
The problem begins with generating XMLHttpRequest in cross-browser fashion, but I think the main item is that not all browsers implement correctly onreadystatechangefrom XMLHttpRequest; not all browsers call onreadystatechange event on each server flush (BTW. how to force server flush from within CGI script (in Perl)?). Example code on Ajaxpatterns deals with this by using timer; should I drop timer solution if I detect partial response from onreadystatechange?
Added 11-08-2009
Current solution:
I use the following function to create XMLHttpRequest object:
function createRequestObject() {
var ro;
if (window.XMLHttpRequest) {
ro = new XMLHttpRequest();
} else {
ro = new ActiveXObject("Microsoft.XMLHTTP");
}
if (!ro)
debug("Couldn't start XMLHttpRequest object");
return ro;
}
If I were to use some (preferably light-weight) JavaScript framework like jQuery, I'd like to have fallback if user chooses not to install jQuery.
I use the following code to start AJAX; setInterval is used because some browsers call onreadystatechange only after server closes connection (which can take as long as tens of seconds), and not as soon as server flushes data (around every second or more often).
function startProcess(dataUrl) {
http = createRequestObject();
http.open('get', dataUrl);
http.onreadystatechange = handleResponse;
http.send(null);
pollTimer = setInterval(handleResponse, 1000);
}
The handleResponse function is most complicated one, but the sketch of it looks like the following. Can it be done better? How it would be done using some lightweight JavaScript framework (like jQuery)?
function handleResponse() {
if (http.readyState != 4 && http.readyState != 3)
return;
if (http.readyState == 3 && http.status != 200)
return;
if (http.readyState == 4 && http.status != 200) {
clearInterval(pollTimer);
inProgress = false;
}
// In konqueror http.responseText is sometimes null here...
if (http.responseText === null)
return;
while (prevDataLength != http.responseText.length) {
if (http.readyState == 4 && prevDataLength == http.responseText.length)
break;
prevDataLength = http.responseText.length;
var response = http.responseText.substring(nextLine);
var lines = response.split('\n');
nextLine = nextLine + response.lastIndexOf('\n') + 1;
if (response[response.length-1] != '\n')
lines.pop();
for (var i = 0; i < lines.length; i++) {
// ...
}
}
if (http.readyState == 4 && prevDataLength == http.responseText.length)
clearInterval(pollTimer);
inProgress = false;
}
The solution you linked to is not AJAX at all, actually. They call it HTTP Streaming but it's essentially just long polling.
In the example they link to, you can see for yourself quite easily with firebug. Turn on the Net panel - there are no XHR entries, but it takes just a hair over 10 seconds to load the original page. That's because they're using PHP behind the scenes to delay the output of the HTML. This is the essence of long polling - the HTTP connection stays open, and the periodic HTML sent back is javascript commands.
You can opt to do the polling completely on the client side, though, with setTimeout() or setInterval()
A jQuery example
<script type="text/javascript">
$(document).ready(function()
{
var ajaxInterval = setInterval( function()
{
$.getJSON(
'some/servie/url.ext'
, { sample: "data" }
, function( response )
{
$('#output').append( response.whatever );
}
);
}, 10000 );
});
</script>
I would take a look at orbited
They use several comet transport implementation that they choose based on configuration and browser sniffing.
See http://orbited.org/svn/orbited/trunk/daemon/orbited/static/Orbited.js
and look for "Orbited.CometTransports"
Some of the different transports must be matched by the backend implementation, so have a look at the server side for orbited also.
Related
I have an application that gathers live JSON data every second from another server via XMLHttpRequest. After checking the Networks panel on Chrome, I have found that the size of each packet is about 697 bytes. I am unsure whether this is a high or a low number, and whether there are any potential problems running my application like so.
Example:
var exhaitch = new XMLHttpRequest();
var exlink = "wheremydatais.com";
exhaitch.onreadystatechange = function(){
if (this.readyState == 4 && this.status == 200){
console.log(JSON.parse(this.responseText));}
}
exhaitch.open("GET", exlink, true);
exhaitch.send();
This javascript code is put inside an interval that is set to run every 1.5 seconds. The console log contains the updated data I want to use in my application.
I understand that ideally this would have been done using Node.js and Socket.io. However, much of this application has already been built over LAMP stack. So I am wondering what my options are if this method is unsustainable over the long term.
One thing I have looked into recently is socket.io without Node. Though I am still unclear how to go about that.
I think this is more preferable and will scale better:
function getMyData(){
var exhaitch = new XMLHttpRequest();
var exlink = "wheremydatais.com";
exhaitch.onreadystatechange = function(){
if (this.readyState == 4 && this.status == 200){
console.log(JSON.parse(this.responseText));
getMyData();
}
}
exhaitch.open("GET", exlink, true);
exhaitch.send();
}
In my application I use MVC model and Views are built with JavaScript DOM API.
On each page I have to check user's information to find out if session is active and if user's role gives him ability to access that page.
To make this happen, on each page I have "onload" function that triggers "sessionCheck" function which sends AJAX request to controller and returns information with which application makes decisions.
As I said JavaScript is also used to build Views, which means that after "sessionCheck" function I also have "headerView", "sectionView" and other functions that build the structure of page.
So the problem is that, before "sessionCheck" is finished other functions are loaded and nearly for 1-2 seconds users have ability to see what happens on that page and only after that they are transported out of that page by application if that is needed.
I read that there are some solutions in JQuery where "ajax.Complete" functions are available, but I couldn't same solutions in pure JavaScript. Can someone help me to solve this problem ?
This is HTML
<body onload="sessionCheckAdmin(); adminHeaderView(); adminUser(); modalView();">
sessionCheckAdmin functions looks like this
function sessionCheckAdmin()
{
var formData = new FormData();
formData.append("Code", "3");
formData.append("Sequence", "27");
formData.append("TaskId", "All");
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function()
{
if(xmlHttp.readyState == 4 && xmlHttp.status == 200)
{
var array = JSON.parse(xmlHttp.responseText);
if(array["userRole"] != "Administrator")
window.location.href = "task.php";
}
}
xmlHttp.open("POST", "../Controller.php");
xmlHttp.send(formData);
}
Part of PHP controller
case 27:
$array = json_encode($_SESSION);
echo $array;
break;
My script is sending a GET request to a page (http://example.org/getlisting/) and the page, in turn, responds back with a JSON object. ({"success":true, "listingid":"123456"})
Here's an example snippet:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
Simple enough. The script works perfectly too! The issue arises when I want to do this:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
What I imagine should happen is my script would create a steady flow of GET requests that get sent out to the server and then the server responds to each one. Then, my script will receive the server's responses and send them to the callback.
To be more exact, say I let this script run for 5 seconds and my script sent out 20 GET requests to the server in that time. I would expect that my callback (listingCallback) would be called 20 times as well.
The issue is, it isn't. It almost seems that, if I sent out two GET requests before I received a response from the server, then the response is ignored or discarded.
What am I doing wrong/misunderstanding from this?
Many browsers have a built in maximum number of open HTTP connections per server. You might be hitting that wall?
Here is an example from Mozilla but most browsers should have something like this built in: http://kb.mozillazine.org/Network.http.max-connections-per-server
An earlier question regarding Chrome:
Increasing Google Chrome's max-connections-per-server limit to more than 6
If you have Windows, take a look at a tool like Fiddler - you might be able to see if all of the requests are actually being issued or if the browser is queueing/killing some of them.
You can't reuse the same XMLHttpRequest object opening a new connection while one is in progress, otherwise it will cause an abrupt abortion (tested in Chrome). Using a new XMLHttpRequest object for each call will solve that:
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
This will work nicely queueing a new ajax request for each interval.
Fiddle
Note that too frequent calls may cause slowdown due to the maximum limit of concurrent ajax calls which is inherent to each browser.
Though, modern browsers have a pretty fair limit and very good parallelism, so as long as you're fetching just a small JSON object modern browsers should be able to keep up even when using a dial-up.
Last time I made an ajax polling script, I'd start a new request in the success handler of the previous request instead of using an interval, in order to minimize ajax calls. Not sure if this logic is applicable to your app though.
I currently have a web page that uses javascript, however; when I use my Ajax to move towards the DB my responseText is always empty.
js to make the flag and send query
objAjaxUpdates.main_flag = "getNames";
objAjaxUpdates.SendQuery(query);
next in the flow
(The url is an aspx file)
this.SendQuery = function(data) {
this.Initialize();
if (this.req != null) {
//alert(data);
//alert(this.url + " " + this.main_flag);
this.req.open("POST", this.url);
this.req.setRequestHeader("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8");
this.req.onreadystatechange = this.processData;
this.req.send(data);
}
}
Next
this.processData = function() {
if (objAjaxUpdates.req.readyState == 4) {
if (objAjaxUpdates.req.status == 200) {
if (objAjaxUpdates.req.responseText == "") {
alert('No Return');
}
else {
...
Any help would be appreciated.
I don't know the library you are using. But the procedure looks correct to me. (There are libraries like jQuery which avoids you to work on the low-level AJAX API, checking response code and so on.)
I advise you to work with something like Firebug for Mozilla, or the Chrome Developer Tools to "debug" your AJAX machinery.
There still is that good old technique:
First test the requests to the server manually.
Then, do a dumb AJAX request.
At the end, test both components.
I have an issue, mainly with IE.
I need to be able to handle n queries one after another. But If I simply call my function below in a for loop IE does some strange things (like loading only so many of the calls).
If I use an alert box it proves that the function gets all of the calls, and surprisingly IT WORKS!
My guess is that IE needs more time than other browsers, and the alert box does just that.
Here is my code:
var Ajax = function(all) {
this.xhr = new XMLHTTPREQUEST(); // Function returns xhr object/ activeX
this.uri = function(queries) { // Takes an object and formats query string
var qs = "", i = 0, len = size(queries);
for (value in queries) {
qs += value + "=" + queries[value];
if (++i <= len) { qs += "&"; }
}
return qs;
};
xhr.onreadystatechange = function() { // called when content is ready
if (this.readyState === 4) {
if (this.status === 200) {
all.success(this.responseText, all.params);
}
this.abort();
}
};
this.post = function() { // POST
xhr.open("POST", all.where, true);
xhr.setRequestHeader("Content-type","application/x-www-form-urlencoded");
xhr.send(uri(all.queries));
};
this.get = function() { // GET
xhr.open("GET", all.where + "?" + uri(all.queries), true);
xhr.send();
};
if (this instanceof Ajax) {
return this.Ajax;
} else {
return new Ajax(all);
}
};
This function works perfectly for a single request, but how can I get it to work when called so many times within a loop?
I think the problem might be related to the 2 concurrent connections limit that most web browsers implement.
It looks like the latency of your web service to respond is making your AJAX requests overlap, which in turn is exceeding the 2 concurrent connections limit.
You may want to check out these articles regarding this limitation:
The Dreaded 2 Connection Limit
The Two HTTP Connection Limit Issue
Circumventing browser connection limits for fun and profit
This limit is also suggested in the HTTP spec: section 8.14 last paragraph, which is probably the main reason why most browsers impose it.
To work around this problem, you may want to consider the option of relaunching your AJAX request ONLY after a successful response from the previous AJAX call. This will prevent the overlap from happening. Consider the following example:
function autoUpdate () {
var ajaxConnection = new Ext.data.Connection();
ajaxConnection.request({
method: 'GET',
url: '/web-service/',
success: function (response) {
// Add your logic here for a successful AJAX response.
// ...
// ...
// Relaunch the autoUpdate() function in 100ms. (Could be less or more)
setTimeout(autoUpdate, 100);
}
}
}
This example uses ExtJS, but you could very easily use just XMLHttpRequest.
Given that the limit to a single domain is 2 concurrent connections in most browsers, it doesn't confer any speed advantage launching more than 2 concurrent requests. Launch 2 requests, and dequeue and launch another each time one completes.
I'd suggest throttling your requests so you only have a few (4?) outstanding at any given time. You're probably seeing the result of multiple requests being queued and timing out before your code can handle them all. Just a gess though. We have an ajax library that has built-in throttling and queues the requests so we only have 4 outstanding at any one time and don't see any problems. We routinely q lots per page.
Your code looks like it's put together using the constructor pattern. Are you invoking it with the new operator like var foo = new Ajax(...) in your calling code? Or are you just calling it directly like var foo = Ajax(...) ?
If the latter, you're likely overwriting state on your later calls. It looks like it's designed to be called to create an object, on which the get/post methods are called. This could be your problem if you're "calling it within a loop" as you say.