Viability of running XMLHttpRequest every second? - javascript

I have an application that gathers live JSON data every second from another server via XMLHttpRequest. After checking the Networks panel on Chrome, I have found that the size of each packet is about 697 bytes. I am unsure whether this is a high or a low number, and whether there are any potential problems running my application like so.
Example:
var exhaitch = new XMLHttpRequest();
var exlink = "wheremydatais.com";
exhaitch.onreadystatechange = function(){
if (this.readyState == 4 && this.status == 200){
console.log(JSON.parse(this.responseText));}
}
exhaitch.open("GET", exlink, true);
exhaitch.send();
This javascript code is put inside an interval that is set to run every 1.5 seconds. The console log contains the updated data I want to use in my application.
I understand that ideally this would have been done using Node.js and Socket.io. However, much of this application has already been built over LAMP stack. So I am wondering what my options are if this method is unsustainable over the long term.
One thing I have looked into recently is socket.io without Node. Though I am still unclear how to go about that.

I think this is more preferable and will scale better:
function getMyData(){
var exhaitch = new XMLHttpRequest();
var exlink = "wheremydatais.com";
exhaitch.onreadystatechange = function(){
if (this.readyState == 4 && this.status == 200){
console.log(JSON.parse(this.responseText));
getMyData();
}
}
exhaitch.open("GET", exlink, true);
exhaitch.send();
}

Related

What's the best way to check the connectivity in pure JavaScript?

I've a tried a lot of things to check the internet connection of the users of my app. My first researches brought me to the navigator.onLine method. It works sometimes. But sometimes it doesn't. Of course it's not new, I have seen multiples people complaining about that.
Then I tried the XHR request. It was working on every devices, whatever the internet navigator (not like the previous method). But I got some warnings from Chrome and Firefox cos it was synchronous and may slow the whole app.
So I converted my function to an asynchronous function :
function verification() {
var xhr = new XMLHttpRequest();
xhr.open("GET", "//" + window.location.hostname + "/ressources/favicon.ico?rand=" + Date.now(), true);
xhr.onload = function (e) {
if (xhr.readyState === 4 && xhr.status === 200) {
reconnection();
}
};
xhr.onerror = function (e) {
deconnexion();
};
xhr.send(null);
}
The idea is simple, I check if I can access to the favicon (with a rand to make it unique and avoid cache ressource). If the request is a success then I consider I'm connected. If not, I'm not connected. So far, it seems to work pretty well.
My question is : is it the best way to do that with pure JS ? Or should I maybe consider using fetch ? Or is there a better way I didn't find ?

Hit WEB API URLs multiple times(approximately 2500 times)

I'm doing a web api project in MVC in which i want to test the maximum hit limit of web api method which is 2500 per day. i want to test this limit by hitting url at 2500 times but i don't get any proper solution yet.
The url of my web api method is: http://localhost:63091/api/CustomerSite/GetSiteList?accessToken=123456789
Suggest me any online tool or guide me through any js script. Thanks in advance.
Postman is an excellent tool to send requests and test an API.
You can use the collection runner to run a request multiple times.
You can find the full guide here:
https://www.getpostman.com/docs/v6/postman/collection_runs/running_multiple_iterations
You could install Apache Bench on your local machine and use that.
Here is a quick article that walks you through using this tool.
You can do this in javascript you can create a .js file for this and include that in .html file and access that file on the browser:
for (i = 0; i < 2500; i++){
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
//do Something
}
};
xhttp.open("GET", "http://localhost:63091/api/CustomerSite/GetSiteList?accessToken=123456789", true);
xhttp.send();
}

XMLHttpRequest does not seem to do anything

i have been trying very unsuccessfully to download a binary file from my server using jquery-ajax, which i finally gave up. so now i am trying to use XMLHttpRequest instead. however, i cannot even get a simple example working.
strangely enough, this code does not appear to do anything. i copy/pasted this from w3schools and this example is near identical to many other examples. it does not work for me in either chrome or FF:
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (xhttp.readyState == 4 && xhttp.status == 200) {
// Action to be performed when the document is read;
}
};
xhttp.open("GET", '/blah/blah/output.png', true);
xhttp.send();
we go into the onreadystatechange function only once, on the open() statement with an xhttp.readyState equal to one, but not on the send() step. i should think it would at least throw some kind of error rather than do nothing at all.
also, as an experiment, i purposely fed the open() a bad url - but again no reply.
can anybody tell me what i might be doing wrong?
thank you very much.
Your code looks correct to me, which points to some external cause.
Is your code flowing all the way through to the end of the execution context? Browsers will hold onto network requests until the engine yields back to the browser.
For instance:
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (xhttp.readyState == 4 && xhttp.status == 200) {
// Action to be performed when the document is read;
}
};
xhttp.open("GET", '/blah/blah/output.png', true);
xhttp.send();
while(true){}
will never send the call.

Not Receiving Asynchronous AJAX Requests When Sent Rapidly

My script is sending a GET request to a page (http://example.org/getlisting/) and the page, in turn, responds back with a JSON object. ({"success":true, "listingid":"123456"})
Here's an example snippet:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
Simple enough. The script works perfectly too! The issue arises when I want to do this:
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
What I imagine should happen is my script would create a steady flow of GET requests that get sent out to the server and then the server responds to each one. Then, my script will receive the server's responses and send them to the callback.
To be more exact, say I let this script run for 5 seconds and my script sent out 20 GET requests to the server in that time. I would expect that my callback (listingCallback) would be called 20 times as well.
The issue is, it isn't. It almost seems that, if I sent out two GET requests before I received a response from the server, then the response is ignored or discarded.
What am I doing wrong/misunderstanding from this?
Many browsers have a built in maximum number of open HTTP connections per server. You might be hitting that wall?
Here is an example from Mozilla but most browsers should have something like this built in: http://kb.mozillazine.org/Network.http.max-connections-per-server
An earlier question regarding Chrome:
Increasing Google Chrome's max-connections-per-server limit to more than 6
If you have Windows, take a look at a tool like Fiddler - you might be able to see if all of the requests are actually being issued or if the browser is queueing/killing some of them.
You can't reuse the same XMLHttpRequest object opening a new connection while one is in progress, otherwise it will cause an abrupt abortion (tested in Chrome). Using a new XMLHttpRequest object for each call will solve that:
function listingCallback(event) {
if (this.readyState == 4 && this.status == 200) {
console.log(this.responseText);
}
}
window.setInterval(function() {
var listingAjax = new XMLHttpRequest();
listingAjax.addEventListener("load", listingCallback, false);
listingAjax.open("GET", "http://example.org/getlisting/", true);
listingAjax.send();
}, 250);
This will work nicely queueing a new ajax request for each interval.
Fiddle
Note that too frequent calls may cause slowdown due to the maximum limit of concurrent ajax calls which is inherent to each browser.
Though, modern browsers have a pretty fair limit and very good parallelism, so as long as you're fetching just a small JSON object modern browsers should be able to keep up even when using a dial-up.
Last time I made an ajax polling script, I'd start a new request in the success handler of the previous request instead of using an interval, in order to minimize ajax calls. Not sure if this logic is applicable to your app though.

Cross-browser implementation of "HTTP Streaming" (push) AJAX pattern

Client request web page from server. Clent then requests for extra calculations to be done; server performs series of calculations and sends partial results as soon as they are available (text format, each line contains separate full item). Client updates web page (with JavaScript and DOM) using information provided by server.
This seems to fit HTTP Streaming (current version) pattern from Ajaxpatterns site.
The question is how to do it in cross-browser (browser agnostic) way, preferably without using JavaScript frameworks, or using some lightweight framework like jQuery.
The problem begins with generating XMLHttpRequest in cross-browser fashion, but I think the main item is that not all browsers implement correctly onreadystatechangefrom XMLHttpRequest; not all browsers call onreadystatechange event on each server flush (BTW. how to force server flush from within CGI script (in Perl)?). Example code on Ajaxpatterns deals with this by using timer; should I drop timer solution if I detect partial response from onreadystatechange?
Added 11-08-2009
Current solution:
I use the following function to create XMLHttpRequest object:
function createRequestObject() {
var ro;
if (window.XMLHttpRequest) {
ro = new XMLHttpRequest();
} else {
ro = new ActiveXObject("Microsoft.XMLHTTP");
}
if (!ro)
debug("Couldn't start XMLHttpRequest object");
return ro;
}
If I were to use some (preferably light-weight) JavaScript framework like jQuery, I'd like to have fallback if user chooses not to install jQuery.
I use the following code to start AJAX; setInterval is used because some browsers call onreadystatechange only after server closes connection (which can take as long as tens of seconds), and not as soon as server flushes data (around every second or more often).
function startProcess(dataUrl) {
http = createRequestObject();
http.open('get', dataUrl);
http.onreadystatechange = handleResponse;
http.send(null);
pollTimer = setInterval(handleResponse, 1000);
}
The handleResponse function is most complicated one, but the sketch of it looks like the following. Can it be done better? How it would be done using some lightweight JavaScript framework (like jQuery)?
function handleResponse() {
if (http.readyState != 4 && http.readyState != 3)
return;
if (http.readyState == 3 && http.status != 200)
return;
if (http.readyState == 4 && http.status != 200) {
clearInterval(pollTimer);
inProgress = false;
}
// In konqueror http.responseText is sometimes null here...
if (http.responseText === null)
return;
while (prevDataLength != http.responseText.length) {
if (http.readyState == 4 && prevDataLength == http.responseText.length)
break;
prevDataLength = http.responseText.length;
var response = http.responseText.substring(nextLine);
var lines = response.split('\n');
nextLine = nextLine + response.lastIndexOf('\n') + 1;
if (response[response.length-1] != '\n')
lines.pop();
for (var i = 0; i < lines.length; i++) {
// ...
}
}
if (http.readyState == 4 && prevDataLength == http.responseText.length)
clearInterval(pollTimer);
inProgress = false;
}
The solution you linked to is not AJAX at all, actually. They call it HTTP Streaming but it's essentially just long polling.
In the example they link to, you can see for yourself quite easily with firebug. Turn on the Net panel - there are no XHR entries, but it takes just a hair over 10 seconds to load the original page. That's because they're using PHP behind the scenes to delay the output of the HTML. This is the essence of long polling - the HTTP connection stays open, and the periodic HTML sent back is javascript commands.
You can opt to do the polling completely on the client side, though, with setTimeout() or setInterval()
A jQuery example
<script type="text/javascript">
$(document).ready(function()
{
var ajaxInterval = setInterval( function()
{
$.getJSON(
'some/servie/url.ext'
, { sample: "data" }
, function( response )
{
$('#output').append( response.whatever );
}
);
}, 10000 );
});
</script>
I would take a look at orbited
They use several comet transport implementation that they choose based on configuration and browser sniffing.
See http://orbited.org/svn/orbited/trunk/daemon/orbited/static/Orbited.js
and look for "Orbited.CometTransports"
Some of the different transports must be matched by the backend implementation, so have a look at the server side for orbited also.

Categories

Resources