How can I "ping" the user in JavaScript? - javascript

I need to get the request time it took the client to reach the server and the opposite.
And so I have been able to do this in Python which does not help me now on a webpage.
So I heard that it can be done using Ajax, is that true?
If so, Can you give me some details or information where I should start from?
Thank you, looking forward to a answer!

You could emulate a ping via http however that is not very accurate. A simple way would be to post the current timespamp in ms or ns and wait then for the response of the same time. The difference between the real time and time of the response is the so called round trip time (RTT). If you divide it by two you get the response time what the ping is.

On newer browsers, you can use the navigation timing API (HTML5 rocks article on navigation timing) to get a fine-grained break down of the time that it takes to load your page. You can simply subtract the relevant fields from "performance.timing" to get the timing that you are interested in. The (secureConnectionStart - connectStart) or (connectEnd - connectStart), depending on whether the connection is an SSL connection, looks like a reasonable approximation of ping time (though, as rekire# points out, you probably want to include more than just that if you are trying to measure the overall user-visible latency of your website for end users).

For your reference, the ajax ping works as below.
Unfortunately it is very possible you will get blocked by domain policy and never get a success.
function ping(){
$.ajax({
url: 'http://website.com',
success: function(result){
alert('replied');
},
error: function(result){
alert('error');
}
});
Please see this code as an example of a seemingly successful implementation for what you are trying to do.

Related

Race conditions during simultaneous link click and asynchronous AJAX request?

I'm currently facing a situation similar to the relatively-simple example shown below. When a user clicks on a link to a third-party domain, I need to capture certain characteristics present in the user's DOM and store that data on my server. It's critical that I capture this data for all JS-enabled users, with zero data loss.
I'm slightly concerned that my current implementation (shown below) may be problematic. What would happen if the external destination server was extremely fast (or my internal /save-outbound-link-data endpoint was extremely slow), and the user's request to visit the external link was processed before the internal AJAX request had enough time to complete? I don't think this would be a problem (because in this situation, the browser doesn't care about receiving a response from the AJAX request), but getting some confirmation from fellow developers would be much appreciated.
Also, would the answer to the question above vary if the <a> link pointed to an internal URL rather than an external one?
<script type="text/javascript">
$(document).ready(function() {
$('.record-outbound-click').on('click', function(event) {
var link = $(this);
$.post(
'/save-outbound-link-data',
{
destination: link.attr('href'),
category: link.data('cat')
},
function() {
// Link tracked successfully.
}
);
});
});
</script>
<a href="http://www.stackoverflow.com" class="record-outbound-click" data-cat="programming">
Visit Stack Overflow
</a>
Please note that using event.preventDefault(), along with window.location.href = var.attr('href') inside $.post's success callback, isn't a viable solution for me. Neither is sending the user to a preliminary script on my server (for instance, /outbound?cat=programming&dest=http://www.stackoverflow.com), capturing their data, and then redirecting them to their destination.
Edit 2
Also consider the handshake step (Google's docs):
Time it took to establish a connection, including TCP handshakes/retries and negotiating a SSL.
I don't think you and the server you're sending the AJAX request to can complete the handshake if your client is no longer open for connection to the server (i.e., you're already at Stackoverflow or whatever website your link navigates to.)
Edit 1
More broadly, though, I was hoping to understand from a theoretical point of view whether or not the risk I'm concerned about is a legitimate one.
That's an interesting question, and the answer may not be as obvious as it seems.
That's just a sample request/response in my network tab. Definitely shouldn't be thought of to be used as any sort of trend or representation for general requests/responses.
I think the gap we might be most concerned with is the 1.933ms stall time. There's also other additional steps that need to happen before the actual request is sent (which itself was about 0.061ms).
I'd be worried if there's an interruption in any of the 3 steps leading up to the actual request (which took about 35ms give or take).
I think the question is, if you go somewhere else before the "stalled", "DNS Lookup", and "Initial connection" steps happen, is the request still going to be sent? That part, I don't know. But what about any general computer or browser lag beforehand?
Like you mentioned, the idea that somehow the req/res cycle to/from Stackoverflow would be faster than what's happening on your client (i.e., the initiation itself -- not even the complete cycle -- of a network request to your server) is probably a bit ridiculous, but I think theoretically (as you mentioned, this is what you're interested in), it's probably a bad idea in general to depend on these types of race conditions.
Original answer
What about making the AJAX request synchronous?
$.ajax({
type: "POST",
url: url,
async: false
});
This is generally a terrible idea, but if, in your case, the legacy code is so limiting that you have no way to modify it and this is your last option (think, zombie apocalypse), then consider it.
See jQuery: Performing synchronous AJAX requests.
The reason it's a bad idea is because it's completely blocking (in normal circumstances, you don't want potentially un-completeable requests blocking your main thread). But in your case, it looks like that's actually exactly what you want.

javascript/ajax – ajax requests very slow - time out issue?

I programmed an experiment in javascript/jquery extensively using ajax to communicate with the server / execute php scripts on the server.
Now, a friend of mine wants to use the scripts in japan and is running into problems that I haven't encountered before.
For example he has to press submit buttons several times instead of just getting redirected to the next page or he sees placeholder symbols instead of the data that is stored in the mysql database and should have been fetched (and replace the placeholders).
In the last case, the script is quite straightforward. A $.post request is executed in the document ready function, which then executes a function in which the javascript replacements take place.
Now we are kind of lost what could be the cause of it. It seems to work fine on my side.
The server is located in europe and he also had some issues with proxy servers.
Could that be a problem?
Could it be that the $.post commands are just very slow or even time out?
Is there anything that can be done about this to make the scripts run reliably on his side?
I am happy to provide more specific information if needed, however currently I am a little bit lost and don't know what to look for specifically.
Thanks for any help!
edit:
Here is the jquery code for one specific example where we encountered this problem:
function updateContent() {
$.post("php/earnings.php",
{
type: "contribution",
grp : group,
pbnr : PbNr,
multiplier : multiplier
}, processAndShow);
function processAndShow(data) {
// here are just jquery commands updating html elements
}
}
$(document).ready(function() {
updateContent();
});
the earnings.php file contain roughly 150 lines of code with 15 mysql_query requests.
I also had a look at the connection diagnostics in firefox for the earnings.php $.post request:
Connection: "Keep-Alive";
Keep-Alive: "timeout=1, max=96"
has it possibly something to do with the timeout / max settings of the apache server?

Long Polling: How do I calm it down?

I'm working on a simple chat implementation in a function that has an ajax call that invokes a setTimeout to call itself on success. This runs every 30 seconds or so. This works fine, but I'd like a more immediate notification when a message has come. I'm seeing a lot of examples for long polling with jQuery code that looks something like this:
function poll()
{
$.ajax(
{
data:{"foo":"bar"},
url:"webservice.do",
success:function(msg)
{
doSomething(msg);
},
complete:poll
});
}
I understand how this works, but this will just keep repeatedly sending requests to the server immediately. Seems to me there needs to be some logic on the server that will hold off until something has changed, otherwise a response is immediately sent back, even if there is nothing new to report. Is this handled purely in javascript or am I missing something to be implemented server-side? If it is handled on the server, is pausing server execution really a good idea? In all of your experience, what is a better way of handling this? Is my setTimeout() method sufficient, maybe with just a smaller timeout?
I know about websockets, but as they are not widely supported yet, I'd like to stick to current-gen techniques.
Do no pause the sever execution... it will lead to drying out server resources if lot of people try to chat...
Use client side to manage the pause time as you did with the setTimeout but with lower delay
You missed the long part in "long polling". It is incumbent on the server to not return unless there's something interesting to say. See this article for more discussion.
You've identified the trade-off, open connections to the web server, therefore consuming http connections (i.e. the response must block server side) vs frequent 'is there anything new' requests therefore consuming bandwidth. WebSockets may be an option if your browser base can support them (most 'modern' browsers http://caniuse.com/websockets)
There is no proper way to handle this on the javascript side through traditional ajax polling as you will always have a lag at one end or the other if you are looking to throttle the amount of requests being made. Take a look at a nodeJS based solution or perhaps even look at the Ajax Push Engine www.ape-project.org which is PHP based.

$.ajax cache true duration

I'm trying to get a list of tweets (in this case 25 with a specific hashtag) using $.ajax and twitter's search functionality.
There's 2 things that are a bit unclear to me:
How long is the duration for cache: true (how long does it take untill it sends a new request instead of using the cache)? At the moment it seems to be every few seconds but I'd like to know the precise duration and if/how this can be altered.
This retrieving the information from twitter will be done by about 2000-3000 people at random moments, but probably not by that same person for a little while after. How will twitter respond to this (block the IP address because of too many requests maybe)? This is why I want to cache the information for about 1-5 minuutes, to lessen the amount of requests.
TL;DR: I'm making alot of requests to the twitter search functionality, how do I lessen the load?
The cache time for $.ajax is most likely browser dependant so you should not rely on this to balance your load against the Twitter API. This caching can be turned off by passing { cache: false } into the settings parameter. Instead you should implement your own cacheing system.
Twitter has a page describing this: https://dev.twitter.com/docs/rate-limiting.
It says that it sends some headers to tell you how many requests you have left within a certain timespan. What you will have to do is just cache long enough so that you do no exceed the limits Twitter imposes on you. You should do some testing to see how different load on your service will affect the total number of calls to the Twitter API and design your cache system accordingly. The most optimal solution here will probably be something like from a to b'o-clock i will need to cache for c minutes and from d to e'o-clock i will need to cache for f minutes. Another solution might be to implement a system to keep track of load as it is happening and determine cache times in real time. If both of these solutions seams to daunting you can probably get away with just picking a long enough cache time so that you will never exceed the limit.

jQuery: Using a single Ajax call, receive progressive statuses instead of one single response?

I'm just wondering..is it possible to receive multiple responses from a single ajax call?
I'm thinking purely for aesthetic purposes to update the status on the client side.
I have a single ajax method that's called on form submit
$.ajax({
url: 'ajax-process.php',
data: data,
dataType: 'json',
type: 'post',
success: function (j) {
}
});
I can only get one response from the server-side. Is it possible to retrieve intermittent statuses? Such as:
Default (first): Creating account
Next: Sending email confirmation
Next: Done
Thanks for your help! :)
From a single ajax call, I don't think it is possible.
What you could do is check frequently where the process is (it's what is used for the upload bars in gmail for example). You do a first ajax request to launch the process, and then a series of ajax request to ask the server how he is doing. When the server answers "I'm done", you're good to go, and until that you can make the server respond and say the current state.
There is something called comet which you can set up to "push" requests to client, however it is probably way more than what you are wanting to invest in, time-wise.
You can open up a steady stream from the server, so that it continues to output, however I'm not sure how client-side script can handle these as individual "messages". Think about it like a server that outputs some info to the browser, does more work, outputs some more to the browser, does more work, etc. This shows up more or less in real time to the browser as printed text. It is one long response, but it is still one response. I think ajax only handles a response once it finished being sent, but maybe someone else will know more than me on the topic.
But you couldn't have the server output several individual responses without reloading itself, at least not with PHP, because once you start outputting the response, the response has begun and you can't chop that up without finishing the response, which happens when the script is done executing.
Your best bet is with the steady stream, but again, I'm not sure how ajax handles getting responses in chunks.
Quick Update
Based on the notes for this plugin:
[http://plugins.jquery.com/project/ajax-http-stream]
things don't look promising. Specifically:
Apparently the trend is to disallow access to the xmlhttprequest.responseText before the request is complete (stupid imo). Sorry there's nothing I can do to fix this
Thus, not only can you not get what you want in one request, you probably can't get it multiple requests, unless you want to break up the actual server-side process into several parts, and only have it continue to the next step when an ajax function triggers it.
Another option would be to have your script write it's status at specific points to another file on the server, call it "status.xml" or "status.txt". Have your first ajax function initialize the process, and have a second ajax function that queries this status file and outputs that to the user.
It is possible, but it has more to do with your backend script. As Anthony mentioned there is a tech called comet. Another term I've heard is called "Long polling". The idea is that you delay the time in which your php(insert language of choice) script finished processing.
In php you can do something like this:
while($response !== 'I'm done'){
sleep(1);
}else{
return $some_value;
exit();
}
This code stops your script from completely finishing. sleep(1) allows the script to stop and lets the server rest for 1 millisecond, before it loops back through. You can adjust the sleep time based on your needs. In php the amount of time the script sleeps is not counted agains your server timeout time.
You'll obviously need to make more checks for you code. You'll probably also want to allow for an abort script call. Something like sending a get request to kill the backend script. Maybe on the javascript unload event.
In the tests that I've done. I made the initial ajax call, and when the value was returned, I made another ajax call, that way your back end script wont time out.
I've only played around with this on my local server, so i'm not sure how real world this is, but it works.

Categories

Resources