I have simple node server (nodejs + express + ejs rendering), when user tried to load a particular page server sends that compiled html and sends that in response.
In order to track users I have added 2 counters
a. counter which increments when server receives request for that page
b. when client loads the page it contains one some code which will make HTTP requests back to my server which I use as counter
Now the issue is that as time passes on the difference between sentResponse counter and clientLoad counter increases increases so much so that I get sentResponse = 7000 and clientLoad = 3600.
Any suggesstions on what could cause that kind of behavior
Note: I also have setup Cloudfare before requests reaches my server and I paused it but still I was getting huge differences ?
Note: I also noticed that lots of users are making requests to the page frequently like multiple times under 4s but I am sure that I am sending valid html and I different is almost 50% so I dont think that every user that visits the page is pressing ctrl+r 2 or more times under 4s.
Code:
server.js
app.get('/dashboard', (res, res) => {
...
...
...
Tracker.incrementCount('sentResponse');
res.render(page.ejs, {...});
});
app.get('/client-load-start', (req, res) => {
Tracker.incrementCount('clientLoadStart');
res.send(200);
});
page.ejs
<html>
<head>
...
<script src='/client-load-start?version=<some_random_4_digit_number>'></script>
...
</head>
<body>
...
...
</body>
</html>
I can think of the following possibilities on why the count could be different.
Setting caching headers (Cache-Control) could cause the browser to cache requests.
JS is not enabled in client's browser and so there is no possibility in making an AJAX call to the server to update clientLoad counter (This is less possible , but still is one of the possibility).
People interacted with the page even before the page loaded causing the AJAX call to not fire. People reloaded the page before the page could load.
If you are simply looking to track
Page hits , you can use the "sentResponse" counter.
Active Users , think about web sockets (Counting the number of hits to my webpage)
You can try to use cache-busting methods. Every time the browser loads /dashboard the browser is getting a script source to your /client-load-start route. The problem is the only time /client-load-start is being requested is when the browser's cache is cleared, or the first time the site loads. This will cause the problem where the two counts can differ greatly over time.
Try to make the browser think /client-load-start is a new script that needs to be downloaded. You can do this by adding query parameters to the src.
<script src='/client-load-start?<%- Date.now()%>=1'></script>
Every time the browser goes to /dashboard it'll attempt to download the new script because the query parameters have changed thus busting the cache.
Sequence of Events
Browser requests /dashboard
Server increases sentResponse, renders and sends page.ejs to the browser
Browser receives page.ejs and detects the <script> tag
Browser checks if the script URL is in the browser cache
The sequence ends here if the script URL is found in the cache; clientLoad remains unchanged
Browser requests script from your server
Server executes get request, increases clientLoad
Problem
The "problem" here is that the browser - or any proxy between the browser and your server - will cache the returned script unless you instruct it otherwise.
Note that caching is actually expected and intended behaviour in most cases. It’s also the browsers’s default behaviour.
Solution
To instruct the browser (or proxies) to avoid caching the file, you can set cache control headers.
app.get('/client-load-start', (req, res) => {
Tracker.incrementCount('clientLoadStart');
res
.set('cache-control', 'max-age=0; private; no-cache')
.send(200);
});
This would instruct the browser to never cache the request and always get a new copy, thus increasing clientLoadStart with every request.
Another way is "cache busting", meaning you intentionally change the script’s URL, e.g. by appending a random number as you did above, thus "breaking" the cache key.
Related
So I have a quite expensive and complex PHP process which makes its execution long lasting, lets call it function "expensive_process()".
I have an interface which through a press of a button calls an ajax request to a PHP script which in turn initiates "expensive_process()". Here's the javascript code:
$('#run_expensive_process_button').click( function(){
var url = "initiate_expensive_process.php";
$.ajax({
url: url
});
});
And initiate_expensive_process.php code:
<?php
session_start();
run_expensive_process();
?>
Simple and trivial. Now the issue with this is that while expensive_process() is running, the browser is losing the ability to navigate the domain. If I refresh the browser window it hangs indefinitely while the process last. If I redirect to a different url under the same domain, same thing. This happens in all browsers. However, if I relaunch the browser (close and open a new window, not a tab), navigation works normally, even though expensive_process() is still running.
I've inspected network traffic, and the HTTP request to initiate_expensive_process.php doesn't get a response while expensive_process() is running, but I'm assuming this shouldn't be locking the browser given the asynchronous nature of the request..
One more thing, which I believe is relevant. This situation is happening on a replica server. On my local machine, where I run WAMP and the same source code, this is not happening, i.e., while expensive_process() is running, I'm still able to navigate the hosting domain without having to relaunch the browser. This seems to be an indication of a server configuration problem of some sort, but I'm not sure I can rule out other possible reasons.
Anyone know what might be causing this or what can be done to figure out the source of the problem?
Thanks
Most likely the other PHP scripts also session variables. Only one script process can access a session at a time; if a second script tries to access the session while the first script is still running, it will be blocked until the first script finishes.
The first script can unlock the session by calling session_write_close() when it's done using the session. See If call PHP page via ajax that takes a while to run/return (and it sets session variables), will a 2nd ajax call see those session changes? for more details about how you can construct the script.
I wonder whether it might be due to ajax. The javascript is being executed client-side.
Maybe you might consider a stringified JSON call instead of ajax?
1) I need the following requirement to be satisfied:
Client's request(Long running process) should wait till the server is serving the request.
Current solution:
Client initiates the request followed by ping request every 5 sec to check the request status and
with that also maintains the session.
2) If the client moves to other tab in the application and comes back, The client should still show the process status and server should continue working on the request.
3) If the client closes the browser or logs out, the server should stop the process.
PS : Need the functionality for all the browsers after IE-9,Chrome and Firefox.
There are many ways to skin a cat, but this is how I would accomplish it.
1, assign unique identifier to the request (You most likely have done this as you're requesting the ready state every few seconds).
Set a member of their session data to the unique ID.
Set all your pages to load the JS needed to continually check the process, but the JS should NOT use any identifier.
In the script that parses the ajax request, have it check the session for the unique identifier, and update an internal system (file or database) with the time of the last request and the unique identifier.
and push back details if there are details to be pushed.
In another system(like a cron system) or within the process itself(if in a loop for example) have it check the same database or file system that gets updated with the timestamp for the unique identifier and the last timestamp. If the timestamp is too old, lets say 15 seconds (remember page load times may delay the 5 second interval), then kill the process if cron'd, or suicide the process if within the process script itself.
Logout will kill the session data, thus making the updating of the table/file impossible(and a check should be there for this) and that will make it so that in the next few seconds from logout, the process stops.
You will not be able to find a reliable solution for logout. window.onbeforeunload will not allow you to communicate with the server (you can only prompt the user using only the built-in dialog, and that's pretty much it). Perhaps, instead of finding a solution on capturing logout/abandon, add some logic to the server's process to wait for those pings (maybe allow 30 seconds of no-comm before abandoning); that way you're not wasting server's cycles that much and you still have the monitoring working as before.
I've been working on an automatic log-out functionality for my web page. As a part of that, I implemented the window.location.reload(true) in 3 different places in my code. Two of them are automatic, one is attached to a link. The link always works, but the automatic ones don't always work, and I don't understand why.
For the automatic logouts, one set by a debouncer:
var userActionTimeout = debounce(function(e) {
console.log("inaction timeout, reloading page");
window.location.reload(true);
},15000;);
$(document.body).on('mousemove keydown click scroll',userActionTimeout);
Which theoretically should reload the page after a certain amount of inactivity.
The other two uses happen after certain types of AJAX data submission (e.g. blatantly wrong data sent that could only happen if the client was modified) trigger a log out. Of course, any further AJAX submissions are ignored by the server, and the next page the server will serve the client is a login page. In the event this happened inadvertently, AJAX sends the client an error message that includes the following:
refresh to continue session
I also implemented a timeout that also happens if this link is served, which happens after the AJAX response is received:
if (typeof response._forceRefresh !== 'undefined' && response._forceRefresh) {
console.log('reload firing');
/*
some code to insert the link into a spotlight here
*/
setTimeout(function(){console.log('reloading in 3s...');},7000);
setTimeout(function(){
console.log('reloading...');
window.location.reload(true);
},10000);
}
However the issue I'm having is this: Most of the time, the debounce page reload works (tested in firefox & chrome), however occasionally it doesn't. The link always works, but the AJAX response reload is about 50/50. I know it receives the response from the server since the link shows up, but quite often it doesn't actually automatically reload the page.
What is going on?
When ever I get inconsistency on a web page, it usually involves caching that I didn't realize was happening. If you haven't already, look through your project with that in mind and see if there is an effected location that you can force it not to cache a page.
Another idea might be to try using the meta refresh element. There is another thread where this is suggested: auto logout idle timeout using jquery php
I have a static page (by static, I mean html, css and javascript are fixed on the page).
On the server side, I can tell (based on server-side session info) whether this is the first loading of the page or not by a particular user. Now, based on this detail, I want to signal the front end (e.g. by setting something in the response header, maybe?), so that a javascript function on the page can perform some action based on whether this is the first time the user has landed on this page.
The server that actually serves the page is Nginx, and the server that handles the logic, session, etc is Tornado. So presumably I need to do something in tornado, and then instruct nginx to deliver the static page.
Is this doable? If so, what is the most robust way of doing so?
You could achieve this with cookies: http://www.quirksmode.org/js/cookies.html
WIth cookies you would send a specific cookie w/ the response headers on the first view. Then each time the page loads, have your javascript check the specific cookie
Or use local storage: http://diveintohtml5.info/storage.html
Similar to the cookie method, but on the first view have javascript store a key=value pair in local storage, and check against that each time the page loads. Something along these lines:
(function () {
window.onload = function () {
if (localStorage.getItem("SeenBefore")) {
// Not first time viewed
} else {
localStorage.setItem("SeenBefore", true);
// First view
}
}
}())
Another option is, on the serverside, you could set a session variable the first time the page is viewed, then check agianst that each time a user sends a request for the page. This could also be achieved using cookies.
I have an ASP.NET page where a request is made and after a while server returns either new page or just file for download. I want to indicate on screen s that server is "Processing..." while it takes time before returning data.
To call javascript when user hits submit is easy. Also reload of page on Postback causes any "Processing..." indicators (some DIVs popping up at the top of page) to go away.
My problem is mostly cases when data returned by server is not a page but a file to store. How can I catch the moment that server started to return data, and run a javascript/remove "Processing" DIV ? Is it even a way to do so in case of reply of different mime type?
In which cases it is even possible?
There are a couple of ways to approximate what you're trying to do with timers and assumptions about what happened, but to really do what you're describing, you need to be polling the server for an indication that the download occurred.
What I would do is take the file, Response.WriteFile it, and then write a flag to some store, either a db, or the file system, or whatever, that uniquely identifies that the transaction has completed. On the client side, your script is polling the server, and on the server, the poll response is checking the store for the flag indicating that the download has occurred.
The key here is that you have to take finer control of the download process itself...merely redirecting to the file is not going to give you the control you need. If you need more specifics on how to accomplish any of these steps, let me know.