1) I need the following requirement to be satisfied:
Client's request(Long running process) should wait till the server is serving the request.
Current solution:
Client initiates the request followed by ping request every 5 sec to check the request status and
with that also maintains the session.
2) If the client moves to other tab in the application and comes back, The client should still show the process status and server should continue working on the request.
3) If the client closes the browser or logs out, the server should stop the process.
PS : Need the functionality for all the browsers after IE-9,Chrome and Firefox.
There are many ways to skin a cat, but this is how I would accomplish it.
1, assign unique identifier to the request (You most likely have done this as you're requesting the ready state every few seconds).
Set a member of their session data to the unique ID.
Set all your pages to load the JS needed to continually check the process, but the JS should NOT use any identifier.
In the script that parses the ajax request, have it check the session for the unique identifier, and update an internal system (file or database) with the time of the last request and the unique identifier.
and push back details if there are details to be pushed.
In another system(like a cron system) or within the process itself(if in a loop for example) have it check the same database or file system that gets updated with the timestamp for the unique identifier and the last timestamp. If the timestamp is too old, lets say 15 seconds (remember page load times may delay the 5 second interval), then kill the process if cron'd, or suicide the process if within the process script itself.
Logout will kill the session data, thus making the updating of the table/file impossible(and a check should be there for this) and that will make it so that in the next few seconds from logout, the process stops.
You will not be able to find a reliable solution for logout. window.onbeforeunload will not allow you to communicate with the server (you can only prompt the user using only the built-in dialog, and that's pretty much it). Perhaps, instead of finding a solution on capturing logout/abandon, add some logic to the server's process to wait for those pings (maybe allow 30 seconds of no-comm before abandoning); that way you're not wasting server's cycles that much and you still have the monitoring working as before.
Related
I have a PHP page named update_details.php?id=xyz which has a query for getting the details and updating the login time of the users.
The users have a profile page named profile.php?id=xyz. So for different users the profile page is different like profile.php?id=abc, profile.php?id=def etc. Now this profile.php has an ajax function that sends the user id to the update_details.php through ajax call so that the update_details.php can update the record.
Now for example if I have 2000 users and all of them open their profile page simultaneously. Now my question is will the update_details page be able to handle this. I mean is it one update_details.php or each update_details.php?id=abc, update_details.php?id=def etc is considered to be a seperate one.
To be more precise, when 2000 users are updating their record through 2000 ajax calls, are the calls going to one update_details.php or to the one according to their ids like update_details.php?id=abc, update_details.php?id=def etc. TIA
Okay, let's check how the request goes from the browser till it's served and the browser gets a response.
The client clicks on a link, maybe a button.
The browser makes a HTTP request and sends it to the server ( that maybe Apache, nginx, whatever you use )
The server analyzes the request, checks its rules.. Saying : I found a rule when I hit a url with .php extension, I run a php interpreter and pass it the request info..
The server spawns new process or assign the request to one of its workers ( depends on the internals of the server ).
How many concurrent php processes will run ? it depends on the web server configuration and design.
So to answer your question, each php process is running has its isolated memory segment even if they are executing the same instructions from update_details.php
Think of it like 10 workers in a factory crafting a chair following the same instruction, but each one uses a different paint color, wood type, etc..
I created a tabulation system for beauty pageants that show judges score on a projector. I created a presentation page for that using Codeigniter.
The HTML from that presentation page is purely written in Javascript. The page refreshes each second to get real-time data sent by the judges.
The not-so-cool thing about this logic is that when the page writes a lot of data, the page blinks every second. So the refreshing of the page is noticeable and somewhat disturbing.
This is a snippet of the code I'm working on.
$(document).ready(function() {
getJudgesScore();
setInterval(function(){
if (getNumFinalists() == 0)
getJudgesScore();
else {
window.open ('presentationFinalists','_self',false)
}
},1000);
});
You can imagine how much data is being sent and received every time this code is executed.
To sum this up, what I want to accomplish is instead of the client asking for data every second, the server initiates the connection every time a new data is saved to the database. Thank you for taking your time reading my concern.
This might help you to take necessary data from mysql server and send to client page.
Timer jquery run for after perticular time of interval.
<script src="../JS/Timer/jquery.timer.js"></script>
var timer = $u.timer(function() {
getJudgesScore();
});
timer.set({time: 1000, autostart: true});
refer this link also
https://code.google.com/p/jquery-timer/source/browse/trunk/jquery.timer.js?r=12
What you are attempting is a tricky -- but not impossible -- proposition.
#Chelsea is right -- the server can't genuinely initiate a connection to the client -- but there are several technologies that can emulate that functionality, using client connections that are held open for future events.
Those that come to mind are EventSource, Web Sockets, and long polling... all of which have various advantages and disadvantages. There's not one "correct" answer, but Google (and Stack Overflow) are your friends.
Of course, by "server," above, I'm referring to the web server, not the database. The web server would need to notify the appropriate clients of data changes as it posts them to the database.
To get real-time notification of events from the MySQL server itself (delivered to the web server) is also possible, but requires access to and decoding of the replication event stream, which is a complicated proposition. The discovered events would then need to result in an action by the web server to notify the listening clients over the already-established connections using one of the mechanisms above.
You could also continue to poll the server from the browser, but use only exchange enough data via ajax to update what you need. If you included some kind of indicator in your refresh requests, such as a timestamp you received in the prior update, or some kind of monotonic global version ID such as the MySQL UUID_SHORT() function generates, you could send a very lightweight 204 No Content response to the client, indicating that the browser did not need to update anything.
As some of you probably know, Facebook is using this kind of "system" where a popup is displayed when a user session is lost due to inactivity or distant session close. I already saw and read this Node.js question but didn't find anything.
I am working for a Canadian computer business our main product is a CRM and everything is coded using Classic ASP.
I know.
The whole web-based application is working great and since we host the site on our servers, it is possible if necessary to open ports and use sockets.
Here goes the main question: is there a way (using a javascript library or a jQuery plug-in maybe?) to trigger a client-side event when the session expires or is simply lost due to a server reset for example?
Of course, the best would be to use another solution than sending an AJAX request every second to validate if the user session still exists. If it can help, there is a maximum of about 3'500 users connected at the same time and our servers can easily handle more traffic. The servers are working on Windows Server 2008 along with IIS 7.
Unfortunately, I cannot provide any code blocks or screenshots for this question since there is nothing to debug.
One idea would be to use an AJAX request to a file that does not return anything and hangs there. If session is lost (inactivity or server reset), the AJAX request will trigger an error and the "error" function will be triggered. Would that be something to consider?
Or else, any other suggestions?
One way to do it is to set client-side timer set to the same time as session expiration time.
Let's say your session is set to expire after 20 minutes. When page loads - client-side timer set to 20 minutes kicks in. If user does any server interaction (submits form etc.) - timer is reset. But if nothing happens during this 20 minutes - timer counts down and you get your event.
You could do the following to achieve this, assuming you have a default session timeout of 20:00 minutes:
Ensure, that each user has a "session cookie" issued by you, NOT the default ASP Session Cookie.
dim live_session_id
live_session_id = Request.Cookies("livesession")
if live_session_id = "" then
live_session_id = create_unique_session_id()
Response.Cookies("livesession") = live_session_id
end if
Save this live_session_id in a database along with the expected session expire date
call updateSession(live_session_id, dateadd("n", 20, now())) ' = now()+20min
Output the live_session_id somewhere on your page, so you can access it via JS.
Implement a serverside ASP script that checks the current session state for a given live_session_id and make it accessible in IIS in a DIFFERENT subdomain, so calls to this check will NOT refresh the ASP session. The script could return the time difference between now and session end, so you could output the duration the session will remain valid.
Add some AJAX code to call the check script every other second, so you could issue a warning if the session time draws to an end.
To make it detect IIS reset, you could clear the saved sessions in the database by implementing Application_OnStart in global.asa. This way, your clients would detect a session loss by IIS reset.
another quick and dirty method
On every page load, you could let a javascript count down from 20:00 minutes and display a session lost warning after this time. This is what my online banking system uses... :)
as far as i understand you the main Problem is that the user has to fill out enormous forms. that could take some time and during that time the session could expire.
furthermore the session could be ended by anything else (iisreset or so) during the time the user fills out the form.
in my understanding you do not have to notify the Client that the session is lost/expired/ended/abandoned. it would be enough to just show a Login form (Ajax or something) when the user submits the form OR the next request (by Ajax as you mentioned) is made by the Client.
the called asp script checks if the session is valid and if not a popup or overlay is shown to Login the user by Ajax first and the form is submitted afterwards.
you could think of a http Status code 401 or something to send back to the Client and the Client then Shows the mentioned Ajax Login form...
What makes a session expire in your CRM? Expiring after X time passes since last [user] action is pretty conventional and will allow you to use ajax to keep the session alive. Let's say a session is good for 5 minutes as part of security requirements for super-secret NSA banking CRM with kittens and youtube videos. A good scenario of how a session can be extended would be this:
a page is opened, validating the session for another 5 minutes
a timeout is set with JS to make an ajax request every 4 minutes
[4 minutes later] the request is made, returning a very light-weight response.
if the response says everything is ok and the session is still valid, schedule another "ping" and carry on as usual. If the response comes back with an error (session invalidated on the server because of logging in from a different PC etc.), gracefully handle it. Allow the users to retain what they were working, don't just kick them out to a log in screen with an error
User navigates away from the page (clicks a link, submits a form), repeat from the beginning. If the user navigates to an external site or closes the browser, his session will self-destruct in no more than 5 minutes :)
Obviously you can piggy-back any additional information onto the ajax call in step 3 - e.g. notifying the user of new items assigned to them in CRM?
Google is your friend, one of the first results gives a not bad overview of the approach basics.
I'm working on something similar to a pastebin (yeah, it's that generic) but allowing for multiple user editing. The obvious problem is that of multiple users attempting to edit the same file. I'm thinking along the lines of locking down the file when one user is working on it (it's not the best solution, but I don't need anything too complex), but to prevent/warn the user I'd obviously need a system for monitoring each user's edit sessions. Working with database and ajax, I'm thinking of two solutions.
The first would be to have the edit page ping the server at a arbitrary interval, say a minute, and it would update the edit session entry in the db. Then the next time a script request to edit, it checks for the most recent ping, and if the most recent was another arbitrary time ago, say five minute, then we assume that the previous user had quited and the file can be edited again. Of course, the problem with this method is that the assumption that the previous user had quited is simply an assumption. He could be having flaky wi-fi connection and simply dropped out for ten minutes, all the time with the window still open.
Of course, to deal with this problem, we'd have to have the server respond to new request from previously closed sessions with an error, telling the client side to point out to the user that his session has ended, and then deal with it by, say, saving it as another file on the server and asking the user to manually merge it, etc. It goes without saying that this is rather horrible for the end user.
So I've came around to think of another solution. It may also be possible to get a unload event to fire when the user's session ends, but I cannot be sure whether this will work reliably.
Does anybody has any other, more elegant solution to this problem?
If you expect the number of concurrent edits to the file to be minor, you could just store a version number for the file in the db, and when the user downloads the file into their browser they also get the version number. They are only allowed to upload their changes if the version number matches. First one to upload wins. When a conflict is detected you should send back the latest file and the user's changes so that the user can manually merge in the changes. The advantage is that this works even if it's the same user making two simultaneous edits. If this feature ends up being frequently used you could add client-side merging similar to what a diff tool uses (but you might need to keep the old revisions in that case).
You're probably better off going for a "merge" solution. Using this approach you only need to check for changes when the user posts their document to the server.
The basic approach would be:
1. User A gets the document for editing, document is at version 1
2. User B gets the document for editing, document is at version 1
3. User B posts some changes, including the base version number of 1
4. Server updates document, document now at version 2
5. User B posts some changes, including the base version number of 1
6. Server responds saying document has changed since the user starts editing, and sends user the new document, and their version - user will then need to perform any merging of their changes into document version 2, and post back to the server. User is essentially now editing document version 2
7. User A posts some changes, including the version number of 2
8. Server updates the document, which is now at version 3
You can still do a "ping" every minute, to get the current version number - you already know what version they're editing, so if a new version is available you can let them know and let them download the latest version to make their changes into.
The main benefit of this approach is that users never lock files, so you don't need any arbitrary "time-outs".
I would say you are on the right track. I would probably implement a hybrid solution:
Have a single table called "active_edits" or something like that with a column for the document_id, the user, and the last_update_time. Lets say your ping time is 1 minute and your timeout is 5 minutes. So a use-case would look like this:
Bob opens a document. It checks the last_update_time. If it is over 5 minutes ago, update the table with Bob and the current time. If it is not, someone else is working on the document, so give an error message. Assuming it is not being edited, Bob works on the document for a while and the client pings an update time every minute.
I would say do include a "finish editing" button and a onunload handler. Onunload, from what I understand can be flaky, but might as well add it. Both of these would send a single send-only post to the server saying that Bob is done. Even if Bob doesn't hit "finish editing" and onunload flakes out, the worst case is that another user would have to wait 5 more minutes to edit. The advantage is that if these normally work (a fair assumption) then the system works a bit better.
In the case you described where a Bob is on a bad wireless connection or takes a break: I would say this isn't a big deal. Your ping function should make sure that the document hasn't been taken over by someone else since Bob's last ping. If it has, just give Bob a message saying "someone else has started working on the document" and give them the option to reload.
EDIT: Also, I would be looking into window.onbeforeunload, not onunload. I believe it executes earlier. I believe this is the function website (slashdot included) use to allow you to confirm that you actually want to leave the page. I think it works in the major browsers except Opera.
As with this SO question How do you manage concurrent access to forms?, I would not try to implement pessimistic locking. It is simply too difficult to get working reliably in a stateless environment. Instead, I would use optimistic locking. However, in this case I used something like a SHA hash of the file to determine if the file had changed since the user last read from the file. For each request to change the file, you would run a SHA hash of the file bytes and compare it with the version you pulled when you first read the data. If had changed, you reject the change and either force the user to do their edits again (pulling a fresh copy of the file contents) or you provide a fancier conflict resolution.
I made a chat using PHP and JavaScript chat and there is a disconnect button which removes user from the chat removing him from user list first. But if the user closes browser then he will remain in the user list. How do I check if he left?
This must be done without putting any handles on page closing in JS because if user kills the browser then he will remain in chat.
By the way , JS script always sends a request to the PHP page which constantly checks for new messages in a loop and when there are some, the script prints them out and exits. Then it repeats all over again.
EDIT : How do I make a heartbeat thing in PHP? If a user closes the page the script execution will be terminated therefore we won't be able to check if the user is still connected in the same script.
Sorry, there is no reliable way of doing this, that's the way HTTP was built - it's a "pull" protocol.
The only solution I can think of is that "valid" and logged in clients must query the server in a very small interval. If they don't, they're logged out.
you could send a tiny ajax call to your server every 5 seconds. and users that doesn't do this aren't in the room any more
You answered your own question: if you don't detect a request for new messages from a user over a given length of time (more than a few seconds), then they left the room.
The nature of HTTP dictates that you need to do some AJAX type of communication. If you don't want to listen for the "give me more messages" request (not sure why you wouldn't want to), then build in a heartbeat type communication.
If you can't modify the JS code for some reason, there really is little you can do. Only thing you can do with PHP is to check if there's been for example over 15 minutes from the last activity, the user has left. But this is in no way a smart thing to do – a user might just sit and watch the conversation for 15 minutes.
Only proper way to do is using AJAX polling in set intervals if you want to do it reliably.
You noted that a user polls the server for new messages constantly, can't you use that to detect if user has left?
Maintain a list of active users on the server, as well as the last time they connected to the chat to request new messages.
When a user connects to check for messages update their time.
Whenever your code runs iterate through this list and remove users who haven't connected in too long.
The only failure is that if the number of users in the channel drops to zero, the server wont notice until someone comes back.
To address your edit, you can ignore client termination by using ignore_user_abort.
Using javascript u can do the following :
<script type="text/javascript">
window.onunload = unloadPage;
function unloadPage()
{
alert("unload event detected!");
}
</script>
Make the necessary ajax call on the unloadPage() function to ur PHP Script
Request a PHP script that goes a little something like this, with AJAX:
register_shutdown_function("disconnect_current_user");
header('Content-type: multipart/x-mixed-replace; boundary="pulse"');
while(true) {
echo "--pulse\r\n.\r\n";
sleep(2);
}
This way, you won't constantly be opening/closing connections.
The answers to all the questions asked by the OP are covered in the section in the manual about connection handling:
http://uk3.php.net/manual/en/features.connection-handling.php
No Ajax.
No Javascript.
No keep alives.
C.