Doing some stuff right before the user exits the page - javascript

I have seen some questions here regarding what I want to achieve and have based what I have so far on those answer. But there is a slight misbehavior that is still irritating me.
What I have is sort of a recovery feature. Whenever you are typing text, the client sends a sync request to the server every 45 seconds. It does 2 things. First, it extends the lease the client has on the record (only one person may edit at one time) for another 60 seconds. Second, it sends the text typed so far to the server in case the server crashes, internet connection fails, etc. In that case, the next time the user enters our application, the user is notified that something has gone wrong and that some text was recovered. Think of Microsoft or OpenOffice recovery whenever they crash!
Of course, if the user leaves the page willingly, the user does not need to be notified and as a result, the recovery is deleted. I do that final request via a beforeunload event.
Everything went fine until I was asked to make a final adjustment... The same behavior you have here at stack overflow when you exit the editor... a confirm dialogue.
This works so far, BUT, the confirm dialogue is shown twice. Here is the code.
The event
if (local.sync.autosave_textelement) {
window.onbeforeunload = exitConfirm;
}
The function
function exitConfirm() {
var local = Core;
if (confirm('blub?')) {
local.sync.autosave_destroy = true;
sync(false);
return true;
} else {
return false;
}
};
Some problem irrelevant clarifications:
Core is a global Object that contains a lot of variables that are used everywhere.
sync makes an ajax request. The values are based on the values that the Core.sync object contains. The parameter determines if the call should be async (default) or sync.
Edit 1
I did try to separate both things (recovery deletion and user confirmation that is) into beforeunload and unload. The problem there was that unload is a bit too late. The user gets informed that there is a recovery even though it is scheduled to be deleted. If you refresh the page 1 second later, the dialogue disappears as the file was deleted by then.

Did you try this:
if (local.sync.autosave_textelement) {
window.onunload = exitConfirm;
}

Related

How to call API only when user reload/leave site from the browser alert and not on click of cancel?

I am trying to do an API call when the user is trying to close/reload the browser/tab. I don't want to call the API if the user clicks on cancel. I followed JavaScript, browsers, window close - send an AJAX request or run a script on window closing, but it didn't solve my issue. I followed catching beforeunload confirmation canceled? for differentiating between confirm and cancel. I have no idea how to make the API call when the user reloads/closes the browser and not to call the API when user clicks on cancel. I followed JavaScript, browsers, window close - send an AJAX request or run a script on window closing and tried like
For showing alert on reload or close the tab
<script>
window.addEventListener("onbeforeunload", function(evt){
evt.preventDefault()
const string = '';
evt.returnValue = string;
return string;
})
</script>
and on click of cancel, nothing should happen. If the user is forcefully closing the browser or reloading, the API should be called
<script type="module">
import lifecycle from 'https://cdn.rawgit.com/GoogleChromeLabs/page-lifecycle/0.1.1/dist/lifecycle.mjs';
lifecycle.addEventListener('statechange', function(event) {
if (event.originalEvent === 'visibilitychange' && event.newState === 'hidden') {
var URL = "https://api.com/" //url;
var data = '' //payload;
navigator.sendBeacon(URL, data);
}
});
</script>
But it's not happening. Any help is appreciated. Thanks
Your problem is happening because you're using beforeunload to present a prompt.
I can see that you're handling the beforeunload event properly, so you must already be aware that browser vendors have deliberately limited the ability of script authors to do custom stuff when the user wants to leave the page. This is to prevent abuse.
Part of that limitation is that you don't get to find out what the user decides to do. And there will not be any clever workarounds, either. Once you tell the browser to present the beforeunload prompt, you lose all your power. If the user clicks the Okay button (i.e. decides to leave the page), the browser will refuse to run any more of your code.
Presenting the prompt creates a fork in the road that you are prevented from observing. So, put a laser tripwire there instead of a fork:
window.addEventListener("onbeforeunload", function(evt) {
navigator.sendBeacon(url, payload)
})
This is guaranteed to run when the user actually leaves the page, and only when the user actually leaves the page. But, you sacrifice the ability to try to talk the user out of leaving. You can't have it both ways.
You can't always get what you want, but if you try, sometimes you just might find you get what you need. -- The Rolling Stones
I can only think of one way to accomplish what you need, but it requires help from the server. This is not an option for most people (usually because the beacon goes to a third-party analytics provider who won't do this), but I'm including it here for completeness.
before the beforeunload handler returns, fire a beacon message that says "user is maybe leaving the page"
after firing that beacon, and still before returning, set up a document-wide mousemove handler that fires a second beacon message that says "the user is still here" (and also de-registers itself)
return false to present the prompt
modify your server so that it will reconcile these two events after some kind of delay:
if the server receives beacon 1 and then also receives beacon 2 (within some reasonably short time-frame, e.g. 5 minutes), it means the user tried to leave but then changed their mind, and so the server should delete the record of beacon 1
if the server receives beacon 1 but doesn't receive beacon 2 within the time-frame, then it means the user really did leave, and so the server would rewrite the previous beacon datapoint to say "user actually departed"; you wouldn't need to actually write beacon 2 to your datastore
(Or, depending on expected traffic and your infrastructure, maybe the server just holds the beacon 1 datapoint in RAM for the 5 minutes and commits it to your datastore only if beacon 2 never shows up. Or you could write both beacons to the database and then have a different process reconcile the beacons later. The outcome is identical, but they have different performance characteristics and resource requirements.)
P.S.: Never use "URL" (all caps) as a variable name in javascript. "URL" is actually a useful web API, so if you use that exact variable name, you're clobbering a useful ability. It's just like if you did let navigator = 'Henry'. Yes, it will execute without error, but it shadows a useful native capability.

keep session after login - selenium - javascript

I am trying to automate couple of pages using selenium web driver and node js . I was able to login , but after login I want to use same session initiated by web driver so that I can do automated testing on session protected page. This is my attempt
async function login(){
Let d = await new Builder()
.forBrowser('chrome')
.build();
await d.get('https://demo.textdomain.com/')
await d.findElement(By.id('username')).sendKeys('admin ')
await d.findElement(By.id('password')).sendKeys('admin');
await d.findElement(By.css('button[type="submit"]')).click();
d.getPageSource().then(function(content) {
if(content.indexOf('Welcome text') !==-1 ) {
console.log('Test passed');
console.log('landing page');
d.get('https://demo.textdomain.com/landingpage') //this is still going to login page as i cannot use the previous session
} else {
console.log('Test failed');
return false;
}
//driver.quit();
});
}
login();
Am I accidentally discarding the browser after login.
From a similar question on SQA StackExchange, you can store and restore the current session's cookies:
Using Javascript:
// Storing cookies:
driver.manage().getCookies().then(function (cookies) {
allCookies = cookies;
});
// Restoring cookies:
for (var key in allCookies) {
driver.manage().addCookie(key, allCookies[key]);
}
You might just be dealing with timing issues. Selenium moves very fast. Way faster than you can interact as a user. So it often acts in what seems like unpredictable ways. But that's only because Selenium is acting much faster than you would as a user. In order to work around this, you should make good use of Selenium's built-in driver.wait. For example:
const button = driver.wait(
until.elementLocated(By.id('my-button')),
20000
);
button.click();
The above waits until the button with id my-button is present in the DOM, and then clicks it. It will wait for a maximum of 20000 milliseconds, but will finish as soon as the button becomes available.
So in your case, if there is something that becomes available after the user is successfully logged in, you could wait on that element before going to the new page in your code.
As an aside, I'm also not so sure why you are using getPageSource()? That seems like a very heavy-handed way to get what you are looking for. Isn't that content inside an element you could get the contents of?
I wrote an article about How to write reliable browser tests using Selenium and Node.js which might help you understand in more detail the code example above from the article, along with other techniques you can use to wait reliably for a variety of conditions in the browser.
I believe your problem is not properly waiting for the login to complete.
Selenium doesn't wait for asynchronous actions to be done, it moves to the next line, so when you ask for the page source, there is a good chance the login action didn't complete on the server and the result is not what you expect it to be.
you have to explicitly tell Selenium to wait, so you need to add some code between the login and the code that checks if the user is login, for the sake of this assumption, add a 10 seconds timeout.
if this works for you, you wouldn't want to just waste time, so you need to wait for certain elements on the page to change because of the login, for example, you need to wait for the presence (or visibility if it is already in the DOM) of the user photo in the header.
also, I'm not sure how the "getPageSource" function behaves, it can use the existing page, or it can ask for a fresh copy.
I would advise you to use other ways to test if the user is logged in, by inspecting the DOM.
I suggest to re-use the session-cookie after first login in other web-driver instances.
First store the cookie:
var cookieValue = firstWebDriver.Manage().Cookies.GetCookieNamed(name:"cookie_name");
Then you can pass it by to any WebDriver instance, set it and drive the web-app as it would be the same user with different browser instances:
anotherWebDriver.Manage().Cookies.AddCookie(new Cookie(name:"cookie_name", value:cookieValue));
If you want to use the same browser instance, you have to synchronize them, because WebDriver invocations are in general not thread-safe and would probalby often lead to exceptions (e.g. stale because an element was changed or notfound, because one web-driver navigated to a different page).
Then I suggest to just use the window handle for the next instance, without caring about the session. The first one opens and the last one closes the session (count the referenced handles) and be sure only one driver uses this handle at a time. You can also create new browser windows and this will keep the session and give you a new handle:
var handle = firstWebDriver.CurrentWindowHandle;
otherWebDriver.SwitchTo().Window(handle);
I wrote the code in C# but should be easily adaptable to JavaScript.

Node.js error handling with socket.emit

I have some node.js client side code like this:
socket.emit('clickAccept', { myrecid: recid });
Server side node.js code gets it fine and all is well.
If I take the server down to simulate a server side outage, then click the button that fires this socket.emit on the client side, this happens:
Nothing really, I guess it might eventually time out
When I bring the server back up, the clicks end up being sent to the server and the server acts on them (TCP-like I Guess).
What I want to happen is for those socket.emit calls to die after a short timeout and not send when the server comes back up, it causes all sorts of confusion because if they click 3 times, nothing happens, then when/if the connection or server comes back up they get 3 reactions all at once.
Also, if they click and it times out because the server is down, I would like to show an error to the client user to let them know that basically the click didn't work and to try again.
I know how to act on and show an error if the socket goes down but I don't want to do this if they aren't trying to click something at that time. No sense is firing errors at the user because the socket went down briefly if they have no need to do anything at that moment.
So, to be clear, I only want to show an error if they click on the button and the socket between the client and server is down. AND... If they get an error, I want to kill that emit, not save it all up and fire it and all the other clicks when the server comes back up a few seconds later.
Thanks in advance and I hope that was at least reasonably clear.
The root of your issue is that socket.io attempts to buffer any data that it can't currently send to the server (because the connection to the server is disconnected) and when the server comes back up and the connection is restored, it then sends that data.
You can see the technical details for how this works here: socket.io stop re-emitting event after x seconds/first failed attempt to get a response
You have several implementation options:
If socket.io already knows the client is not connected to the server, then don't buffer the data (perhaps even give you back an error to show to your user).
When socket.io reconnects and there was data buffered while the connection was down, clear that data and throw it away so old data isn't sent on a reconnect.
Implement a timeout to do one of the above after some sort of timeout.
So, to be clear, I only want to show an error if they click on the button and the socket between the client and server is down. AND... If they get an error, I want to kill that emit, not save it all up and fire it and all the other clicks when the server comes back up a few seconds later.
Probably, the simplest way to do that is to implement a version of what is shown in the above referenced answer:
Socket.prototype.emitWhenConnected = function(msg, data) {
if (this.connected) {
this.emit(msg, data);
return null;
} else {
return new Error("not connected");
}
}
Then, switch your code from using .emit() to use .emitWhenConnected() and check the return value when using it. If the return value is null, then no error was detected. If the return value is not null, then there was an error.
Thanks for the other answers and help. I ended up solving this in a super simple way. See below:
if (socket.connected){
// Do your thing here
} else {
// Throw error here that tells the user they're internet is likely down
}
Hope this helps someone out there, it was a huge improvement in our code to make sure that user's are getting proper feedback when if they have brief network/internet outages.

What is the best way to implement idle time out for web application (auto log off)

I want to implement an idle time-out for the web application that we are building. I had earlier achieved this using AsynchronousSessionAuditor from codeplex, which essentially looks for the formsauthentication and session cookie timeout by constant polling.
But it has a draw back of not respecting the client side events, it will look for only last postback to decide when to log off.
The jquery plug jquery-idle-timeout-plugin from erichynds solves this issue of client side events but suffers from another drawback that is not able to recognise user is active on some other tab.
Is there anyone already fixed the TABBED browsing issue with jquery-idle-timeout-plugin already? Or is there any better approach of application time out for web applications (by the way this web app is build using asp.net f/w)
If I understand your question right, it is not possible, since there are no events triggered in javascript for activity outside of the current window/tab.
Unless you have a addon to go along with your website for each browser, which could monitor all activity in the browser, but that is not really a practical approach.
Well, you'd have to code it by hand, which is not really hard. You can use the onfocus and onblur functions to do something like this:
$(function() {
window.isActive = true;
$(window).focus(function() { this.isActive = true; });
$(window).blur(function() { this.isActive = false; });
showIsActive();
});
function showIsActive()
{
console.log(window.isActive)
window.setTimeout("showIsActive()", 2000);
}
function doWork()
{
if (!window.isActive) { /* Check for idle time */}
}
If you make a little search you can find that varaieties of this question have already been asked and answered, you can probably find a solution you can implement with one of the plugins you mentioned.
Try:
Run setTimeout only when tab is active
or
How to tell if browser/tab is active
EDIT--> ADDED:
Or I'd try a different approach. You could create a cookie with some hash and save that hash in your DB with a timestamp that updates whenever the window is active (you could check every 5 seconds or something, it's not an intensive request)
Then, do another check before(but in the same request) to see how much time has passed since the last timestamp and log them out if necessary.
it won't log them out isntantly when time has passed, but it will when they try to access the site either by opening it again or by focusing on the tab/window.

Node.js how to keep connection between page loads

On my website I have a list of all online users, updated in real-time by node.js (I'm using now.js)
The problem is, when a user navigates my site, they of course disconnect for a couple of seconds when the new page is loading. Which means they disappear from the list for all other clients, to pop back in just seconds later.
Is there any way to set a timeout on the disconnect function, e.g. if user has not reconnected in 30 seconds, remove from the list otherwise don't?
Or if there is a better way to accomplish this? Can someone please point me in the right direction :)
EDIT:
Came up with a working solution, if anyone would like to know. On server side I have this function
nowjs.on('disconnect', function() {
everyone.now.clientDisconnected();
});
which whenever a user disconnects calls this function on the client
now.clientDisconnected = function() {
setTimeout(function() { now.serverUpdateUsers(); }, 20000);
}
So instead of updating the users right away, we wait 20 seconds. By then the user should have finished loading the new page, and no difference will show for all other clients.
The serverUpdateUsers(); is the serverside function that gathers all user data and pushes it out to all clients.
I'm not exactly sure if you can modify Socket.IO's settings with now.js (which uses Socket.IO), but if you could (not sure, never used now.js) you should set the heartbeat interval to be bigger:
https://github.com/LearnBoost/Socket.IO/wiki/Configuring-Socket.IO
heartbeat interval defaults to 20 seconds

Categories

Resources