How to use pubnub timetoken to handle page refreshes - javascript

When pubnub is connected, users inside my app receive messages no problem. However, let's say they refresh the page or go to another page within the app. There's a 2 -> 5 seconds of downtime before the user can connect to pubnub again. During this time some pubnub messages may be missed.
Thankfully, the pubnub subscribe API allows you to specify a timetoken to point to a past time (e.g. 10 seconds ago).
pubnub.subscribe({
channels: ['my_channel'],
timetoken: '13534398158620385'
});
Description:
Specifies timetoken from which to start returning any available cached messages.
Question:
What's the safest way to specify this timetoken such that few messages are missed?

First, listen out for beforeunload events on the window. These will be fired before the page is moved away from. Inside this create a cookie to save the current timetoken:
window.addEventListener('beforeunload', () => {
const now = new Date().getTime();
// pubnub timetokens are 17 digits long
const timetoken = `${now}0000`;
createCookie(PUBNUB_TIMETOKEN_COOKIE, timetoken, EXPIRATION);
});
Note: PUBNUB_TIMETOKEN_COOKIE and EXPIRATION are constants of your choosing. I set my cookie to expire after 10 seconds to prevent clashes. Also you'll need to define a createCookie function similar to this one.
Next, when subscribing to pubnub on page load, use this cookie if it exists:
pubnub.subscribe({
channels: ['my_channel'],
timetoken: getCookie(PUBNUB_TIMETOKEN_COOKIE)
});
This way, if the user refreshes or navigates to another page, missed messages should be caught.

Related

Is there a way to tell in the page that a service worker has recently upgraded?

I have a SPA with a service worker.
I don't want to force users to update, so usually the new worker updates in the background, but I notify users and they can click a button to message the worker to skipWaiting. When this happens I rely on oncontrollerchange to force any other tabs they have open to refresh.
I want to show a link to release notes after the worker has upgraded, either due to refresh of all tabs or they force the refresh.
However, I don't want to show these notes the first time they visit, or every time the service worker activates. If they don't read the release notes I don't want to keep nagging them about it.
Is there an event or reliable design pattern I can use to tell (in the page) that the service worker has updated to a new version?
A straightforward solution would be to add a query parameter to the current URL, and instead of calling location.reload() to reload your page, instead change the location value to that URL.
Something like:
// Inside your `controllerchange` listener, call reloadWithNotes()
function reloadWithNotes() {
const url = new URL(location.href);
url.searchParams.set('showNotes', '');
location = url.href;
}
// Elsewhere...
window.addEventListener('load', () => {
const url = new URL(location.href);
if (url.searchParams.has('showNotes')) {
// Show your release notes.
// Remove the parameter.
url.searchParams.delete('showNotes');
window.history.replaceState({}, document.title, url.href);
}
});

Will running jQuery.get too often cause performance issues for the browser?

So you understand what I'm trying to do, I've written a web page which shows events logged in MySQL as soon as they are inserted into the database (basically monitoring Windows & Mac logon/logoff on the network). Every time a new event is inserted the PHP script connects to a web socket and sends a message to all connected browsers to notify them of the new event. When the browsers receive the notification message they run jQuery.get("liveeventsearch.php", ...); to fetch the new event from the database (see javascript code below).
In a nutshell, when a web socket message is received fetch the new record from the database, append the result to a table and play a notification sound.
socket.onmessage = function(msg) {
if (msg.data == "#all new event") {
var newLastUpdate = new Date().toISOString().slice(0, 19).replace('T', ' ');
jQuery.get("liveeventsearch.php", <?php echo '{ Key:"' . $Value . '", etc... , lastupdate:lastUpdate }'; ?>, function(result) {
jQuery('#LiveResults tbody').append(result);
if (jQuery('#chkNotification-Audio').is(':checked') && result > "")
{
jQuery("#Notification-Audio").trigger("play");
}
lastUpdate = newLastUpdate;
});
}
};
My concern is that there are currently approx 1200 devices on the network and it is expected that most, if not all of them will logon/logoff within a 5 to 10 minute period in large clumps hourly with a few additional scattered here and there. So the browser (depending on the supplied search criteria) will likely receive a large number of web socket messages in a small period of time, if not simultaneously (and obviously fetch liveeventsearch.php that many times). Is this likely to cause a problem for the browser fetching results so frequently?
I can provide the code in liveeventsearch.php if necessary.
Alternate Methods
I had thought about adding something like this in the socket.onmessage function to reduce the frequency.
//[PSEUDO CODE]
if (currentTime > lastUpdate + 3 seconds)
{
jQuery.get(...);
}
But then the last set of events will not appear until another web socket message is received which could be a lot longer than 3 seconds. I could possibly use a timer instead, but that kind of defeats the object of having a web socket providing 'live' updates.
Another option I thought of is to create a new MySQL table (e.g. liveUpdates) which contains only an ID field. Then run a cron job every X seconds which inserts a new ID in that table (or run a a script on the server with a continuous loop doing the same thing?). My events table could then have an additional field tying each event to the latest liveUpdates.ID and the cron job could send the web socket message each time a new update ID was created instead of every time an event is logged. But this again would have the same effect as using a timer.

Firebase disconnects on cordova pause

A pretty major problem for my app is that roughly 50% of launches have no Firebase connection unless the app is paused. So I would:
Launch the app to have no data loaded
Pause it to load the data on the current page
Resume it to see the data.
Repeat step 2 and 3 each time I need data sent or received
code:
var onPause = function(){console.log("paused");}
var onResume = function(){console.log("resumed");}
document.addEventListener("pause", onPause, false);
document.addEventListener("resume", onResume, false);
var connectionRef = new Firebase(FB + "/.info/connected");
connectionRef.on("value", function(snap){
if(snap.val() == true){
console.log("connected -------[]------");
} else {
console.log("not connected --------[ ]------------");
}
});
logs:
resumed
not connected --------[ ]------------
paused
connected -------[]------
resumed
not connected --------[ ]------------
paused
connected -------[]------
The reason it works half the time is because it also works the opposite way, which I assume is the intended way. Is there any way to prevent it from disconnected at all? or alternatively force it to connect on resume?
I found the cause to the problem - It was a plugin called phonegap-plugin-push. When it registers for GCM it causes Firebase to go offline (or online if paused).
I still haven't found a genuine solution, but as a hack fix I have a timer on registering for GCM so the data can load initially before the disconnecting happens. After that it relies on the user not using the app for extended periods of time so it can pause to sync up on a regular basis.

Server saturation with Ajax calls

I'm using PHP over IIS 7.5 on Windows Server 2008.
My web application is requesting repeatedly with Ajax in the background 3 different JSON pages:
page 1 Every 6 seconds
page 2 Every 30 seconds
page 3 Every 60 seconds
They retrieve data related with the current state of some tables. This way I keep the view updated.
Usually I have no much trouble with it, but lately I saw my server saturated with hundreds of unanswered requests and I believe the problem can be due to a delay in one of the request.
If page1, which is being requested every 6 seconds, needs 45 seconds to respond (due to slow database queries or whatever), then it seem to me that the requests start getting piled one after the other.
If I have multiple users connected to the web application at the same time (or with multiple tabs) things can turn bad.
Any suggestion about how to avoid this kind of problem?
I was thinking about using some thing such as ZMQ together with Sockets.io in the client side, but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.
I was thinking about using some thing such as ZMQ together with Sockets.io in the client side...
This is almost definitely the best option for long-running requests.
...but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.
In this case, the 'user action' in question is connecting to the socket.io server. This cut-down example is taken from one of the socket.io getting started docs:
var io = require('socket.io')(http);
io.on('connection', function(socket) {
console.log('a user connected');
});
When the 'connection' event is fired, you could start listening for messages on your ZMQ message queue. If necessary, you could also start the long-running queries.
I ended up solving the problem following the recommendation of #epascarello and improving it a bit if I get no response in X time.
If the request has not come back, do not send another. But fix the serverside code and speed it up.
Basically I did something like the following:
var ELAPSED_TIME_LIMIT = 5; //5 minutes
var responseAnswered = true;
var prevTime = new Date().getTime();
setInterval(function(){
//if it was answered or more than X m inutes passed since the last call
if(responseAnsswered && elapsedTime() > ELAPSED_TIME_LIMIT){
getData()
updateElapsedTime();
}
}, 6000);
function getData(){
responseAnswered = false;
$.post("http://whatever.com/action.json", function(result){
responseAnswered = true
});
}
//Returns the elapsed time since the last time prevTime was update for the given element.
function elapsedTime(){
var curTime = new Date().getTime();
//time difference between the last scroll and the current one
var timeDiff = curTime - prevTime;
//time in minutes
return (timeDiff / 1000) / 60;
}
//updates the prevTime with the current time
function updateElapsedTime(){
prevTime = new Date().getTime();
}
This is a very bad setup. You should always avoid polling if possible. Instead of sending request every 6 seconds from client to server, send data from server to the clients. You should check at the server side if there is any change in the data, then transfer the data to the clients using websockets. You can use nodejs at the server side to monitor any changes in the data.

SignalR Stops Working After A While

For some reason, SignalR will just stop calling client methods after a short period of time (about 1 hour or less I estimate). I have a page that shows Alerts... a very simple implementation. Here's the Javascript:
$(function () {
// enable logging for debugging
$.connection.hub.logging = true;
// Declare a proxy to reference the hub.
var hub = $.connection.alertHub;
hub.client.addAlert = function (id, title, url, dateTime) {
console.log(title);
};
$.connection.hub.start().done(function () {
console.log("Alert Ready");
});
});
If I refresh the page, it works again for about an hour, then will stop calling the client event addAlert. There are no errors in the log, no warnings. The last event in log (other than the pings to the server) is:
[15:18:58 GMT-0600 (CST)] SignalR: Triggering client hub event
'addAlert' on hub 'AlertHub'.
Many of these events will come in for a short while, then just stop, even though the server should still be sending them.
I am using Firefox 35.0.1 on Mac and SignalR 2.0.0.
I realize that a work-around is to force a page refresh every 10 mins or so, but I'm looking for a way to fix the root cause of the problem.
I enabled SignalR tracing on the server. I created an "alert" on the server after a fresh refresh of the Alert page and the alert came through. I waited about 10 mins and I tried it again, and it failed to come through. Here's what the logs read (sorry for the verbosity, not sure what was relevant):
SignalR.Transports.TransportHeartBeat Information: 0 : Connection b8b21c4c-22b4-4686-9098-cb72c904d4c9 is New.
SignalR.Transports.TransportHeartBeat Verbose: 0 : KeepAlive(b8b21c4c-22b4-4686-9098-cb72c904d4c9)
SignalR.Transports.TransportHeartBeat Verbose: 0 : KeepAlive(b8b21c4c-22b4-4686-9098-cb72c904d4c9)
SignalR.Transports.TransportHeartBeat Verbose: 0 : KeepAlive(b8b21c4c-22b4-4686-9098-cb72c904d4c9)
SignalR.Transports.TransportHeartBeat Verbose: 0 : KeepAlive(b8b21c4c-22b4-4686-9098-cb72c904d4c9)
There are dozens more of the SignalR.Transports.TransportHeartBeat messages, but nothing else.
i think theres a timeout of default 110 seconds for signalr. Can you try signalr disconnected event to reconnect it back.
$.connection.hub.disconnected(function () {
setTimeout(function () {
startHub();
}, 5000);
});
and in startHub() you can start connection again.
reference : https://github.com/SignalR/SignalR/issues/3128
and How to use SignalR events to keep connection alive in the right way?
As it turns out the problem was the way I was handling the AlertHub connections. I am using Enterprise Library Caching to store connections backing the AlertHub, and I was expiring the cache entries 20 minutes after they were created. Ergo, when the server called the client method, no errors where reported because there were no client(s) to send the message(s) to.
I have since increased the cache expiration to a reasonable value, which solved the problem.
You can refresh page if client is inactive, no mouse movement (in about every 15-30 min). I had same problem and solved it that way. That was nasty workaround but later i forgot about it and never fixed it completly ;)

Categories

Resources