I have a react app which updates data from a database on a 2 second timer:
componentDidMount() {
this.interval = setInterval(()=> this.getSelPos(), 2000);
}
componentWillUnmount() {
clearInterval(this.interval)
}
async getSelPos() {
const response = await axios.get("http://localhost:5000/selector", { crossdomain: true });
console.log(response.data);
this.setState({SelectorPos: parseInt(response.data.selector)});
this.setState({Pressure: parseInt(response.data.pressure)});
this.setState({Temperature: parseInt(response.data.temperature)});
This just sends to a node api that gets a value from my database and responds. Currently set to poll every 2 seconds and update the page.
The problem I'm running into is that when I tab out or minimise the browser for a few minutes it'll hang for a few seconds on returning, or it'll crash entirely. I suspect this is due to the browser being smart and freezing the tab when not in use. But I'm stumped as to how to fix this issue?
Related
I am developing a PWA with node.js.
It is installable and it runs nicely.
Lately I tried to implement Push Notifications like explained here:
https://developer.mozilla.org/en-US/docs/Web/Progressive_web_apps/Re-engageable_Notifications_Push
This is the express route to trigger a notification for all subscriptions (only for testing purposes)
app.get('/notifyall/', function(req, res) {
subscriptions.forEach(subObject => {
webPush.sendNotification(subObject.sub, "notify All");
})
res.send("sent");
});
This is the eventhandler in the serviceworker
self.addEventListener('push', function(event) {
const payload = event.data ? event.data.text() : 'no payload';
console.debug("sw push event");
event.waitUntil(
self.registration.showNotification('ServiceWorker Cookbook', {
body: payload,
renotify: true,
tag: 'test'
})
);
});
this is how I register a subscription to the push service
sub = navigator.serviceWorker.ready
.then(function(registration)
{
return registration.pushManager.getSubscription()
.then(async function(subscription)
{
console.debug("pwasetup: getSubscription");
if (subscription) {
console.debug("sub found");
sub = subscription;
return subscription;
}
const response = await fetch('/vapidPublicKey/');
const vapidPublicKey = await response.text();
// Chrome doesn't accept the base64-encoded (string) vapidPublicKey yet
const convertedVapidKey = frontendfuncs.urlBase64ToUint8Array(vapidPublicKey);
var newSub = registration.pushManager.subscribe({
userVisibleOnly: true,
applicationServerKey: convertedVapidKey
});
console.debug("sub: " + JSON.stringify(newSub));
return newSub;
});
})
.then(function(subscription)
{
// Send the subscription details to the server using the Fetch API.
console.debug("sub: " + JSON.stringify(subscription));
Notification.requestPermission(function(result)
{
if(result === "granted")
{
fetch('/register/', {
method: 'post',
headers: {
'Content-type': 'application/json'
},
body: JSON.stringify({
subscription: subscription
}),
});
}
})
});
});
I tested receiving push notifications with three different mobile browsers on my samsung galaxy S10+ in Chrome, Firefox and Internet (Samsung).
(Of course in the installed/A2HS version of my app. The push notification was triggerd by the code in the /notifyall/ route, mentioned above)
When the app is open, or the sw still in running state, the notifcation is received and displayed without any problems.
But after that things become less easy in Chrome for Android and Samsung Internet:
Chrome:
Firing a notification like 2 minutes after app being closed, the notification will not be displayed, till I open the PWA again. (Bonus question: Why after 2 minutes? I thougt chrome closes sw after 30 seconds)
Internet (Samsung):
After the sw is stopped, the notification arrives on display unlock. Not earlier.
Firefox:
Seems to get the push notification at any time and displays it, as it should.
Does anyone know, what here the problems are in Chrome and Samsung Internet?
I just want to notify my users anytime. Not at certain circumstances...
Thanks!
Problem solved. Easier than it sounds...
Since the PWA needs to be installed/A2HS to use the feature of receiving notifications at any time, I tried to give all the needed settings to the PWA itself (notifications allowed, background optimization disabled, etc.).
Therefore I never thought of, disabling background optimization for the browser-main-app too, which is necessary to wake up the serviceworker of the PWA.
All three browsers work now as intended.
Code Sandbox: https://codesandbox.io/s/new-breeze-d260k
The code:
This is a simple auth app
When you click Login, the accessToken is exchanged and stored in memory while the refreshToken is stored in localStorage.
While the accessToken is valid (here a timestamp), the Home page shows the protected content
At every page reload (i.e App initialization), the refreshToken is sent to the server and if it is valid, a new accessToken is exchanged.
The problem:
To refresh the token on App initialization, I have an onRefreshToken() function in a useEffect to be executed once (I wanted to pass an empty array as dependency but typescript/eslint complains and suggest that onRefreshToken() should be the dependency. I admit that I don't understand why this is recommended to have always a dependency when you want the effect to be executed once).
Once the token is renewed, I store the accessToken and the user profile in their respective context.
Infinite re-render loop begins. On my local server, this is due to setProfile() and not setAccessToken(). However I don't understand why.
Side note
The above issue is the main issue of this post but on a side note, the login/logout process don't sync between tabs so if you have any idea why, I would be happy to hear your advice on this point as well.
Happy new year
One way to fix this would be to check to see if you have an access token and only refresh it if you need to:
export default function App() {
const { accessToken } = useAuthContext();
const { onRefreshToken, onSyncLogin, onSyncLogout } = useAuth();
useEffect(() => {
const refresh = async () => {
await onRefreshToken();
};
!accessToken && refresh();
}, [onRefreshToken, accessToken]);
I'm working on a Django project and the Javascript for one of my pages makes some XHR requests that can take a while to complete. If the user goes to this page, they have to wait for these requests to finish before the site lets them go to another page.
I've been trying to have the page cancel these requests when clicking to another page, but even if the requests cancel the page still waits quite a while before changing URLs. This is how my requests are structured.
const controller = new AbortController()
const signal = controller.signal
let get_salesperson_sales = async() => {
return await fetch(`/reports/salesperson-sales/${fromDate && toDate ? `from_date=${fromDate}&to_date=${toDate}` : ''}`, { signal })
.then(response => response.json())
.then(response => {
document.getElementById('total_rep_sales').innerText = '$'+Number(response.total_sales.toFixed(2)).toLocaleString()
return response
})
.catch(error => {
if (error.name === 'AbortError') console.log('Salesperson sales fetch was aborted.')
else errorMessage.innerHTML += '<p>Error getting sales by rep: '+error.responseJSON ? error.responseJSON.details : 'Unexpected error'+'</p>'
})
}
There are 4 of these functions in total that I call with Promise.allSettled(). At the bottom of the page, I bind a couple functions to abort the requests.
$('#date-filter').submit(() => {
// Stop loading current report before submitting
controller.abort()
return true
})
$(window).bind('beforeunload', () => {
controller.abort()
})
The requests cancel successfully, I can see both in the Network tab of my browser and because my console logs show up in the console, but for some reason the page continues to wait after the requests are cancelled.
The part that's really tripping me up is that, while this happens in my production build, the page seems to change immediately upon cancelling requests when I run the project on localhost. The only major configuration difference between the web app in both builds is that Django's DEBUG is True in the local build and False in the production build. I've tried 4 different methods of cancelling requests and 2 different kinds of request ($.ajax and fetch), each time successfully cancelling the requests and running into this problem.
What could possibly be causing this issue and how can I force my page to change immediately instead of waiting like this?
I know that Background Sync API is not supported in the apple ecosystem, so how would you get around it and make a solution that would work in the apple ecosystem and other platforms as well, now i have a solution that uses Background Sync API and for some reason it literally does not do anything on IOS, it just saves the failed requests, and then never sync-s, could i just access the sync queue somehow, with a indexedDB wrapper and then sync at an arbitrary time?
I tried it once and it broke everything, do you guys have an idea how?
const bgSyncPlugin = new workbox.backgroundSync.Plugin('uploadQueue', {
maxRetentionTime: 60 * 24 * 60,
onSync: async ({ queue }) => {
return getAccessToken().then((token) => {
replayQueue(queue, token).then(() => {
return showNotification();
});
});
},
});
This is the code i have, they all. have a purpose, since my token has a timeout i have to check if the token is expired or not and proceed after that and replace the token in the headers if it is expired, and i have to change data as well when i sync in the request bodies, but it all works good on anything other than apple devices. Apple devices never trigger the onsync, i tried to do listen to fetch events and trigger onsync with:
self.registration.sync.register('uploadQueue');
But to no awail, i tried to register sync on servvice worker registration, nothing seems to help.
If the sync registration is not viable on ios, then can i access the upload queue table somehow?
P.S.: I`m using dexie.js as a indexedDB wrapper, it is a vue.js app, with laravel api, and the sync process is quite complex, but it is working, just have to figure out how to do it on IOS!
I have found an answer to this after like 2 weeks of it being on my mind and on my to do list.
Now get some popcorn and strap yourself the heck in, because this is quite a chonker.
In my case the sync process was pretty complex as my users could be away from any connection for such a long time that my accessTokens would expire so i had to do a check for the access token expiration as well and reFetch it.
Furthermore my users could add new people to the database of people, which all had their on unique server side id-s, so i had to order my requests in a way that the person registrations are sent first then the tasks and campaigns that were completed for them, so i can receive the respective ids from the API.
Now for the fun part:
Firstly you cant use a bgSyncPlugin, because you cant access the replayQueue, you have to use a normal queue, like this:
var bgSyncQueue = new workbox.backgroundSync.Queue('uploadQueue', {
maxRetentionTime: 60 * 24 * 60,
onSync: () => syncData(),
});
And push the failed requests to the queue inside the fetch listener:
this.onfetch = (event) => {
let requestClone = event.request.clone();
if (requestClone.method === 'POST' && 'condition to match the requests you need to replay') {
event.respondWith(
(() => {
const promiseChain = fetch(requestClone).catch(() => {
return bgSyncQueue.pushRequest(event);
});
event.waitUntil(promiseChain);
return promiseChain;
})()
);
} else {
event.respondWith(fetch(event.request));
}
};
When user has connection we trigger the "syncData()" function, on ios this is a bit complicated(more on this later), on android it happens automatically, as the service worker sees it has connection, now lets just check out what syncData does:
async function syncData() {
if (bgSyncQueue) //is there data to sync?
return getAccessToken() //then get the access token, if expired refresh it
.then((token) => replayQueue(bgSyncQueue, token).then(() => showNotification({ body: 'Succsesful sync', title: 'Data synced to server' })))
.catch(() => showNotification({ title: 'Sync unsuccessful', body: 'Please find and area with better coverage' })); //replay the requests and show a notification
return Promise.resolve('empty');//if no requests to replay return with empty
}
For the android/desktop side of thing we are finished you can be happy with your modified data being synced, now on iOS we cant just have the users data be uploaded only when they restart the PWA, thats bad user experience, but we are playing with javascript everything is possible in a way or another.
There is a message event that can be fired every time that the client code sees that it has internet, which looks like this:
if (this.$online && this.isIOSDevice) {
if (window.MessageChannel) {
var messageChannel = new MessageChannel();
messageChannel.port1.onmessage = (event) => {
this.onMessageSuccess(event);
};
} else {
navigator.serviceWorker.onmessage = (event) => {
this.onMessageSuccess(event);
};
}
navigator.serviceWorker.ready.then((reg) => {
try {
reg.active.postMessage(
{
text: 'sync',
port: messageChannel && messageChannel.port2,
},
[messageChannel && messageChannel.port2]
);
} catch (e) {
//firefox support
reg.active.postMessage({
text: 'sync',
});
}
});
}
this is inside a Vue.js watch function, which watches whether we have connection or not, if we have connection it also checks if this is a device from the apple ecosystem, like so:
isIosDevice() {
return !!navigator.platform && /iPad|iPhone|MacIntel|iPod/.test(navigator.platform) && /^((?!chrome|android).)*safari/i.test(navigator.userAgent);
}
And so it tells the service worker that it has internet and it has to sync, in that case this bit of code gets activated:
this.onmessage = (event) => {
if (event.data.text === 'sync') {
event.waitUntil(
syncData().then((res) => {
if (res !== 'empty') {
if (event.source) {
event.source.postMessage('doNotification');//this is telling the client code to show a notification (i have a built in notification system into the app, that does not use push notification, just shows a little pill on the bottom of the app with the message)
} else if (event.data.port) {
event.data.port.postMessage('doNotification'); //same thing
}
return res;
}
})
);
}
};
Now the most useful part in my opinion, the replay queue function, this guy gets the queue and the token from getAccessToken, and then it does its thing like clockwork:
const replayQueue = async (queue, token) => {
let entry;
while ((entry = await queue.shiftRequest())) {//while we have requests to replay
let data = await entry.request.clone().json();
try {
//replay the person registrations first and store them into indexed db
if (isPersonRequest) {
//if new person
await fetchPerson(entry, data, token);
//then replay the campaign and task submissions
} else if (isTaskOrCampaignRequest) {
//if task
await fetchCampaigns(entry, data, token);
}
} catch (error) {
showNotification({ title: 'no success', body: 'go for better internet plox' });
await queue.unshiftRequest(entry); //put failed request back into queue, and try again later
}
}
return Promise.resolve();
};
Now this is the big picture as how to use this guy on iOS devices and make Apple mad as heck :) I am open to any questions that are related, in this time i think i have become pretty good with service worker related stuff as this was not the only difficult part of this project but i digress, thats a story for another day.
(you may see that error handling is not perfect and maybe this thing is not he most secure of them all, but this project has a prettty small amount of users, with a fixed number which know how to use it and what it does, so im not really afraid of security in this case, but you may want to improve on things if you use in in a more serious project)
Hope i could help and all of you have a grea day.
Why I can't close the server by requesting localhost:13777/close in browser (it continues to accept new requests), but it will gracefully close on timeout 15000? Node version is 0.10.18. I fell into this problem, trying to use code example from docs on exceptions handling by domains (it was giving me 'Not running' error every time I secondly tried to request error page) and finally came to this code.
var server
server = require("http").createServer(function(req,res){
if(req.url == "/close")
{
console.log("Closing server (no timeout)")
setTimeout(function(){
console.log("I'm the timeout")
}, 5000);
server.close(function(){
console.log("Server closed (no timeout)")
})
res.end('closed');
}
else
{
res.end('ok');
}
});
server.listen(13777,function(){console.log("Server listening")});
setTimeout(function(){
console.log("Closing server (timeout 15000)")
server.close(function(){console.log("Server closed (timeout 15000)")})
}, 15000);
The server is still waiting on requests from the client. The client is utilizing HTTP keep-alive.
I think you will find that while the existing client can make new requests (as the connection is already established), other clients won't be able to.
Nodejs doesn't implement a complex service layer on top of http.Server. By calling server.close() you are instructing the server to no longer accept any "new" connections. When a HTTP Connection:keep-alive is issued the server will keep the socket open until the client terminates or the timeout is reached. Additional clients will not be able to issue requests
The timeout can be changed using server.setTimeout() https://nodejs.org/api/http.html#http_server_settimeout_msecs_callback
Remember if a client has created a connection before the close event that connection can continually be used.
It seems that a lot of people do not like this current functionality but this issue has been open for quite a while:
https://github.com/nodejs/node/issues/2642
As the other answers point out, connections may persist indefinitely and the call to server.close() will not truly terminate the server if any such connections exist.
We can write a simple wrapper function which attaches a destroy method to a given server that terminates all connections, and closes the server (thereby ensuring that the server ends nearly immediately!)
Given code like this:
let server = http.createServer((req, res) => {
// ...
});
later(() => server.close()); // Fails to reliably close the server!
We can define destroyableServer and use the following:
let destroyableServer = server => {
// Track all connections so that we can end them if we want to destroy `server`
let sockets = new Set();
server.on('connection', socket => {
sockets.add(socket);
socket.once('close', () => sockets.delete(socket)); // Stop tracking closed sockets
});
server.destroy = () => {
for (let socket of sockets) socket.destroy();
sockets.clear();
return new Promise((rsv, rjc) => server.close(err => err ? rjc(err) : rsv()));
};
return server;
};
let server = destroyableServer(http.createServer((req, res) => {
// ...
}));
later(() => server.destroy()); // Reliably closes the server almost immediately!
Note the overhead of entering every unique socket object into a Set