I'm working on a Django project and the Javascript for one of my pages makes some XHR requests that can take a while to complete. If the user goes to this page, they have to wait for these requests to finish before the site lets them go to another page.
I've been trying to have the page cancel these requests when clicking to another page, but even if the requests cancel the page still waits quite a while before changing URLs. This is how my requests are structured.
const controller = new AbortController()
const signal = controller.signal
let get_salesperson_sales = async() => {
return await fetch(`/reports/salesperson-sales/${fromDate && toDate ? `from_date=${fromDate}&to_date=${toDate}` : ''}`, { signal })
.then(response => response.json())
.then(response => {
document.getElementById('total_rep_sales').innerText = '$'+Number(response.total_sales.toFixed(2)).toLocaleString()
return response
})
.catch(error => {
if (error.name === 'AbortError') console.log('Salesperson sales fetch was aborted.')
else errorMessage.innerHTML += '<p>Error getting sales by rep: '+error.responseJSON ? error.responseJSON.details : 'Unexpected error'+'</p>'
})
}
There are 4 of these functions in total that I call with Promise.allSettled(). At the bottom of the page, I bind a couple functions to abort the requests.
$('#date-filter').submit(() => {
// Stop loading current report before submitting
controller.abort()
return true
})
$(window).bind('beforeunload', () => {
controller.abort()
})
The requests cancel successfully, I can see both in the Network tab of my browser and because my console logs show up in the console, but for some reason the page continues to wait after the requests are cancelled.
The part that's really tripping me up is that, while this happens in my production build, the page seems to change immediately upon cancelling requests when I run the project on localhost. The only major configuration difference between the web app in both builds is that Django's DEBUG is True in the local build and False in the production build. I've tried 4 different methods of cancelling requests and 2 different kinds of request ($.ajax and fetch), each time successfully cancelling the requests and running into this problem.
What could possibly be causing this issue and how can I force my page to change immediately instead of waiting like this?
Related
I know that Background Sync API is not supported in the apple ecosystem, so how would you get around it and make a solution that would work in the apple ecosystem and other platforms as well, now i have a solution that uses Background Sync API and for some reason it literally does not do anything on IOS, it just saves the failed requests, and then never sync-s, could i just access the sync queue somehow, with a indexedDB wrapper and then sync at an arbitrary time?
I tried it once and it broke everything, do you guys have an idea how?
const bgSyncPlugin = new workbox.backgroundSync.Plugin('uploadQueue', {
maxRetentionTime: 60 * 24 * 60,
onSync: async ({ queue }) => {
return getAccessToken().then((token) => {
replayQueue(queue, token).then(() => {
return showNotification();
});
});
},
});
This is the code i have, they all. have a purpose, since my token has a timeout i have to check if the token is expired or not and proceed after that and replace the token in the headers if it is expired, and i have to change data as well when i sync in the request bodies, but it all works good on anything other than apple devices. Apple devices never trigger the onsync, i tried to do listen to fetch events and trigger onsync with:
self.registration.sync.register('uploadQueue');
But to no awail, i tried to register sync on servvice worker registration, nothing seems to help.
If the sync registration is not viable on ios, then can i access the upload queue table somehow?
P.S.: I`m using dexie.js as a indexedDB wrapper, it is a vue.js app, with laravel api, and the sync process is quite complex, but it is working, just have to figure out how to do it on IOS!
I have found an answer to this after like 2 weeks of it being on my mind and on my to do list.
Now get some popcorn and strap yourself the heck in, because this is quite a chonker.
In my case the sync process was pretty complex as my users could be away from any connection for such a long time that my accessTokens would expire so i had to do a check for the access token expiration as well and reFetch it.
Furthermore my users could add new people to the database of people, which all had their on unique server side id-s, so i had to order my requests in a way that the person registrations are sent first then the tasks and campaigns that were completed for them, so i can receive the respective ids from the API.
Now for the fun part:
Firstly you cant use a bgSyncPlugin, because you cant access the replayQueue, you have to use a normal queue, like this:
var bgSyncQueue = new workbox.backgroundSync.Queue('uploadQueue', {
maxRetentionTime: 60 * 24 * 60,
onSync: () => syncData(),
});
And push the failed requests to the queue inside the fetch listener:
this.onfetch = (event) => {
let requestClone = event.request.clone();
if (requestClone.method === 'POST' && 'condition to match the requests you need to replay') {
event.respondWith(
(() => {
const promiseChain = fetch(requestClone).catch(() => {
return bgSyncQueue.pushRequest(event);
});
event.waitUntil(promiseChain);
return promiseChain;
})()
);
} else {
event.respondWith(fetch(event.request));
}
};
When user has connection we trigger the "syncData()" function, on ios this is a bit complicated(more on this later), on android it happens automatically, as the service worker sees it has connection, now lets just check out what syncData does:
async function syncData() {
if (bgSyncQueue) //is there data to sync?
return getAccessToken() //then get the access token, if expired refresh it
.then((token) => replayQueue(bgSyncQueue, token).then(() => showNotification({ body: 'Succsesful sync', title: 'Data synced to server' })))
.catch(() => showNotification({ title: 'Sync unsuccessful', body: 'Please find and area with better coverage' })); //replay the requests and show a notification
return Promise.resolve('empty');//if no requests to replay return with empty
}
For the android/desktop side of thing we are finished you can be happy with your modified data being synced, now on iOS we cant just have the users data be uploaded only when they restart the PWA, thats bad user experience, but we are playing with javascript everything is possible in a way or another.
There is a message event that can be fired every time that the client code sees that it has internet, which looks like this:
if (this.$online && this.isIOSDevice) {
if (window.MessageChannel) {
var messageChannel = new MessageChannel();
messageChannel.port1.onmessage = (event) => {
this.onMessageSuccess(event);
};
} else {
navigator.serviceWorker.onmessage = (event) => {
this.onMessageSuccess(event);
};
}
navigator.serviceWorker.ready.then((reg) => {
try {
reg.active.postMessage(
{
text: 'sync',
port: messageChannel && messageChannel.port2,
},
[messageChannel && messageChannel.port2]
);
} catch (e) {
//firefox support
reg.active.postMessage({
text: 'sync',
});
}
});
}
this is inside a Vue.js watch function, which watches whether we have connection or not, if we have connection it also checks if this is a device from the apple ecosystem, like so:
isIosDevice() {
return !!navigator.platform && /iPad|iPhone|MacIntel|iPod/.test(navigator.platform) && /^((?!chrome|android).)*safari/i.test(navigator.userAgent);
}
And so it tells the service worker that it has internet and it has to sync, in that case this bit of code gets activated:
this.onmessage = (event) => {
if (event.data.text === 'sync') {
event.waitUntil(
syncData().then((res) => {
if (res !== 'empty') {
if (event.source) {
event.source.postMessage('doNotification');//this is telling the client code to show a notification (i have a built in notification system into the app, that does not use push notification, just shows a little pill on the bottom of the app with the message)
} else if (event.data.port) {
event.data.port.postMessage('doNotification'); //same thing
}
return res;
}
})
);
}
};
Now the most useful part in my opinion, the replay queue function, this guy gets the queue and the token from getAccessToken, and then it does its thing like clockwork:
const replayQueue = async (queue, token) => {
let entry;
while ((entry = await queue.shiftRequest())) {//while we have requests to replay
let data = await entry.request.clone().json();
try {
//replay the person registrations first and store them into indexed db
if (isPersonRequest) {
//if new person
await fetchPerson(entry, data, token);
//then replay the campaign and task submissions
} else if (isTaskOrCampaignRequest) {
//if task
await fetchCampaigns(entry, data, token);
}
} catch (error) {
showNotification({ title: 'no success', body: 'go for better internet plox' });
await queue.unshiftRequest(entry); //put failed request back into queue, and try again later
}
}
return Promise.resolve();
};
Now this is the big picture as how to use this guy on iOS devices and make Apple mad as heck :) I am open to any questions that are related, in this time i think i have become pretty good with service worker related stuff as this was not the only difficult part of this project but i digress, thats a story for another day.
(you may see that error handling is not perfect and maybe this thing is not he most secure of them all, but this project has a prettty small amount of users, with a fixed number which know how to use it and what it does, so im not really afraid of security in this case, but you may want to improve on things if you use in in a more serious project)
Hope i could help and all of you have a grea day.
Is it possible to make two backend requests at once from react?
The code below is the first backend call. The post request gets send to the backend and then I would like to do another request. Is it possible at all? Or do I have to wait for the backend response until the next request could be made?
What I basically want is to get information about how many files have been uploaded. The upload could take 3 minutes and the user right now only sees a loading icon. I want to additionally add a text like "50 of 800 Literatures uploaded" and 10 seconds later "100 of 800 litereratures uploaded".
This is basically my code :
class ProjectLiterature extends Component {
constructor(props) {
super(props);
this.state = {
isLoading:"false",
}
}
addLiterature(data, project_name) {
this.setState({ isLoading:true }, () => {
axios.post("http://127.0.0.1:5000/sendLiterature", data })
.then(res => {
this.setState({ isLoading: false })
})
})
}
If both requests do not depend on each other, you can make use of JavaScript's Promise.all() for the above purpose.
const request1 = axios.get('http://127.0.0.1:5000/sendLiterature');
const request2 = axios.get(url2);
Promise.all([request1,request2]).then([res1, res2] => {
// handle the rest
}).catch((error) => {
console.error(error);
// carry out error handling
});
If the second request relies on the response of the first request, you will have to wait for the first request to be completed as both requests have to be carried out in sequence.
const res = await axios.get('http://127.0.0.1:5000/sendLiterature');
// carry out the rest
You can see axios docs for this purpose, they support multiple requests out of box.
You can use Promise.all instead of axios.all as well but if one of requests fails then you won't be able to get response of successful calls. If you want get successful response even though some calls fails then you can use Promise.allSettled.
I'm building an PWA with limited offline capability, I'm using this code to save page content to dynamic cache every time user visits a new url:
self.addEventListener('fetch', function(event) {
event.respondWith(
fetch(event.request)
.then(function(res) {
return caches.open('cache')
.then(function(cache) {
cache.put(event.request.url, res.clone());
return res;
})
})
.catch(function(err) {
console.log( err );
return caches.match(event.request);
})
);
});
This works great, after a page is loaded all of it assets are cached and can be seen in offline mode.
But, I would also like to add the option to automatically cache some of the more important urls when the user comes back online.
I do that by putting the list of urls in the array, loop through it and send a fetch request to each url, so those pages can be cached without user visiting/revisiting the page.
Problem is that when I do that some of the assets on some pages are not cached, for example google map on one page, is there a way to simulate real visit to a page, that gets all of the assets from an url with fetch request?
Fetch code:
function fillDynamicCache(user_id = false) {
let urls = [
'/homepage',
'/someotherpage',
'/thirdpage',
'/...',
];
urls.map((url, id) => (
fetch(url)
.then(
function(response) {
if (response.status !== 200) {
console.log('Looks like there was a problem. Status Code: ' +
response.status);
return;
}
console.log( 'in fetch: ' + url );
}
)
.catch(function(err) {
console.log('Fetch Error :-S', err);
})
));
}
self.addEventListener('message', (event) => {
// refresh cache when user comes back online
if (event.data == 'is_online') {
fillDynamicCache();
} else if (event.data == 'is_updated') {
self.skipWaiting();
Typically if you have important assets you want to provide the users, even when they are offline, you should consider an offline first strategy, meaning you prefetch those resources while the service worker is installing.
This way the matching requests will be served from the cache, improving the performance because you skip the relative network calls entirely.
In case the target resources tend to update/change frequently on the server, then you can opt for a stale while revalidate strategy (after the data is provided from the cache, the SW will update its value with a newer one from the network, if available) or even network first, fallback to cache, the latter if you want to provide always the latest values and provide cache data only if the network connection times out or is unavailable.
I wrote an article about service worker and caching strategies, in case you want to go deeper into the topic.
The application that I'm making loads 3 JSON files in order to get information about a game's characters, spell and more. Currently I have 3 functions that use axios to make GET requests and then store the responses, however, I'm wondering if I'm slowing my load time because frankly I'm not sure if these JSON files are loaded simultaneously or one after another. Each file takes about 45 ms to load so if they're being loaded one after another, I'm looking at around 135 ms load time and I'm not liking it.
Currently I've tried 2 ways but frankly I don't see a difference in the loading time in chrome's network tab. If you're wondering, the functions are located in my Vue.js Vuex store and the calls are executed in App.vue mounted hook.
The first way uses 3 separate functions and each makes its own GET request. Then these functions are called one after another.
The call:
this.$store.dispatch('getChampions')
this.$store.dispatch('getSummonerSpells')
this.$store.dispatch('getSummonerRunes')
The functions:
getChampions({commit, state}){
axios.get("https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/champion.json")
.then((response) => {
commit('champions', {
champions: response.data.data
})
})
.catch(function (error) {
console.log(error);
})
},
getSummonerSpells({commit, state}){
axios.get("http://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/summoner.json")
.then((response) => {
commit('summonerSpells', {
summonerSpells: response.data.data
})
})
.catch(function (error) {
console.log(error);
})
},
getSummonerRunes({commit, state}){
axios.get("https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/runesReforged.json")
.then((response) => {
commit('summonerRunes', {
summonerRunes: response.data
})
})
.catch(function (error) {
console.log(error);
})
}
And using the second way, I have 1 function like this:
The call:
this.$store.dispatch('getRequirements')
The function:
getRequirements({commit, state}){
axios.all([
axios.get('https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/champion.json'),
axios.get('http://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/summoner.json'),
axios.get('https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/runesReforged.json')
])
.then(axios.spread((response1, response2, response3) => {
commit('champions', {
champions: response1.data.data
})
commit('summonerSpells', {
summonerSpells: response2.data.data
})
commit('summonerRunes', {
summonerRunes: response3.data
})
}))
}
You're executing the requests in parallel so your browser will attempt to execute them simultaneously. Whether or not it does this is up the browser.
You can use your browser's Network console timing column (aka Waterfall in Chrome) to see what's going on.
If your question is
"is there a difference between these?"
the answer is "no" as far as timing goes.
If you start running into errors with any particular request, your first option is more robust since axios.all will reject the promise if any fail.
If you want to speed this up, you could create a service that combines the three results in to one so you're only making a single request. Then throw in a cache for an extra speed-up.
When all requests are complete, you’ll receive an array containing the response objects in the same order they were sent. Commit() is called only after both of your requests are completed.
I've done a simple service-worker to defer requests that fail for my JS application (following this example) and it works well.
But I still have a problem when requests succeed: the requests are done twice. One time normaly and one time by the service-worker due to the fetch() call I guess.
It's a real problem because when the client want to save datas, they are saved twice...
Here is the code :
const queue = new workbox.backgroundSync.Queue('deferredRequestsQueue');
const requestsToDefer = [
{ urlPattern: /\/sf\/observation$/, method: 'POST' }
]
function isRequestAllowedToBeDeferred (request) {
for (let i = 0; i < requestsToDefer.length; i++) {
if (request.method && request.method.toLowerCase() === requestsToDefer[i].method.toLowerCase()
&& requestsToDefer[i].urlPattern.test(request.url)) {
return true
}
}
return false
}
self.addEventListener('fetch', (event) => {
if (isRequestAllowedToBeDeferred(event.request)) {
const requestClone = event.request.clone()
const promiseChain = fetch(requestClone)
.catch((err) => {
console.log(`Request added to queue: ${event.request.url}`)
queue.addRequest(event.request)
event.respondWith(new Response({ deferred: true, request: requestClone }))
})
event.waitUntil(promiseChain)
}
})
How to do it well ?
EDIT:
I think I don't have to re-fetch() the request (because THIS is the cause of the 2nd request) and wait the response of the initial request that triggered the fetchEvent but I have no idea how to do it. The fetchEvent seems to have no way to wait (and read) the response.
Am I on the right way ? How to know when the request that triggered the fetchEvent has a response ?
You're calling event.respondWith(...) asynchronously, inside of promiseChain.
You need to call event.respondWith() synchronously, during the initial execution of the fetch event handler. That's the "signal" to the service worker that it's your fetch handler, and not another registered fetch handler (or the browser default) that will provide the response to the incoming request.
(While you're calling event.waitUntil(promiseChain) synchronously during the initial execution, that doesn't actually do anything with regards to responding to the request—it just ensures that the service worker isn't automatically killed while promiseChain is executing.)
Taking a step back, I think you might have better luck accomplishing what you're trying to do if you use the workbox.backgroundSync.Plugin along with workbox.routing.registerRoute(), following the example from the docs:
workbox.routing.registerRoute(
/\/sf\/observation$/,
workbox.strategy.networkOnly({
plugins: [new workbox.backgroundSync.Plugin('deferredRequestsQueue')]
}),
'POST'
);
That will tell Workbox to intercept any POST requests that match your RegExp, attempt to make those requests using the network, and if it fails, to automatically queue up and retry them via the Background Sync API.
Piggybacking Jeff Posnick's answer, you need to call event.respondWith() and include the fetch() call inside it's async function().
For example:
self.addEventListener('fetch', function(event) {
if (isRequestAllowedToBeDeferred(event.request)) {
event.respondWith(async function(){
const promiseChain = fetch(event.request.clone())
.catch(function(err) {
return queue.addRequest(event.request);
});
event.waitUntil(promiseChain);
return promiseChain;
}());
}
});
This will avoid the issue you're having with the second ajax call.