Requests through service-worker are done twice - javascript

I've done a simple service-worker to defer requests that fail for my JS application (following this example) and it works well.
But I still have a problem when requests succeed: the requests are done twice. One time normaly and one time by the service-worker due to the fetch() call I guess.
It's a real problem because when the client want to save datas, they are saved twice...
Here is the code :
const queue = new workbox.backgroundSync.Queue('deferredRequestsQueue');
const requestsToDefer = [
{ urlPattern: /\/sf\/observation$/, method: 'POST' }
]
function isRequestAllowedToBeDeferred (request) {
for (let i = 0; i < requestsToDefer.length; i++) {
if (request.method && request.method.toLowerCase() === requestsToDefer[i].method.toLowerCase()
&& requestsToDefer[i].urlPattern.test(request.url)) {
return true
}
}
return false
}
self.addEventListener('fetch', (event) => {
if (isRequestAllowedToBeDeferred(event.request)) {
const requestClone = event.request.clone()
const promiseChain = fetch(requestClone)
.catch((err) => {
console.log(`Request added to queue: ${event.request.url}`)
queue.addRequest(event.request)
event.respondWith(new Response({ deferred: true, request: requestClone }))
})
event.waitUntil(promiseChain)
}
})
How to do it well ?
EDIT:
I think I don't have to re-fetch() the request (because THIS is the cause of the 2nd request) and wait the response of the initial request that triggered the fetchEvent but I have no idea how to do it. The fetchEvent seems to have no way to wait (and read) the response.
Am I on the right way ? How to know when the request that triggered the fetchEvent has a response ?

You're calling event.respondWith(...) asynchronously, inside of promiseChain.
You need to call event.respondWith() synchronously, during the initial execution of the fetch event handler. That's the "signal" to the service worker that it's your fetch handler, and not another registered fetch handler (or the browser default) that will provide the response to the incoming request.
(While you're calling event.waitUntil(promiseChain) synchronously during the initial execution, that doesn't actually do anything with regards to responding to the request—it just ensures that the service worker isn't automatically killed while promiseChain is executing.)
Taking a step back, I think you might have better luck accomplishing what you're trying to do if you use the workbox.backgroundSync.Plugin along with workbox.routing.registerRoute(), following the example from the docs:
workbox.routing.registerRoute(
/\/sf\/observation$/,
workbox.strategy.networkOnly({
plugins: [new workbox.backgroundSync.Plugin('deferredRequestsQueue')]
}),
'POST'
);
That will tell Workbox to intercept any POST requests that match your RegExp, attempt to make those requests using the network, and if it fails, to automatically queue up and retry them via the Background Sync API.

Piggybacking Jeff Posnick's answer, you need to call event.respondWith() and include the fetch() call inside it's async function().
For example:
self.addEventListener('fetch', function(event) {
if (isRequestAllowedToBeDeferred(event.request)) {
event.respondWith(async function(){
const promiseChain = fetch(event.request.clone())
.catch(function(err) {
return queue.addRequest(event.request);
});
event.waitUntil(promiseChain);
return promiseChain;
}());
}
});
This will avoid the issue you're having with the second ajax call.

Related

How to enable Service Worker only for HTML requests?

I am developing the code for a Service Worker using Cloudflare Workers (JS).
I want to fire the Service Worker only for HTML requests, so that I can optimize the number of requests being evaluated. Right now I am using this code:
addEventListener('fetch', async event => {
if (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html')) {
event.respondWith(handleRequest(event.request));
}
});
async function handleRequest(request) {
const response = await fetch(request);
// Clone the response so that it's no longer immutable
const newResponse = new Response(response.body, response);
// Add a custom header with a value
newResponse.headers.append('x-h-w', 'hello world');
return newResponse;
}
While it is only adding the custom header to the request associated with the HTML element, the Service Worker is evaluating every request from the website (styles, images, scripts etc.).
Is there a way to evaluate only HTML requests? (without consuming quota evaluating the other type of requests)
This is not possible, I fear. As soon as you add an event listener on the "fetch" event, you'll receive all events. But just not invoking the event.respondWith is the right thing to do if you are not interested.
Why would you be worried about the "quota"? You should not notice the performance impact or can you really measure any difference?
As a tiny hint: You don't need to check the "accept" header, because only the initial request is usually an HTML request and this request has some special mode:
event.request.mode === 'navigate'
This should be even less performance overhead 🙃

Angular, how to make consecutive, dependent http get requests

I'd like to execute one http get request after another has completed. The endpoint URL for the second request will need to depend on the first request.
I've tried nesting the requests with something like this in Angular:
this.http.get(endpoint1).subscribe(
success => {
this.http.get(endpoint2 + success).subscribe(
anotherSuccess => console.log('hello stackoverflow!')
);
}
);
First question here on stackoverflow, please let me know if I should provide more detail.
here you can find how to do that, you have various options:
with subscribe:
this.http.get('/api/people/1').subscribe(character => {
this.http.get(character.homeworld).subscribe(homeworld => {
character.homeworld = homeworld;
this.loadedCharacter = character;
});
});
with mergeMap
this.homeworld = this.http
.get('/api/people/1')
.pipe(mergeMap(character => this.http.get(character.homeworld)));
There is a version with forkJoin but isn't explicit compared with subscribe and mergeMap.
You should probably try using ngrx (redux) triggering the second request depending on a success action.
The flow should be something like this: dispatch an action from a component -> call the first request -> the request triggers a success action -> success effect triggers the second request with a payload from the previous request.
Read the docs here

How to properly prefetch a json endpoint in Chrome?

I am trying to speed up the network critical path on a website, and find out about the great <link rel=preload. So I try to anticipate the call that my single page application do as soon as the JS kicks in, I have put in my index.html
<link rel="preload" href="/api/searchItems" as="fetch" />
Then as the JS starts I make the same call with the help of the axios library:
await axios.get(`/api/searchItems`, { params: queryParams });
I would expect to see the call of Axios returning instantly the preloaded JSON file but instead, I see this:
As you can see the same call is loaded twice.
What I am doing wrong?
EDIT: I have added cache-control: public and nothing changes.
EDIT2: I also tried this code instead of axios:
let data = await fetch('/api/searchItems')
.then(response => {
if (response.ok) {
return response.json();
}
throw new Error('HTTP error ' + response.status);
})
.catch(() => {
data = null; // Just clear it and if it errors again when
// you make the call later, handle it then
});
And nothing change
Three options for you:
It looks like your response has headers making it uncacheable for some reason. You may be able to fix it so it's cacheable.
Use a service worker.
Another approach, if this is really critical path, is to have some inline JavaScript that actually does the call and modify the code that will do the call later to look to see if the previous result is available, like this:
let firstLoad = fetch("/api/searchItems")
.then(response => {
if (response.ok) {
return response.json();
}
throw new Error("HTTP error " + response.status);
})
.catch(() => {
firstLoad = null; // Just clear it and if it errors again when
// you make the call later, handle it then
});
(I'm using fetch there because you may want to do it before you've loaded axios.)
Then in the code that wants this data:
(firstLoad || axios.get("/api/searchItems").then(response => response.data))
.then(/*...*/)
.catch(/*...*/);
firstLoad = null;
If the content requires revalidation (and you're using no-cache, so it does¹), #2 and #3 have the advantage of not requiring a second request to the server.
¹ From MDN:
no-cache
The response may be stored by any cache, even if the response is normally non-cacheable. However, the stored response MUST always go through validation with the origin server first before using it...
(my emphasis)

How to fallback to browser's default fetch handling within event.respondWith()?

Within the service worker my fetch handler looks like this:
self.addEventListener('fetch', function (event) {
event.respondWith(
caches.match(event.request).then(function (response) {
return response || fetch(event.request); //<-- is this the browser's default fetch handling?
})
);
});
The method event.respondWith() forces me to handle all requests myself including xhr requests which is not what I like todo. I only want the cached resources to be returned if available and let the browser handle the rest using the browser's default fetch handling.
I have two issues with fetch(event.request):
Only when devtools is opened it produces an error while fetching the initial URL which is visible in the address bar https://test.de/x/#/page. It happens both on initial install and on every reload:
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'ServiceWorkerGlobalScope': 'only-if-cached' can be set only with 'same-origin' mode`
and I don't understand why because I am not setting anything
It seems to violate the HTTP protocol because it tries to request a URL with an anchor inside:
Console: {"lineNumber":0, "message":"The FetchEvent for
\"https://test.de/x/#/page\" resulted in a network error
response: the promise was rejected.", "message_level":2, "sourceIdentifier":1, "sourceURL":""}`
How does fetch() differ from the browser's default fetch handling and are those differences the cause for those errors?
Additional information and code:
My application also leverages the good old appCache in parallel with the service worker (for backwards compatibility). I am not sure if the appcache interferes with the service worker installation on the initial page load. The rest of the code is pretty straight forward:
My index.html at https://test.de/x/#/page uses appcache and a base-href:
<html manifest="appcache" lang="de">
<head>
<base href="/x/"/>
</head>
...
Service Worker registration within the body script
window.addEventListener('load', {
navigator.serviceWorker.register('/x/sw.js')
});
Install and activate event
let MY_CACHE_ID = 'myCache_v1';
let urlsToCache = ['js/main.js'];
self.addEventListener('install', function (event) {
event.waitUntil(
caches.open(MY_CACHE_ID)
.then(function (cache) {
return cache.addAll(
urlsToCache.map(url => new Request(url,
{credentials:'include'}))
)
})
);
});
self.addEventListener('activate', function (event) {
//delete old caches
let cacheWhitelist = [MY_CACHE_ID];
event.waitUntil(
caches.keys().then(function (cacheNames) {
return Promise.all(
cacheNames.map(function (cacheName) {
if (cacheWhitelist.indexOf(cacheName) === -1) {
return caches.delete(cacheName);
}
})
);
})
);
});
fetch(event.request) should be really close to the default. (You can get the actual default by not calling respondWith() at all. It should mostly not be observable, but is with CSP and some referrer bits.)
Given that, I'm not sure how you're ending up with 1. That should not be possible. Unfortunately, you've not given enough information to debug what is going on.
As for 2, it passes the fragment on to the service worker, but that won't be included in the eventual network request. That matches how Fetch is defined and is done that way to give the service worker a bit of additional context that might be useful sometimes.

Use ServiceWorker cache only when offline

I'm trying to integrate service workers into my app, but I've found the service worker tries to retrieve cached content even when online, but I want it to prefer the network in these situations. How can I do this? Below is the code I have now, but I don't believe it is working. SW Install code is omitted for brevity.
var CACHE_NAME = 'my-cache-v1';
var urlsToCache = [
/* my cached file list */
];
self.addEventListener('install', function(event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
/* request is being made */
self.addEventListener('fetch', function(event) {
event.respondWith(
//first try to run the request normally
fetch(event.request).catch(function() {
//catch errors by attempting to match in cache
return caches.match(event.request).then(function(response) {
// Cache hit - return response
if (response) {
return response;
}
});
})
);
});
This seems to lead to warnings like The FetchEvent for "[url]" resulted in a network error response: an object that was not a Response was passed to respondWith(). I'm new to service workers, so apologies for any mistaken terminology or bad practices, would welcome any tips. Thank you!
Without testing this out, my guess is that you're not resolving respondWith() correctly in the case where there is no cache match. According to MDN, the code passed to respondWith() is supposed to "resolve by returning a Response or network error to Fetch." So why not try just doing this:
self.addEventListener('fetch', function(event) {
event.respondWith(
fetch(event.request).catch(function() {
return caches.match(event.request);
})
);
});
Why don't you open the cache for your fetch event?
I think the process of a service worker is :
Open your cache
Check if the request match with an answer in your cache
Then you answer
OR (if the answer is not in the cache) :
Check the request via the network
Clone your answer from the network
Put the request and the clone of the answer in your cache for future use
I would write :
self.addEventListener('fetch', event => {
event.respondWith(
caches.open(CACHE_NAME).then(cache => {
return cache.match(event.request).then(response => {
return response || fetch(event.request)
.then(response => {
const responseClone = response.clone();
cache.put(event.request, responseClone);
})
})
}
);
});
event.respondWith() expects a promise that resolves to Response. So in case of a cache miss, you still need to return a Response, but above, you are returning nothing. I'd also try to use the cache first, then fetch, but in any case, as the last resort, you can always create a synthetic Response, for example something like this:
return new Response("Network error happened", {"status" : 408, "headers" : {"Content-Type" : "text/plain"}});

Categories

Resources