How to enable Service Worker only for HTML requests? - javascript

I am developing the code for a Service Worker using Cloudflare Workers (JS).
I want to fire the Service Worker only for HTML requests, so that I can optimize the number of requests being evaluated. Right now I am using this code:
addEventListener('fetch', async event => {
if (event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html')) {
event.respondWith(handleRequest(event.request));
}
});
async function handleRequest(request) {
const response = await fetch(request);
// Clone the response so that it's no longer immutable
const newResponse = new Response(response.body, response);
// Add a custom header with a value
newResponse.headers.append('x-h-w', 'hello world');
return newResponse;
}
While it is only adding the custom header to the request associated with the HTML element, the Service Worker is evaluating every request from the website (styles, images, scripts etc.).
Is there a way to evaluate only HTML requests? (without consuming quota evaluating the other type of requests)

This is not possible, I fear. As soon as you add an event listener on the "fetch" event, you'll receive all events. But just not invoking the event.respondWith is the right thing to do if you are not interested.
Why would you be worried about the "quota"? You should not notice the performance impact or can you really measure any difference?
As a tiny hint: You don't need to check the "accept" header, because only the initial request is usually an HTML request and this request has some special mode:
event.request.mode === 'navigate'
This should be even less performance overhead 🙃

Related

How to enable SharedArrayBuffer in Microsoft Edge Javascript

So the other day, I asked this question about javascript webworkers:
Javascript Webworker how to put json information into array buffer. One of the answers I received was to use a SharedArrayBuffer to share memory between the main javascript and the webworker. I know that for a time, this was usable on microsoft edge, but for a security concern was disabled a while back. My edge version is 96.0.1054.62. Is there any way to enable using shared array buffers, in the browser configuration or settings? Currently, when I try to use it, it says that SharedArrayBuffer is undefined.
In order for Shared Array Buffer support to be enabled, your web page needs to be in a secure context. To do this, you need your server to give the following headers: Cross-Origin-Opener-Policy: same-origin and Cross-Origin-Embedder-Policy: require-corp. You can read more about it on MDN
Changing the header on the server is the recommended way, but if you do not have the ability to manage headers on the server at all, then you can modify them through Service Workers. This blogpost describes enabling SharedArrayBuffer via header modification in ServiceWorker. It works in the following order:
When the page is loaded for the first time, a Service worker is registered
The page is reloaded
SharedArrayBuffer becomes available because ServiceWorker controls all CORS headers for all requests
Service Worker modifies all requests by adding CORS/COEP headers (The example is taken from the mentioned blogpost):
self.addEventListener("install", function() {
self.skipWaiting();
});
self.addEventListener("activate", (event) => {
event.waitUntil(self.clients.claim());
});
self.addEventListener("fetch", function(event) {
if (event.request.cache === "only-if-cached" && event.request.mode !== "same-origin") {
return;
}
event.respondWith(
fetch(event.request)
.then(function(response) {
// It seems like we only need to set the headers for index.html
// If you want to be on the safe side, comment this out
// if (!response.url.includes("index.html")) return response;
const newHeaders = new Headers(response.headers);
newHeaders.set("Cross-Origin-Embedder-Policy", "require-corp");
newHeaders.set("Cross-Origin-Opener-Policy", "same-origin");
const moddedResponse = new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: newHeaders,
});
return moddedResponse;
})
.catch(function(e) {
console.error(e);
})
);
});

Modify POST request body in service worker

I am trying to add a parameter to the body of a POST request in a service worker but the original body is send. I use the following code
let token = '';
self.addEventListener('message', function (event) {
if (event.data && event.data.type === 'SET_TOKEN') {
token = event.data.token;
}
});
self.addEventListener('fetch', function (event) {
const destURL = new URL(event.request.url);
const headers = new Headers(event.request.headers);
if (token) headers.append('Authorization', token);
if (destURL.pathname === '/logout/') {
const promiseChain = event.request.json().then((originalBody) => {
return fetch(event.request.url, {
method: event.request.method,
headers,
// this body is not send to the server but only the original body
body: JSON.stringify(Object.assign(originalBody, { token })),
});
});
event.respondWith(promiseChain);
return;
}
const authReq = new Request(event.request, {
headers,
mode: 'cors',
});
event.respondWith(fetch(authReq));
});
Generally speaking, that should work. Here's a very similar live example that you can run and confirm:
https://glitch.com/edit/#!/materialistic-meadow-rowboat?path=sw.js%3A18%3A7
It will just POST to https://httpbin.org/#/Anything/post_anything, which will in turn echo back the request body and headers.
If your code isn't working, I would suggest using that basic sample as a starting point and slowing customizing it with your own logic. Additionally, it would be a good idea to confirm that your service worker is properly in control of the client page when its makes that request. Using Chrome DevTool's debugger interface, you should be able to put breakpoints in your service worker's fetch event handler and confirm that everything is running as expected.
Taking a step back, you should make sure that your web app isn't coded in such a way that it requires the service worker to be in control in order to go things like expire auth tokens. It's fine to have special logic in the service worker to account for auth, but make sure your code paths work similarly when the service worker doesn't intercept requests, as might be the case when a user force-reloads a web page by holding down the Shift key.

Should I return a promise in a JS Service Worker onFetch callback function if I don't want to do anything with the original request?

My scenario is the following:
I have a Progressive Web App that uses a Service Worker where I need to catch the request and do something with it every time the user requests a resource or leaves the current URL
I'm handling that through adding a callback to the fetch event of the worker
I only care about requested resources within our domain (e.g. example.com)
If the requested resource is within our domain I return the promise result from a regular fetch, so that's already covered
But, if the requested resource is outside my domain (as shown in the below snippet) I want the original request to just continue
I'm currently just doing a simple return if the scenario in bullet 5 is true
Snippet of my current code:
function onFetch(event) {
if (!event.request.url.startsWith("example.com")) {
return;
} else {
event.respondWith(
fetch(event.request)
.then(req => {
// doing something with the request
})
.catch((error)=> {
// handle errors etc.
})
.finally(()=> {
// cleanup
})
);
}
}
self.addEventListener('fetch', onFetch);
My question: Is it OK if I just return nothing like in the snippet, or, do I need to return something specific, like a new promise by fetching the original request (like I'm doing on the else block)?
Thanks!
It is absolutely okay to do what you're doing. Not calling event.respondWith() is a signal to the browser that a given fetch handler is not going to generate a response to a given request, and you can structure your code to return early to avoid calling event.respondWith().
You might have multiple fetch handlers registered, and if the first one returns without calling event.respondWith(), the next fetch handler will then get a chance to respond. If all of the fetch handlers have executed and none of them call event.respondWith(), the browser will automatically handle the request as if there were no service worker at all, which is what you want.
In terms of observed behavior, not calling event.respondWith() at all ends up looking similar to what would happen if you called event.respondWith(event.request). But there is overhead involved in making a fetch() request inside of a service worker and then passing the response body from the service worker thread back to the main program, and you avoid that overhead if you don't call event.respondWith(). So, I'd recommend the approach you're taking.

How to fallback to browser's default fetch handling within event.respondWith()?

Within the service worker my fetch handler looks like this:
self.addEventListener('fetch', function (event) {
event.respondWith(
caches.match(event.request).then(function (response) {
return response || fetch(event.request); //<-- is this the browser's default fetch handling?
})
);
});
The method event.respondWith() forces me to handle all requests myself including xhr requests which is not what I like todo. I only want the cached resources to be returned if available and let the browser handle the rest using the browser's default fetch handling.
I have two issues with fetch(event.request):
Only when devtools is opened it produces an error while fetching the initial URL which is visible in the address bar https://test.de/x/#/page. It happens both on initial install and on every reload:
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'ServiceWorkerGlobalScope': 'only-if-cached' can be set only with 'same-origin' mode`
and I don't understand why because I am not setting anything
It seems to violate the HTTP protocol because it tries to request a URL with an anchor inside:
Console: {"lineNumber":0, "message":"The FetchEvent for
\"https://test.de/x/#/page\" resulted in a network error
response: the promise was rejected.", "message_level":2, "sourceIdentifier":1, "sourceURL":""}`
How does fetch() differ from the browser's default fetch handling and are those differences the cause for those errors?
Additional information and code:
My application also leverages the good old appCache in parallel with the service worker (for backwards compatibility). I am not sure if the appcache interferes with the service worker installation on the initial page load. The rest of the code is pretty straight forward:
My index.html at https://test.de/x/#/page uses appcache and a base-href:
<html manifest="appcache" lang="de">
<head>
<base href="/x/"/>
</head>
...
Service Worker registration within the body script
window.addEventListener('load', {
navigator.serviceWorker.register('/x/sw.js')
});
Install and activate event
let MY_CACHE_ID = 'myCache_v1';
let urlsToCache = ['js/main.js'];
self.addEventListener('install', function (event) {
event.waitUntil(
caches.open(MY_CACHE_ID)
.then(function (cache) {
return cache.addAll(
urlsToCache.map(url => new Request(url,
{credentials:'include'}))
)
})
);
});
self.addEventListener('activate', function (event) {
//delete old caches
let cacheWhitelist = [MY_CACHE_ID];
event.waitUntil(
caches.keys().then(function (cacheNames) {
return Promise.all(
cacheNames.map(function (cacheName) {
if (cacheWhitelist.indexOf(cacheName) === -1) {
return caches.delete(cacheName);
}
})
);
})
);
});
fetch(event.request) should be really close to the default. (You can get the actual default by not calling respondWith() at all. It should mostly not be observable, but is with CSP and some referrer bits.)
Given that, I'm not sure how you're ending up with 1. That should not be possible. Unfortunately, you've not given enough information to debug what is going on.
As for 2, it passes the fragment on to the service worker, but that won't be included in the eventual network request. That matches how Fetch is defined and is done that way to give the service worker a bit of additional context that might be useful sometimes.

Refresh page after load on cache-first Service Worker

I'm currently considering adding service workers to a Web app I'm building.
This app is, essentially, a collection manager. You can CRUD items of various types and they are usually tightly linked together (e.g. A hasMany B hasMany C).
sw-toolbox offers a toolbox.fastest handler which goes to the cache and then to the network (in 99% of the cases, cache will be faster), updating the cache in the background. What I'm wondering is how you can be notified that there's a new version of the page available. My intent is to show the cached version and, then, if the network fetch got a newer version, to suggest to the user to refresh the page in order to see the latest edits. I saw something in a YouTube video a while ago but the presenter gives no clue of how to deal with this.
Is that possible? Is there some event handler or promise that I could bind to the request so that I know when the newer version is retrieved? I would then post a message to the page to show a notification.
If not, I know I can use toolbox.networkFirst along with a reasonable timeout to make the pages available even on Lie-Fi, but it's not as good.
I just stumbled accross the Mozilla Service Worker Cookbooj, which includes more or less what I wanted: https://serviceworke.rs/strategy-cache-update-and-refresh.html
Here are the relevant parts (not my code: copied here for convenience).
Fetch methods for the worker
// On fetch, use cache but update the entry with the latest contents from the server.
self.addEventListener('fetch', function(evt) {
console.log('The service worker is serving the asset.');
// You can use respondWith() to answer ASAP…
evt.respondWith(fromCache(evt.request));
// ...and waitUntil() to prevent the worker to be killed until the cache is updated.
evt.waitUntil(
update(evt.request)
// Finally, send a message to the client to inform it about the resource is up to date.
.then(refresh)
);
});
// Open the cache where the assets were stored and search for the requested resource. Notice that in case of no matching, the promise still resolves but it does with undefined as value.
function fromCache(request) {
return caches.open(CACHE).then(function (cache) {
return cache.match(request);
});
}
// Update consists in opening the cache, performing a network request and storing the new response data.
function update(request) {
return caches.open(CACHE).then(function (cache) {
return fetch(request).then(function (response) {
return cache.put(request, response.clone()).then(function () {
return response;
});
});
});
}
// Sends a message to the clients.
function refresh(response) {
return self.clients.matchAll().then(function (clients) {
clients.forEach(function (client) {
// Encode which resource has been updated. By including the ETag the client can check if the content has changed.
var message = {
type: 'refresh',
url: response.url,
// Notice not all servers return the ETag header. If this is not provided you should use other cache headers or rely on your own means to check if the content has changed.
eTag: response.headers.get('ETag')
};
// Tell the client about the update.
client.postMessage(JSON.stringify(message));
});
});
}
Handling of the "resource was updated" message
navigator.serviceWorker.onmessage = function (evt) {
var message = JSON.parse(evt.data);
var isRefresh = message.type === 'refresh';
var isAsset = message.url.includes('asset');
var lastETag = localStorage.currentETag;
// ETag header usually contains the hash of the resource so it is a very effective way of check for fresh content.
var isNew = lastETag !== message.eTag;
if (isRefresh && isAsset && isNew) {
// Escape the first time (when there is no ETag yet)
if (lastETag) {
// Inform the user about the update.
notice.hidden = false;
}
//For teaching purposes, although this information is in the offline cache and it could be retrieved from the service worker, keeping track of the header in the localStorage keeps the implementation simple.
localStorage.currentETag = message.eTag;
}
};

Categories

Resources