I want to get the response headers of a cached response inside a service worker. The purpose of this is so that I can read a custom header called 'Modified' to see if it is necessary to fetch a new copy of the data by comparing it to the response headers of a 'HEAD' fetch for the same URL.
On install of the service worker, I populate a cache called v1::fundamentals with some responses. I then register a fetch listener which looks for the request in the cache and if its there, serves it. I then want to async update the cache with non-stale content but only if the 'Modified' header contains a newer timestamp than the one in the cache. In the simplified code below, I try to access the headers with headers.get() but I always get a null in return. Why is this?
When I look at the cache in Chrome devtools, the headers are very much there, I just can't get to them from within the service worker.
self.addEventListener('fetch', event => {
console.log('%c[SW] Fetch caught: ', 'color: #42d9f4', event.request.url);
// Let the browser do its default thing for non-GET requests.
if (event.request.method != 'GET') {
return;
} else {
// Prevent the default, and handle the request ourselves.
event.respondWith(async function() {
// Try to get the response from a cache.
const cache = await caches.open('v1::fundamentals');
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
// Try to get the headers
var cacheDate = cachedResponse.headers.get('Modified');
// Print header, returns 'null'
console.log(cacheDate);
event.waitUntil(cache.add(event.request));
return cachedResponse;
}
return fetch(event.request);
}());
}
});
Related
I'm trying to get just a simple working example of this going, but I feel like I'm misunderstanding something.
My page is dynamically generated (Django), but all I want is to register a service worker to have a fallback page if the user is offline anywhere in the app. I'm testing this on http://localhost:8000, so maybe this is keeping it from working?
This is what I was basing my code from, which I've copied 99% aside from the location of the offline HTML file, which is correctly getting cached, so I can verify it works.
https://googlechrome.github.io/samples/service-worker/custom-offline-page/
The SW is registered at the bottom of my HTML's body:
<script>
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/static/js/service-worker.js');
}
</script>
For /static/js/service-worker.js:
const OFFLINE_VERSION = 1;
const CACHE_NAME = 'offline';
// Customize this with a different URL if needed.
const OFFLINE_URL = '/static/offline/offline.html';
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.add(new Request(OFFLINE_URL, {cache: 'reload'}));
})());
});
self.addEventListener('activate', (event) => {
event.waitUntil((async () => {
// Enable navigation preload if it's supported.
// See https://developers.google.com/web/updates/2017/02/navigation-preload
if ('navigationPreload' in self.registration) {
await self.registration.navigationPreload.enable();
}
})());
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
if (event.request.mode === 'navigate') {
event.respondWith((async () => {
try {
// First, try to use the navigation preload response if it's supported.
const preloadResponse = await event.preloadResponse;
if (preloadResponse) {
return preloadResponse;
}
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log('Fetch failed; returning offline page instead.', error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})());
}
// If our if() condition is false, then this fetch handler won't intercept the
// request. If there are any other fetch handlers registered, they will get a
// chance to call event.respondWith(). If no fetch handlers call
// event.respondWith(), the request will be handled by the browser as if there
// were no service worker involvement.
});
The worker successfully installs and activates. The offline.html page is successfully cached and I can verify this in Chrome Inspector -> Application -> Service Workers. I can also verify it's the correct service-worker.js file and not an old one.
If I switch Chrome to "Offline" and refresh the page, I still get the standard "No Internet" page. It also doesn't look like the "fetch" event happens on any normal page loads due to a "console.log" never getting fired.
Is the sample code I'm using outdated? Is this a limitation of trying this on Localhost? What am I doing wrong? Thank you.
I am trying to set up my website to have a fallback page when it is loaded without an internet connection. To do that, I am following this guide on web.dev: "Create an offline fallback page"
I modified the example ServiceWorker in the article to fit my purposes, including being able to serve external CSS and images in the fallback offline page:
// Incrementing OFFLINE_VERSION will kick off the install event and force
// previously cached resources to be updated from the network.
const OFFLINE_VERSION = 1;
const CACHE_NAME = "offline";
// Customize this with a different URL if needed.
const OFFLINE_URL = "offline.html";
self.addEventListener("install", (event) => {
event.waitUntil(
(async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.add(new Request(OFFLINE_URL, { cache: "reload" }));
await cache.add(new Request("offline.css", { cache: "reload" }));
await cache.add(new Request("logo.png", { cache: "reload" }));
await cache.add(new Request("unsupportedCloud.svg", { cache: "reload" }));
})()
);
});
self.addEventListener("activate", (event) => {
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener("fetch", (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
if (event.request.mode === "navigate") {
if (event.request.url.match(/SignOut/)) {
return false;
}
event.respondWith(
(async () => {
try {
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log("Fetch failed; returning offline page instead.", error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})()
);
}
});
However, when the offline.html page loads it does is unable to load the images and the CSS; the images fail to load with a 404 error and the request for the CSS doesn't even show in the Network tab of the browser dev console.
I would expect the images and CSS to be fetched from the ServiceWorker cache, but it seems that neither is.
Am I missing something on how ServiceWorkers cache requests or how they fetch them? Or on how to design the offline fallback page to work?
Turns out there were a few reasons why the assets were not being found.
The first reason is because when they were saved to cache, they were saved with the entire path where they are stored alongside the Service Worker file.
So the path that was saved was something along the lines of static/PWA/[offline.css, logo.png, unsupportedCloud.svg] but the path of the page that requested them was in the root. In offline.html I had to reference them as such: <img src="static/PWA/unsupportedCloud.svg" class="unsupported-cloud" />.
The second reason is that the Service Worker only checks for fetch events were of type navigation. In my example you can see I had written if (event.request.mode === "navigate") {...} so we only attempted to use the cache that we set up in navigation events, which would not catch fetch events to get assets. To fix that, I set up a new check for no-cors event modes: else if (event.request.mode === "no-cors") {...}.
These two fixes let me get assets from the offline cache that I set up on Service Worker installation. With some other minor fixes, this addresses my question!
I have a service worker that is caching requests from the browser, so the page works offline. However, each time the user logs out and back in, a new CSRF token is generated and all previous cached data is useless since the requests contain the CSRF token as part of the querystring. This necessitates re-caching all the same data again, so we're left with multiple copies of the data in the cache, each copy simply having a different request URL due to the different CSRF tokens.
I'm querying the network-first, then failing over to cache if the network is unavailable.
How should I handle the caching of these responses in relation to the CSRF token? Should I manually remove the CSRF token from the event.request value before doing the cache.put() and cache.match() functions? Is this even permitted? By modifying the request URL, it seems the previously cached value for that request could still be returned, even if the user has logged out and back in, which would be the desired behavior.
Also, how can I remove all cached requests that don't match the current CSRF token, without purging all entries from the cache?
Here's the pertinent Service Worker code:
self.addEventListener('fetch', function(event)
// 'fetch' event lister: if the network is UP, fetch the data across the network and cache the result.
// If network is unavailable, attempt to fetch from cache.
{
// Send a response, first by trying the network, then by looking in cache. If both fail, an error occurs.
event.respondWith(
// Try to fetch the request from the network:
fetch(event.request)
// If successful, cache a clone of the response, then return it.
.then(function(response)
{
var r = response.clone();
caches.open('offline-cache')
.then(function(cache)
{
cache.put(event.request, r);
})
.catch(function(error)
{
console.log("Unable to cache item: ", error);
});
return response;
})
// If network fails, try to pull the item from cache.
.catch(function(error)
{
// Open the cache
return caches.open('offline-cache')
// When cache is open, attempt to match with desired request
.then(function(cache)
{
// Try to match:
return cache.match(event.request)
// If successful, return the match. Errors bubble up to the main event.
.then(function(response)
{
return response;
});
})
.catch(function(error)
{
console.log("Cached entry not found. Error.");
});
})
); // END event.respondWith
});
If resources to be cached are identical for all requests/users and can be identified by removing certain URL parameters (or removing the query string entirely), your code can generate a cache key URL and use that String instead of Request objects for the Fetch and Cache APIs' methods:
// Returns URL string used for caching that excludes authentication-specific URL params, etc
function getCacheKeyUrl(request) {
// Strip query string from URL
return request.url.replace(/\?.*/,'');
}
self.addEventListener('fetch', event => {
const cacheKeyUrl = getCacheKeyUrl(event.request);
...
});
Then you can use fetch(cacheKeyUrl), cache.put(cacheKeyUrl, r), cache.match(cacheKeyUrl), etc in your code.
I have a website which I don't want to make people create accounts. It is a news feed with each news article categorized. I want to allow people to tag the categories they are interested in so that next time they go to the site it only shows news for the categories that are tagged.
I'm saving the tags in an indexedDB which I understand is available in a service worker.
Hence in my service worker I want to "intercept" requests to www.my-url.com, check the indexDB for what categories this person is interested in, and add some headers like 'x-my-customer-header': 'technology,physics,sports' so that my server can respond with a dynamic html of those categories only.
However I'm struggling to get the service worker to properly cache my root response. In my serviceworker.js, I console log every event.request for the onFetch handler. There are no requests that are related to my root url. I'm testing right now on my localhost, but I only see fetch requests to css & js files.
Here is my onFetch:
function onFetch(event) {
console.log('onFetch',event.request.url);
event.request.headers["X-my-custom-header"] = "technology,sports";
event.respondWith(
// try to return untouched request from network first
fetch(event.request).catch(function() {
// if it fails, try to return request from the cache
caches.match(event.request).then(function(response) {
if (response) {
return response;
}
// if not found in cache, return default offline content for navigate requests
if (event.request.mode === 'navigate' ||
(event.request.method === 'GET' && event.request.headers.get('accept').includes('text/html'))) {
return caches.match('/offline.html');
}
})
})
);
}
I'm using rails so there is no index.html that exists to be cached, when a user hits my url, the page is dynamically served from my news#controller.
I'm actually using the gem serviceworker-rails
What am I doing wrong? How can I have my service worker save a root file and intercept the request to add headers? Is this even possible?
Credit here goes to Jeff Posnick for his answer on constructing a new Request. You'll need to respond with a fetch that creates a new Request to which you can add headers:
self.addEventListener('fetch', event => {
event.respondWith(customHeaderRequestFetch(event))
})
function customHeaderRequestFetch(event) {
// decide for yourself which values you provide to mode and credentials
// Copy existing headers
const headers = new Headers(event.request.headers);
// Set a new header
headers.set('x-my-custom-header', 'The Most Amazing Header Ever');
// Delete a header
headers.delete('x-request');
const newRequest = new Request(event.request, {
mode: 'cors',
credentials: 'omit',
headers: headers
})
return fetch(newRequest)
}
Hi I have simple service worker, but my Range header is not being send even thought i can log it on request object.
self.addEventListener('fetch', async function(event) {
if (event.request.headers.get("range")) {
const response = await fetch(event.request.clone());
return event.respondWith(this.getPartialResponse(event.request, response));
}
return event.respondWith(fetch(event.request));
}
async getPartialResponse(req, res) {
const pos = Number(/^bytes\=(\d+)\-$/g.exec(req.headers.get("range"))[1]);
const ab = await res.arrayBuffer();
const headers = new Headers(res.headers);
headers.append("Content-Range", `bytes ${pos}-${ab.byteLength - 1}/${ab.byteLength}`);
headers.append("Content-Length", ab.byteLength - pos + 1);
return new Response(ab.slice(pos), {
status: 206,
statusText: "Partial Content",
headers
});
}
Here you can see request one catched by service worker and the second one send to the api where you can notice the Range header is missing. Why ? My browser: Chrome/59.0.3071.104
caveat, I don't know what a range header is, so bear with me.
I know some headers are not available to the service worker for security reasons. Basically you cannot tamper with the request and response. For example you do not have access to Cache-Control in response objects from 3rd party domains, that way you cannot adjust the HTTP caching logic. This might be your issue, but I can't say for sure.
Look up rules concerning opaque request & responses.