I have developed a javascript application with a custom service worker and facing a problem fetching the service worker file behind authentication.
My application is deployed behind a cookie based authentication (after login, the user has a cookie which is checked by every request). This is fine for for fetching all resources, except the service-worker file, for which the browser doesn't send any cookies. Note: This only happens if the service worker is loaded with {type: "module"}, and works fine with {type: "clasic"}.
Here is miminal example reproducing my issue:
backend: index.js
// Service worker file
app.get('/test.js', (req,res) => {
if(req.cookies.token != null) {
res.sendFile('./public/test.js')
}
else {
res.sendStatus(401)
}
})
// Index page
app.get('/', (req, res) => {
// initial login check goes here
res.cookie('token',"<user token>").sendFile('./public/index.html')
})
frontend: index.html
<body>
<h1>Test</h1>
// Doesn't work with the auth
<script type="module">
const registration = await navigator.serviceWorker.register('test.js', {type: "module"})
console.log(registration)
</script>
// Works with the auth
<script src="test2.js"></script>
// Works with the auth
<script type="module">
const registration = await navigator.serviceWorker.register('test.js', {type: "clasic"})
console.log(registration)
</script>
</body>
When the request is made, the browser doesn't attach any cookie information:
.
Why doesn't the browser treat the service-worker module request as everything else? Why is there a difference between the two types?
Related
Basically, I have an online app that uses a htaccess file to silently redirect all requests in a given /folder/ to the same html. Then, to decide what to show the user, the page calls
var page_name = location.href.split('/').pop();
This works well online, but could I use a ServiceWorker to support this folder/file model while the page is offline? Or will I always get the page cannot be found error unless I explicitly cache the URLs?
What you describe can be accomplished using the App Shell model.
Your service worker's exact code might look a little different, and tools like Workbox can automate some of this for you, but a very basic, "vanilla" example of a service worker that accomplishes this is:
self.addEvenListener('install', (event) => {
const cacheShell = async () => {
const cache = await caches.open('my-cache');
await cache.add('/shell.html');
};
event.waitUntil(cacheShell());
});
self.addEventListener('fetch', (event) => {
// If this is a navigation request...
if (event.request.mode === 'navigate') {
// ...respond with the cached shell HTML.
event.respondWith(caches.match('/shell.html'));
return;
}
// Any other caching/response logic can go here.
});
Regardless of what the location.href value is, when this service worker is in control, the App Shell HTML will be used to fulfill all navigation requests.
I'm using a service worker from PWABuilder for my website https://digimoncard.io/.
The cache-first network service worker JS file contains the following code:
// This is the service worker with the Cache-first network
const CACHE = "pwabuilder-precache";
importScripts('https://storage.googleapis.com/workbox-cdn/releases/5.0.0/workbox-sw.js');
self.addEventListener("message", (event) => {
if (event.data && event.data.type === "SKIP_WAITING") {
self.skipWaiting();
}
});
workbox.routing.registerRoute(
new RegExp('/*'),
new workbox.strategies.CacheFirst({
cacheName: CACHE
})
);
I then have the following code in my index.php file under the body:
<!-- PWA -->
<script type="module">
import 'https://cdn.jsdelivr.net/npm/#pwabuilder/pwaupdate';
const el = document.createElement('pwa-update');
document.body.appendChild(el);
</script>
<!-- END PWA -->
The service worker never seems to update? No matter what I change on any page (content, file versioning, etc) the server worker won't update if it's already been cache. I can manually fix this by clearing browser cache but I'm either missing something or this is intended? For example, the version I visited on my phone has out of date content for 2 days now.
That is the intended behavior when using a cache-first strategy. Assuming there's a match in the cache, that's the response that will be used.
If you're looking for "use the cache response if present, but also update it in the background" approach, you can switch to the stale-while-revalidate strategy.
The full list of strategies supported by Workbox out of the box can be found at https://developers.google.com/web/tools/workbox/modules/workbox-strategies
I have a React app created by using create-react-app. By default, this tool creates a serviceWorker.js file for us and I am using this to register a service-worker. Furthermore, the documents suggest using google's workbox wizard to create a service-worker.js used to manage my website for offline purposes. The goal is for me to store an offline.html page in the browsers cache and whenever there is no online connection, render the cached offline.html page.
I am successful in storing the offline.html in cache and as you can see below, it is stored in the precached URLS (check last two rows).
I can also manually navigate to the offline.html if i change the URL in my browser.
However, I am having trouble automatically grabbing this file and rendering it whenever there isn't a connection.
In the serviceWorker.js code that is generated for me from CRA theres a function called checkValidServiceWorker:
function checkValidServiceWorker(swUrl, config) {
// Check if the service worker can be found. If it can't reload the page.
fetch(swUrl)
.then(response => {
// Ensure service worker exists, and that we really are getting a JS file.
const contentType = response.headers.get('content-type');
if (
response.status === 404 ||
(contentType != null && contentType.indexOf('javascript') === -1)
) {
// No service worker found. Probably a different app. Reload the page.
navigator.serviceWorker.ready.then(registration => {
registration.unregister().then(() => {
window.location.reload();
});
});
} else {
// Service worker found. Proceed as normal.
registerValidSW(swUrl, config);
}
})
.catch(() => {
console.log(
'No internet connection found. App is running in offline mode.'
);
const OFFLINE_URL = '/.offline/offline.html';
return caches.match(OFFLINE_URL).then((response) => {
console.log(response)
});
});
}
So in the catch part of the function, I want to do my redirect because thats the logic that runs when we are offline. I read a lot of docs and my current solution doesn't work. Any ideas on how to redirect in my serviceWorker?
I'm trying to figure out what happens if I have a service worker registered on a live site called sw.js and then I rename the service worker to service-worker.js. Now the old one isn't found but it is still showing the old cached version.
How long does it take for it to register the new renamed service worker or how does this work at all?
Edit
This is how I have register the service worker in a react application:
componentDidMount() {
if ("serviceWorker" in navigator) {
navigator.serviceWorker
.register("/service-worker.js")
.then(registration => {
console.log("service worker registration successful: ", registration);
})
.catch(err => {
console.warn("service worker registration failed", err.message);
});
}
}
The newly created service worker (renamed) cannot take over the old one because the old one is still active and controlling the client.
the new service worker(renamed one) will wait until the existing worker is controlling zero clients.
Now imagine a service worker sw.js installed and active (controlling the client),
Chrome will visualize the process for you like this
1. The service worker is registered and active
2. Now let's rename the service worker file to sw2.js
You can see that chrome is telling you that something has changed about the service worker. but the current one will keep controlling the client until you force the new one to take control by clicking on the skipWaitting button or by flushing your cache. clicking on the button will cause the sw2.js to take controll over the sw1.js
Now if you need to do this programmatically, you can do it in the install event inside your service worker by calling self.skipWaiting().
self.addEventListener('install', (e) => {
let cache = caches.open(cacheName).then((c) => {
c.addAll([
// my files
]);
});
self.skipWaiting();
e.waitUntil(cache);
});
The following animated image from Jake Archibald's article The Service Worker Lifecycle can make the idea more clear.
You have to update the instance creation code to reflect this change where your shared worker is being initialized and used, for example your current code would look like
var worker = new SharedWorker("ws.js");
That will need to be updated to
var worker = new SharedWorker("service-worker.js");
I was able to solve it by setting up a server which listens to both /service-worker.js and /sw.js get requests.
Since the service worker was renamed from sw.js to service-worker.js it was not finding the old service worker at http://example.com/sw.js so what I did was the following:
createServer((req, res) => {
const parsedUrl = parse(req.url, true);
const { pathname } = parsedUrl;
// new service worker
if (pathname === "/service-worker.js") {
const filePath = join(__dirname, "..", pathname);
app.serveStatic(req, res, filePath);
// added new endpoint to fetch the new service worker but with the
// old path
} else if (pathname === "/sw.js") {
const filePath = join(__dirname, "..", "/service-worker.js");
app.serveStatic(req, res, filePath);
} else {
handle(req, res, parsedUrl);
}
}).listen(port, err => {
if (err) throw err;
console.log(`> Ready on http://localhost:${port}`);
});
As you can see, I added a second path to serve the same service worker but with the /sw.js endpoint, one for the old sw.js and the other one for the newer service-worker.js.
Now when old visitors that have the old active sw.js will download the newer one and upon revisit, they will automatically fetch the newer renamed service-worker.js service worker.
I'm using workbox-webpack-plugin to register service worker.
My frontend app is react-redux app configured with webpack. If you visit app url, you can always see login view.
My plugin inside webpack.config.js:
new InjectManifest({
swSrc: path.join('src', 'service-worker.js')
})
Service worker:
workbox.skipWaiting();
workbox.clientsClaim();
workbox.precaching.precacheAndRoute(self.__precacheManifest);
My service worker caches all my splitted routes. But that doesn't matter - even if they all are cached, when user without connection visits my app, he cannot login. That's why I need a way to check if user is in offline mode, and instead of returning login, return 'offline.html' page.
I found out that my env.config.js file (which contains API URLS and is requested on login page) is not cached, so I think it would be easy to catch error while not getting this file. So I added following in my service worker:
workbox.routing.registerRoute(
new RegExp('/env.config.js'),
({event}) => {
return networkFirstHandler.handle({event})
.catch(() => caches.match('/offline.html'));
}
);
But it doesn't return offline.html in browser. It seems like 'offline.html' file is returned instead of 'env.config.js' file.
How to accomplish this? I'm new to workbox plugin and it would be great to see some suggestions.
importScripts("/precache-manifest.81b400bbc7dc89de30f4854961b64d1d.js", "https://storage.googleapis.com/workbox-cdn/releases/3.4.1/workbox-sw.js");
workbox.skipWaiting();
workbox.clientsClaim();
const STATIC_FILES = [
'/env.config.js',
];
self.__precacheManifest = STATIC_FILES.concat(self.__precacheManifest || []);
workbox.precaching.precacheAndRoute(self.__precacheManifest);
Update - since I decided to cache env.config.js file I'm only getting API error while using app offline. Maybe this API call (which returns error because of no connection) is a good trigger to display offline page? I think it is, but I still don't know.
When I try something like this:
workbox.routing.registerRoute(
new RegExp(API_REGEX_GOES_HERE),
({event}) => {
return networkFirstHandler.handle({event})
.catch(() => caches.match('/offline.html'));
}
);
The "offline.html" page will be returned instead of API request. So it will not be displayed like a page...