Angular PWA with Firebase - Page does not work offline - javascript

Recently, my PWA gives a warning in Chrome, that it cannot be installed:
The given Link states, that in the future, the ServiceWorker needs 3 things:
Updated offline detection logic The updated offline detection logic
checks:
That there is a service worker installed for the page.
That the installed service worker has a fetch event.
That the installed service worker fetch event returns an HTTP 200 status code (indicating a successful fetch) in simulated offline mode.
The last paragraph is the new one.
Using the code from the page the google links
self.addEventListener('fetch', (event) => {
if (event.request.method === 'GET') {
event.respondWith(
(async () => {
try {
// First, try to use the navigation preload response if it's supported.
const preloadResponse = await event.preloadResponse;
if (preloadResponse) {
return preloadResponse;
}
// Always try the network first.
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log("Fetch failed; returning offline page instead.", error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})()
);
}
});
Unfortunately, this leads to:
What needs to be done, to comply with the new specs?

Related

Service Worker for Static HTML fallback - Refreshing page Offline just shows "No Internet"

I'm trying to get just a simple working example of this going, but I feel like I'm misunderstanding something.
My page is dynamically generated (Django), but all I want is to register a service worker to have a fallback page if the user is offline anywhere in the app. I'm testing this on http://localhost:8000, so maybe this is keeping it from working?
This is what I was basing my code from, which I've copied 99% aside from the location of the offline HTML file, which is correctly getting cached, so I can verify it works.
https://googlechrome.github.io/samples/service-worker/custom-offline-page/
The SW is registered at the bottom of my HTML's body:
<script>
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/static/js/service-worker.js');
}
</script>
For /static/js/service-worker.js:
const OFFLINE_VERSION = 1;
const CACHE_NAME = 'offline';
// Customize this with a different URL if needed.
const OFFLINE_URL = '/static/offline/offline.html';
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.add(new Request(OFFLINE_URL, {cache: 'reload'}));
})());
});
self.addEventListener('activate', (event) => {
event.waitUntil((async () => {
// Enable navigation preload if it's supported.
// See https://developers.google.com/web/updates/2017/02/navigation-preload
if ('navigationPreload' in self.registration) {
await self.registration.navigationPreload.enable();
}
})());
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
if (event.request.mode === 'navigate') {
event.respondWith((async () => {
try {
// First, try to use the navigation preload response if it's supported.
const preloadResponse = await event.preloadResponse;
if (preloadResponse) {
return preloadResponse;
}
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log('Fetch failed; returning offline page instead.', error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})());
}
// If our if() condition is false, then this fetch handler won't intercept the
// request. If there are any other fetch handlers registered, they will get a
// chance to call event.respondWith(). If no fetch handlers call
// event.respondWith(), the request will be handled by the browser as if there
// were no service worker involvement.
});
The worker successfully installs and activates. The offline.html page is successfully cached and I can verify this in Chrome Inspector -> Application -> Service Workers. I can also verify it's the correct service-worker.js file and not an old one.
If I switch Chrome to "Offline" and refresh the page, I still get the standard "No Internet" page. It also doesn't look like the "fetch" event happens on any normal page loads due to a "console.log" never getting fired.
Is the sample code I'm using outdated? Is this a limitation of trying this on Localhost? What am I doing wrong? Thank you.

Using images and CSS to offline fallback page

I am trying to set up my website to have a fallback page when it is loaded without an internet connection. To do that, I am following this guide on web.dev: "Create an offline fallback page"
I modified the example ServiceWorker in the article to fit my purposes, including being able to serve external CSS and images in the fallback offline page:
// Incrementing OFFLINE_VERSION will kick off the install event and force
// previously cached resources to be updated from the network.
const OFFLINE_VERSION = 1;
const CACHE_NAME = "offline";
// Customize this with a different URL if needed.
const OFFLINE_URL = "offline.html";
self.addEventListener("install", (event) => {
event.waitUntil(
(async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.add(new Request(OFFLINE_URL, { cache: "reload" }));
await cache.add(new Request("offline.css", { cache: "reload" }));
await cache.add(new Request("logo.png", { cache: "reload" }));
await cache.add(new Request("unsupportedCloud.svg", { cache: "reload" }));
})()
);
});
self.addEventListener("activate", (event) => {
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener("fetch", (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
if (event.request.mode === "navigate") {
if (event.request.url.match(/SignOut/)) {
return false;
}
event.respondWith(
(async () => {
try {
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log("Fetch failed; returning offline page instead.", error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})()
);
}
});
However, when the offline.html page loads it does is unable to load the images and the CSS; the images fail to load with a 404 error and the request for the CSS doesn't even show in the Network tab of the browser dev console.
I would expect the images and CSS to be fetched from the ServiceWorker cache, but it seems that neither is.
Am I missing something on how ServiceWorkers cache requests or how they fetch them? Or on how to design the offline fallback page to work?
Turns out there were a few reasons why the assets were not being found.
The first reason is because when they were saved to cache, they were saved with the entire path where they are stored alongside the Service Worker file.
So the path that was saved was something along the lines of static/PWA/[offline.css, logo.png, unsupportedCloud.svg] but the path of the page that requested them was in the root. In offline.html I had to reference them as such: <img src="static/PWA/unsupportedCloud.svg" class="unsupported-cloud" />.
The second reason is that the Service Worker only checks for fetch events were of type navigation. In my example you can see I had written if (event.request.mode === "navigate") {...} so we only attempted to use the cache that we set up in navigation events, which would not catch fetch events to get assets. To fix that, I set up a new check for no-cors event modes: else if (event.request.mode === "no-cors") {...}.
These two fixes let me get assets from the offline cache that I set up on Service Worker installation. With some other minor fixes, this addresses my question!

Service Worker failing Fetch requests

Fairly new to Service workers and JS promises so any help is appreciated.
The webpage that's failing:
https://icbmaeronautics.co.uk
The Error as shown in Chrome Dev Tools:
The FetchEvent for "http://localhost:3005/" resulted in a network error
response: the promise was rejected.
Promise.catch (async)
(anonymous) # sw.js:29
sw.js:1 Uncaught (in promise) TypeError: Failed to fetch
The markup for registering service worker (Seems to work fine)
if('serviceWorker' in navigator) {
navigator.serviceWorker
.register('/sw.js')
.then(function() { console.log("Service Worker Registered"); });
}
The install code within the sw.js script file
self.addEventListener('install', function(e) {
e.waitUntil(
caches.open('airhorner').then(function(cache) {
/* Particular urls which all install with code 200s */
})
);
});
And finally the fetch code which seems to have an issue with the event.respondWith() function
self.addEventListener('fetch', event => {
// Let the browser do its default thing
// for non-GET requests.
if (event.request.method != 'GET') return;
// Prevent the default, and handle the request ourselves.
event.respondWith(async function() {
// Try to get the response from a cache.
const cache = await caches.open('dynamic-v1');
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
// If we found a match in the cache, return it, but also
// update the entry in the cache in the background.
event.waitUntil(cache.add(event.request));
return cachedResponse;
}
// If we didn't find a match in the cache, use the network.
return fetch(event.request);
}());
});

How to handle 206 responses in Firefox service workers

While testing service workers for a project and using this example from google:
/*
Copyright 2016 Google Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
// Names of the two caches used in this version of the service worker.
// Change to v2, etc. when you update any of the local resources, which will
// in turn trigger the install event again.
const PRECACHE = 'precache-v1';
const RUNTIME = 'runtime';
// A list of local resources we always want to be cached.
const PRECACHE_URLS = [
'index.html',
'./', // Alias for index.html
'styles.css',
'../../styles/main.css',
'demo.js'
];
// The install handler takes care of precaching the resources we always need.
self.addEventListener('install', event => {
event.waitUntil(
caches.open(PRECACHE)
.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
);
});
// The activate handler takes care of cleaning up old caches.
self.addEventListener('activate', event => {
const currentCaches = [PRECACHE, RUNTIME];
event.waitUntil(
caches.keys().then(cacheNames => {
return cacheNames.filter(cacheName => !currentCaches.includes(cacheName));
}).then(cachesToDelete => {
return Promise.all(cachesToDelete.map(cacheToDelete => {
return caches.delete(cacheToDelete);
}));
}).then(() => self.clients.claim())
);
});
// The fetch handler serves responses for same-origin resources from a cache.
// If no response is found, it populates the runtime cache with the response
// from the network before returning it to the page.
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.url.startsWith(self.location.origin)) {
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
return cache.put(event.request, response.clone()).then(() => {
return response;
});
});
});
})
);
}
});
source: https://github.com/GoogleChrome/samples/blob/gh-pages/service-worker/basic/service-worker.js
I discovered that Firefox (in contrast to safari and chrome) throws errors within event.waitUntil() as well as event.respondWith() if something does not work with the fetch requests (even if its just a 206 partial content response):
Service worker event waitUntil() was passed a promise that rejected
with 'TypeError: Cache got basic response with bad status 206 while
trying to add request
That behaviour breaks the installer. If I add a .catch() to the installer like this
self.addEventListener('install', event => {
event.waitUntil(
caches.open(PRECACHE)
.then(cache => cache.addAll(PRECACHE_URLS))
.then(self.skipWaiting())
.catch(function(err){
console.log(err);
self.skipWaiting();
})
);
});
I presume the first 206 will make the precache stop (?)
Also after that the sw gets installed but once in a while I get a
Service worker event respondWith() was passed a promise that rejected
with 'TypeError: Cache got basic response with bad status 206 while
trying to add request
and even if that does not happen, if I try to open the a link to the url that threw the 206 error while installation/precaching I get:
Failed to load ‘https://xxxx/yyyyy.mp3’. A
ServiceWorker passed a promise to FetchEvent.respondWith() that
rejected with ‘TypeError: Cache got basic response with bad status 206
while trying to add request
https://xxxx/yyyyy.mp3’.
how can I handle this kind of error properly? catching like above doesn't make much sense for it breaks the forced precaching. And even if that would be acceptable, it seems to interfere with every request happening from then on and might cause trouble later on.
one half of the problem i could solve by moving the return statement from within the cache.put() function outside of it:
self.addEventListener('fetch', event => {
// Skip cross-origin requests, like those for Google Analytics.
if (event.request.url.startsWith(self.location.origin)) {
event.respondWith(
caches.match(event.request).then(cachedResponse => {
if (cachedResponse) {
return cachedResponse;
}
return caches.open(RUNTIME).then(cache => {
return fetch(event.request).then(response => {
// Put a copy of the response in the runtime cache.
cache.put(event.request, response.clone()).then(() => {
console.log("logged a file into RUNTIME:");
console.log(response);
});
return response; // and return anyhow whatever came back
});
});
})
);
}
});
this way the sw does not wait for the cache.put() to be successful and yet it gets cached most of the times.
this solves the most urgent issue but problems still are
a) forced precaching still gets cancelled by 206 responses
b) in case I would want to make sure that requests are cached in runtime, i would still need to make some retry() function or sth.

Best practices for detecting offline state in a service worker

I have a service worker that is supposed to cache an offline.html page that is displayed if the client has no network connection. However, it sometimes believes the navigator is offline even when it is not. That is, navigator.onLine === false. This means the user may get offline.html instead of the actual content even when online, which is obviously something I'd like to avoid.
This is how I register the service worker in my main.js:
// Install service worker for offline use and caching
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/service-worker.js', {scope: '/'});
}
My current service-worker.js:
const OFFLINE_URL = '/mysite/offline';
const CACHE_NAME = 'mysite-static-v1';
self.addEventListener('install', (event) => {
event.waitUntil(
// Cache the offline page when installing the service worker
fetch(OFFLINE_URL, { credentials: 'include' }).then(response =>
caches.open(CACHE_NAME).then(cache => cache.put(OFFLINE_URL, response)),
),
);
});
self.addEventListener('fetch', (event) => {
const requestURL = new URL(event.request.url);
if (requestURL.origin === location.origin) {
// Load static assets from cache if network is down
if (/\.(css|js|woff|woff2|ttf|eot|svg)$/.test(requestURL.pathname)) {
event.respondWith(
caches.open(CACHE_NAME).then(cache =>
caches.match(event.request).then((result) => {
if (navigator.onLine === false) {
// We are offline so return the cached version immediately, null or not.
return result;
}
// We are online so let's run the request to make sure our content
// is up-to-date.
return fetch(event.request).then((response) => {
// Save the result to cache for later use.
cache.put(event.request, response.clone());
return response;
});
}),
),
);
return;
}
}
if (event.request.mode === 'navigate' && navigator.onLine === false) {
// Uh-oh, we navigated to a page while offline. Let's show our default page.
event.respondWith(caches.match(OFFLINE_URL));
return;
}
// Passthrough for everything else
event.respondWith(fetch(event.request));
});
What am I doing wrong?
navigator.onLine and the related events can be useful when you want to update your UI to indicate that you're offline and, for instance, only show content that exists in a cache.
But I'd avoid writing service worker logic that relies on checking navigator.onLine. Instead, attempt to make a fetch() unconditionally, and if it fails, provide a backup response. This will ensure that your web app behaves as expected regardless of whether the fetch() fails due to being offline, due to lie-fi, or due to your web server experiencing issues.
// Other fetch handler code...
if (event.request.mode === 'navigate') {
return event.respondWith(
fetch(event.request).catch(() => caches.match(OFFLINE_URL))
);
}
// Other fetch handler code...

Categories

Resources