Axios response not triggered and Node execution silently stops - javascript

I make an HTTP call with Axios meant to respond with 200 (tested on Postman).
But the response is not triggered in try/catch/finally logic and the execution weirdly stops.
The code looks like this:
(async () => {
const axios = require('axios');
const url = ''; // The URL responds with a HTML page
const cookies = ''; // The cookies are checked with Postman
const config = {
headers: {
Cookie: cookies
}
};
try {
let res = await axios.get(url, config);
console.log('Response received.');
console.log(res);
} catch(err) {
console.log('Error happened.');
console.log(err);
} finally {
console.log('Finally block.');
}
console.log('End of execution.');
})()
All console.log aren't called. I even tried to put debugger or other actions not related to writing on stdout but they aren't called either.
The status returned by the process is 0.

I just found the solution, my program was stopping because my request wasn't resolved.
I needed to set the connection header to keep-alive, and now it works properly.

Related

Intercept fetch for the first time but not afterwards using serviceWorker

Need some guidance here with service worker.
When the service worker is installed, it caches the assets. On next reload, when any request is made, it is intercepted by service worker, which first checks in cache, if it isn't found, then we make a network call. But this second network call is again being intercepted by service worker and thus it has turned into an infinite loop.
I don't want the next fetch call, to be intercepted again. I hope I'm able to explain the issue here.
Here is the serviceWorker.js
const cacheVersion = "v11";
self.addEventListener('install',(event)=>{
self.skipWaiting();
event.waitUntil(caches.open(cacheVersion).then((cache)=>{
cache.addAll([
'/',
'/index.html',
'/style.css',
'/images/github.png',
])
.then(()=>console.log('cached'),(err)=>console.log(err));
}))
})
self.addEventListener('activate',event=>{
event.waitUntil(
(async ()=>{
const keys = await caches.keys();
return keys.map(async (cache)=>{
if(cache !== cacheVersion){
console.log("service worker: Removing old cache: "+cache);
return await caches.delete(cache);
}
})
})()
)
})
const cacheFirst = async (request) => {
try{
const responseFromCache = await caches.match(request);
if (responseFromCache) {
return responseFromCache;
}
}
catch(err){
return fetch(request);
}
return fetch(request);
};
self.addEventListener("fetch", (event) => {
event.respondWith(cacheFirst(event.request));
});
The reason here is your cacheFirst, it's a bit wrong. What do we want to do inside it (high-level algorithm) ? Should be something like this, right?
check cache and if match found - return
otherwise, fetch from server, cache and return
otherwise, if network failed - return some "dummy" response
const cacheFirst = async (request) => {
// First try to get the resource from the cache
const responseFromCache = await caches.match(request);
if (responseFromCache) {
return responseFromCache;
}
// Next try to get the resource from the network
try {
const responseFromNetwork = await fetch(request);
// response may be used only once
// we need to save clone to put one copy in cache
// and serve second one
putInCache(request, responseFromNetwork.clone());
return responseFromNetwork;
} catch (error) {
// well network failed, but we need to return something right ?
return new Response('Network error happened', {
status: 408,
headers: { 'Content-Type': 'text/plain' },
});
}
};
This is not ready-to-use solution !!! Think of it as a pseudo-code, for instance you might need to impl putInCache first.

Forward body from request to another url

I am wondering if someone might be able to help figure out how to pass a post body to another endpoint with cloudflare workers?
I am trying to get the incoming request post to post to url.
const url = 'https://webhook.site/#!/b2f75ce2-7b9e-479a-b6f0-8934a89a3f3d'
const body = {
results: ['default data to send'],
errors: null,
msg: 'I sent this to the fetch',
}
/**
* gatherResponse awaits and returns a response body as a string.
* Use await gatherResponse(..) in an async function to get the response body
* #param {Response} response
*/
async function gatherResponse(response) {
const { headers } = response
const contentType = headers.get('content-type') || ''
if (contentType.includes('application/json')) {
return JSON.stringify(await response.json())
} else if (contentType.includes('application/text')) {
return response.text()
} else if (contentType.includes('text/html')) {
return response.text()
} else {
return response.text()
}
}
async function handleRequest() {
const init = {
body: JSON.stringify(body),
method: 'POST',
headers: {
'content-type': 'application/json;charset=UTF-8',
},
}
const response = await fetch(url, init)
const results = await gatherResponse(response)
return new Response(results, init)
}
addEventListener('fetch', (event) => {
return event.respondWith(handleRequest())
})
I created a worker at https://tight-art-0743.ctohm.workers.dev/, which basically forwards your POST request's body to a public requestbin. You can check what is it receiving at: https://requestbin.com/r/en5k768mcp4x9/24tqhPJw86mt2WjKRMbmt75FMH9
addEventListener("fetch", (event) => {
event.respondWith(
handleRequest(event.request).catch(
(err) => new Response(err.stack, { status: 500 })
)
);
});
async function handleRequest(request) {
let {method,headers}=request,
url=new URL(request.url)
// methods other than POST will return early
if(method!=='POST') return new Response(`Your request method was ${method}`);
const forwardRequest=new Request("https://en5k768mcp4x9.x.pipedream.net/", request)
forwardRequest.headers.set('X-Custom-Header','hey!')
return fetch(forwardRequest)
}
You can see it working with a simple CURL request
curl --location --request POST 'https://tight-art-0743.ctohm.workers.dev/' \
--header 'Content-Type: application/json' \
--data-raw '{"environment": {"name": "Sample Environment Name (required)"}}'
Two things worth noting, in the worker's code:
I'm passing the original request as the init parameter, through which original headers and body are transparently forwarded to the requestbin, also allowing for some extra header manipulation if neeeded.
In this example I'm not actually doing anything with the request body. Therefore there's no need to await it. You just connect incoming and outgoing streams and let them deal with each other.
Another example: let's add a /csv route. Requests starting with /csv will not forward your POST body. Instead they will download a remote CSV attachment and POST it to the requestbin. Again, we aren't awaiting for the actual CSV contents. We pass a handle to the response body to the forwarding request
async function handleRequest(request) {
let {method,headers}=request,
url=new URL(request.url)
if(method!=='POST') return new Response(`Your request method was ${method}`);
const forwardRequest=new Request("https://en5k768mcp4x9.x.pipedream.net/",request)
if(url.pathname.includes('/csv')) {
const remoteSource=`https://cdn.wsform.com/wp-content/uploads/2018/09/country_full.csv`,
remoteResponse=await fetch(remoteSource)
return fetch(forwardRequest,{body:remoteResponse.body})
}
forwardRequest.headers.set('X-Custom-Header','hey!')
return fetch(forwardRequest)
}
While your code should theoretically work, the fact that you're unwrapping the response means your worker could be aborted due to hitting time limits, or CPU, or memory. On the contrary, when using the streams based approach,
your worker's execution finishes as soon as it returns the forwarding fetch. Even if the outgoing POST is still running, this isn't subject to CPU or time limits.

Service worker returns offline html page for javascript files

I'm new to service workers and offline capabilities. I created a simple service worker to handle network requests and return a offline html page when offline. This was created following Google's guide on PWA.
The problem is that the service worker returns offline.html when requesting javascript files (not cached). It should instead return a network error or something. Here is the code:
const cacheName = 'offline-v1900'; //increment version to update cache
// cache these files needed for offline use
const appShellFiles = [
'./offline.html',
'./css/bootstrap.min.css',
'./img/logo/logo.png',
'./js/jquery-3.5.1.min.js',
'./js/bootstrap.min.js',
];
self.addEventListener("fetch", (e) => {
// We only want to call e.respondWith() if this is a navigation request
// for an HTML page.
// console.log(e.request.url);
e.respondWith(
(async () => {
try {
// First, try to use the navigation preload response if it's supported.
const preloadResponse = await e.preloadResponse;
if (preloadResponse) {
// console.log('returning preload response');
return preloadResponse;
}
const cachedResponse = await caches.match(e.request);
if (cachedResponse) {
// console.log(`[Service Worker] Fetching cached resource: ${e.request.url}`);
return cachedResponse;
}
// Always try the network first.
const networkResponse = await fetch(e.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
// console.log("Fetch failed; returning offline page instead.", error);
const cachedResponse = await caches.match('offline.html');
return cachedResponse;
}
})()
);
When offline, I open a url on my site, it loads the page from the cache but not all assets are cached for offline. So when a network request is made for, say https://www.gstatic.com/firebasejs/9.1.3/firebase-app.js, the response I get is the html of offline.html page. This breaks the page because of javascript errors.
It should instead return a network error or something.
I think the relevant sample code is from https://googlechrome.github.io/samples/service-worker/custom-offline-page/
self.addEventListener('fetch', (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
if (event.request.mode === 'navigate') {
event.respondWith((async () => {
try {
// First, try to use the navigation preload response if it's supported.
const preloadResponse = await event.preloadResponse;
if (preloadResponse) {
return preloadResponse;
}
const networkResponse = await fetch(event.request);
return networkResponse;
} catch (error) {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
console.log('Fetch failed; returning offline page instead.', error);
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
})());
}
// If our if() condition is false, then this fetch handler won't intercept the
// request. If there are any other fetch handlers registered, they will get a
// chance to call event.respondWith(). If no fetch handlers call
// event.respondWith(), the request will be handled by the browser as if there
// were no service worker involvement.
});
Specifically, that fetch handler checks to see whether event.request.mode === 'navigate' and only returns HTML when offline if that's the case. That's what's required to make sure that you don't end up returning offline HTML for other types of resources.

The lambda function, after an api call, doesn't return the result to my main script

I made a web app that uses an API.
For hides the key of the API and host it on netlify I've used a lambda function:
exports.handler = async event => {
const apiKey = process.env.apiKey
const response = await fetch(`https://api.waqi.info/feed/${cityName}/?token=${apiKey}`)
const result = await response.json()
const pass = (body) => {
return {
statusCode: 200,
body: JSON.stringify(body)
}
}
return pass(result)
}
that makes the call to the API and share the result of the call to my main script, that one elaborates this response.
async function checkAir() {
let cityName = document.getElementById("cityName").value;
// Call API
const response = await fetch("../netlify/functions/lambda")
const result = await response.json()
console.log("response" + response)
console.log("result" + result)
}
When it runs, doesn't works, and gives the error:
GET 'url/.netlify/functions/lambda' 404
Try using the developer tools in your Browser.
Inspect > Network Tab. and refresh your page to run the script.
Find the http request for lambda, and investigate the request, maybe the path isn't correct.

Get only HTML in single fetch request in service worker

I'm using Cloudflare service workers and I want on every request to:
request only the HTML (therefore count as only 1 request)
search the response for a string
Purge that page's cache if the message exists
I've solved points #2 and #3. Can't figure out if #1 is feasible or possible at all.
I need it as only one request because there is a limit per day on the number of free requests. Otherwise I have about 50-60 requests per page.
My current attempt for #1, which doesn't work right:
async function handleRequest(request) {
const init = {
headers: {
'content-type': 'text/html;charset=UTF-8',
},
};
const response = await fetch(request);
await fetch(request.url, init).then(function(response) {
response.text().then(function(text) {
console.log(text);
})
}).catch(function(err) {
// There was an error
console.warn('Something went wrong.', err);
});
return response;
}
addEventListener('fetch', event => {
return event.respondWith(handleRequest(event.request))
});
You can't request "only the html", the worker will act on any request that matches the route that it is deployed at. If you only care about the html, you will need to set up your worker path to filter to only the endpoints that you want to run the worker on.
Alternatively you can use the worker on every request and only do your logic if the response Content-Type is one that you care about. This would be something along these lines:
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request));
})
async function handleRequest(request) {
let response = await fetch(request);
let type = response.headers.get("Content-Type") || "";
if (type.startsWith("text/")) {
//this is where your custom logic goes
}
return response;
}

Categories

Resources