Brief Project Overview
I am working on a full page caching system using Node and Express. I have this working, however I am trying to add cache "hole-punching", where portions of a document can have different caching policies.
Attempted Approach
I am attempting to achieve this by injecting the rendered output of one Express route into the rendered output of another Express route. Unfortunately, the most popular methods of firing an http request within Node involve asynchronous promises. I have not been able to get promises working in a synchronous request because... well, they don't work that way.
From what I can tell, I am left with 2 options:
Make the entire route/request/response asynchronous (is this possible?)
Find a synchronous way of rendering an Express route
I don't know how to achieve either of these. I know many folks will say not to do #2 because performance - I'm open to other suggestions.
Code
In the code below, I override the res.render() method to analyze the output of the current response and determine if there is a secondary request that should be made, rendered, and injected.
async function getRemoteInclude(req, content) {
// remote include format: {{ includeRemote('path') }}
const handlebarRegex = '(?<={{)(.*?)(?=}})';
const includeRegex = '(?<=includeRemote[(])(.*?)(?=[)])';
const replaceHandlebarRegex = /\{\{[^\}]+\}\}/g;
let parsedContent = content;
let foundHandlebars = content.match(handlebarRegex); // does string with {{ * }} exist?
if (foundHandlebars) {
let foundInclude = foundHandlebars[0].match(includeRegex); // does string with includeRemote('*') exist?
if (foundInclude) {
const axios = require('axios');
let path = 'http://expresscache.com:3000/test'; // sample hardcoded route
let response = await axios.get(path); // fire async request to express route
return parsedContent.replace(replaceHandlebarRegex, response.body); // replace remote include string with route response
}
}
return content;
}
function interceptRender() {
return function (req, res, next) {
res.render = function (view, options, callback) {
var self = this;
options = options || res.locals;
self.app.render(view, options, function (err, content) {
if (!callback) {
content = getRemoteInclude(req, content); // check if content should be modified with rendered remote include
self.locals.body = content;
self.send(content);
} else {
callback();
}
});
}
next();
}
}
module.exports = {
interceptRender: interceptRender
};
Obviously this code performs an async request through use of the axios node library. I am expecting that this will not be part of the final solution, unless option 1 from above is possible.
Ask
Does anyone have an idea how to either get this async request working in a synchronous context or to modify this request so that it is synchronous itself? Or maybe a completely different approach?
Related
In application which I currently develop, it's using Express. In my case I want to get response before it's been sent and modify it (for purpose of JWT). In this application, there is a dozen of endpoints and I don't want to create my own function like sendAndSign() and replace res.send() everywhere in code. I heard there is option to override/modify logic of res.send(...) method.
I found something like this example of modifying, but in my case this doesn't work. Is there any other option (maybe using some plugin) to manage this action?
You can intercept response body in Express by temporary override res.send:
function convertData(originalData) {
// ...
// return something new
}
function responseInterceptor(req, res, next) {
var originalSend = res.send;
res.send = function(){
arguments[0] = convertData(arguments[0]);
originalSend.apply(res, arguments);
};
next();
}
app.use(responseInterceptor);
I tested in Node.js v10.15.3 and it works well.
I have created an NPM package called experss-response-hooks that provides response hooks.
You can register a hook in a middleware before all your other routes, that will enable you to change the response body when send() will be called.
For example:
const responseHooks = require('express-response-hooks');
// response hooks initialization
app.use(responseHooks());
// register a middleware that modifies the response body before being sent to the client
app.use(function (req, res, next) {
// hook on "send()" function
res.hooks.on('send', (args) => {
args[0] = 'new-body'; // args[0] is the body passed to send()
});
});
So, this is the first time I'm trying to take a large js app written in one file and modularize it into separate files. The goal is to create a more organized base of files rather than one big one.
There are a lot of api calls and a lot of shared information. I'm making use of module.exports but I'm not sure that it's the best way to go about it. I'd like some advice on how to do it more correctly or maybe I should use some other method? I'm using module.exports to pass back specific data rather than functions.
For example, here's the authentication function which was in the larger file and now in authenticate.js (some irrelevant parts were taken out):
module.exports.authenticate = (logger) => {
return new Promise((resolve, reject) => {
const authentication = new logger("Authentication Service");
fs.createReadStream('auth.json').pipe(request.post(('https://example.com/auth'), function (error, response, body) {
authentication.log('Authenicating API access');
body = JSON.parse(body);
token = body.response.token
if (typeof(token) === 'undefined' || token === '') {
reject('No Token Available');
}
authentication.log('Successfully logged in.');
module.exports.token = token;
resolve();
}));
})
}
So specifically, i'm using 'module.exports.token = token;' to pass back the token info that was just retrieved from the api call, I'm doing this in quite a few modules though for different pieces of information.
Is this proper and good practice?
Thanks!
I'm using node.js simply so that I can run scheduled tasks and use GET requests. I'll paste some code that displays what I want to do, although it doesn't work for an obvious reason:
const http = require("http");
const request = require("request");
http.createServer(function (req, res) {
res.writeHead(200, {"Content-Type": "text/html"});
res.write("Hello, World!");
let a = getRequest();
console.log(a);
res.end();
}).listen(8080);
function getRequest() {
let b;
request("http://www.google.com", function(err, res, body) {
b = body;
})
return b;
}
So the b from the body does not work due to how the request is asynchronous and this leaves b as undefined when it is eventually printed. I know the way these callback functions are supposed to be used is to keep chaining of the callback function since that's the only place where the contents of body can be accessed. However, I don't want to keep chaining off functions because it completely destroys the structure of the program. I want to keep all my node server commands inside the http.createServer block. I don't want to place them in functions called from inside the callback function. In this example it doesn't really make sense for the process to be asynchronous since there's only 1 get request anyway and it can't be displayed in console.log until it's received anyway.
I just need a simple way to scrape data with get requests. What would be perfect is if I had some function that I could give a bunch of links, it gets the raw html from them, and then it waits for them to all be done so that I can process all the data at once.
How can something like this be implemented in Node.js?
You can do that using this module: sync-request.
With this module you will be able to make synchronous web requests from your NodeJS code.
So i've been doing some reading and I think I have a general grasp on this subject but could use some insight from someone more experienced. I've been trying to write a simple RSS reader in Meteor and have been facing some issues with calling the Meteor method asynchronously. I currently define the method on the server(synchronously) and call it on the client(asynchronously). What I don't understand is that when I try to make the HTTP.call on the server, I return an undefined value passed to my client if I pass a callback into the request. But when I make the API request synchronously everything seems to work fine. Is this the normal behavior I should expect/the way I should be making the API call?
Meteor.methods({
getSubReddit(subreddit) {
this.unblock();
const url = 'http://www.reddit.com/r/' + subreddit + '/.rss';
const response = HTTP.get(url, {}, (err, res) => {
if(!err) {
//console.log(res.content);
return res;
} else {
return err;
}
});
}
});
Here's the method defined on the server side. Note that logging res.content shows that I'm actually getting the right content back from the call. I've tried reading some other answers on the topic and seen some things about using Future/wrapAsync, but I'm not sure I get it. Any help would be greatly appreciated!
The HTTP.get is doing async work, so callback passed to it will be called out of this meteor method call context.
To get desired result you should do it like this:
Meteor.methods({
getSubReddit(subreddit) {
// IMPORTANT: unblock methods call queue
this.unblock();
const url = 'http://www.reddit.com/r/' + subreddit + '/.rss';
const httpGetSync = Meteor.wrapAsync(HTTP.get);
try {
const response = httpGetSync(url, {});
//console.log(response.content);
return response.content;
} catch (err) {
// pass error to client
throw new Meteor.Error(...);
}
}
});
Question: Would you consider dangling callbacks as bad node.js style or even dangerous? If so under which premise?
Case: as described below, imagine you need to make calls to a DB in an express server that updates some data. Yet the client doesn't need to be informed about the result. In this case you could return a response immediately, not waiting for the asynchronous call to complete. This would be described as dangling callback for lack of a better name.
Why is this interesting?: Because tutorials and documentation in most cases show the case of waiting, in worst cases teaching callback hell. Recall your first experiences with say express, mongodb and passport.
Example:
'use strict'
const assert = require('assert')
const express = require('express')
const app = express()
function longOperation (value, cb) {
// might fail and: return cb(err) ...here
setTimeout(() => {
// after some time invokes the callback
return cb(null, value)
}, 4000)
}
app.get('/ping', function (req, res) {
// do some declartions here
//
// do some request processesing here
// call a long op, such as a DB call here.
// however the client does not need to be
// informed about the result of the operation
longOperation(1, (err, val) => {
assert(!err)
assert(val === 1)
console.log('...fired callback here though')
return
})
console.log('sending response here...')
return res.send('Hello!')
})
let server = app.listen(3000, function () {
console.log('Starting test:')
})
Yeah, this is basically what called a "fire and forget" service in other contexts, and could also be the first step in a good design implementing command-query response separation.
I don't consider it a "dangling callback", the response in this case acknowledges that the request was received. Your best bet here would be to make sure your response includes some kind of hypermedia that lets clients get the status of their request later, and if it's an error they can fix have the content at the new resource URL tell them how.
Think of it in the case of a user registration workflow where the user has to be approved by an admin, or has to confirm their email before getting access.