Node.js function without callback - javascript

I have a node.js server. When a user requests a page I call a function that pulls some info from db and services the request. Simple function with callback then execute response.send
I need to perform secondary computation/database updates which are not necessary for rendering the page request. I don't want the user to wait for these secondary ops to complete (even though they take only 200 ms.)
Is there a way to call a function and exit gracefully without callback?

You can simply do something like this
app.get('/path', function(req, res){
getInfoFromDatabase(); // get info from the database
res.render('myview', {info: data});
// perform post render operations
postRenderingCode();
return;
});

If I understand your problem correctly you can use setTimeout with a value of 0 to place the maintenance code at the end of the execution queue.
function service(user, callback) {
// This will be done later
setTimeout(function() {
console.log("Doing some maintenance work now...");
}, 0);
// Service the user
callback("Here's your data " + user);
}
service("John", function(data) { console.log(data); });
service("Jane", function(data) { console.log(data); });
The output will be:
Here's your data John
Here's your data Jane
Doing some maintenance work now...
Doing some maintenance work now...

You can call your extra ASYNCHRONOUS function before, or after your actual response; for example:
yourCoolFunction() // does awesome stuff...
response.writeHead(200, 'OK');
response.write('some cool data response');
response.end();
Note that the "yourCoolFunction" mentioned must be asynchronous, else the rest of the code will wait for it to complete.

Assuming you're using express.js:
function(req, res, next) {
doSomeAsyncWork(function(e, d) {
// Some logic.
doSomeMoreAsyncWork(function() {})
res.send(/* some data*/)
})
}
Basically you don't really care about the response of the additional async work so you can put in a function that does nothing for the callback.

since I can see none of the answers so far are even somehow helpful, and in order to avoid confusing. What I suggest is use on the object you are working on the following:
function doStuff() {
myObj.emit('myEvent', param);
}
function callback(param) {
do stuff;
}
myObj.on('myEvent', callback);

well, just do what you said, render the page, respond to the request and do whatever you have to do, your code isn't suddenly going to die because you responded to the request.
with express:
function handleTheRequest(req, res) {
res.status(200).send("the response")
// do whatever you like here
}

Related

Service Worker 'Hello World' example

I am learning about Service workers, as I have a usecase to create a fetch listener, that will pass back an arbitrary binary response.
I have no idea where to start. The examples I have seen online talk about making a server request, caching in the service worker, and passing it back. What I really want to do is just pass back my own response, not make a query to the server and cache it!
What I am looking for, as a start is, say something that will once the service worker is active, given the user enters in the browser, (or uses fetch api to get the following url)
http://myapp/helloworld
will show 'Helloworld' in the browser. The service worker will be something like the following. But I have not a clue how make it work.
self.addEventListener('fetch', event => {
// compare end of url with /helloworld
// if match, respond with 'helloword', otherwise fetch the response from the server
});
This is just going to be a very brief, broad overview of how I would tackle the problem.
First, I would follow a guide like this:
https://css-tricks.com/add-a-service-worker-to-your-site/
// Listen for request events
self.addEventListener('fetch', function (event) {
// Get the request
let request = event.request;
...
}
Then you'll use this bit of code as a guideline for what you want to do:
event.respondWith(
fetch(request).then(function (response) {
return response;
}).catch(function (error) {
return caches.match(request).then(function (response) {
return response;
});
})
);
With some modifications.
First, you'll want to check if it's a normal non-/helloworld type of request, and if it is, do something like this:
if (normalRequest) {
event.respondWith(
fetch(request).then(function (response) {
return response;
});
} else {
... TODO
}
And in the TODO section, you'll do your helloworld code - it's not clear to me what you want to do with that, so I can't really go into more detail. But this is the overall structure I would use.

Node JS - Async - Response is sent while queries are being executed

I am using the below piece of code to execute some set of queries and send the response of doing some validations on query results. For this scenario, I am using async module in Node JS.
async.series([
function(callback){
common.commonValidations(db);
callback();
},
function(callback){
console.log('second function');
res.end(JSON.stringify(gErrors));
callback();
}
], function(err){
console.log('sending res to client');
console.log(err);
});
common.commonValidations(db) function is used to execute few db2 queries.
Here my issue is, though I am using async module, the response is sent to the client while the query execution is going on. As per my understanding of async module, the second function in the array is executed once the first function is done with it's job.
Can someone help me on this? Thanks in advance.
Looks like common.commonValidations(db) is an asynchronous function, but you are not waiting for it to be done. you are calling callback() function before the answer for commonValidations comes back.
one possible change might be like
common.commonValidations(db,function(err,data) {
//check error
//process data
//and then
callback()
})

Request functions are executing out of order in node.js

I am trying to make a scraper, but I cant seem to get the code to execute in the right order. I need the album/albumart request function to execute after the title and artist function. I know node.js is weird about this sort of thing, but I've tried moving things all over and still no luck.
Here's the Code
Please pardon the mess and excess debug code.
Current output:
TESTED!!!
req
No error
Pentemple - Pazuzu 2
Now Playing: Pentemple - Pazuzu 2
10
Pentemple
10
Pentemple
1
{ artist: '',
title: '',
album: '',
albumArt: '',
testval: 'TESTED!!!' }
xtest
Because of the asynchronous request calls, the responses might not be in order therefore to keep the order, you will need to make next request call in previous request's callback. Below is the example for the same -
request(url1, function(err, res, html){
if(!err)
{
// url1 successfully returned , call another dependent url
request(url2, function(err2, res2, html2){
if(!err2)
{
// url2 successfully returned, go on with another request call and so on ...
}
});
}
else
{
// first call failed, return gracefully here --
callback(err); // if you have any
}
})
However, as suggested in earlier answer as well, this is anti pattern and will result in messy and cluttered code known as pyramid of doom or callback hell.
I would suggest going with wonderful async npm module and then the same code can be written as -
var async = require('async');
async.waterfall([
function(callback) {
request(url1, function(error, res, html){
callback(null, res, html);
});
},
function(res1, html1, callback) {
request(url1, function(error, res, html){
callback(null, res1, html1, res, html);
});
} // ... AND SO ON
], function (err, result) {
// the result contains the response sent by the last request callback
if(!err)
{
// use your data
}
});
JavaScript is asynchronous. If the requests are dependent on each other I recommend using callbacks so that when one request is complete, it calls the next one.
In most cases performing a request in Javascript has an asynchronous nature. That means that requests do not block the entire process. To perform and action when the request is done callbacks are used. Callbacks are functions that are added into the event loop queue once the request gets in finished state. The easiest way (but for sure not the best one) to make reqeust run one after another is to call second request in the first callback, third request in seconds callback and so on.
request(profileurl, function (error, response, html) {
console.log("req");
if (!error) {
// ...
request(albumurl, function (error, response, html) {
if (!error) {
// ...
request(albumurl, function (error, response, html) {
// ...
});
});
} else {
console.log("ERROR: " + error);
}
});
But such a practice is considered to be anti-pattern and is called Pyramid of Doom, because nesting callbacks make the code unreadable, hard to to test and hard to maintain.
Good practice is considered to use promises. They come "in box" with ES2015. But if you use ES5, you should use some additional module for them like: request-promise or Q.

Waiting for MongoDB findOne callback to complete before finishing app.get()

I'm relatively new to Javascript and I am having trouble understanding how to use a MongoDB callback with an ExpressJS get. My problem seems to be if it takes too long for the database search, the process falls out of the app.get() and gives the webpage an "Error code: ERR_EMPTY_RESPONSE".
Currently it works with most values, either finding the value or properly returning a 404 - not found, but there are some cases where it hangs for a few seconds before turning the ERR_EMPTY_RESPONSE. In the debugger, it reaches the end of the app.get(), where it returns ERR_EMPTY_RESPONSE, and after that the findOne callback finishes and goes to the 404, but by then it is too late.
I've tried using async and introducing waits with no success, which makes me feel like I am using app.get and findOne incorrectly.
Here is a general version of my code below:
app.get('/test', function (req, res) {
var value = null;
if (req.query.param)
value = req.query.param;
else
value = defaultValue;
var query = {start: {$lte: value}, end: {$gte: value}};
var data = collection.findOne(query, function (err, data) {
if (err){
res.sendStatus(500);
}
else if (data) {
res.end(data);
}
else{
res.sendStatus(404);
}
});
});
What can I do to have the response wait for the database search to complete? Or is there a better way to return a database document from a request? Thanks for the help!
You should measure how long the db query takes.
If it's slow >5sec and you can't speed it up, than it might be a good idea to decouple it from the request by using some kind of job framework.
Return a redirect the url where the job status/result will be available.
I feel silly about this, but I completely ignored the fact that when using http.createServer(), I had a timeout set of 3000 ms. I misunderstood what this timeout was for and this is what was causing my connection to close prematurely. Increasing this number allowed my most stubborn queries to complete.

Issuing internal express request

I'm curious if there is any way to issue an internal request in express without going through all the actual overhead of a real request. An example probably shows the motivation better:
app.get("/pages/:page", funciton(req, res)
{
database_get(req.params.page, function(result)
{
// "Page" has an internal data reference, which we want to inline with the actual data:
request(result.user_href, function(user_response)
{
result.user = user.response.json;
res.send(result);
});
});
});
/// ....
app.get("/user/:name", function() ... );
So what we have here is a route whose data requires making another request to get further data. I'd like to access it by just doing something like app.go_get(user_href) instead of the heavy weight actual request. Now, I've asked around and the going strategy seems to be "split out your logic". However, it actually requires me to duplicate the logic, since the recursive data is referenced properly through URLs (as in the example above). So I end up having to do my own routing and duplicating routes everywhere.
Can you avoid the overhead of a real request? No. If you need the href from the first request in order to go to get a user object, you absolutely need to follow that link by making a second "real request."
If you have a database of users, you CAN avoid the request by including the user's ID on the page, and making a regular database call instead of following your own href.
Demo refactor on splitting out logic:
// Keep as little logic as possible in your routes:
app.get('/page/:page', function(req, res){
var pageId = req.params.page;
makePage(pageId, function(err, result){
if(err){ return res.send(500) }
res.send(result)
})
})
// Abstract anything with a bunch of callback hell:
function makePage(pageId, callback){
database_get(pageId, function(result) {
// Since it's only now you know where to get the user info, the second request is acceptable
// But abstract it:
getUserByHref(result.user_href, function(err, data){
if(err){return callback(err)};
result.user = data.json;
callback(null, result);
});
});
}
// Also abstract anything used more than once:
function getUserByHref(href, callback){
request(href, function(err, response, body){
if(response.statusCode != 200){
return callback(err);
}
var user = JSON.parse(body);
return callback(null, user);
})
}
// It sounds like you don't have local users
// If you did, you would abstract the database call, and use getUserById
function getUserById(id, callback){
db.fetch(id, function(err, data){
return callback(err, data);
})
}
I've made a dedicated middleware for this uest, see my detailed answer here: https://stackoverflow.com/a/59514893/133327

Categories

Resources