This code works like it should work, but after fifth GET request it does what it should do on the backend(stores the data in db) but it's not logging anything on the server and no changes on frontend(reactjs)
const express = require('express');
const router = express.Router();
const mongoose = require('mongoose');
const User = require('./login').User;
mongoose.connect('mongodb://localhost:27017/animationsdb');
router.get('/', async(req, res) => {
await User.findOne({ username: req.query.username }, (err, result) => {
if (result) {
// when user goes to his profile we send him the list of animations he liked
// list is stored in array at db, field likedAnimations
res.send({ animationList: result.likedAnimations });
console.log("Lajkovane animacije:", result.likedAnimations);
} else {
console.log("no result found");
res.sendStatus(404)
}
});
});
router.put('/', async(req, res) => {
console.log("username:", req.body.username);
console.log("link:", req.body.link);
// if animation is already liked, then dislike it
// if it's not liked, then store it in db
const user = await User.findOne({ username: req.body.username });
if (user.likedAnimations.indexOf(req.body.link) === -1) {
user.likedAnimations.push(req.body.link);
} else {
user.likedAnimations = arrayRemove(user.likedAnimations, user.likedAnimations[user.likedAnimations.indexOf(req.body.link)]);
}
user.save();
});
function arrayRemove(arr, value) {
return arr.filter((item) => {
return item != value;
});
}
module.exports = router;
For first five requests I get this output:
Liked animations: ["/animations/animated-button.html"]
GET /animation-list/?username=marko 200 5.152 ms - 54
Liked animations: ["/animations/animated-button.html"]
GET /animation-list/?username=marko 304 3.915 ms - -
After that I don't get any output on server console and no changes on front end untill I refresh the page, even though db operations still work and data is saved.
It appears you have a couple issues going on. First, this request handler is not properly coded to handle errors and thus it leaves requests as pending and does not send a response and the connection will stay as pending until the client eventually times it out. Second, you likely have some sort of database concurrency usage error that is the root issue here. Third, you're not using await properly with your database. You either use await or you pass a callback to your database, not both. You need to fix all three of these.
To address the first and third issues:
router.get('/', async(req, res) => {
try {
let result = await User.findOne({ username: req.query.username };
if (result) {
console.log("Liked animations:", result.likedAnimations);
res.send({ animationList: result.likedAnimations });
} else {
console.log("no database result found");
res.sendStatus(404);
}
} catch(e) {
console.log(e);
res.sendStatus(500);
}
});
For the second issue, the particular database error you mention appears to be some sort of concurrency/locking issue internal to the database and is triggered by the sequence of database operations your code executes. You can read more about that error in the discussion here. Since the code you show us only shows a single read operation, we would need to see a much larger context of relevant code including the code related to this operation that writes to the database in order to be able to offer any ideas on how to fix the root cause of this issue.
We can't see the whole flow here, but you need to use atomic update operations in your database. Your PUT handler you show is an immediate race condition. In multi-client databases, you don't get a value, modify it and then write it back. That's an opportunity for a race condition because someone else could modify the value while you're sitting their holding it. When you then modify your held value, you overwrite the change that the other client just made. That's a race condition. Instead, you use an atomic operation that updates the operation directly in one database call or you use transactions to make a multi-step operation into a safe operation.
I'd suggest you read this article on atomic operations in mongodb. And, probably you want to use something like .findAndModify() so you can find and change an item in the database in one atomic operation. If you search for "atomic operations in mongodb", there are many other articles on the topic.
Related
Currently i am dealing with the following situation:
I have a ShareDB backend up and running in order to realize real time collaboration (text writing).
Every time a user connects i would like to check if the document the user intends to work on exists in the database. If it DOES NOT exist, create it first. If it DOES exist, proceed normally, this should be done in the "connect" middleware:
var backend = new ShareDB();
backend.use('connect', function(context, next) {
console.log('connect')
var connection = backend.connect();
doc = connection.get('collection_name', 'document_id');
doc.fetch(function(err) {
if (err) throw err;
if (doc.type === null) {
doc.create({content: ''});
return;
}
});
next()
})
But it triggers an infinite loop, because i trigger an connect action inside the connect middleware.
So i have no idea how i to access the database in the middleware... any idea?
Thanks!
Say I have this endpoint on an express server:
app.get('/', async (req, res) => {
var foo = await databaseGetFoo();
if (foo == true) {
foo = false;
somethingThatShouldOnlyBeDoneOnce();
await databaseSetFoo(foo);
}
})
I think this creates a race condition if the endpoint is called twice simultaneously?
If so how can I prevent this race condition from happening?
OK, so based on the comments, I've got a little better understanding of what you want here.
Assuming that somethingThatShouldOnlyBeDoneOnce is doing something asynchronous (like writing to a database), you are correct that a user (or users) making multiple calls to that endpoint will potentially cause that operation to happen repeatedly.
Using your comment about allowing a single comment per user, and assuming you've got middleware earlier in the middleware stack that can uniquely identify a user by session or something, you could naively implement something like this that should keep you out of trouble (usual disclosures that this is untested, etc.):
let processingMap = {};
app.get('/', async (req, res, next) => {
if (!processingMap[req.user.userId]) {
// add the user to the processing map
processingMap = {
...processingMap,
[req.user.userId]: true
};
const hasUserAlreadySubmittedComment = await queryDBForCommentByUser(req.user.userId);
if (!hasUserAlreadySubmittedComment) {
// we now know we're the only comment in process
// and the user hasn't previously submitted a comment,
// so submit it now:
await writeCommentToDB();
delete processingMap[req.user.userId];
res.send('Nice, comment submitted');
} else {
delete processingMap[req.user.userId];
const err = new Error('Sorry, only one comment per user');
err.statusCode = 400;
next(err)
}
} else {
delete processingMap[req.user.userId];
const err = new Error('Request already in process for this user');
err.statusCode = 400;
next(err);
}
})
Since insertion into the processingMap is all synchronous, and Node can only be doing one thing at a time, the first request for a user to hit this route handler will essentially lock for that user until the lock is removed when we're finished handling the request.
BUT... this is a naive solution and it breaks the rules for a 12 factor app. Specifically, rule 6, which is that your applications should be stateless processes. We've now introduced state into your application.
If you're sure you'll only ever run this as a single process, you're fine. However, the second you go to scale horizontally by deploying multiple nodes (via whatever method--PM2, Node's process.cluster, Docker, K8s, etc.), you're hosed with the above solution. Node Server 1 has no idea about the local state of Node Server 2 and so multiple requests hitting different instances of your multi-node application can't co-manage the state of the processing map.
The more robust solution would be to implement some kind of queue system, likely leveraging a separate piece of infrastructure like Redis. That way all of your nodes could use the same Redis instance to share state and now you can scale up to many, many instances of your application and all of them can share info.
I don't really have all the details on exactly how to go about building that out and it seems out of scope for this question anyway, but hopefully I've given you at least one solution and some idea of what to think about at a broader level.
I have a Node.js server which queries MySQL database. It serves as an api end point where it returns JSON and also backend server for my Express application where it returns the retrieved list as an object to the view.
I am looking into implementing flat-cache for increasing the response time. Below is the code snippet.
const flatCache = require('flat-cache');
var cache = flatCache.load('productsCache');
//get all products for the given customer id
router.get('/all/:customer_id', flatCacheMiddleware, function(req, res){
var customerId = req.params.customer_id;
//implemented custom handler for querying
queryHandler.queryRecordsWithParam('select * from products where idCustomers = ? order by CreatedDateTime DESC', customerId, function(err, rows){
if(err) {
res.status(500).send(err.message);
return;
}
res.status(200).send(rows);
});
});
//caching middleware
function flatCacheMiddleware(req, res, next) {
var key = '__express__' + req.originalUrl || req.url;
var cacheContent = cache.getKey(key);
if(cacheContent){
res.send(cacheContent);
} else{
res.sendResponse = res.send;
res.send = (body) => {
cache.setKey(key,body);
cache.save();
res.sendResponse(body)
}
next();
}
}
I ran the node.js server locally and the caching has indeed greatly reduced the response time.
However there are two issues I am facing that I need your help with.
Before putting that flatCacheMiddleware middleware, I received the response in JSON, now when I test, it sends me an HTML. I am not too well versed with JS strict mode (planning to learn it soon), but I am sure the answer lies in the flatCacheMiddleware function.
So what do I modify in the flatCacheMiddleware function so it would send me JSON?
I manually added a new row to the products table for that customer and when I called the end point, it still showed me the old rows. So at what point do I clear the cache?
In a web app it would ideally be when the user logs out, but if I am using this as an api endpoint (or even on webapp there is no guarantee that the user will log out the traditional way), how do I determine if new records have been added and the cache needs to be cleared.
Appreciate the help. If there are any other node.js caching related suggestions you all can give, it would be truly helpful.
I found a solution to the issue by parsing the content to JSON format.
Change line:
res.send(cacheContent);
To:
res.send(JSON.parse(cacheContent));
I created cache 'brute force' invalidation method. Calling clear method will clear both cache file and data stored in memory. You have to call it after db change. You can also try delete specified key using cache.removeKey('key');.
function clear(req, res, next) {
try {
cache.destroy()
} catch (err) {
logger.error(`cache invalidation error ${JSON.stringify(err)}`);
res.status(500).json({
'message' : 'cache invalidation error',
'error' : JSON.stringify(err)
});
} finally {
res.status(200).json({'message' : 'cache invalidated'})
}
}
Notice, that calling the cache.save() function will remove other cached API function. Change it into cache.save(true) will 'prevent the removal of non visited keys' (like mentioned in comment in the flat-cache documentation.
I have a database query (function) which is asynchronous, and as a result I need to use a callback function (no problem with that). However, in Node.js I need to make two separate queries in the same POST function. Both are asynchronous, so I'm having trouble on how to continue with the execution of the POST.
Goal:
Validate form entries for malformations, etc.
Check if username exists in db before saving (must be unique)
Check if email exists in db before saving (must be unique)
Save user if everything checks out, else throw some errors
Normally, I would have something like this (oversimplified):
postSearch = function(req, res, next) {
var searchCallback = function(err, results) {
// Do stuff with the results (render page, console log, whatever)
}
// This is the db query - async. Passes search results to callback
defaultSearch(input, searchCallback);
}
Which only has one async query, so only one callback. Normally I would just get the db results and render a page. Now I have to validate some form data, so my POST function looks something like this:
postUser = function(req, res, next) {
// Some static form validation (works, no issues)
var usernameExistsCallback = function(err, exists) {
// Does the username exist? True/false
}
// DB query - passes true or false to the callback
usernameExists(username, usernameExistsCallback);
var emailExistsCallback = function(err, exists) {
// Does the email exist? True/false
}
// DB query - passes true or false to the callback
emailExists(email, emailExistsCallback);
// Check if ALL validation constraints check out, implement error logic
}
The node-postgres module is async, and as a result the queries need callbacks (if I want to return any value, otherwise I can just run the query and disconnect). I have no problem executing both of those queries. I can console.log() the correct results in the callbacks. But now I don't know how to access those results later on in my postUser function.
I've read all about async JavaScript functions, but I've been scratching my head on this one for three hours now trying ridiculous things (like setting global variables [oh my!]) to no avail.
The results I need from these two queries are simply true or false. How can I organize this code to be able to use these results in the postUser function? It seems to me that I need something like a third callback, but I have no clue how to implement something like that. Is it necessary for me to start using async? Would it be a good idea? Nothing in this application is super complex thus far, and I'd like to keep dependencies low <-> it makes sense.
How about this:
postUser = function(req, res, next) {
// Some static form validation (works, no issues)
var emailExistsCallback = function(err, exists) {
// Does the email exist? True/false
var usernameExistsCallback = function(err, exists) {
// Does the username exist? True/false
// DO STUFF HERE
}
// DB query - passes true or false to the callback
usernameExists(username, usernameExistsCallback);
}
// DB query - passes true or false to the callback
emailExists(email, emailExistsCallback);
// Check if ALL validation constraints check out, implement error logic
}
Simplest way is to nest functions like this:
postUser = function(req, res, next) {
var emailExistsCallback = function(err, exists) {
// Does the email exist? True/false
// Check if ALL validation constraints check out, implement error logic
next(); // <= you should finally call "next" callback in order to proceed
}
var usernameExistsCallback = function(err, exists) {
// Does the username exist? True/false
emailExists(email, emailExistsCallback); // <= note this
}
usernameExists(username, usernameExistsCallback);
}
Or you can use async, Q, seq or yaff(which is Seq reimplemented). There are number of libs to make your life easier. Better to try them all and decide which one is right for you, your style, requirements and so on.
you can use a common var to keep track of how many responses you got back. if you have them both, you can then do the stuff that needs them in a third "callback", which i called done():
postUser = function(req, res, next) {
var hops=0, total=2;
function done(){
// do stuff with both username and email
}
// Some static form validation (works, no issues)
var usernameExistsCallback = function(err, exists) {
if(++hops>=total && exists ){ done(); }
// Does the username exist? True/false
}
// DB query - passes true or false to the callback
usernameExists(username, usernameExistsCallback);
var emailExistsCallback = function(err, exists) {
if(++hops>=total && exists){ done(); }
// Does the email exist? True/false
}
// DB query - passes true or false to the callback
emailExists(email, emailExistsCallback);
// Check if ALL validation constraints check out, implement error logic
}
you should probably add error handling as needed by your app, specifically in both of the SQL callbacks, but this is a nice parallel IO ajax pattern that should do what you need.
I'm writing my first (non tutorial) node application and am at a point where I'm writing a function that should take the username and password as parameters and query them against the user table of my database to return either true or false. The database is setup, and the app is connecting to it successfully.
However, I haven't worked with SQL very much, nor node, and I'm unsure how to proceed with this function (and short surrounding script). Here it is:
console.log('validator module initialized');
var login = require("./db_connect");
function validate(username, password){
connection.connect();
console.log('Connection with the officeball MySQL database openned...');
connection.query(' //SQL query ', function(err, rows, fields) {
//code to execute
});
connection.end();
console.log('...Connection with the officeball MySQL database closed.');
if(){ //not exactly sure how this should be set up
return true;
}
else{ //not exactly sure how this should be set up
return false;
}
}
exports.validate = validate;
This is using node-mysql. I'm looking for a basic example of how I might set the query and validation up.
I think you'll want to rethink your app into a more node-like way (i.e. one that recognizes that many/most things happen asynchronously, so you're not usually "returning" from a function like this, but doing a callback from it. Not sure what you plan to get from node-mysql, but I would probably just use the plain mysql module. The following code is still most likely not entirely what you want, but will hopefully get you thinking about it correctly.
Note that the use of 'return' below is not actually returning a result (the callback itself should not return anything, and thus its like returning undefined. The return statements are there so you exit the function, which saves a lot of tedious if/else blocks.
Hope this helps, but I'd suggest looking at various node projects on github to get a better feel for the asynchronous nature of writing for node.
function validate(username, password, callback){
var connection = mysql.createConnection({ user:'foo',
password: 'bar',
database: 'test',
host:'127.0.0.1'});
connection.connect(function (err){
if (err) return callback(new Error('Failed to connect'), null);
// if no error, you can do things now.
connection.query('select username,password from usertable where username=?',
username,
function(err,rows,fields) {
// we are done with the connection at this point), so can close it
connection.end();
// here is where you process results
if (err)
return callback(new Error ('Error while performing query'), null);
if (rows.length !== 1)
return callback(new Error ('Failed to find exactly one user'), null);
// test the password you provided against the one in the DB.
// note this is terrible practice - you should not store in the
// passwords in the clear, obviously. You should store a hash,
// but this is trying to get you on the right general path
if (rows[0].password === password) {
// you would probably want a more useful callback result than
// just returning the username, but again - an example
return callback(null, rows[0].username);
} else {
return callback(new Error ('Bad Password'), null);
}
});
});
};